UNITED KINGDOM
bookmark

AI use by students surges to 92%, second HEPI survey finds

With 92% of students now using generative AI in some form, AI tools are clearly here to stay, says Josh Freeman, author of a just-published survey of AI use by students across the United Kingdom. “The next phase has to be the integration of AI into course content and into the way we teach.

“If students are equipped to use AI tools effectively – by which I mean productively but also intelligently, critically, and with full understanding of the limits of those tools and where they’re useful and where they’re not – they will be much more productive,” Freeman told University World News. “If students don’t, they will have a disadvantage.”

Freeman is policy manager at the Higher Education Policy Institute (HEPI) based in Oxford, which released the Student Generative AI Survey 2025 as a policy note on 26 February. The poll followed the inaugural 2024 AI Survey and was produced by HEPI in conjunction with the digital learning platform Kortext.

The widespread use of generative AI (GenAI) in British higher education is a fait accompli, and students are using AI tools in various ways in learning and assessment. Freeman believes the next step is entrenchment in teaching and learning.

“A key reason students go to universities is to develop the skills they will need in the workplace, and AI skills are already absolutely essential in the workplace. This is obviously challenging because it takes time to develop new courses and to integrate AI into curricula,” he said in an interview.

“But I do think this is essential and that we’re starting to see a cautious trend in that direction, both in terms of new courses, courses with AI in the titles, but also embedding AI into courses that are otherwise not connected to AI,” he noted.

Internationally, Freeman added: “many of the issues faced in the UK are also going to be faced by students and institutions around the world”.

Previous research has suggested that international students use AI at much higher levels than home students. Language is a part of that, with students for whom English is a second language turning to GenAI for help.

Surge in GenAI use among students

The survey polled 1,041 full-time undergraduate students through the data and marketing research company Savanta in December 2024. The responses were weighted on demographics such as gender, institution type, and year of study to ensure representative results.

One of the most striking findings of Student Generative AI Survey 2025 is the rate of increase in AI use, said Freeman: “the extremely rapid rate of uptake and the extent to which students have been responsive to this new trend and have jumped on it as something that’s going to enhance their productivity”.

According to the policy report, the survey for 2025 found that student use of AI had surged in the last year, with 92% now using AI in some form – up from 66% in 2024 – and 88% having used GenAI for assessments, compared to 53% in 2024.

“The main uses of GenAI are explaining concepts, summarising articles, and suggesting research ideas, but a significant number of students – 18% – have included AI-generated text directly in their work,” said the report.

“When asked why they use AI, students most often find it saves them time and improves the quality of their work. The main factors putting them off using AI are the risk of being accused of academic misconduct and the fear of getting false or biased results,” the report noted.

Women are less enthusiastic about using GenAI and more worried about its risks and weaknesses than men, the survey found. “Men report more enthusiasm for AI throughout the survey, as do wealthier students and those on STEM courses.

“The digital divide we identified in 2024 appears to have widened,” Freeman writes in the report.

The most popular use of GenAI is to ‘explain concepts’ – 58%, up from 36% last year. “The largest increase has been in the use of AI to summarise articles and this is now the second most popular use of GenAI, up from a third in 2024,” stated the report.

One-quarter of students use AI-generated text to help them draft assessments, and 18% use AI-generated and edited text in their assessments.

“The proportion of students using AI to generate text with tools such as ChatGPT has more than doubled from less than a third (30%) to nearly two-thirds (64%), and this is now by far the most popular use of AI.

“Tools for editing work, such as Grammarly, and for working with university textbooks, such as Kortext, are the second and third most popular,” the report noted.

High use of AI in schools is unexpected

Another unexpected finding, from a new question for 2025, was that a high 45% of students had already used AI while at school. “[M]ore students agree AI-generated content would get a good grade in their subject (40%) than disagree (34%),” stated the report.

Universities have been reasonably slow in picking GenAI up, but Freeman expected that schools with even fewer resources might have been even more cautious. A follow-up question, he said, is whether school kids are being instructed or supported to use GenAI or are trying it for themselves.

While the use of GenAI has soared, attitudes to it are more mixed, said the report. “In some areas, students are more hesitant about AI use than in 2024,” the report stated. However, “the proportion who consider it acceptable to include AI text in assignments after editing has grown sharply from 17% to 25%”.

Students were asked about factors that encourage or discourage them from using GenAI. As in 2024, students said that saving time was a big reason for using AI, as was improving the quality of their work.

Asked about discouraging factors, more than half of students chose the risks of being accused of cheating and of obtaining false results or GenAI ‘hallucinations’ – made-up ‘facts’, statistics, or citations: 39% of students said they ‘rarely’ experience hallucinations but 30% did ‘quite often’.

Successes and failures

There have been successes as well as failures in higher education’s response to generative AI, which burst onto the global stage in November 2022 with ChatGPT. For instance, 31% of students said their institution bans or discourages AI use.

But the survey found that universities and colleges have been very successful in protecting the integrity of assessments, “with 80% of students surveyed agreeing that their institution has a clear AI policy and 76% saying their institution would spot the use of AI in assessed work – both increases from the 2024 Survey,” the report said.

While 67% of students believed it essential to have good GenAI skills, only 36% said they had received support from their institution to develop their AI skills.

Freeman suggested to University World News: “Institutions have had this really effective response to protecting assessment. But the fact that effort has been spent there possibly means less effort has been put into giving students the essential AI skills that they will need later, for example in the workplace, and supporting students to develop the skills to understand the benefits and the harms of AI.”

One way institutions could support students with their AI skills is by providing AI tools. ChatGPT was listed by students as the tool most provided by their institution, followed by Microsoft’s Copilot tool. Also mentioned were Grammarly, Turnitin, Google Gemini, Adobe AI Assistant and large language models developed in-house by institutions.

The 2024 survey argued that institutions providing AI tools could help to close digital divides based on affordability. This year 53% of students agreed that institutions should provide AI tools, up from 30% last year; 26% said their institution currently does so, up from 9% last year.

The AI literacy of staff has also increased sharply, with 42% of students reporting that staff are ‘well-equipped’ to help them with AI, considerably up from just 18% in 2024.

While staff are widely adopting AI, said the report, they “fear students are becoming too reliant on AI tools, crowding out critical thinking. Meanwhile, AI-related academic misconduct cases have soared”.

In the last year, Freeman added: “you’ve seen a much wider culture of experimentation among staff. I think most staff have now had a go at using AI tools. There are extraordinary experts and real innovation going on in some institutions. We’re starting to see an increase in staff literacy, which is great”.

How AI affects student work productivity

Another really striking finding, said Freeman, came from a new question about whether students would put in more or less effort when they were told their exams would be assessed mainly or partly by AI.

While 34% said they would put in more effort, 29% said they would exert less effort and 27% said their effort would not change.

So, the biggest proportion of students would put more effort into assessment involving AI.

“That was surprising. We thought students would be quite sceptical of the ability of AI to give them feedback. But they really rate getting instantaneous feedback on their work, which might explain that real openness to using AI tools in this area,” Freeman told University World News.

There was a slightly more sceptical response from Russell Group universities – Britain’s older, research-intensive universities.

“I think if you’re a newer institution, you will possibly have more flexible processes that you’ve been able to adapt more quickly. Russell Group institutions tend to be older and possibly have more entrenched and deeply embedded processes and might be less flexible as a result.”

Also interesting is that while students don’t feel they are being given enough support to improve their AI skills, they do seem to think lecturers are stepping up to the plate.

There was a big year-on-year difference, said Freeman. Students who said their institution was well-equipped to support them with GenAI have risen from 18% to 42% – “a striking increase”.

Conclusions and recommendations

A primary conclusion from the survey is that in just two years the use of AI in higher education has become widespread. There are four other conclusions:

First, on the whole, universities and colleges remain more sceptical of AI for teaching and learning than they are supportive. Secondly, students like AI because it saves them time, and they feel it improves their work – though many are put off by the risk of being accused of cheating.

Thirdly, there are persistent digital divides in AI competency, said the report, not just in gender (with men the more frequent users) and socio-economic group (favouring wealthier students), but also by subject: arts and humanities students using AI tools less than STEM students.

And fourth, the ethics and norms of AI use are in flux. “Students are deeply divided on what AI can legitimately be used for, particularly regarding the use of AI-generated text in assessments.”

The HEPI report has five recommendations.

First, all institutions should continually review assessments and assessment procedures “to keep up with the growing power of AI tools and students’ competency in using them”.

Secondly, “every member of staff involved in setting exams should have a deep working understanding of AI tools”. Institutions should provide ongoing training in AI tools and robust assessment design, and staff should familiarise themselves with and use AI tools.

Thirdly, institutions should adopt a nuanced policy that reflects the fact that student use of AI is inevitable and often beneficial. “A failure to teach AI actively risks widening digital divides.”

Fourth, institutions should keep AI policies under constant review as the capabilities of AI technologies develop.

And finally: “Institutions should seek opportunities to cooperate. Alone, no one institution can make the great strides required to adapt to AI fully,” the report stressed. “The sector should share best practices and create forums for collaboration around mutual problems.”

Email Karen MacGregor: macgregor.karen@gmail.com.