Experts differ over way forward for research assessment
However, they are doing so with a glaring gap at the heart of government that has left the UK without a science minister since Prime Minister Boris Johnson announced he would step down once a new Conservative Party leader has been elected, scheduled to happen on 5 September.
To help fill the vacuum, a dozen different experts have offered their thoughts in a collection of essays published by the Higher Education Policy Institute (HEPI) on 1 September 2022, under the title Research Evaluation: Past, present and future.
Edited by HEPI Policy Manager Dr Laura Brassington, the variety of voices in the collection of essays should move the research community beyond ‘Twitter-style’ debates about how to recognise and reward research excellence, according to the outgoing head of Research England, David Sweeney.
HEPI Director Nick Hillman hopes the report will be at the top of the in-tray for any new science minister, telling University World News: “If we are to get anywhere near the Conservatives’ manifesto commitment to spend 2.4% of GDP on research and development, it is essential the new prime minister appoints a big figure as the next minister for science.”
‘Rip up the rulebook’
There was no holding back contributors to the new HEPI report, with Peter Mandler, professor of modern cultural history at Gonville and Caius College, University of Cambridge, writing: “The time is ripe for a root-and-branch reconsideration. Rip up the rulebook and start again.”
His chapter titled “REF 2028? Think Again” admitted that he used to defend these exercises as “the least-worst way to distribute limited government research funding across the wider range of universities that has emerged since the early 1990s” while “preserving the essential element of peer review”.
Now, he is not so sure, complaining that “in the most recent exercise, direct assessment of research counts for only 60% of the outcome”, with a bundle of measures of research culture called ‘environment’ counting for 15% and an assessment of impact that reaches beyond academia accounting for 25%.
He claimed: “What started out as a research assessment exercise has ended up as more of a public relations assessment exercise, with largely rhetorical documents contributing more and more to the calculus.”
An evolutionary approach
Taking an opposing view, HEPI President Bahram Bekhradnia, who was previously director of policy for the Higher Education Funding Council for England, which presided over several research assessment exercises, said: “The REF has evolved, and arguably been improved, in part in response to criticisms of its processes, and in part to take account of changing academic, social and political realities.”
He agreed that the most significant change, and perhaps the most controversial, has been the inclusion of ‘impact’ in “direct response to a political requirement to show that public investment in research produces clear benefits” to society.
Bekhradnia said that, after a shaky start, “the assessment of impact is a rare example of political interference in essentially academic matters resulting in an improvement”.
He also hailed the evaluation and assessment of interdisciplinary work in the latest exercise following criticism that academics were previously inhibited from undertaking such research.
The REF has evolved and “there can be little doubt that it has been one of the reasons for the international pre-eminence of UK universities in research”, he argued, before accepting that it has its “downsides”.
“As a dominant feature of academic life, it has led academics – and university leaders – to focus on research over teaching and other academic activity.
“The measures taken to increase the value placed on teaching have been puny and unsuccessful compared to the imperative of the REF,” he said.
Bekhradnia suggested it had been “the victim of its own success”, requiring “ever increasing contortions to use REF results in a way that protects the funding of the strongest research universities” while acknowledging that “an increasing number of universities have progressively improved the quality of their research”, as University World News reported in May.
Tension between ‘excellence’ and diversification
In her chapter looking at the history of research assessment, Dr Helen Carasso from the education department at the University of Oxford said: “Although not always as immediately evident, the perceived tension between ‘excellence’ and diversification that underlies many of the policy debates about teaching within higher education institutions is also at the core of discussions about the nature of the UK’s academic research base.”
She said the legacy of six research assessment exercises to date was “a boost for the international standing of UK research, through concentrated support for certain disciplines and universities, at the cost of increased stratification of institutional research capability and individual research opportunities within the country”.
Clare Viney, chief executive officer of the Careers Research and Advisory Centre Ltd, wrote in her chapter on research culture: “The way research is being conducted is changing. Increasingly, researchers are expected to work collaboratively, interdisciplinarily, inter-sectorally and internationally.
“They are expected to share their research, data and publications openly, and demonstrate the social and-or economic impact of their research. These developments raise the importance of values such as research integrity, ethics and reproducibility.”
Who is inside the academy?
In the concluding chapter, “Passing on the baton”, David Sweeney, outgoing head of Research England, accepted that “the move from theorising about research culture to delivery mechanisms is challenging” and said while he talks about “the academy”, there also needs to be discussion about who is inside that academy.
“Our research teams now include experts with a range of professional responsibilities, whether they be technicians, statisticians, librarians, research managers and so on.
“In considering research assessment the work of those teams should be assessed appropriately, not just the outputs which bear the names of the principal investigators and some colleagues.”
As editor of the report, Brassington told University World News the collection “highlights the strength of the UK research landscape, but it also emphasises the need to focus on the four nations and ensure equal attention is paid to higher education institutions across the UK”.
“There is a clear need for more open dialogue between the different stakeholders in research assessment. It’s what makes this edited collection so valuable, in that we were able to provide a platform to air the different views of a wide range of authors.”
Brassington said she “completely agrees there is a clear need for more international dialogue on research assessment and funding allocations” and is working on a second volume of this edited collection in which she intends to compare the means and metrics for research assessment in an international context.
Nic Mitchell is a UK-based freelance journalist and PR consultant specialising in European and international higher education. Follow @DelaCour_Comms on Twitter. Nic also blogs at www.delacourcommunications.com.