GLOBAL-AFRICA
bookmark

Is the rhetoric of research ‘excellence’ holding us back?

‘Transforming Research Excellence: New ideas from the Global South’ by Erika Kraemer-Mbula, Robert Tijssen, Matthew L Wallace and Robert McLean was recently published open access by African Minds based in Cape Town, South Africa. This is an edited extract of the introduction.

Opting for a broader and fluid concept of excellence requires developing measures able to capture multiple dimensions where we expect research to deliver social value. This process calls for joint efforts involving engagement and co-creation with relevant social actors. Such performance criteria also depend on geography – the location where the science is done, and where the primary users and potential beneficiaries of scientific findings are to be found.

As one moves from a ‘global’ to a ‘local’ perspective, or from science in the Global North to that of the Global South, the core analytical principle should be: scientific excellence cannot and should not be reduced to a single criterion, or to quantitative indicators only.

Any criterion of excellence in Global South science that does not take these considerations into account creates inadequate views and indicators of research performance, inappropriate assessment criteria, and therefore problematic rationales for justifying exclusivity of those tagged as ‘excellent’.

Excellence becomes even more ambiguous when universities are described (or more often, self-described) as being ‘excellent’. For example, the Research Excellence Framework (REF) in the United Kingdom, which provides performance-based funding to universities and promotes high-quality research through an explicit competitive scheme, or statistics on research publication performance, have shown an increasing focus on university rankings – and to a lesser degree country rankings – where the ‘excellence’ rhetoric hinders important debates and capacity building that should take place within these scholarly institutions (Moore et al, 2017).

In the case of rankings, measurement of excellence is often done through a less-than-rigorous and often opaque methodology. Politics and public relations exercises blur debates on measurement methodologies. The question is often not ‘how best to characterise the top universities’ but rather, ‘should we be ranking universities at all?’

And excellence does not necessarily only accrue to research outputs or impacts: high-quality features or outstanding performance may also emerge in knowledge sharing or dissemination strategies, ways of offering access to technical facilities, or other process-related characteristics of scientific research and its infrastructures.

Measurement out of context

University rankings are often prime instances of measurement out of context. Southern academic leaders have expressed concern that reliance on the predominant approaches to ranking may broadly miss the point for Southern institutions (Dias 2019). Worse still, rankings may exacerbate systemic bias toward the flawed approaches of the North, and undervalue unique ways of knowing, as well as essential scientific work from the South.

Local relevance should be a leading concern and one of the key performance criteria, especially in resource-poor research environments of low-income countries of the Global South. A fuller picture can only be captured and revealed by applying assessment criteria and indicators that put researchers and users of research outcomes at centre stage.

Adopting user-oriented approaches will require dedicated capabilities, cash and care. But it also needs a dose of creativity, and well-designed experimentation in the science funding models and mechanisms of the Global South is essential to arrive at workable assessment solutions customised for resource-constrained circumstances.

Indeed, the Global South may have a head start in developing and implementing these new and much-needed approaches. By avoiding the entrenched biases and well-described flaws of the mainstay methods of excellence assessment, Southern-derived solutions may offer potential improvements globally. One example is the Research Quality Plus (RQ+) approach developed by the International Development Research Centre (IDRC) with and for its Southern research community.

In short, RQ+ presents a values-based, context-sensitive, empirically driven and systematic approach to defining, managing and evaluating research quality. As such, it is one practical and transferable response to the calls to action such as the Leiden Manifesto (see McLean and Sen 2019 for a comparison of RQ+ vis-a-vis the Manifesto’s principles).

But, as is argued in this book, RQ+ requires further trialling, testing and improvement. Still, the practical validation to date at IDRC, and at a growing number of Southern institutions, demonstrates that another way for research evaluation and governance is possible. A key purpose of this book is a further critique of, and experimentation with, new approaches such as RQ+.

New options and alternative experiences

The Global South has an opportunity to do differently, and by doing so, to do better. Rethinking what makes for good science is essential; it is a process from which all can learn. But just as some of these issues can partly be traced back to the ‘blind’ quest for excellence, so too can new visions of excellence and quality have significant impacts on research systems, particularly in the Global South. In the book, we present new options and alternative experiences.

It would be entirely possible for the book to focus solely on discontents with the status quo. But that is not our intent. Our goal is to provide a platform for new perspectives that have been under-represented and undervalued in the global debates and systems driving the status quo of excellence, and thereby offer novel experiences and different ways of thinking.

We hope this lens will benefit those from either geographical location (South or North), those across disciplines of science (pure maths or public health), or component (researcher, funder, university, government) of the global research system. We believe it opens a path toward a fairer, more efficient, more motivating and more impactful global research ecosystem. In the following paragraphs we suggest why.

The adverse consequences of the quest for excellence are most strongly felt in the Global South, given scarce resources and challenges in attaining visibility on a global scale. Moreover, the lesser developed regions of the globe also happen to be those where socially relevant research is most needed to address pressing local and regional development issues. Hence, more appropriate criteria and performance indicators, fit for purpose in the Global South, should embrace two other guiding principles: inclusivity and local relevance.

As for inclusivity, with the rise of cooperation in science and team-based research, it has become increasingly complex – and perhaps also less relevant – to assign a quality stamp to one particular ‘excellent’ entity, be it an individual researcher, an organisation or a country. Broader visions of local relevance can also help retain and reward a more diverse set of ‘top’ researchers, and thus a greater diversity of knowledge that can be assessed and compared.

This can be achieved by recognising researchers’ motivations for not only producing high-quality science (as judged by their international peers), but also pushing the boundaries of knowledge to tackle pressing societal problems (as judged by local society).

To move in this direction, quality and excellence can be shaped to embrace a wider community of knowledge producers, brokers and users, reinforcing the ‘social contract’ that provides science with the autonomy and legitimacy to operate in the eyes of decision-makers, as well as the public. In an era where many point to declining trust in evidence and in scientists, this is sorely needed.

Useful, robust knowledge

On a more practical level, accepting a pluralistic vision of research excellence can lead to greater flexibility in research evaluation practices and in setting research agendas that reflect development needs. This highlights the importance of science granting councils which, on a national scale, can link research to national policy priorities and facilitate connections between users and producers of scientific knowledge. This means putting the onus on useful, robust knowledge that can make a difference in a given context.

While retaining what at times is a competitive process (for example, to make funding decisions), research evaluation tools, particularly in the Global South, can be empowered to be more deliberate in recognising ‘success’ or ‘quality’. Perhaps more importantly, moving away from a narrow or ‘blind’ usage of the term ‘excellence’ can enable funders to decide, based on evaluations as well as policy considerations, how to distribute research resources in a given system.

In some cases, focusing on a few ‘top’ researchers or research teams may be desirable, while in others a greater return may be obtained from a more equitable distribution of resources (for example, to promote diversity in approaches to solving grand challenges, or to build capacity in the research system).

What the South does not lack is scientific talent. Researcher capacity is another area where rethinking excellence, and how it is embedded in research systems, holds significant potential and importance for the future.

However, few young people decide on a career in science in order to outperform other researchers in terms of the number of papers published or the popularity of their papers amongst other scientists. Instead, they develop an interest in scientific research – and make the difficult and at times costly choice to enter a career in research – motivated by a desire to do better for people, to advance a business objective or even to benefit the health of our planet.

But the academic incentive and rewards systems tend to favour, compensate and advance researchers based on the number of their publications, not on the socio-economic impacts of their research. This creates an often unnecessary tension between output-driven and impact-inspired science.

Of course, researchers will seek financial rewards for their investments and efforts, and feel good receiving the acknowledgement of their peers. But if these returns were tied to underpinning motivations (say to help people) rather than the insular status quo (such as the number of journal publications), a challenging and demanding career choice would receive renewed carrots for incentivising hard work.

Measures of excellence which relate to the values and motivations of why people enter research would attract new entrants to research, and retain the fire and enthusiasm of those who do choose the path.

On a global scale, there is a real opportunity here. As the world population grows, it is expected that more than half of that growth will come from low- and middle-income countries. If Southern actors successfully align incentives to enter research with the right reasons for wanting to do research, there will be an unprecedented renaissance of science across the globe. At such a time, new ideas, advanced knowledge and fresh solutions will be most needed.

Erika Kraemer-Mbula holds the DST-NRF-Newton Fund Trilateral Research Chair in Transformative Innovation, the 4th Industrial Revolution and Sustainable Development and is a research associate at the Centre for Law, Technology and Society, University of Ottawa, Canada, the Institute for Economic Research on Innovation at Tshwane University of Technology, South Africa, and a researcher at the DST-NRF Centre of Excellence in Scientometrics and Science, Technology and Innovation Policy (SciSTIP), South Africa; Robert Tijssen holds the Chair of Science and Innovation Studies at Leiden University in the Netherlands, and is a part-time full professor at the Centre for Research on Evaluation, Science and Technology at Stellenbosch University, South Africa, and affiliated to SciSTIP; Matthew L Wallace is a senior programme specialist at the International Development Research Centre (IDRC) in Ottawa, Canada; and Robert McLean is senior programme specialist in policy and evaluation at Canada’s IDRC.