Questions about new ranking for Sub-Saharan universities

The formal launch of a Sub-Saharan Africa (SSA) university ranking, based on data collected from 121 universities and with a strong focus on undergraduate education, on 26 June at Ashesi University in Ghana, has yielded mixed responses from within the sector.

The new ranking comes at a time when higher education systems across Africa are battling financial crises and their consequences, leading universities have been assertively pursuing a research focus to enhance societal relevance and impact and an increase in ‘ranking scepticism’ has emerged globally. Questions have also emerged about the focus and criteria of the new rankings produced by Times Higher Education (THE).

Professor Barnabas Nawangwe, the vice-chancellor of Makerere University in Uganda, captured some of the concerns within the sector, saying the THE rankings of late were “questionable with parameters changing at each ranking season …”

“These rankings distract universities from their efforts to address challenges faced by our communities. Moreover, the element of money has sneaked into the process. African universities should develop their own benchmarks for assessing quality,” he told University World News.

But Phil Baty, the chief global affairs officer at Times Higher Education, said in an earlier blog its “mission is to support excellence in all universities, with data, insights and analysis that properly reflect and celebrate the vast diversity of institutional types all across the world – understanding inequalities of resources, and often vastly different operating contexts, missions and priorities”.

According to an explanation by THE of its methodology, a student survey included 20,000 students from 88 institutions (a minimum of 50 responses was required as part of the data collection process).

According to Africa Check, quoting uniRank, there are 1,225 officially recognised higher education institutions on the continent.

The other criteria developed with a consortium of African and international organisations include resources and finances, access and fairness, teaching skills, student engagement and Africa impact, including citations, and African research-authorship.

However, the survey was a reason why some institutions did not participate in the ranking exercise. Stellenbosch University (SU), South Africa, is one of the institutions.

“A concern has been the usage of stakeholder and student surveys as these are based on opinion or perception, and not facts.

“The increase in the number of university rankings, coupled with the growing number of subsets of these rankings do pose a challenge in terms of institutional time spent on submitting information – matters Professor Hester Klopper, the deputy vice-chancellor of strategy, global and corporate affairs at SU, has reflected on,” SU told University World News.

But there are other reasons why some institutions are not focusing on the new rankings.

Dr Mahomed Moolla, the head of the strategic partnerships office at the University of the Witwatersrand (Wits), South Africa, said he believed universities like the University of Cape Town, Wits and SU, which do well in the global rankings, would not spend too many resources on this new ranking as they are already deemed to be in the league of exceptional research universities.

“This ranking would be looking at aspects other than research excellence. On the other hand, it also depends on their mandates, if they are mandated to achieve other targets (besides research excellence), then they might allocate resources to this ranking,” he said.

What do experts say?

Professor Goolam Mohamedbhai, the former secretary-general, Association of African Universities, honorary president of the International Association of Universities, and the former vice-chancellor of the University of Mauritius, told University World News: “As a principle, I am against ranking of universities as I feel they serve little purpose other than providing satisfaction to a small number of universities who keep yo-yoing among the highest ranked and use it as a marketing tool.”

According to Professor Jonathan Jansen, distinguished professor in the faculty of education at SU, and former vice-chancellor of the University of the Free State, there is “now enough research and common sense in circulation to indicate that the rankings are a waste of valuable academic time and a drag on the resources of already struggling universities. It is small wonder that major Ivies have dumped the rankings.

“Rankings measure, first of all, what rankers think a good university is all about and, by their own criteria, much is left out in what counts in, for example, a development university in developing contexts.

“I believe that African universities, for example, should completely ignore these rankings and get on with the kind of teaching, research and public service that counts in their particular national and regional contexts,” Jansen said.

His views align with the ‘ranking scepticism’ that has been reflected in the Hamburg Declaration by university leaders, including vice-chancellors from Africa, following a 14-16 June meeting in Hamburg, Germany, in which they say, among other comments, that they are concerned about “current global university rankings [that] often result in an unhealthy use of competition that reproduces and reinforces social inequities and global science inequalities”.

Earlier, Rhodes University in South Africa also stated that it would not be participating in ranking exercises. Professor Sioux McKenna, the director of the Centre for Postgraduate Studies at Rhodes University, South Africa, wrote: “The rankings suggest that universities work in an entirely market-based ideology without recognising their responsibility as a public good. Being a public good can, at times, entail taking decisions which militate against an institution’s positioning in the rankings.”

She added that the rankings ignore social concerns, reward elite and exclusive admissions processes, and are methodologically flawed in that they consist of combining unrelated measures to produce a composite score.

McKenna also raised a point made by the other commentators: “Universities increasingly use ranking metrics to steer institutional decision-making, which drives funding away from activities such as community engagement that might otherwise have been deemed central to the institution’s purpose.”

But, Dr Peter Wells, head of education for UNESCO in Southern Africa and the former chief of higher education, UNESCO, Paris, said the criteria and methodologies of the Times Higher Education (THE) ranking of institutions in Sub-Saharan Africa will “undoubtedly” be refined and adjusted based on the feedback of the first edition of the ranking.

He said that, irrespective of opinions on rankings, “like them or loathe them, the reality is they are here to stay – at least for now.”

The key is to know how to read any ranking and to understand what is and, more importantly, what is not being ranked. UNESCO published the volume on The Uses and Misuses of University Rankings publication precisely for this reason.

He said virtually anything could be ranked within higher education systems.

“The list is almost endless. If you want to find something to rank, you can. However, again coming back to ‘love them or loathe them’, rankings have provided an impetus for each higher education institution or even systems to reflect on what it or they consider to be ‘good quality’ – be it in what they teach, research and the community services it or they provide in individual or system contexts.

“Self-reflection and then a strategy for quality enhancement can only be a force for good – and not for the purposes of appearing on, or rising up any ranking list, but as an ethical obligation to be the best they can be for their communities.”

Why a SSA ranking?

THE’s Baty indicated that it was an opportune time to launch a Sub-Sahara-focused ranking.

According to him, the number of African universities in its world rankings have increased from 27 to 97 since 2018, in the most recent rankings the newcomers were all from Africa and the overall universities’ scores across the world rankings’ range of 13 performance indicators have risen faster than the world average.

This contributed to the initiative to set up a “pioneering framework” to highlight the strongest universities in Sub-Saharan Africa across a unique and comprehensive range of performance indicators that go beyond the traditional research-focused metrics of the global rankings to cover teaching and research but also societal impact.

"THE's mission is to support excellence in all universities, with data, insights and analysis that properly reflect and celebrate the vast diversity of institutional types all across the world – understanding inequalities of resources, and often vastly different operating contexts, missions and priorities," he said.

“A strengthening cadre of top global universities in Africa will help African nations not only stem the brain drain but will also help attract international talent from outside the continent. They will attract inward investment and powerful international research collaborations.

“They will help ensure that Africans are at the forefront of new knowledge creation and technological innovation for thriving, transformed knowledge-driven economies. They will help ensure Africa’s top talent is nurtured, creating the next generation of productive, engaged citizens, supporting peace and strong democracies,” wrote Baty.

He added that the world rankings evaluate global, research-intensive universities, with metrics weighted towards research excellence and academic reputation that can favour the more wealthy nations and institutions in the Global North, and the research publishing ecosystems dominated by the Anglo-Saxon world and its research priorities.

Africa’s ranking, in contrast, will explore the impact of universities in Sub-Saharan African in addressing some of the toughest challenges faced in the continent – embracing key Agenda 2063 and United Nations’ priorities, said Baty.

Questions about data

Mohamedbhai, based on initial information provided by THE on 3 April, raised a few questions, echoed by other experts.

In addition to the number of institutions, he asked which institutions were part of the consortium driving the initiative and whether they were representative of the continent’s system, whereby the criteria could be assessed fairly; the exclusion of research as a ranking criterion, and the problem of reliable data, which could have affected the participation of institutions.

But, UNESCO Southern Africa’s Wells said that, if the rankings prompted institutions “to begin to institutionalise regular data collection, monitoring and evaluation – regardless of a THE survey – then, in my opinion, that can only be a good thing for the overall quality enhancement of any system of HEIs [higher education institutions].”

In response to questions by University World News, Wells said, given the small cohort, his question is how many institutions submitted data, and, why did many more not submit data?

If institutions did not submit data and “the reason was because they didn’t have the data – then that is very worrying, since it suggests they don’t collect the data, that they are not monitoring their own effectiveness; that they are not concerned with data to inform their institutional development strategies and, thus, their relevance to the wider community and, subsequently, justifying the public funding to support this.”

Wells, in response to concerns over research metrics, said the new ranking does take research into account under its ‘Africa Impact’ methodology criteria.

“I do, however, feel that the strength in the first round of [the ranking] is that it takes into account other – arguably more relevant – aspects of measuring higher education in the region. As I say, there are three pillars of the university [education, research and community engagement], and not every higher education institution can be everything to everyone – they need to be something to somebody.

“Research is clearly important to inform learning and teaching, but measuring this against research-intensive higher education institutions globally is inappropriate, in my view. Small-scale student and teacher or academics research projects that can make a difference or impact are equally vital, not just in Africa, but globally,” he said.

Ellen Hazelkorn, a professor emeritus at the Dublin Institute of Technology, Ireland, said: “All the evidence is that rankings act in the same way as other performance incentives – it’s drive behaviour towards that which is being measured. This can be positive but it can also lead to negative and perverse consequences.”

What does ChatGPT say?

Honorary Professor Robert Tijssen of the Centre of Excellence in Scientometrics and Science, Technology and Innovation Policy (SciSTIP), based at Stellenbosch University, an international authority on rankings, said that, after 10 years of being “a critical ranker” he has decided to say goodbye to anything related to world university ranking.

Accepting that a differently phrased question might yield another outcome, Tijssen was willing to ask ChatGPT the following question: “Give me a few reasons why it’s better to ignore world university rankings produced by commercial providers.”

In response, the bot said: “While world university rankings produced by commercial providers can offer some valuable insights, there are several reasons why it might be better to take them with a grain of salt or even ignore them altogether:

• Methodological limitations: Commercial university rankings often use complex methodologies that may prioritise certain factors over others. These methodologies might not accurately reflect the quality and suitability of education provided by universities. Additionally, the criteria used can vary significantly, leading to inconsistent rankings across different providers.

• Focus on research performance: Many commercial rankings heavily emphasise research output and citation metrics, which may not necessarily reflect the overall quality of education or the university’s ability to provide a well-rounded learning experience. These rankings tend to overlook other important aspects such as teaching quality, student satisfaction and community engagement.

• Regional and cultural bias: Commercial rankings often have a Western-centric bias, as they tend to favour institutions from English-speaking countries or those with significant financial resources. This bias can lead to the undervaluation of universities from developing countries or non-English-speaking regions, which may excel in specific disciplines or have unique educational approaches.

• Lack of context: Rankings alone cannot provide a comprehensive understanding of a university’s strengths and weaknesses. Factors such as location, campus culture, available resources, faculty expertise and career opportunities are crucial considerations for prospective students. Commercial rankings often fail to capture these nuanced aspects and may oversimplify the evaluation process.

• One-Size-Fits-All approach: Students have diverse needs and preferences when it comes to higher education. The notion of a single global ranking fails to consider the varying priorities, aspirations, and interests of individual students. Students should focus on finding a university that aligns with their specific academic, career and personal goals rather than solely relying on a ranking system.

• Potential for manipulation: The pursuit of higher rankings can create a competitive environment where universities might be incentivised to prioritise actions that improve their rank rather than focusing on long-term educational development. This can lead to a distortion of priorities and undermine the quality of education provided.

• Evolving landscape: The higher education landscape is dynamic, and universities continuously evolve and adapt to changing needs and demands. Commercial rankings are typically released on an annual basis and might not reflect the most up-to-date information or changes occurring within universities.

It’s important to approach university rankings critically and consider them as just one of many factors in the decision-making process. Engaging in thorough research, consulting multiple sources, visiting campuses, and considering personal preferences are crucial steps in finding the right university fit.”

Said Tijssen: “I couldn’t have phrased it better.”