GLOBAL

Will global rankings boost HE in emerging countries?

When global university rankings started with the publication of the first Academic Ranking of World Universities, or ARWU, by Shanghai Jiao Tong University back in 2003 and the Times Higher Education Supplement-Quacquarelli Symonds (QS) World University Rankings in 2004, interest was patchy. Latin America, South Asia, the Arab and Muslim worlds and East and Central Europe generally ignored them.

Things have changed a lot since then. There are places where global university rankings seem to be a substitute, if not for war, then at least for the World Cup or the Olympics.

As globalisation and competition continue unabated, we find universities using rankings to woo students and talented researchers and to win funds from government and industry. Dire consequences for national economies are threatened if national flagships are unrepresented in the top 200 of the QS or THE – now powered by Thomson Reuters – world charts.

Increasingly, national governments are sponsoring local rankings, ratings and classification schemes or seeking the advice and assistance of the big ranking groups.

It is QS and THE that have been in the forefront of the expansion of the rankings industry into new markets, although Shanghai ARWU has produced a Greater China ranking that includes Taiwan, Hong Kong and Macau and also one for Macedonia. Both QS and THE have produced Asian university rankings and young university rankings.

Russian concern

Russia and the former Soviet republics have been particularly concerned about the poor international showing of their universities.

In 2012 only Lomonosov Moscow State University and the Moscow Engineering Physics Institute made it into the top 400 of the THE rankings. MEPhI should not have been there as it is a single subject institution and was removed in 2013.

In May 2012 President Vladimir Putin announced that Russia aimed to have five universities in the top 100 of the global university rankings. The reports did not say which rankings, but probably he had QS in mind.

The Shanghai rankings, which include Nobel and Fields awards as indicators, are difficult to scale quickly and it is likely that local institutions were irked by THE’s elevation of a small research institution, even if a respected one, to the top of the world’s research impact tree by virtue of contributing to just two research reviews.

In 2013, it was reported that an agreement was reached between Interfax, the Russian news agency, and QS to produce two new rankings. One was a ranking of universities in the former Soviet territories of the Commonwealth of Independent States plus the Baltic States and Georgia.

It seems that the CIS rankings were actually conducted by Interfax itself, although it may have used QS’s basic methodology.

The other was a ranking of universities in the emerging BRICS economies of Brazil, Russia, India, China and South Africa. The term ‘BRIC’ was originally been coined by economist Jim O’Neill to cover economies with large populations and considerable potential for growth. South Africa joined the group more recently, although it is considerably smaller.

The idea of a BRICS ranking is not really new. Webometrics already had a BRICS ranking with a Brazilian university, Sao Paulo, in first place and one Russian and eight Chinese universities in the top 10.

THE BRICS plus emerging countries ranking

On 4 December Times Higher Education came out with its own BRICS plus emerging countries ranking that included not just Brazil, Russia, India, China and South Africa but 17 other ‘emerging economies’ as defined by FTSE. The countries ranged from Taiwan to Egypt to Mexico.

As expected, China did very well, with Peking University in first place and Tsinghua University (pictured) in second and a total of 23 universities in the top 100.

Third place went to the University of Cape Town while Taiwan National University was fourth and Bogazici University in Turkey was fifth. Altogether Taiwan had 21 universities in the Top 100, India 10, Turkey seven and South Africa and Thailand five each.

Having seized nearly a quarter of the top 100 places and first and second place, China did have cause for celebration. But when we look at population size, China's achievement shrinks considerably while Taiwan emerges as the undisputed winner; Eastern Europe does well and the gap between Russia and China is drastically reduced.

THE did not change the methodology that it had used in the 2013 World University Rankings so there was nothing new about the top 37 who had been in the global top 400 in 2013 in the same order.

It is questionable whether a methodology designed to discriminate among the world’s elite research-based universities is suitable for a ranking of universities in such a variety of countries and with such differing missions.

QS BRICS rankings

The QS rankings were published on 17 December. QS did modify its methodology for this ranking, taking 10% off the academic survey and adding to the employer survey. A new indicator – staff with a PhD – was added.

The citations per faculty indicator in the world rankings was replaced by papers per faculty and citations per faculty. The weighting for international faculty and international students was reduced from 10% to 5%.

QS has made some effort to make its methodology more appropriate to emerging countries by putting more emphasis on the employability of graduates and less on the recruitment of international staff and students, and by giving more weight to the production of research rather than to its impact.

Even so, with half of the score dependent on two surveys, which almost certainly have very few responses going down the rankings, it is very likely that there will be large fluctuations if these rankings are to continue.

This would also apply to Times Higher Education, whose reputation survey accounts for nearly a third of its weighting. The citations index, in which a single contribution to a single publication can produce a huge dividend of field normalised citations, is also likely to be a future contributor to instability.

Comparing rankings

Comparing these two rankings shows that variations in methodology make a lot of difference.

South African universities seem to have suffered from QS’s methodological changes. The University of Cape Town was third in the THE rankings but 11th in QS, probably because its international students and faculty counted for less. The University of the Witwatersrand was 10th among the BRICS in THE but 31st in QS.

Russia did better in the QS than in the THE rankings. Among the BRICS countries, Lomonosov Moscow State University was sixth in THE, but rose to third place in the QS rankings, which also included some highly specialised institutions excluded by THE such as the Moscow Engineering Physics Institute, Moscow State Institute of International Relations and Moscow Power Engineering Institute.

Indian universities did not do particularly well in either ranking, but there were differences in who was proclaimed the national champion.

In the QS rankings, the Indian institutes of technology were supreme among Indian institutions. There were seven listed before the University of Calcutta appeared in 52nd place. In the THE rankings the best Indian performer was Panjab University, which was absent from the QS rankings.

I suspect that Panjab University is an example of rankings shopping, where universities target one specific ranking, and that there is a very smart person directing its ranking strategy.

Panjab University has invested money in participation in the Hadron Collider project, exactly where it would profit from THE's field normalised citations indicator, while its number of publications has not risen excessively.

Recently, the university proposed to establish integrated masters and doctoral programmes, good for two THE indicators, and to increase research collaboration, good for another.

It is likely that we are going to see more regional or special category rankings and equally likely that there will be significant differences, depending on sponsor or collaborator and methodology.

Whether this will contribute to genuine improvement in global higher education is debatable.

* Richard Holmes is a lecturer at Universiti Teknologi MARA in Malaysia and author of the University Ranking Watch blog.