Power and responsibility – The growing influence of global rankings

A few years ago I remember a dean at a Malaysian university urging faculty to look out for potential external examiners. There was one condition. They had to be at universities in “the Times” top 200. The dean, of course, was referring to the then Times Higher Education Supplement-Quacquarelli Symonds World University Rankings.

Time has passed and the THE-QS rankings have now become two rather different league tables. More global rankings have appeared and a succession of spin-offs, regional, reputational, subject and young university rankings, have appeared.

Rankings have become very big business and they are acquiring a prominent role in the policies of university administrators and national governments.

Times Higher Education used to be proud of the attention its rankings received. The THE ranking archives from 2004-09 contain this introduction:

“The publication of the world rankings has become one of the key annual events in the international higher education calendar. Since their first appearance in 2004, the global university league tables have been recognised as the most authoritative source of broad comparative performance information on universities across the world.

“They are now regularly used by undergraduate and postgraduate students to help select degree courses, by academics to inform career decisions, by research teams to identify new collaborative partners and by university managers to benchmark their performance and set strategic priorities.

“As nations across the globe focus on the establishment of world-class universities as essential elements of economic policy, the rankings are increasingly employed as a tool for governments to set national policy.”

Arbiters of excellence

Rankings have indeed become arbiters of excellence. They are cited endlessly in advertisements, prospectuses and promotional literature.

They influence government strategy in some countries and getting into the top 50, 100 or 200 is often a target of national policy, sometimes attracting as much attention as grabbing medals at the Olympics or getting into the World Cup quarter-finals.

There have even been proposals to use rankings as an instrument of immigration policy, presumably to ensure that only smart people are added to the workforce.

In 2010, politicians in Denmark suggested using graduation from one of the top 20 universities as a criterion for immigration to the country. The Netherlands has gone even further. Take at a look at this page from the Dutch government’s London embassy website:

To be considered a ‘highly skilled migrant’ you need:

“A masters degree or doctorate from a recognised Dutch institution of higher education listed in the Central Register of Higher Education Study Programmes (CROHO) or a masters degree or doctorate from a non-Dutch institution of higher education which is ranked in the top 150 establishments in either the Times Higher Education 2007 list or the Academic Ranking of World Universities 2007 issued by Jiao Ton Shanghai University [sic] in 2007.

“The certificate or diploma must also be approved by the Netherlands Organisation for International Cooperation in Higher Education (NUFFIC). To obtain this approval, you need to send your document(s) to: NUFFIC, Postbus 29777, 2502 LT Den Haag, The Netherlands.”

In another document those who meet the above criteria are described as “highly educated persons”.

Admission to The Netherlands under this scheme is not automatic. There are additional points for speaking English or Dutch, being between 21 and 40 or graduating from a university that has signed up for the Bologna declaration.

So no job and a poor masters in humanities from the university in 149th place in the 2007 THES-QS world university rankings (City University of Hong Kong)?

I suspect you would have problems getting a job in Hong Kong – but you could still be eligible to be a highly skilled migrant to The Netherlands, provided you spoke English and were in your twenties or thirties.

City University of Hong Kong graduates are fortunate. If the Dutch government had picked the 2006 rankings as the benchmark, the university would not have been on the list.

And too bad for those with outstanding doctorates in physics, engineering or philosophy from Tel Aviv university. In 2007 their university would not have been on the list, having fallen from 147th place in 2006 to 151st in 2007.

Also, perhaps someone should tell The Netherlands government about what one has to do to turn a bachelor of arts degree into a master of arts from Oxford or Cambridge.

Recently, Russia indicated that it will make a placing in the major rankings a condition for recognition of foreign degrees, and India has stated that local universities can only enter into agreements with those universities in the Shanghai rankings or THE rankings – to be precise only those in the top 500 of those rankings.

There is something odd about this. THE prints the top 200 universities and has another 200 on an iPhone app. So where are the other 100 coming from? Or was it just a journalistic misunderstanding?

Choice of rankings is disturbing

To some extent, all this appears to be another example of the pointless bureaucratisation of modern universities, where the ability to write proposals or list learning outcomes is more highly valued than actual research or teaching.

Most academics, left to their own devices, could surely judge the suitability of potential collaborators, external examiners or contributors to journals just as well as the THE or Shanghai or QS rankers.

As for using rankings to select immigrants, if the idea is to pick smart people, then the Wonderlic test would probably be just as good. After all, it worked very well for the US National Football League.

More disturbing perhaps is the choice of rankings. Few people would argue with using the Shanghai ARWU rankings to evaluate universities. Their reliability and methodological stability make them an obvious choice.

But the THE rankings are only two years old and underwent drastic methodological changes between the first two editions. Is India proposing to consider the 2010 or the 2011 rankings? If there are more changes in methods in years to come, what will happen to an agreement negotiated with a university that is the top 500 one year but not next?

Phil Baty, head of the THE ranking, has just published an article in University World News accepting that rankings are inherently crude and that they should be used with care. This is most welcome and it is certainly an improvement on those previous pronouncements.

Let us hope that the THE rankings do become more transparent, starting with breaking up the clusters of indicators and reducing dependence on Thomson Reuters and its normalised citation indicator.

Another dangerous thing about the Indian government proposals is that Thomson Reuters and ISI are the source for two of the Shanghai indicators, publications and highly cited researchers, and they collect and analyse data for THE.

The idea of a single organisation shaping higher education practices and government policy around the world, even deciding who can live in prosperous countries, is not an attractive one.

How to respond

So what can be done?

The International Rankings Expert Group has been getting ready to audit rankings, but so far there seems to be no sign of anyone actually being audited. Regulation does not seem to be the answer then.

Perhaps what we need is healthy competition between ranking, and constant criticism.

It would help perhaps if governments, universities and the media paid some attention to other rankings such as Scimago, HEEACT from Taiwan, and URAP from the Middle East Technical University, not just to the big two or the big three.

These could be used to assess the output and quality of research since they appear to be at least as good as the Shanghai Rankings, although they are not as broadly based as other world rankings.

But above all, Phil Baty’s admission that there are aspects of academic life where rankings are of little value is very helpful. For, things like collaboration and recognition, common sense and disciplinary knowledge and values should be just as valid, maybe more so.

* Richard Holmes is a lecturer at Universiti Teknologi MARA in Malaysia and author of the University Rankings Watch blog.