GLOBAL

Going up and going down
The latest edition of the Times Higher Education, or THE, World University Rankings has just been published, along with a big dose of self-flattery and congratulations to the winners of what is beginning to look more like a lottery than an objective exercise in comparative assessment.The background to the story is that at the end of last year THE broke with their data suppliers Thomson Reuters and announced the dawn of a new era of transparency and accountability.
There were quite a few problems with the THE rankings, especially with the citations indicator, which supposedly measured research impact and was given nearly a third of the total weighting. This meant that THE faced a serious dilemma. Keeping the old methodology would be a problem, but radical reform would raise the question of why they would want to change what they claimed was a unique, trusted and sophisticated methodology with carefully calibrated indicators.
It seems that THE has decided to make a limited number of changes, but to postpone making a decision about other issues.
They have broadened the academic reputation survey, sending out forms in more languages and getting more responses from outside the US. Respondents are now drawn from those with publications in the Scopus database, which is much larger than the Web of Science. Data about publications and citations are also now taken from this source.
In addition, THE has excluded 649 “freakish” multi–author papers, mostly in physics, from their calculations and diluted the effect of the regional modification that boosted the citation scores of universities in low-performing countries.
These changes have led to some remarkable fluctuations with some institutions rising or falling dozens or hundreds of places. Fortunately for THE, the latest winners are happy to trumpet their success and the losers so far seem to have lapsed into an embarrassed silence.
Grabbing the headlines
When they were published on 30 September the rankings provided lots of headline fodder about who was up or down.
The Irish Times announced that the rankings showed that Trinity College Dublin had fallen while University College Dublin was rising.
In the Netherlands the University of Twente bragged about its “sensationally higher scores”.
Study International asserted that “Asia Falters” and that Britain and the US were still dominant in higher education.
The Telegraph in London claimed that European universities were matching the US.
The Hindu found something to boast about by noting that India was at last the equal of co-BRICS member Brazil.
Russian media celebrated the remarkable achievement of Lomonosov Moscow State University in rising 35 places.
And, of course, the standard THE narrative was trotted out again. British universities are world class, but they will only go on being world class if they are given as much money as they want and are allowed to admit as many overseas students as they want.
The latest rankings support this narrative of British excellence by showing Oxford and Cambridge overtaking Harvard, which was pushed into sixth place. But is this believable? Has anything happened in the labs or lecture halls at any of those places between 2014 and 2015 to cause such a shift?
Oxbridge rises
In reality, what probably happened was not that the Oxbridge duo were doing anything much better this year, but that Harvard’s eclipse came from a large drop from 92.9 to 83.6 points for THE’s composite teaching indicator.
Did Harvard’s teaching really deteriorate over 12 months? The most likely explanation is that there were relatively fewer American respondents in the THE survey this time, but one cannot be sure because there are four other statistics bundled into the indicator.
While British universities appeared to do well, French ones appeared to perform disastrously. The École Normale Supérieure recorded a substantial gain going from 78th to 54th place, but every other French institution in the rankings fell, sometimes by dozens of places.
École Polytechnique went from 61st place to 101st, Université Paris-Sud from 120th to 188th, the University of Strasbourg from the 201-225 band to 301-350, in every case because of a substantial fall in the citations indicator. If switching to Scopus was intended to help non-English speaking countries, it did not do France much good.
Meanwhile, the advance of Asia has apparently come to an end or gone into screeching reverse. Many Asian universities slipped down the ladder, although the top Chinese schools held their ground.
Some Japanese and Korean universities fell dozens of places. The University of Tokyo went from 23rd to 43rd place, largely because of a fall in the citations indicator from 74.7 points to 60.9 and Kyoto University from 59th to 88th with another drop in the score for citations.
Among the casualties was Tokyo Metropolitan University which used to advertise its perfect score of 100 for citations on the THE website. This year, stripped of the citations for mega-papers in physics, its citation score dropped to a rather tepid 72.2 and its overall position from the 226-250 band to 401-500.
The Korean flagships have also floundered. Seoul National University fell 35 places and the Korea Advanced Institute of Science and Technology 66, largely because of a decline in their scores for teaching and research. Pohang University of Science and Technology, or POSTECH, fell 50 places, losing points in all indicators except income from industry.
Turkish tumble
The most catastrophic fall was in Turkey. There were four Turkish universities in the top 200 last year. All of them have dropped out.
Several Turkish universities have contributed to the Large Hadron Collider project with its multiple authors and multiple citations and they also benefited from producing comparatively few research papers and from the regional modification, which gave them artificially high scores for the citations indicator in 2014, but not this year.
The worst case was Middle East Technical University which had the 85th place in 2014, helped by an outstanding score of 92 for citations and reasonable scores for the other indicators. This year it was in the 501-600 band with reduced scores for everything except industry income and a very low score of 28.8 for citations.
The new rankings appear to have restored the privilege given to medical research. In the upper reaches we find St George’s, University of London, a medical school, which according to THE is the world's leading university for research impact, Charité – Universitätsmedizin Berlin, a teaching hospital affiliated to Humboldt University and the Free University of Berlin, and Oregon Health and Science University.
It also seems that THE's methodology continues to gives an excessive advantage to small or specialised institutions such as Scuola Superiore Sant’Anna in Pisa, which may not be a truly independent university, the Copenhagen Business School, and Rush University in Chicago, the academic branch of a private hospital.
Altogether it is difficult to see how these rankings are credible. If, for example, the poor scores given to French and Japanese universities this year are valid, then how could the very different scores last year also be valid? It would have been better if THE had announced that they were starting afresh with new rankings and a completely new methodology.
These rankings appear so far to have got a good reception in the mainstream media, although we may eventually hear some negative reactions from independent experts and from Japan, Korea, France, Italy and the Middle East.
It is likely that there will be more changes next year and that the THE rankings will be even more interesting.
Richard Holmes is former associate professor at Universiti Teknologi MARA, Shah Alam, Malaysia, and produces the University Ranking Watch Blog.