GLOBAL: US lead slips in world's top 100 universities

American universities continued to lead the latest Academic Ranking of World Universities, but US dominance of the global top 100 list compiled by China's Shanghai Jiao Tong University slipped this year, to 54 institutions against 67 in 2009. Harvard clinched the top slot, as it has since the ranking was first published in 2003.

The University of California, Berkeley, leapfrogged Stanford into second place, while MIT pipped Cambridge into fourth place, leaving the UK university - one of only two non-US universities in the top 10 - in fifth place. Next came California Institute of Technology and Princeton, Columbia and Chicago. Oxford retained its 10th place for the fifth year in a row.

Although the Shanghai ranking changes little year on year, huge demand to view the 2010 top 500 list crashed the university's web server when it was published on Friday. There were still server problems over the weekend.

Richard Holmes, a lecturer at Universiti Teknologi MARA in Malaysia and author of the influential University Ranking Watch blog, compared performance on the stable Shanghai ranking over the six years from 2004 to 2010 and found "some noticeable changes".

"Cambridge and Oxford have each slipped a couple of places while Imperial College and University College London have moved up a bit, although not as high as their implausible positions in THE-QS. Tokyo has slipped from 14th to 20th and Kyoto from 21st to 24th. The leading Australian universities have also fallen."

Some 12,000 universities are ranked each year and the top 500 are included in the Shanghai list. Holmes' six-year comparison also noted significant national trends.

"India and Russia have stagnated with two institutions apiece in the top 500 in 2004 and 2010," Holmes commented on University Ranking Watch yesterday.

"The rising stars for scientific research are China (nine in 2004 and 20 in 2010), South Korea (eight in 2004 and 10 in 2010), Brazil (four in 2004 and six in 2010) and the Middle East (none in 2004 and four from Saudi Arabia, Turkey and Iran in 2010)."

Universities in 16 countries made the coveted top 100, including 11 nations in Europe, the US and Canada, Japan, Australia and Israel.

In the top 100 were 54 American universities, 11 from the UK, five each from Japan and Germany and four from Canada. France, Switzerland, Sweden and Australia each had three universities in the 2010 top 100, Denmark and the Netherlands had two each, and there was one university each from Belgium, Israel, Finland, Norway and Russia.

Although China now has more universities in the top 500 than before, it had none in the top 100 and neither did India or the regions of Africa and South America. The Middle East had only one institution in the top 100 - Hebrew University of Jerusalem, which slid from 64 in 2009 to 72 this year.

Institutions that shot up the list from 2009 to this year were the University of Melbourne, which rose to 62 from 75; three Swedish universities - the Karolinska Institute to 42 from 50, Uppsala to 66 from 76, and Stockholm to 79 from 88; and Arizona State University, Tempe, which was up to 81 from 94.

Dramatic place changes - a characteristic of other rankings in the past - are normally attributable to unforeseen consequences of changes in methodology. One of the merits of the Shanghai ranking is its stability due to its use of historic, mainly research-related data.

Criteria include alumni winning Nobel Prizes and Fields Medals (10%), staff winning Nobel Prizes and Fields Medals (20%), highly-cited researchers in 21 broad subject categories (20%), articles published in Nature and Science (20%), the Science Citation Index and Social Sciences Citation Index (20%), and institutional per capita academic performance, using the indicators above (10%).

While the initial purpose of the rankings was said to be to determine the global standing of China's top universities, the annual publication has attracted considerable attention from universities, governments and the media worldwide.

In 2009 a study from the perspective of specialists in multiple criteria decisions analysed how the Shanghai ranking worked and concluded that the criteria used were not relevant, that the aggregation methodology was plagued by major problems and that the exercise as a whole failed to pay sufficient attention to fundamental structuring issues.

The Shanghai compilers have freely acknowledged that the quality of universities cannot be measured precisely by mere numbers, and that any ranking can be controversial. Rankings should be used with caution and their methodologies must be understood clearly before reporting or using the results, they have warned.

This warning is often disregarded both by the media and universities who disingenuously use its findings for their own ends while damning the exercise with faint praise.

Still, said Richard Holmes in University Ranking Watch yesterday, as such exercises go the methodological stability of the Shanghai ranking - versus the wild fluctuations in the former THE-QS rankings - means that "over the long run it is more likely to reveal real and significant trends".