GLOBAL

GLOBAL: Leiden Ranking: Many ways to rate research

It looks as though a two-tier international university ranking system is emerging.

At the top we have the 'big three', Shanghai's Academic Ranking of World Universities, the QS World University Rankings and, since 2010, the Times Higher Education World University Rankings.

These receive massive attention from the media, are avidly followed by academics, students and other stakeholders and are often quoted in promotional literature. Graduation from a university included in these has even been proposed as a requirement for immigration.

Then we have the rankings by SCImago and Webometrics, both from Spain, the Performance Ranking of Scientific Papers for World Universities produced by the Higher education Evaluation and Accreditation Council of Taiwan, and the Leiden Ranking, published by the Centre for Science and Technology Studies at Leiden University.

These rankings get less publicity but are technically very competent and in some ways more reliable than the better-known rankings.

The Leiden Ranking

The Leiden Ranking of 2011-12, which has just been released, will get little public attention because it does not declare an overall winner. But it is probably more informative about publications and citations than any other ranking.

These rankings have acquired an indirect influence in that they were a precursor of the citations indicator used by Thomson Reuters in the Times Higher Education rankings of 2010 and 2011.

Their key concept is the measurement of research impact through citations and the normalisation of data, which means that citations are benchmarked against the world average of citations for a field, for a year and for the type of publication (letter, review, article). They also rank universities for publications and collaboration.

Normalisation of data is an idea that has merit.

It takes into account the reality that an outstanding paper in literary studies or philosophy might get fewer citations than a routine piece of work in medicine or physics. It should therefore be given credit for exceeding the average for its discipline rather than being compared to papers in disciplines where citation is frequent.

But normalisation can mean that as we get down to smaller and smaller numbers, there are greater chances that statistical anomalies will produce odd results.

The Times Higher Education (THE) rankings have added a further twist, namely normalisation by region, which apparently means country or even part of a country.

According to Simon Pratt, in this year's THE ranking Hong Kong was separated from the rest of China, which meant that Hong Kong universities were compared to a much higher benchmark than in 2010 and in consequence the University of Tokyo replaced the University of Hong Kong as the top university in Asia.

The Leiden Ranking also has a threshold of 500 papers per year for inclusion and does not count self-citations. They would probably do well to consider also excluding citations within a journal. There are a lot of stories floating around about journal editors asking authors to cite papers in their publications.

These rankings are extremely valuable in that they use several indicators and have added interactive features that allow us to see how many different ways there are of measuring research excellence and how very different results can be obtained by using different indicators.

Rankings that use only one measure of citations will have serious problems in justifying their methodology after looking at these.

Looking at total publications, which is called the 'P indicator', and unchecking three boxes so that we are just counting total not average publications, giving the same weight to collaborative publications and including non-English publications, we find that the top five are: 1- Harvard, 2- Tokyo, 3- Toronto, 4- Michigan and 5- University of California, Los Angeles.

The inclusion of a Canadian and a Japanese university and a US state school in the top five suggests that if we look at the sheer volume of research, the dominance of the Ivy League is beginning to fade. We also have the University of São Paolo in eighth place and University College London in 11th, ahead of Oxford and Cambridge.

Looking at citations rather than publications, when we check all three boxes (normalising for size, dividing up publications among universities, excluding non-English publications) to get the total citation score we still have Harvard in first place but the next four are completely different: the University of California, San Francisco (which is actually a medical school), MIT, the University of Massachusetts Medical School and the University of Texas at Dallas.

With normalised citation scores there is another picture yet again. According to this measure MIT replaces Harvard followed by Gottingen, Princeton, Rice and the University of California, Santa Barbara.

How did Gottingen get into second place for this indicator? It was because of precisely one paper by George M Sheldrick, "A short history of SHELX" published in ACTA CRYSTALLOGRAPHICA SECTION A in January 2008, which has been clocking up 14 citations a day since its publication.

This concerns the development of a computer programming system. The abstract concludes with this sentence: "This paper could serve as a general literature citation when one or more of the open-source SHELX programs (and the Bruker AXS version SHELXTL) are employed in the course of a crystal-structure determination."

Another interesting component is the provision of indicators of collaboration.

The most predictable ranking is one that measures long-distance collaboration in terms of kilometers. First place goes to the University of Hawaii at Manoa. It will probably not be long before there are branch campuses setting up in Easter Island.

So, counting publications will be to the advantage of Harvard and a few large institutions outside the US. But counting total citations benefits medical schools, which are excluded from the QS and THE rankings, or institutions with medical schools.

With normalisation, Harvard is dethroned and replaced by MIT. For collaboration, the first place is held variously by the Australian National University, the London School of Hygiene and Tropical Medicine and the University of Hawaii.

The Leiden Ranking contains a vast amount of information and rankers that use citations will have to pay very careful attention. The measurement of research impact has reached a high level of sophistication, but the problem now is how to choose among the various indicators and how to combine them.

* Richard Holmes teaches at Universiti Teknologi MARA Malaysia, and is the author of the blog University Ranking Watch.