GLOBAL

University rankings are innovating and improving

We welcome scrutiny of all university rankings and can agree with some of the opinions put forward by Bahram Bekhradnia for discussion in the recent Higher Education Policy Institute or HEPI report, International University Rankings: For good or ill?.

Our Times Higher Education or THE World University Rankings, which are based on an audited, widely consulted on and openly published methodology, have for over a decade thrived and improved precisely because of such interest and discussion. Since we began developing our methodology in 2004, its success in reflecting the mission of a modern, global, research-focused university has been achieved thanks to public discourse.

We also agree that more must be done to establish more advanced methods for measuring factors such as teaching and outreach. Indeed, we are at the vanguard of such developments and we are investing heavily to produce innovations such as the student experience and teaching-focused ranking of United States universities we created this year with The Wall Street Journal.

We are now working on a methodology for a teaching-focused ranking of Japanese universities and see this as an area of growth and continued investment in the future.

Some of Bekhradnia's criticism, such as his points about ordinal lists and weighting of indicators, is little more than an explanation of basic arithmetic. It's true that if you change a weighting, a ranking order changes: but the point is our weightings have been developed in consultation with universities, governments and academics over a decade or more and, consequently, our rankings are uniquely valid.

Unfortunately, while Bekhradnia did show us a near final draft of the report for comment, some of the criticisms he makes demonstrate that he has failed to research our growing range of rankings, metrics and data tools in depth and has fundamentally failed to understand the current, dynamic global rankings landscape.

For example, we have invested heavily in creating an open user interface on our website that allows anyone to choose which dimension to rank universities by. This means we are publishing not a single ranking each year, but dozens, filtered by geography or broad dimensions such as teaching or international outlook.

If you want evidence that our rankings provide students with data they can use, independent research revealed last year that one third of the five million internationally mobile students each year use the THE World University Rankings to help them select their universities.

Fundamental misunderstanding

There is a fundamental misunderstanding at the heart of Bekhradnia’s analysis, which we would like to clear up. He sees our rankings as an end in themselves, but they are in fact just one output from one of the world’s most sophisticated databases of higher education performance data, which we have built over the past two years and are now gathering for the third year.

Bekhradnia argues for different ways of presenting data. We agree. In addition to the multiple rankings available to students, we have invested heavily in producing software for universities that affords a highly flexible range of analyses, for example, producing comparisons based on multiple measures, presented in non-linear ways, such as using radar graphs, precisely as Bekhradnia proposes.

We continue to work with many of the world’s leading institutions to create analytical tools that provide the actionable insights their institutional research teams need.

We are more in agreement than disagreement with much of what Bekhradnia says in this paper. We hope constructive criticisms such as this can help drive up the scrutiny and quality of rankings in the future because – as he admits – university rankings are going to continue to see significant growth in future, as their increasing quality continues to drive utility for universities, students and policy-makers.

Where we take issue with Bekhradnia is with his final contention that governments should ignore rankings. This does not follow at all from his own analysis, nor does it bear scrutiny when you look at the strategic analysis that higher education data now affords. There is more work to be done in this area, but the current picture is exciting and should provide optimism for the global higher education sector.

Phil Baty is editor of the Times Higher Education World University Rankings.