How should rankings assess teaching and learning?
Times Higher Education (THE) has been at the forefront of international rankings. They have produced Asian, Asia-Pacific, Middle East and North African, Latin American, African, young, golden age, millennial, Generation X and reputation rankings, plus publishing some of their indicators as separate lists.
Teaching vs research
Rankers have become quite good at measuring scientific research output and quality. There are several good rankings, such as those published by CWTS Leiden, Middle East Technical University and National Taiwan University that measure research in various ways.
The well-known global rankings, the Shanghai Academic Ranking of World Universities, THE World University Rankings and Quacquarelli Symonds’ QS World University Rankings, are either entirely or largely research-based and are aimed at researchers, administrators and policy-makers.
But recently THE has been marketing rankings that claim to be teaching-centred with an appeal to undergraduate students and other stakeholders.
So far there is no international equivalent to the US News or the Guardian national rankings in the United Kingdom which, for better or worse, are consulted by thousands of prospective students and their advisors and rank universities according to the quality of students or graduates.
The QS world rankings have a survey of employers, but that is very unrepresentative. The Center for World University Rankings tables, now published from the UAE, cover the achievements of alumni, but few universities have even one alum with an award.
THE and the Russian Round University Ranking (RUR) go into income, faculty resources, postgraduate teaching reputation and proportion of doctoral students, but these may have little direct impact on the learning process.
The ranking that currently does the most to assess teaching and learning is U-Multirank, published by a consortium of Dutch, German and Spanish institutions. This ranking has presentation issues and its coverage is patchy, especially outside Europe. It does, however, provide useful insights into questions of learning and teaching based partly on a global student survey.
THE used to present themselves as doorkeepers to a rather exclusive club catering for a select group of well-funded research universities with a strong international presence. But lately they have been looking for ways of ranking universities well beyond the borders of the world elite.
Pillars and weightings
The THE Europe Teaching Rankings are the third in a series of teaching-oriented rankings, following the US College Rankings and the Japan University Rankings.
There are four pillars: Resources, Engagement, Outcomes and Environment. The weighting and composition of the pillars varies. For example, in the US college rankings the Environment pillar includes international students, student diversity, faculty diversity and student inclusion.
In the Japan rankings it consists of international students, international staff, international exchange programmes and courses in a foreign language. In the Europe rankings, it is gender ratios among faculty and students and nothing else.
If these rankings are aimed at prospective students, their value seems limited. The THE practice of bundling indicators together and withholding the 13 separate scores is likely to prove frustrating to any student or stakeholder looking for detailed information.
If a university has a high score for Resources does that mean that it has a lot of staff, which would improve the staff to student score, or does it have relatively few students, or do its staff write a lot of papers, or does it have relatively few staff, which would improve the papers per staff score, or does it get good scores for the two survey questions on which the quality of service metric is based, or is it some combination of these?
Another thing about the rankings is that they still show the influence of the research orientation of previous products. A weighting of 7.5% is given to papers per staff. Academic reputation gets another 10%. The latter refers to the question in the world rankings about the best universities for postgraduate teaching and correlates very highly with a question about the best universities for research.
This pilot exercise is not likely to be very useful for students who will probably find U-Multirank and the various national rankings more informative. But they may well provide publicity fodder for the research universities of the Russell Group and their continental counterparts plus some gifts for a few student-centred places who get credit in the Engagement indicators or are lucky enough to have a 50:50 gender ratio.
The top university overall in the THE Europe Teaching Rankings is the University of Oxford, which is also first for Outcomes. The University of Cambridge leads for Resources, Comillas Pontifical University in Spain for Engagement and Hanze University of Applied Sciences in the Netherlands for Environment.
The new rankings also have a limited coverage, ranking only universities in Western Europe. Scandinavia, Belgium and the whole of Eastern Europe from Austria to the Urals are excluded, probably because they failed to get enough students to respond to the survey. The Ecole Polytechnique, Erasmus University Rotterdam and the universities of Berlin are also missing.
The biggest problem, though, is that these rankings say little about the academic ability of students or the employability of graduates apart from the subjective responses to survey questions or possibly the graduation rate, which gets a 5% weighting. The ranking of universities for teaching and learning is still very much unmapped territory.
Richard Holmes is editor of University Ranking Watch blog.