GLOBAL: Crucial to measure teaching in rankings
University ranking season is over for 2010. Students, faculty, university administrators and policy-makers are now picking over the contrasting methodologies and contradictory results of an ever-growing array of systems, to try to draw meaningful conclusions.
I can make that task easier by suggesting that for those who want a comprehensive picture of a world-class university, there is only one global system that really counts - the Times Higher Education World University Rankings, powered by data from Thomson Reuters.
Of course, there is some excellent work by other ranking providers. Shanghai Jiao Tong University's Academic Ranking of World Universities is objective, stable and useful - but only if you want a narrow picture of research power. Its six indicators are restricted purely to research, almost exclusively in science.
The Higher Education Evaluation and Accreditation Council of Taiwan's table is also impressive, but it is based solely on journal articles. The Spanish Webometrics Ranking of World Universities also has clear value - but only for universities seeking to monitor their global visibility on the internet, an ever more important issue in a world where brand really matters.
All these and other tables provide useful information, but there's only one world ranking that examines performance across all of a university's core missions: pushing the boundaries of understanding and innovation with world-class research; sharing expertise with the wider world through 'knowledge transfer'; working in an international environment; and, crucially, providing a rich and enriching teaching environment for its undergraduate and postgraduate student body.
Times Higher Education magazine built on its six years of experience in the global ranking game, and on its 40 years of experience in reporting on higher education, to publish a new set of tables on 16 September 2010 - with an entirely new and radically revised methodology.
The tables were the result of a global survey of user needs and 10 months of open consultation, and were devised with expert input from more than 50 leading figures from 15 countries representing every continent. They use 13 separate indicators - more than any other global system - to take a holistic look at the whole institution.
The Times Higher Education World University Rankings place the most weight on a range of research indicators - we think this is the correct approach in a world where governments are investing heavily in developing the knowledge economy and seeking answers to global challenges such as climate change and food security.
We look at research in a number of different ways, examining research reputation, income and research volume (through publication in leading academic journals indexed by Thomson Reuters). But we give the highest weighting to an indicator of 'research influence', measured by the number of times a university's published research is cited by academics around the globe.
We looked at more than 25 million citations over a five-year period from more than five million articles. All the data were normalised to reflect variations in citation volume between different subject areas.
This indicator has proved controversial, as it has shaken up the established order, giving high scores to some smaller institutions with clear pockets of research excellence, often at the expense of the larger research-intensive universities.
We make no apology for recognising quality over quantity, but we concede that our decision to openly include in the tables the two or three extreme statistical outliers, in the interests of transparency, has given some fuel for criticism, and has given us some food for thought for next year's table.
We judge knowledge transfer, at present, with just one indicator - research income earned from industry - and we plan to enhance this category with other indicators. Internationalisation is recognised through data on the proportion of international staff and students attracted to each institution.
The flagship, and the most dramatic, innovation is a new set of five indicators that give proper credit to the role of teaching in universities, with a collective weighting of 30%.
But let's get one thing straight: we are not measuring teaching 'quality'. There are no recognised, globally comparative data on teaching outputs at present. What Times Higher Education does is look at the teaching 'environment' - to give a sense of the kind of learning environment that students are likely to find themselves in.
The key indicator for this category draws on the results of a reputational survey on teaching. Thomson Reuters carried out its Academic Reputation Survey - a worldwide, invitation-only poll of 13,388 experienced scholars, statistically representative of global subject mix and geography - in spring 2010.
It examined the perceived prestige of institutions in both research and teaching. Respondents were asked only to pass judgement within their narrow area of expertise, and we asked them 'action-based' questions to elicit more meaningful responses, such as: "where would you send your best graduates for the most stimulating postgraduate learning environment?"
We also include a staff-to-student ratio. This is admittedly a relatively crude proxy for teaching quality, hinting at the level of personal attention a student may receive from an institution's faculty, and it receives a relatively low weighting of just 4.5%.
We also look at the ratio of PhD to bachelor degrees, to give a sense of how knowledge-intensive the environment is, as well as the number of PhDs awarded, scaled for size, to give a sense of how committed an institution is to nurturing the next generation of academics and providing strong supervision.
The last of our teaching indicators is a simple measure of institutional income scaled against academic staff numbers. This figure, adjusted for purchasing-price parity so that all nations compete on a level playing field, gives a broad sense of the general infrastructure and facilities available.
Our efforts to recognise the importance of teaching have been praised by Philip Altbach, director of the Center for International Higher Education at Boston College in the US. In a recent article, he noted that while there are no global measures of teaching quality, "the new Times Higher Education Rankings have recognised the importance of teaching".
He said that while there are obvious limitations to the proxies, "at least Times Higher Education has recognised the importance of the issue...Times Higher Education gets an A grade for effort, having tried to include the main university functions - research, teaching, links with industry, and internationalisation".
Other responses to our new tables have been excellent. I will not pretend that there has not been criticism (notably from vice-chancellors whose institutions have taken the biggest hits from our new methodology), but other comments have been positive.
David Willetts, the UK government minister for universities and science, congratulated Times Higher Education for revising the methodology, and Steve Smith, who represents all UK vice-chancellors as president of Universities UK, said that the new methodology "bolstered confidence in the evaluation method".
David Naylor, president of the University of Toronto, summed things up well. He recognised that Times Higher Education "consulted widely to pinpoint weaknesses in other ranking systems and in [our] previous approach".
He said: "They brought in a new partner with recognised expertise in data gathering and analysis. And they also sought peer opinions on the education and learning environment at scores of universities. These are welcome developments."
We will continue to engage with our critics, and to take expert advice on further methodological modifications and innovations. But we are proud of the new and improved Times Higher Education World University Rankings.
* Phil Baty is editor, Times Higher Education World University Rankings