Plugging the employability gap in global HE rankings
We now have a range of rankings that include rigorous attempts to measure and compare university research output and quality. There are, of course, problems with these. They tend to privilege research in physics, engineering and medicine and to favour larger, older and wealthier universities and those in the Anglosphere.
Even so, browsing through the research and citation indicators of University Ranking by Academic Performance, published in Ankara, the National Taiwan University Rankings, Leiden Ranking, Shanghai Ranking’s Academic Ranking of World Universities and Nature Index can provide useful information about comparative trends in research and university performance.
But with regard to teaching and learning, the rankers have done much less well. The data about student admissions, progress or careers that is used for national rankings such as US News Best Colleges in America or the Guardian University Guide or global business school rankings is currently not available for the majority of institutions.
This is unfortunate since, for most people, it is teaching, especially the teaching of undergraduates, that is the main business of universities, which are judged largely by their ability to provide graduates with the chance of earning a decent living or, at the very least, consideration for semi-skilled or white-collar employment. It is not easy to assess universities in this respect and very hard to make transnational comparisons.
One workaround by rankers has been to compare available resources, such as faculty numbers or income. But there are problems here. Having a lot of money does not mean it will be spent wisely and employing a lot of faculty does not mean that they will teach effectively, if at all.
The problem is further compounded by reliance on data provided by institutions, which may not always be completely accurate, rather than from state or third-party sources.
QS, Round University Rankings and Times Higher Education (THE) count faculty-student ratios in their world rankings. THE also claims that institutional income and the number of doctoral students has something to do with a superior learning environment.
However, direct measurement of graduate employability is limited to just four global rankings: the Moscow Three University Missions Ranking (MosIUR), the Emerging/Trendence Global University Employability Ranking published in THE, the Center for World University Rankings (CWUR), now based in the UAE, and the QS Graduate Employability Rankings.
MosIUR reports that it uses measures of employability to construct its global rankings but has not released any indicator scores. Consequently, their value for students or other stakeholders is minimal.
THE has published rankings based on a survey of international employers by Emerging, a French consulting firm. The survey is confined to an international group of corporate leaders and recruiters. Its relevance is therefore limited.
That basically leaves two rankings that make a serious and transparent attempt to go beyond surveys and provide additional insights into employability. The first is the Center for World University Rankings (CWUR) which has two relevant indicators. In keeping with its declared objectives, both of these are derived from public sources.
The Quality of Education indicator is measured by the number of alumni who have won major awards, medals and prizes. It is basically an expanded version of the Shanghai Ranking’s alumni criterion, based on Nobel and Fields awards, and includes about 500 universities. This indicator is heavily biased to the West: even the top Chinese universities like Tsinghua and Peking are barely represented by this indicator.
The other indicator is Alumni Employment, that is, the number of alumni who have held CEO positions in the world’s top 2,000 companies. This affects a larger number of universities, about 1,400.
However, some universities such as University of Carolina Irvine, Australian National University, University College London and Leiden University that surely produce employable graduates are placed a long way down this indicator. This is a narrow view of graduate employability and again it is one that is biased against Asian universities.
QS Graduate Employability Rankings
The QS Graduate Employability Rankings represent the most serious effort to conduct a broad assessment of graduate employability. There are five indicators. The Employer Reputation Survey, with a 30% weighting, is the largest of its kind, although, as with other data, size is not everything or even the most important thing.
A key feature of this indicator is that QS allows universities the option of proposing their own nominees, something that has received a lot of criticism, although it does allow the survey to reach sectors that otherwise might be ignored.
This indicator has five universities at the top with scores of 100. These are Cambridge, Oxford, Harvard, Massachusetts Institute of Technology (MIT) and Stanford. The University of California, Berkeley, one of the famous six ‘superbrand’ universities, has not taken part in these rankings since 2019.
The other indicators suggest that Chinese, Australian and Asian universities are beginning to catch up with North American institutions in undergraduate education as well as research and innovation. Partnership with Employers shows a strong representation from China, with Zhejiang and Tsinghua universities in second and third place after Stanford.
The Graduate Employment Rate indicator shows that this is an area where Asian and some European institutions are forging ahead of the Ivy League and large US public universities. Front runners include MGIMO University in Moscow, Politecnico di Milano, Taylor’s University in Malaysia and the Indian Institute of Technology Madras. For Employer Student Connections, Chinese universities led by Huazhong University of Science and Technology also perform extremely well.
For Alumni Outcomes, however, the traditional elite of Harvard, Stanford, Oxford, Pennsylvania and MIT dominate.
There are, of course, questions that might be asked about the weighting of indicators and the collection of data, but it seems that the overall scores for the QS rankings are telling a consistent and coherent story of the steady erosion of Western hegemony and the emergence of new centres of excellence in China and perhaps in Australia and parts of Europe.
Nonetheless, these rankings cover only a fraction of the world’s universities and it would seem that the comparison and assessment of teaching and learning still have a long way to go. The world of work and employment has been transformed by the pandemic and a rigorous and transparent assessment of university teaching and learning should be at the top of the rankers’ agenda.
Richard Holmes is editor of the University Ranking Watch blog.