The new battlefront in university subject rankingsGlobal Ranking of Academic Subjects, produced by the same team that compiles the Academic Ranking of World Universities, allows 1,410 universities from 80 countries worldwide the right to boast about their academic successes.
This subject ranking is radically different to any other produced, notably the widely used QS subject rankings, which covers a similar number of subjects and institutions. To start with, the Shanghai ranking is aimed at the postgraduate research training market as well as faculty members seeking greener pastures with higher ranked institutions.
The current ranking comprises 52 subjects (compared to 12 in 2016) across five broad faculty areas and is dominated by engineering, with 22 listings, followed by the social sciences with 14.
There are five indicators, four of which draw data from Clarivate’s InCites database. The indicator that is not bibliometric-based – awards – draws data from a survey, the Shanghai Ranking’s Academic Excellence Survey. There is no need to get too excited about this survey or worried about influencing the likely outcomes because Shanghai Ranking is not interested in the opinions of academics if their alma mater is not included in the top 100.
In distinct contrast to the Academic Ranking of World Universities (500 institutions listed from 45 countries), the subject ranking offers the opportunity for institutions to highlight areas where they excel. In this case, it is about research output, impact and prestige.
The wider usability of this ranking is limited and is somewhat spurious because of the construct of its subject classification. However, let’s not get too distracted on these issues as they are likely to be fine-tuned to entice a wider audience.
Which countries stand out?
At first glance, the United States is the country that appears the most (3,857 times and is top in 32 out of the 52 subjects), followed by China with 1,289 instances and the United Kingdom with 1,168 listings.
While the US has 257 institutions included and dominates the top 100, the interest in the rankings is not so much based on the top 100 as these are well-resourced institutions with big endowments (or with ability to draw funds).
The worthiness of the subject rankings resides in the shifts that are occurring with institutions vying for higher standing (those within the 101-300 band in any of the ranking schemas). There are also institutions in the 301-500 range, particularly those from upper- and middle-income economies, which benefit from a higher standing in the rankings.
The other subject and faculty rankings from Leiden, QS and Times Higher Education also highlight this same pattern.
Institutions from the US, the UK, Germany, Australia, Canada, Italy and France are well represented in the Shanghai subject ranking. However, once the proportion of subjects per institution ranked is considered, their overall standing diminishes somewhat. This is partially explained by the extremely competitive nature of their national systems and their desire to feature in world university rankings.
Further, universities from various national systems such as Denmark, the Netherlands, Belgium and Switzerland are also well represented among the top 300 across the various subject areas.
What these results suggest is that over the next few years we will see an uplift in the overall standing of universities from China, Singapore, Malaysia and other Asian countries. The extent to which universities from Japan and South Korea remain competitive depends on their ability to stay abreast of the top Asian universities.
Comparison with other rankings
There are no two ranking schemas that bear much similarity. Every ranking seeks to be different to the other. Institutions use results from rankings to promote areas where they excel or perform best. Put simply, any new distinct ranking is yet another marketing tool.
The Shanghai ranking by subject resembles QS subject rankings because of the number of subject areas ranked (52 compared to QS’ 46 in 2017), and is based on four to five indicators per subject. While the QS rankings draw considerably from opinion surveys and are influential among international students, Shanghai rankings rely considerably on bibliometric data and focus on research endeavours.
At first sight, QS and Shanghai appear in competition because of the volume, detail and coverage, but in actual fact they complement each other, each offering a different perspective and new angle of possibilities for institutional advancement and benchmarking.
The Shanghai rankings and the Leiden Ranking, which aims to measure the performance of the world’s most intensive research universities, rely entirely on bibliometric data. They both draw on data from Clarivate. Leiden produces tables across five main fields of science and assesses the performance of more than 900 universities globally.
Last year Times Higher Education released rankings in 31 subject areas for the top 100 institutions, drawing data from opinion surveys, institutional input and bibliometric information. Both Shanghai and Times Higher Education emphasise the performance of research intensive, well-endowed and elite institutions. And, again, both are complementary.
Suffocation by ranking
With rankings fever already in its 15th year running, there are no signs in sight of it cooling. The release of Shanghai Ranking’s academic subject ranking opens up a new front, which is likely to result in the increased commodification and consumption of rankings data and a rise in the use of consultancy services. After all, we live now in the age of big data, shifting paradigms and the remaking of a new world order.
Today’s idea of the university is being altered by the rise of performance measurement regimes, shifting geopolitics and a collective sense of insecurity and uncertainty.
Angel Calderon is principal advisor, planning and research, at RMIT University, Australia. He is a rankings expert and a Latin American specialist. He is a member of the advisory board to the QS World University Rankings.