The renaissance of national rankings

There can be no question about it, university rankings evoke emotions. Phil Baty, editor of the Times Higher Education World University Rankings, often starts his presentation quoting a high-ranking Chinese official who once called him the “education secretary of the world”. Modesty aside, this illustrates how seriously academic rankings are taken.

The appearance of global rankings a decade or so ago has put university rankings in the spotlight. Some have started to see in the global rankings not so much an expression of competition between universities as yet another form of rivalry between nations.

Presidents and kings in France, Saudi Arabia, Russia and a number of other countries have suddenly expressed readiness to spend billions to help universities in their countries advance up the rankings.

The global rush to outdo others in the world-class university business has overshadowed the very, if not the most, important and meaningful type of rankings – national university rankings. They may have been criticised now and again, but national rankings have been recognised and appreciated for some time.

The Washington Post, analysing the phenomenon of the US News & World Report Best Colleges Rankings produced by Bob Morse since 1983, wrote: "Bob Morse is a wonk, a number-cruncher who works in a messy office at a struggling publishing company in Georgetown. He’s also one of the most powerful wonks in the country, wielding the kind of power that elicits enmity and causes angst."

He “has endured for two decades as chief arbiter of higher education’s elite. No one can stake a credible claim to academic aristocracy without a berth on the first page of a US News list. He is to colleges what Robert Parker is to wine."

And: "The annual release of the rankings is a marquee event in higher education."

Three new rankings a year

No matter how much excitement there may be around the global rankings, to a young American looking for a place to study – or a young Chinese who wants to study in the US – the annual ranking by US News & World Report is by far more relevant than all the global rankings put together.

The same goes for national rankings published in the United Kingdom such as The Guardian University League Table, The Times Good University Guide, or The Complete University Guide.

For years they have been the basic tool for British students finishing secondary school and making their higher education choices. Bernard Kingston wrote about this an article published by University World News.

Rankings published in Germany or Poland, the CHE University Ranking and the Perspektywy University Ranking respectively, enjoy an equally good reputation. The two rankings were the first national university rankings to pass the international audit and receive the “IREG Approved” quality certificate.

The number of national rankings has been growing at an impressive rate. Thirty-two new national rankings have been published since 2005. This means that at least three new national university rankings are published every year.

These rankings, based on a solid methodology which takes into account the cultural and legal context of a country’s higher education system, have increasingly been seen as a tool for helping domestic and international students select which institution to study at.

In this group we find the Best Chinese Universities Rankings, or BCUR, which assesses 657 universities in China. Based on a solid methodology, it has been prepared by a team from the Shanghai Ranking Consultancy under the leadership of Professor Nian Cai Liu.

The renaissance of national rankings inspired the IREG Observatory on Academic Ranking and Excellence, an international association of ranking organisations and universities, to take a closer look at the phenomenon. As a result we now have a unique picture of national rankings in the form of the IREG Inventory of National Rankings. It can also serve as a useful source of information and an analytical tool.

The Inventory has been prepared by the Perspektywy Education Foundation, a member of the IREG Observatory, with several years of experience in publishing and refining national university rankings in Poland.

Who publishes national rankings?

Making a list of 57 rankings in 35 countries was no easy task. A large number of rankings and “like rankings” had to be reviewed. Only those rankings that were published at least twice in or after 2010 were included. The Inventory does not take into account the so-called "country rankings" that are extracted directly from international rankings.

In 14 countries more than one ranking is published on a regular basis: in the US (five rankings), Mexico (four), UK (three) and Poland (four). Other countries have one ranking only. The authors of the Inventory have been unable to identify any current rankings in Africa and Australia.

The majority of national rankings (88%) are published annually. Only 5% are published on a semi-annual or biannual cycle. The majority of rankings (58%) are prepared and published by commercial media companies. Only in some cases (7%) have they been prepared and published by a government organisation.

It is unfortunate that the majority of these rankings (58%) are published in a native language, making them difficult for prospective international students to use. This obstacle, however, can easily be overcome.


The overwhelming majority of national rankings are aimed at prospective students and their parents (95%). Most of them assess institutions as one unit (78%); but often (59%) they also rank by field (33%) or by subject (31%) as well.

Two areas dominate the methodology: teaching (89%) and research (82%), but also important are reputation (62%), internationalisation (44%) and employability (42%). Most of the rankings are based on three to five criteria (70%).

Quality assurance procedures are usually applied to the ranking process and contribute to the quality of these rankings. Among the analysed rankings a consultancy is the tool most used. An advisory board with broad competencies is another quality assurance tool used by 38% of rankings.

To summarise, the IREG Inventory is a tool that can help all those interested to access valuable information on national higher education systems in at least 35 countries. The authors of the Inventory promise that it will be systematically updated.

High quality rankings

With facts and figures on national rankings now available in one place, let us ask what makes these rankings popular; what makes their numbers grow?

There are two possible answers. The first is of a “technical” nature. The availability of new, reliable data useful for ranking purposes makes possible significant advances in methodology. The competition between the two main data providers, Scopus and Web of Science, has contributed to the increase of available data and analytical tools.

Easier access to data allows for a radical increase in indicators that can be used both in national and international rankings. Additionally, a number of countries have improved their collection of data on higher education, making the construction of rankings easier and less costly.

As the methodologies have become more sophisticated and data has become more available, the scenario depicted by Ulrich Teichler in the book University Rankings: Theoretical basis, methodology and impacts on global higher education as a "turn towards high-quality rankings” may be coming true.

The other answer has to do with demand and supply. Any product can only develop when there is a genuine need for it. The present context seems to particularly favour national academic rankings.

Who needs national rankings?

The primary users of national rankings are prospective students and their parents. These two groups, as Bernard Kingston points out in his article quoted earlier, are mainly interested in institutions that are outside the top 10.

He says: "This interest tends to be concentrated in the levels below the UK Top Ten. It is a virtue of the league tables’ methodologies that there is very little movement at all in the leading 10 institutions. They remain more or less the same year on year, perhaps with a minor shuffling of places, but with no great shocks."

The national rankings, and in particular the middle range institutions, are of great importance to international students. Making a decision to study abroad, a prospective student thinks in terms of country and language. For most of these students, the choice of a specific university is not as important. They take into consideration country, institution and costs.

An average prospective student is not looking to enrol in a world-class research university. He or she will try to find an affordable institution that offers a good level of education, that’s all. In the era of massification of higher education, the middle group of students dominate.

To the majority of international students coming to study in the US, UK or Australia, the fact that they will study and master English at the same time as completing their studies is often more important than the reputation of the university they will attend.

The Ukrainians who come to study in Poland value most the fact that they will study in a European Union country and get a European education. They do not care that much if the institution of their choice ranks high globally, but they check with great interest the ranking of universities in Poland. They try to enrol in institutions in the middle of the pack since the top universities charge fees they cannot afford.

The interest of international students in national rankings can be described as the “internationalisation of national rankings”. I am convinced that this phenomenon will only grow.

We should also bear in mind that prospective students show interest in rankings only once or twice in their life – when they make a choice about where to study for themselves or their children. This is why candidates for undergraduate studies look for classic league tables with an established set of criteria. They are not ranking analysts; they are ranking consumers.

Rankings by subject

The advantage of national rankings comes from the fact that embedded therein are rankings by subject. National rankings cover more subjects than their international counterparts. Here are some examples:
  • • The CHE University Ranking from Germany, since it began in 1998, has been published as a ranking by subject only. Its newest edition includes 36 fields of study.
  • • The US News & World Report "Best Colleges" rankings, published since 1983, and "Best Graduate School" rankings, published since 1987, cover over 100 fields and specific programmes.
  • • The Perspektywy University Ranking from Poland has been publishing rankings by subject since 2000. The current edition covers 43 subjects.
  • • The Complete University Guide (UK), published since 2007, consists of 67 rankings by subject.
  • • The Folha University Ranking, published since 2012 in Brazil, includes rankings in 40 subjects.
To prospective students, including international students, rankings by subject are of greatest interest since they provide the concrete information they seek, and national rankings are the best source of such information.

A look ahead

National rankings are being prepared in Egypt and in India. There will also soon be new national rankings in Russia.

At the NAFSA: Association of International Educators 2015 Conference in Boston there was a session dedicated specifically to the subject of national rankings, confirming a growing interest in these rankings.

At the session which I chaired, “How National Rankings Can Serve as a Tool for Finding Partners Abroad”, the different ways national rankings can effectively be used to facilitate contacts between institutions in different countries were discussed. Due to their comprehensive and inclusive character, national rankings are an excellent source of detailed information on potential foreign partners.

National rankings also allow rectors or university presidents, as well as politicians, to monitor progress in the implementation of reforms. International rankings are of little help here as they only take into account a few hundred universities worldwide (the magic 500 group).

Theoretically more institutions could be included in global rankings, but differentiation between institutions would then drop radically. Feedback time in the case of international rankings takes longer.

The only honest advice one can give an ambitious rector or president who wants to monitor how his university performs is: “Pay attention to national rankings. Only by improving your position in national rankings can your university improve its standing in international rankings!”

The future of national rankings is closely linked to the demand coming from different groups of stakeholders: prospective students (including international ones), the academic community, employers, the authorities responsible for higher education and politicians in general.

The latter treat rankings as a source of up-to-date information on the state of higher education institutions in their country. Since there is a growing interest in academic rankings, they are being published in more countries and their quality is improving so national rankings seem to be heading towards a promising future.

Waldemar Siwinski is vice-president, IREG Observatory on Academic Ranking and Excellence.