GLOBAL

Nineteen nations in new ranking’s global top 100

An assistant professor at the King Abdulaziz University in Jeddah, Dr Nadim Mahassen, has just published an unusual ranking of world universities.

Mahassen’s system places 57 US universities in the top 100 but then drops rapidly down to England and Japan in equal second place with six each, France with five, Canada, Israel and Switzerland four, Australia and Germany two, and another 10 countries scoring one each, including the University of Edinburgh, which gets a separate mention as Scotland’s single nominee.

The US also wins eight of the top 10 while England has Oxford and Cambridge in third and fifth positions respectively. That list runs: Harvard, Stanford, Oxford, Massachusetts Institute of Technology, Cambridge, Columbia, Berkeley, Princeton, Chicago and Yale.

The Canadian-born mathematician produces the ranking through his independent Center for World University Rankings.

He claims that the centre publishes the only global university performance tables that measure the quality of education and training of students as well as the prestige of faculty members and the quality of their research “without relying on surveys and university data submissions”.

Unlike the three main systems of ranking that were criticised in a University World News report last week by independent consultant and former head of the centre for applied research in education at the National Institute of Education in Singapore Dr Kaycheng Soh, the system used by Mahassen, he says, relies on “seven objective and robust indicators” to rank the world's top 100 universities. He describes the methodology in a six-page commentary.

The seven indicators include quality of academics as measured by the number who have won major international awards, prizes and medals; publication of research papers; influence as measured by the number of papers in “highly influential journals”; citations by the number of highly cited research papers; patents; alumni employment, according to the number currently holding CEO positions in the top 2,000 public companies; and quality of education “as measured by the number of a university's alumni who have won major international awards, prizes, and medals relative to the university's size”.

In his commentary on other ranking systems, Mahassen says some rely mainly on research indicators while others place a great deal of emphasis on opinion-based surveys.

“Up to now, there has been no ranking measuring the quality of the learning environment as well as research without relying on surveys and university data submissions.

“If you look at the Shanghai ranking, only 10% of its analysis is devoted to non-research. Some prestigious institutions of higher education, such as École Polytechnique in France, rank extremely low (301-400) because the ranking fails to assess properly the quality of education and training of students.

“In the case of the QS ranking, 50% of its analysis is based on surveys. Another 20% is based on the faculty-to-student ratio. Unfortunately, the number of faculty could easily be inflated to include academic-related and non-teaching staff, resulting in the indicator failing to measure the quality of teaching.

“As for the Times [Higher Education] ranking: it also uses surveys, unfortunately, which make up roughly a third of its analysis.”

Mahassen says his system shows that a ranking measuring the quality of education and training of students as well as the prestige of academics and the quality of their research can be constructed based solely on verifiable data.

In an analysis of the new system, Soh compared the lists of the top 100 universities prepared by the four different ranking systems and found only 34 appeared in all four. He says this shows the degree of inconsistency between the systems, partly because some universities were included in one ranking but not the others and partly because of the different indicators each used.

For the 34 universities common to all four lists, however, the correlations between their rankings were substantial, suggesting that having more or less the same information about the top-ranking universities from so many systems is unnecessary.

"The intention behind the new ranking system is good and respectable but the outcomes do not seem to warrant the effort," Soh says.

Professor Simon Marginson, of the Centre for the Study of Higher Education at the University of Melbourne and an expert on rankings, disagrees with Mahassen’s method of weighting the indicators in the composite index, and the equal status given to research and three other areas, which he says are not explained or justified.

“Research is a more important indicator than this, partly because we don't yet have indicators for teaching quality or learning achievement or value added which are valid, objective, standardised and internationally comparative,” Marginson says.

“Weightings are always the main problem with composite rankings. They can be arbitrary and readily used to manipulate the outcomes. I doubt if this system will challenge the standing of the Shanghai Jiaotong ranking, while among the multi-purpose composite indicators the Times Higher enjoys first mover advantage.”