30 August 2014 Register to receive our free newsletter by email each week
Advanced Search
View Printable VersionEmail Article To a Friend
EUROPE
EUROPE: New ways of ranking universities
An expert group on the assessment of university-based research was established by the European Commission in July 2008. The main objective was to "identify parameters to be observed in research assessment and to analyse major assessment and ranking systems to establish a more valid comparative methodological approach".

Now the group has delivered its final report proposing wide-ranging changes in the world ranking of universities and calling for a more fine-tuned assessment methodology.

The EU Commission asked for the development of a multidimensional methodology to capture more dimensions of academics' work. It also asked the group to identify the types of users of measurements of the quality of university-based research, take stock of the main methodologies in use and identify data requirements for a new multidimensional approach.

The group had 15 distinguished members from 12 EU member states and Australia, and two international organisations. It was headed by Professor Wolfgang Mackiewcz of the Free University of Berlin.

In its report, the group says universities should be funded more for what they do than what they are. Competitive funding "should be based on institutional evaluation systems and on diversified performance indicators".

Although university ranking systems have been widely used since their introduction in 2003, assessment experts have serious reservations about the methodologies used. Besides this, the ranking systems tend to focus on the 100 top ranked institutions.

The EU group wants to make a new and coherent methodology to assess the research produced by European universities. And this should be relevant to the 17,000 higher education institutions around the world.

In a foreword to the report, Commissioner Janez Potocnik, who commissioned the report and who participated in the conference before the report was launched, quotes Einstein: "Not everything that counts can be counted, and not everything that can be counted counts".

The group claims that "the absence of appropriate, verifiable and trustworthy data can undermine the usefulness of cross-national comparisons and benchmarking". It then makes reference to a 2007 article by RV Florian in Scientrometics: "Research has found that the results of the Shanghai Jiao Tong Academic Rankings of World Universities are not replicable, thus calling into question the comparability and methodology used."

The methodology recommended by the group is an assessment based on fit-for-purpose, with combined quantitative indicators and data with qualitative information undertaken at the level of 'knowledge clusters'.

Such knowledge clusters should be based on an administrative unit such as a faculty, department, school, teams, centres, institutes, interdisciplinary issue-driven clusters, the report says. They should allow for aggregation to the institutional level.

The group has developed an outline for a "multi-dimensional research assessment matrix" that links specified users with their defined properties and objectives to specific data, quantifiable and qualitative indicators and specific assessment methods.

Based on this exercise, the group demonstrates the method in several case studies: on research excellence initiatives in Australia and Germany; individual universities in Belgium and Finland; national evaluation agencies and processes in France, Germany, Norway, Ireland, Italy, Hungary, the Netherlands, Spain, Sweden and the UK; and on global rankings such as Webometrics, the ARWU, THE-QS rankings, as well as performance rankings of scientific papers and the Leiden ranking on bibliometric indicators.

To take the project forward, the report proposes:

* Establishment of a European observatory for assessment of university-based research.
* Investment in developing a shared information infrastructure.
* The launch of a pilot for a multidimensional research assessment matrix.
* Adapting the multidimensional matrix to web-based technology.
* Launching a project of pilot indicators to measure the social and economic impact of research.
* Developing a financial model to cover the full cost of university-based research.

Comment
I am curious if there has been any analysis of rankings based on a different type of analysis. I suggest that the EU cluster emphasis, the THE research emphasis and, for instance, the Washington Monthly values emphasis all represent different "tells" that may be explored more for what is desired than measured as realized.

Marc Arenstein
Disclaimer
All reader responses posted on this site are those of the reader ONLY and NOT those of University World News or Higher Education Web Publishing, their associated trademarks, websites and services. University World News or Higher Education Web Publishing does not necessarily endorse, support, sanction, encourage, verify or agree with any comments, opinions or statements or other content provided by readers.