How to boost your university’s ranking position
In mid-June QS released the top 300 universities in the region, which showed no significant movement of institutions at the top. This bodes well for the stability of the QS ranking.
The Universidade de São Paulo, Universidade Estadual de Campinas, Pontificia Universidad Católica de Chile and Universidad Nacional Autónoma de México are the undisputed top four. There are others like the Tecnológico de Monterrey and the Universidad Nacional de Colombia that are consolidating their standing. This should include the Universidad de Costa Rica, which is now in the top 20.
Unchanged top countries
Among the top 10 universities from the 2016 edition, Brazil has four and is followed by two each from Chile, Colombia and Mexico. Two universities from Argentina, one each from Costa Rica and Venezuela are added to the countries with universities in the top 20.
Universities from five out of 20 countries are in the top 20. In this regard there is not much diversity and that is not particularly surprising given the uneven educational development in the region.
Diversity widens somewhat among the top 50 as there are universities from nine different countries, which increases to 13 out of 20 countries for the top 100. Eighteen countries make up the top 200.
The top 300 universities are drawn from all Latin American countries (however, some universities have small or no scores and are thus not internationally competitive). It may sound exciting for a region that promises a lot but has yet to demonstrate quality as a system. It also highlights the gulf that separates the institutions that perform well (and presumably are better resourced) and those that are ranked in the middle and at the bottom.
There are a number of methodological changes in the 2016 edition, which align with changes made to the QS World University Rankings in the 2015 edition. The most significant change is that it includes a metric about internationalisation. QS has judged that internationalisation is best measured by an international research network indicator.
This new indicator assesses the degree of international openness in terms of research activity for each assessed institution. It refers to the count of international collaborating institutions with co-authorship in one or more papers indexed by Scopus in a five-year period. Universities with greater numbers of published research papers benefit the most from this change.
There are eight indicators that comprise the QS Latin American university ranking. The academic reputation survey accounts for 30% of the overall score, followed by 20% from the employer survey. Then three indicators count for 10% each: faculty to student ratio, staff with doctorates and citations per paper.
In previous editions, papers per faculty and citations per paper accounted for 10% each, but these were reduced to 5% each and the remaining 10% transferred to the new international research network indicator.
The introduction of the new indicator has not inflicted much variability on the 2016 rankings. But it sends a strong signal to institutions in the region. Research activity matters far more than one may like to admit. This change may be the beginning of a behavioural and cultural change for institutions and drive an international research agenda.
Understandably, QS had to adopt a metric that it could reliably measure. QS attained this goal by drawing from Elsevier´s Scopus data. It may have been more pertinent to use a metric that highlights breadth and depth of institutional activity in teaching and learning rather than using another bibliometric measure.
An issue with a metric requiring institutional input is that Latin American institutions do not have standardised national data collections as occurs in countries such as Australia, the United Kingdom or the United States.
There are clear messages that emerge for Latin American institutions from the various university ranking schemas. First, there needs to be standardised data collection, reporting and analysis across institutions within national systems regardless of the type of institution. In this regard governments need to drive the policy agenda for improving the collection, reporting and analysis of information.
Second, institutional visibility is critically important as it is influential in driving up (or down) scores in the reputational surveys. Whether we like it or not, the branding of an institution plays a pivotal role in shaping reputation. A clearly articulated strategy to boost institutional visibility is a priority.
Third, universities need to focus on research output and impact. Institutional leaders need to draw up a long-term strategy to generate the levels of research output required for consideration for rankings. This may include targeted professional development programmes and staff incentives for undertaking research training, among many other initiatives.
To boost scores in the international research network, universities may be able to draw on their alumni who may be working in academia abroad.
These are examples of an array of initiatives that could work well for institutions wanting to improve their positioning in university rankings.
Angel Calderon is principal advisor, planning and research, at RMIT University, Australia. He is a rankings expert and a Latin American specialist.