A failure to measure up
Schemes to rank universities are being reviewed in a bid to achieve greater transparency.
International university league tables are a relatively recent phenomenon, beginning in 2003 with the Shanghai Jiao Tong University ranking. In the years since, rankings have established a grip on the minds of the media, policymakers and university managers alike.
Politicians began to pay close attention to their national performance and started to demand improvements. University managers also leapt on the results, promoting their successes and looking for ways to improve their positions. All the while, the media showed a growing interest in which institutions were ‘world-class’ and which were not.
The effect has been dramatic. “Rankings only address the elite universities, but their results have an impact upon life in all universities,” says Andrejs Rauhvargers, secretary-general of the Latvian Rectors Conference. “All politicians talk about rankings. Money is assigned according to rankings. Universities are closed and created because of the results of rankings.”
But, critics say, the ranking systems have shortcomings. A review of rankings, conducted by Rauhvargers for the European University Association (EUA) and published last month, confirms the most evident weaknesses. Most focus on indicators related to universities’ research activities, not all of which are reliable, while attempts to measure teaching quality involve largely unsatisfactory proxies.
Delving into the methodology of the rankings reveals that most stop calculating scores soon after they have come up with, for example, a top 500. There are stable rankings for 700-1,000 universities worldwide, but the remaining 16,000 are left off the map entirely.
Several projects are now in hand that are intended to bring greater transparency to existing ranking systems. One, an EUA project, is an attempt to improve awareness of how the rankings are created and what they really mean. A complementary initiative, from the International Rankings Expert Group, aims to audit the various ranking systems, assessing their compliance with guidelines it drew up in 2006.
Rauhvargers’s report observes that, while ranking providers often claim to comply with these guidelines, the reality is somewhat different. “Most, if not all, existing rankings would have to make changes in order to genuinely comply with them,” it says.
Meanwhile, the EU has been funding work on an alternative system that will allow institutions to be compared and assessed without constructing league tables. The first phase, called U-Map, produced a classification tool that reflects the variety of missions and profiles of European higher-education institutions.
The second phase is U-Multirank, a system that will permit comparisions of universities according to different activities, such as research, teaching or knowledge transfer, or according to different academic disciplines. A specific feature of the system is that users must select institutions that have comparable profiles and missions before indicators can be pulled up.
Positive response
Initial results from the feasibility study were presented in June at an EUA conference in Brussels, and have produced a positive response. “It largely confirms that the multi-ranking approach, without going on to establish league tables, is feasible,” said Jan Truszczyn?ski, the European Commission’s director-general for education, at the EUA event.
Full results will be published later in the year, but in the meantime the Commission is thinking about governance and data collection for the system.
“It needs to be picked up and developed into a tool that is self-sustaining and independent of either the EU bureaucracy or any specific group of universities,” Truszczyn?ski said.
The new system has also won support from the university sector. “I think this is quite an elegant solution, because it democratises the process of rankings, it introduces an element of participation in rankings, and deals with diversity,” said Howard Newby, the vice-chancellor of the University of Liverpool.
And given the problems with assessing teaching and learning through existing rankings, the appeal for students is evident. “From the international comparative perspective, you don’t really have anything other than rankings,” said Allan Päll, vice-chairman of the European Students’ Union. “And because the rankings are often said to be so irrelevant and superficial, then anything that you do would be an improvement.”
Ian Mundell is a freelance journalist based in Brussels.
Leave a Reply