Ratings motivate universities to be more honestFebruary 15, 2016
Representatives of the companies compiling academic rankings and other experts met at a round table called “Changes in rankings: challenges and opportunities for Russian universities”. The 2-day event was part of the 15th Seminar-Conference of Project 5-100 which took place in Ural Federal University.
The participants in the round table discussed the changes in the methodologies of national and international rankings and future possibilities. For example, Alex Usher (President of the Higher Education Strategy Association (HESA) and Chief Editor of Global Higher Education Strategy Monitor) is certain that we shouldn’t be expecting substantial innovations in this area in the near future. He briefly talked about the history of the development of rankings in the world and described each stage of its development from a methodological point of view. According to Usher, such indicators as the employability of graduates and their income after graduation are very important for the “third wave” of national ratings.
Duncan Ross (Data and Analytics Director at Times Higher Education) claimed that the methodology of THE rankings isn’t going to alter in the foreseeable future; at least not as far as its core principles are concerned. It is still going to be built on “three whales”, which is to say, by means of information provided by the universities, and by the academic reputation survey and data from Scopus. Nevertheless, Ross reminded the audience of the recent changes that have both expanded and forged variety in the current system: the number of the universities in the global ranking has increased from 400 to 800; the reputational survey has become more balanced in terms of geographical location of the scientists (thanks to the UNESCO database); and articles with an extremely large number of authors are no longer considered in the ranking.
Ben Sowter (Chief Editor of QS Rankings) showed which articles and in which disciplines were the most highly-cited in 2010-2014 according to Scopus. Life sciences and medicine are the undisputed leaders here. This field of study has brought in almost half (49%) of all citations, while only 1% of these acknowledgements have been attributed to arts and humanities. “We don’t think that Russian journals must publish articles about Russian literature in English,” Sowter said. “This doesn’t make much sense. Look at China and Japan. Most humanities-related articles are published in their native language, and so they should be. We don’t want to put the universities in an uncomfortable position, so that’s why there needs to be regulations.”
Sowter also gave his attention to the ethical side of rankings. “We motivate the universities to be honest”, he said. He went on to add that QS checks the reports provided by the universities. Thus, in order to avoid any manipulation and misinterpretation of the terms, students who did not study outside of their countries were excluded from the “foreign student” category in 2015.
Kazimierz Bilanow (Managing Director of IREG) believes that ranking providers must always be “on top of their game.” For this purpose, the professional community has developed the “Berlin Principles” – quality standards and proper ranking practices. Over the last few years IREG has been conducting audits of ranking organizations with the participation of independent experts.
Richard Holmes (Editor of the blog University Ranking Watch) systemized the practices which accompany changes in ranking systems. He noted that methodological changes are inevitable, even if undesired. Holmes recommended a series of measures which would help to “soften the blow”. “Rating organizations must forewarn the public of the upcoming changes”, Holmes urged. He also advised ranking providers to warn the universities that it will be impossible to compare the results of different years because when the methodology changes the rating practically starts from scratch.
According to Oleg Solovyev (Editor of the rating Round University Ranking), in this situation the universities cannot and must not claim that a sharp rise in the ranking is the result of their efforts. He warned the marketing departments of universities against giving in to this temptation. “We need to be informed consumers of information”, he concluded.
Several speakers devoted their presentation to the perception of ratings and asked questions from the universities’ position. Vitaly Bagan (Head of Strategy Development at MIPT) noted that the main ranking providers do not consider such important things as prospective students (applicants) and students in their calculations. Bagan believes that a survey of prospective students can show the power of a university brand and reduce the influence of academic experts. “We need ratings which collect the maximum amount of objective data on the basis of publically available information”, he emphasized.
The topic of “blind spots” was continued by Ekaterina Mikhailova (Head of Strategic Planning and Ranking Studies at ITMO University). She pointed out that such important areas as innovation, entrepreneurship and the social function (adaptation of students), which are currently undergoing rapid development in Russian and international universities, are not fully reflected in the current ratings.
Sergei Kireev (Dean at MIPT) complained about the exclusion of articles with a large number of authors from ranking calculations. He reminded the audience that such publications are based on serious breakthrough research conducted with the participation of many countries. “We want to cooperate with the rankings and we want the methodology to be transparent”, Kireev concluded.