QS World University Rankings FAQs
2/21048
calendar_month 01 Mar 2011, 00:00
1) How do you plan to address the perceived bias towards English-speaking (and particularly UK) universities?

The developments to our survey approach, both in terms of distribution and analysis ought to have an impact here. Elsevier are also continually updating Scopus to embrace more non-English language content. The reality is, however, that in many areas of university competitiveness, operating in English is an advantage. English language journals are more widely read and cited, the top four destinations for international students (and I suspect also faculty) are the US, Canada, UK and Australia all English speaking. Many universities in non-English speaking Asia, recognising this are operating more programs in English and all global rankings currently carry this bias, not just ours. Our objective is to minimise the bias, but it is far from clear whether eliminating it entirely would be appropriate.

2) What would you say is the main difference between the QS methodology and the new system announced by the European Commission?

The final methodology of the European Commission has not yet been announced in fact they are due to release only the results of an initial feasibility study in 2011 with the full exercise to emerge some time later. The key observation perhaps is their assertion that they will "compare only institutions which are similar and comparable in terms of their missions and structures" depending on how far this principle is taken, the results of this exercise may not resemble a ranking at all.

3) Following the launch of the government-funded Assessment of Higher Education Learning Objectives (AHELO) pilot scheme, how do you respond to the suggestion that an insufficient emphasis is given to teaching standards and student skills within the more research-oriented established methodologies?

QS absolutely concurs that teaching and learning is inadequately embraced in any of the existing global rankings, including our own and is watching the AHELO exercise with great interest to see if lessons can be drawn and applied to the much broader geographical scope of our rankings. QS is also assessing whether student and alumni inputs can help draw a clearer picture of comparative performance in teaching and learning. On the student skills side of things, QS is currently the only global ranking taking this aspect seriously via the Employer Review indicator.

4) Do you think that the low ranking of LSE in the 2009 rankings (67th) is reflective of an inherent bias toward scientific subjects within citations-based methodologies, and if so how do you plan to address this in 2010?

The QS World University Rankings are designed to assess the all-round quality of universities across all disciplines and levels, in teaching, research, employability and internationalisation. LSE is a fantastic institution, as is reflected by their persistent high position in the Social Sciences the faculty area in which they are focused. In fact, it is so strong with its narrower focus that it manages to compete with world leading institutions with a much broader range. Even if we only take the proportion of world universities recognised by UNESCO a Top 100 placing represents the top 1% - a prolific achievement for an institution that focuses on only a small part of the academic spectrum. To put things in perspective, LSE fails to break the top 200 in the Shanghai Ranking.



5) How can the shift in position of some universities in the THE -QS World University Rankings 2004-2009 be explained?

Intuitively, established rankings with established methodologies should yield stable results, and those of the QS World University Rankings ( formerly known as THE -QS World University Rankings 2004-2009) have been showing increased stability over time. Indeed the more specific the context and the smaller the number of evaluated institutions, the more stable the results ought to be. However, in the Times Good University Guide the University of Bedfordshire rose 18 places in 2010, in the FT Business School Rankings, UK schools featured in the Top 50 have fluctuated between 5 and 12 over the past few years. In the Guardian's ranking in 2009 the University of Exeter climbed 20 places to 14 and the average shift in position was 8.6 in their Top 100 where the fluctuation in the QS World University Rankings Top 100 was only 7.4. These other rankings have been established for a long time and all share one thing in common they all aspire to evaluate multiple aspects of what a university does not merely focus on a single one. Different aspects of university performance move at different rates international student flows are changing, the research productivity of Chinese universities continues to grow dramatically, employability and work skills are at the forefront of many institutions' plans. Since our results are standardized, these changing patterns and the addition of new institutions has an interdependent effect on the overall results and can influence stability.