QS ranking downright shady and unethical

QS ranking downright shady and unethical

On Thursday (8 June), Nanyang Technological University (NTU) was named Asia’s top university in the 2018 Quacquarelli Symonds (QS) World University Rankings, surpassing the National University of Singapore (NUS).

NTU is also ranked 11th in the world, ranking above other notable institutions such as Princeton University, Cornell University and Yale University, and two positions above its previous 13th spot.

Below is a response written by John Ouserhout on question-and-answer site, Quora on the question of “How accurate are the 2018 QS rankings? They seem to rank Nanyang Tech higher than Princeton, Yale, Cornell, Columbia, and Berkeley.”

Ouserhout’s response in full
First impressions suggest it is almost laughably inaccurate. UC Berkeley at #27? This is the same university that is affiliated with 91 Nobel Laureates, 13 Fields Medals, 23 Turing Awards and 16 elements of the Periodic Table. The rest of the ranking seems similarly strange as well; NUS and NTU ranked right alongside or higher than Princeton, Cornell, Yale, Columbia and Johns Hopkins?

Is it just my Anglo-American bias speaking? I know these universities have made rapid strides in funding and encouraging cutting edge research, so perhaps it’s inevitable that they’ve caught up and surpassed the more well known Ivy League universities. That is obviously partly the case, but after digging around for a bit, there seems to be a whole host of articles about the flawed methodology and downright shady practices of the QS organisation.

Firstly, take a look at flaws in the methodology:

  • 10% of the score is based on the % of international students and faculty. This has absolutely no relation to research or teaching, and favors universities in smaller countries like the UK, Switzerland and Singapore over those in the US. 10% might seem like a small number, but when all these universities score almost the same in the major metrics, it is small factors like these that create all the differences in rankings.
  • 50% of the score comes from reputation surveys from Academics and Employers. The methodology of these surveys has come under immense scrutiny – awarding incentives for completing surveys, letting survey takers recommend the survey to others, surveying random academics rather than the established leaders in the field, etc.
  • Less commonly spoken about is the geographic distribution of the survey respondents: 7.3% of respondents come from the UK (Pop: ~65 Mil), 3.7% from Malaysia (Pop: ~20 Mil), 0.8% from Singapore (Pop: ~5Mil), 4.0% from Australia (Pop: ~20 Mil), and just 10% from the USA (Pop: ~ 320 Mil)! 15% of survey respondents are administrators and teaching assistants, not academics! Looks like it’s no surprise that UK, Singapore and Australian universities seem to be higher ranked than they might be.
  • The employer surveys ask “employers to identify those institutions from which they source the most competent, innovative, effective graduates. “ and “ international and domestic responses will contribute 50% each to an institution’s final score.” In short, employers are not actually being asked to compare a hypothetical Harvard grad to an NUS grad. The score for NTU for instance, is based primarily on which universities Singaporean employers will tend to hire graduates from.

Secondly, QS’s business model is really shady:

  • A dubious Star ranking system, where universities pay to be evaluated. (Conflicts of interest anyone?)
  • ‘Branding Opportunities’ for $80,000 with QS Showcase
  • A highly lucrative ‘consultancy service’ to help universities rise up the rankings

In short, ranking systems might have their flaws, but this one is downright shady and unethical. The only people who take this ranking seriously are not academics or employers, but relatively uninformed parents and students, which is deeply worrying.

Read: NUS top rank – not necessarily a reflection of teaching quality

Notify of
Inline Feedbacks
View all comments