What do university rankings mean?

I refer to the article “NTU overtakes NUS in latest QS university rankings” (Channel NewsAsia, Jun 8).

It states that “Nanyang Technological University (NTU) was on Thursday (Jun 8) named Asia’s top university in the 2018 Quacquarelli Symonds (QS) World University Rankings, surpassing the National University of Singapore (NUS) in the process.

NTU is ranked 11th in the world, above other notable institutions such as Princeton University, Cornell University and Yale University, and two positions above its previous 13th spot.

NUS came in at 15th place in the latest rankings, dropping three spots from 12th.”

In this connection, I understand that the higher the percentage of international students and faculty (foreigners) – the higher may be the contribution to a university’s rankings.

The estimated percentage of non-Singaporean students and faculty in our public universities is about 40 and 63 per cent, respectively (“62.7% of university faculty foreigners?”, Sep 11, 2016).

Read: NUS top rank – not necessarily a reflection of teaching quality

In the interest of encouraging a more discerning perspective on the numerous university rankings in the world – the following are some extracts from Philip G. Altbach’s article “The State of the Rankings” (Inside High Ed, Nov 11, 2010):-

“The investments made in higher education by China, South Korea, Taiwan, Hong Kong, and Singapore in the past several decades have resulted in the dramatic improvement of those countries’ top universities.

Where Is Teaching inInternational Rankings?

In a word — nowhere. One of the main functions of any university is largely ignored in all of the rankings. Why?

Institutions in Hong Kong and Singapore have the advantage of financial resources, English as the language of teaching and research, and a policy of employing research-active international staff.

Comment: Our tuition fees for citizens are one of the highest in the world

The QS World University Rankings are the most problematic. Between 2004 and 2009, these ranking were published with Times Higher Education. After that link was dropped, Times Higher Education began publishing its own rankings.

From the beginning, QS has relied on reputational indicators for a large part of the analysis. Most experts are highly critical of the reliability of simply asking a rather unrandom group of educators and others involved with the academic enterprise for their opinions. In addition, QS queries the views of employers, introducing even more variability and unreliability into the mix

Some argue that reputation should play no role at all in ranking, while others say it has a role but a minor one. Forty percent of the QS rankings are based on a reputational survey. This probably accounts for the significant variability in the QS rankings over the years. Whether the QS rankings should be taken seriously by the higher education community is questionable.”

This entry was posted in Education.