What’s in a rank?

Is there a secret recipe that decides which institutions lead global university rankings?

July 09, 2017 05:00 pm | Updated 05:00 pm IST

When we're short-listing universities to apply to, it's not unusual to rely on global rankings. But who decides these ranks? How are they decided? Does one system carry more heft and credibility than another? Ranking organisations make this information publicly available through their websites. But to get the inside scoop, we spoke with Ben Sowter, Research Director at Quacquarelli Symonds (QS), and Phil Baty, Editorial Director, Global Rankings, Times Higher Education (THE).

Two globally established lists are the THE World University Rankings and the QS World University Rankings. As Ben puts it, “Of the 19 organisations currently publishing rankings of a global scope, Shanghai (Academic Ranking of World Universities), QS and THE seem to be, by far, the most resonant and thus widely referenced.”

Methodology

QS publishes several lists each year, with some ranking universities by region (BRICS, Asia, and so on), some by subject, and one solely for graduate employability, but we looked at their overall ranking. This list is put together by looking at six metrics —given varying clout — to compute a final score for each university: academic reputation index (40%), employer reputation index (10%), faculty/student ratio (20%), citations per faculty (20%), international faculty ratio (5%) and international student ratio (5%).

Similarly, THE produces various annual rankings, based on 13 indicators, grouped into five areas: 30% for teaching, 30% for research, 30% for citations, 7.5% for international outlook and 2.5% for industry income.

For QS, 50% of the ranking decision rests on two proprietary surveys — their academic reputation survey and their employer reputation survey. They have 75,000 respondents — academics from across the world's universities — for the former and 40,000 respondents (HR professionals worldwide) for the latter.

“Our respondents are professors, heads of department, deans, and/or vice-chancellors. Faculty and student data is drawn from ministry and central statistics sources where possible and from direct submission otherwise. Submitted data is validated against public sources, the historical record and patterns for other institutions of similar type or location,” says Ben.

THE too, has an academic reputation survey, soliciting responses from 10,000 senior scholars. Phil explains, “The crucial element of this survey is that it is invitation-only — this ensures that we only invite published scholars who are known experts in their field, and that the respondents are statistically representative of their field and their country (by United Nations data).”

The survey accounts for 33% of the score, through two of THE's thirteen indicators. For most other indicators, THE collects data directly from the institutions, following “several strict quality control mechanisms,” according to Phil. For example, data is checked against public domain sources where available. Something that might interest universities is the THE’s “range of subscription-access data and analytical tools, where universities can access much of the data used to build the rankings, and compare their performance with benchmarking peers.”

Both QS and THE give heavy weightage to citations for research publications — 30% for citations per paper at THE and 20% for citations per faculty at QS. Why is this the case when it seems to be such a one-dimensional representation of a university? Ben explains, “The pursuit of teaching and research alongside one another is at the heart of many definitions of a university. Citations serve as a proxy measure for the quantity and quality of research and can be reliably collected. At QS, citations data is drawn from Elsevier's Scopus database, the largest of its kind.”

THE also uses Elsevier’s Scopus data. “We use it to identify those universities which publish at least 1,000 research papers over a five-year period. We then target this group. This year we collected rich data on more than 1,500 universities across the world, including 67 from India,” shares Phil.

On the flip side, neither QS nor THE rankings directly consider alumni and student opinion. Intuitively, it appears that they would be an important resource to tap for parameters like teaching quality. However, there are obvious downsides to doing so. In Ben's words, “We conduct extensive student surveys but it is problematic to include them in rankings for a variety of reasons — cultural differences and the fact that the known inclusion of student survey data in rankings tends to influence the validity of the responses.”

Indian universities

Interestingly, Indian universities that don't necessarily enjoy unanimous renown locally, can feature higher than expected in global rankings. Phil speculates, “In some cases, quite naturally, we do find some statistical outliers, and when these occur in heavily weighted indicators, they can affect an institution’s overall score. Veltech University received an exceptionally high score for citations, based on a relatively low volume of research papers but with some very highly cited work within that output.”

Ben posits, “There are always divergences between perception and reality and between reality and measurement. Domestic and international reputation can also diverge. For the most part, in India, our results seem to have tracked sensibly with domestic perceptions — the IITs always do well, for example. Whereas, there have been some unexpected outcomes in the case of Panjab University and more recently, Veltech, in THE.”

So, between QS and THE, who has the upper hand? It's hard to say — they seem to cater to different audiences on the basis of varying strengths. This is reflected by their differing outcomes as well — MIT has come out on top six years in a row by QS’ count, but THE puts it at number five, with the University of Oxford in the lead.

Ben emphasises the usefulness of the unique employability indicator in the QS rankings, while THE seems to focus more on research-intensive institutions, allocating 30% to research and 30% to citations when computing each institutes’ final score. As Phil says, “We have a comprehensive process of targeting the leading research universities in the world to collect data from.”

Endgame

Keeping the endgame of college in mind — which, in most cases, is finding employment or furthering studies — it becomes important to know: are university rankings taken seriously by senior professors, the universities themselves, and potential employers?

Colin Phillips, Professor of Linguistics and Director of the Maryland Language Science Center, weighs in: “If a student has been successful at a highly competitive undergraduate institution, then we do notice. But we care more about what the student has been able to achieve, relative to the resources available in that institution. If two students from the University of Chicago and a university in Argentina present the very same profile, then we’ll likely take the Argentinian more seriously, as that student has had fewer opportunities.”

Speaking on QS' and THE's subject area rankings, he adds, “I think rankings are systematically biased. In my own field, linguistics, there is an obvious bias for universities that have a strong general undergraduate profile and that are Anglophone. There are some world class programmes that appear way down QS’ list. For example, UC Santa Cruz is widely recognised as having top undergraduate and graduate programmes in linguistics, yet is only ranked 50. I know less about the THE rankings. In my subject area, they combine a range of different fields, 'Languages, literatures, and linguistics', so the results aren’t terribly useful.”

And what about potential employers? “They do not care about these rankings. They will always say that they pay attention to the student rather than to the university or programme’s reputation,” sums up Colin.

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.