The National Institutional Ranking Framework (NIRF)’s ranking of higher education institutions (HEIs), released in July, has received considerable flak. The broad parameters on which a HEI is ranked by the NIRF are ‘teaching, learning and resources’, ‘research and professional practice’, ‘graduation outcome’, ‘outreach and inclusivity’, and ‘perception’. Each of them is assigned a specific weightage. HEIs are ranked overall, university-wise, college-wise and also under disciplines such as law, medical, pharmacy, management, architecture, and engineering. To show the contradictions, inconsistencies, and flaws in the NIRF’s methodology, we have taken law as a case in point.
The NIRF places some private multi-discipline institutions higher than many prestigious national law universities (NLUs) and law departments. It is a fact that students often seek admission into NLUs; private universities and institutions, barring a few, are invariably their last choices. Generally, students who cannot secure a seat in NLUs are admitted to private institutions. Similarly, private universities and institutions are the last choices for those looking for a career in academia. However, the NIRF ranking shows that a private law university scored 100% in perception. If we consider this score, it should have been the most preferred place for students. But the Common Law Admission Test admission choices show a different picture: this institution figures below 10 NLUs as a preferred place to study.
An analysis of the data submitted by some multi-discipline private universities participating in various disciplines under the NIRF provides evidence of data fudging. There seems to be a lack of a rigorous system of verification by the NIRF of the data submitted by HEIs. For instance, the faculty-student ratio (FSR) is an important criterion for ranking. Evidence suggests that some private multi-discipline universities have claimed the same faculty in more than one discipline. Faculty in liberal arts have been claimed as faculty in law too, to claim an improved FSR. This manipulation defeats the purpose of ranking, especially in the case of single-discipline institutions like the NLUs.
There are similar instances of data fudging for parameters like financial resources utilisation (spending on library, academic facilities, etc.) by multi-discipline institutions. Enormous funds have been claimed as expenditure on equipment for laboratories by some private multi-discipline institutions which offer law as a subject. But labs are not required for law. An analysis of the 15 top-ranked institutions under law shows that equipment purchased for one department has been claimed in more than one department. In the case of an institution ranked among the top 15 under law, the expenditure on equipment claimed in engineering, law, management, dental, and medical is nearly double the actual amount spent by that institution. Research funding for research projects and consultancy is an essential parameter for ranking. Data show that research grants and consultancy charges received in other disciplines appear to have been claimed as those in law. Another sub-parameter where data fudging by certain universities is discernible is procurement of books for the library and spending on the library.
The NIRF requires the data submitted to it be published by all the participating HEIs on their website so that such data can be scrutinised. Some private multi-discipline universities have not granted free access to such data on their website; instead, they require an online form to be filled along with the details of the person seeking access. Such non-transparency is antithetical to the ranking exercise. There is also discrepancy in the data submitted to the NIRF and the data on the websites of these institutions. For instance, the data uploaded on the websites omit details on the number, name, qualification and experience of the faculty.
Further, the NIRF applies almost the same parameters to all the institutions across varied disciplines in research and professional practice. In this parameter, data on publications and the quality of publications is taken from the Scopus and Web of Science data bases. While these may be suitable for medical and engineering, they are unsuitable for law. There is a gap between the methodology employed for accreditation purposes and for ranking purposes. While the National Assessment and Accreditation Council gives due weightage to publications in UGC-Care listed journals, the NIRF uses publication data only from Scopus and Web of Science.
Thus, severe methodological and structural issues in the NIRF undermine the ranking process. The methodology must be revised in consultation with all the stakeholders.
G.S. Bajpai is the Vice Chancellor of the Rajiv Gandhi National University of Law, Punjab. Dr. Manoj Sharma, Associate Professor, provided inputs for this piece. Views are personal