School boards in India are often accused of inflating the marks of students in classes 10 and 12 in the board examinations. The usual evidence provided for this accusation that is levelled is the high pass percentages in the board examinations. There are nearly 42 boards conducting board examinations for classes 10 and 12. The combined pass percentage of students in class 10 was 85% while the pass percentage in class 12 was 82% in 2023. Besides, the percentage of students securing more than 60% marks was around 61% in class 10, and 56% for class 12. Most of the marks clustering at the higher level is known as mark compression, the twin sister of mark inflation. Mark inflation and mark compression not only undermine the credibility of our education system but also hamper the prospects of students, as they are not adequately prepared for higher education or the job market. The clamour for entrance examinations for higher education emanates from this perceived mark inflation and mark compression.
Variations across boards
In 2023, 1.55 crore students appeared for the class 12 examination, while 1.85 crore students appeared for the class 10 examination. The variation in pass percentages across boards is not high. At the same time, the variation in the percentage of students scoring more than 60% is wide across school boards. In both the secondary and higher secondary examinations, proportions of students securing more than 60% marks were lower than the national average in Assam, Bihar, Chhattisgarh, Gujarat, Madhya Pradesh, Maharashtra, the other north-eastern States, Odisha, Uttar Pradesh, Uttarakhand, and West Bengal. Does this mean, on a comparable scale, that students in other States performed better than students in these States? Of course not. This is because there is no comparable scale to measure the relative academic credentials of students certified by different boards. However, questions remain. Do the boards indulge in the practice of inflating marks to varying degrees?
Mark/grade inflation in the school board examinations is a universal phenomenon. In every country, academics and public intellectuals point out the mark/grade inflation in schools and argue for corrective measures. Marks are supposed to reflect a student’s academic knowledge and skills. Usually, the school board examination marks are compared with those of some standardised tests conducted at the national level to prove mark inflation in the board examinations. We have a few national-level tests such as the National Eligibility cum Entrance Test, Joint Entrance Examination and the Common University Entrance Test that class 12 students appear for. Such tests are not conducted to evaluate a student’s knowledge in a subject as it was taught as in the prescribed curriculum by the school boards. Instead, it is an elimination process to select students with high scores in that test for admission in specific higher educational programmes. Not all students who appear for the class 12 board examinations appear for these national-level entrance tests. Students undergo special coaching classes to secure high scores in these competitive examinations. These tests fail as benchmarks for any comparison of educational standards across States.
The National Council of Educational Research and Training (NCERT) conducts a standardised test for most of the classes every year including class 10 class but not for classes 11 and 12. The tests are conducted as a part of National Achievement Survey (NAS) for a sample of a few thousand students in every district in the country. This is a standard test, and the NCERT uses ‘Item Response Theory’ to statistically estimate the scores of each student in five subjects, i.e., English, math, science, social science and a regional language. Though it is a scientifically designed study, its academic character is overwhelming to deduce any policy suggestion.
There is little scope to connect the estimated scores of students with the curriculum design of different boards and efficiency of schools among other factors that influence teaching and learning in schools. The NAS also fails to serve as a benchmark for studying the possible mark inflation offered by school boards. Continuous annual exercises such as NAS with continuous improvements in assessment instruments and marking systems should help in understanding the differences in teaching and learning across States and possibly amending the educational processes in States.
Though comparable and independent assessment tests are not available, the high pass percentages and high proportions of students securing more than 60% marks in the board examinations are enough to make one believe that mark inflation and mark compression are in vogue and that this warrants improvements to make the examination system credible.
Standardise assessment systems
The high stakes in the board examinations for both society and students cannot be dismissed. Therefore the boards should be accountable to society and to every student. The opaque board examination system is the root cause of all the problems. The cover of secrecy should not give scope for wrongdoings.
The processes, right from question paper setting to marking systems and the publication of results, should be transparent. Question paper setting should be automated with clear guidelines that specify question formats and expected answers. Students’ guidebooks should be published on how learning outcomes are tested and marks awarded in an examination. There should be examples. The process of standardising question papers by teachers should be done in confidence. The entire process of question paper setting, printing and distribution should be codified, and standard operating procedures should be published.
The process of the printing of answer books, distribution and collection should also be codified, and a self-correcting audit process should be followed. Partial automation of the valuation of answer scripts — that is, scanning and online evaluation of answer scripts — should be ensured so that errors (other than judgmental errors) in evaluation are completely avoided in the awarding of marks. Every student should have free access to answer scripts after the publication of results and have a chance to apply for revaluation for a nominal fee. A transparent and credible examination system should reduce the scope for revaluation.
Need for transparency
There should be a transparent process of awarding marks for difficult/irrelevant/wrong questions. Along with the publication of results, the minutes of the meeting of the board of examiners should be published. The board of examiners should explain the adequacy of the question paper in terms of measuring learning outcomes, the level of difficulty of questions, and the decisions on awarding moderation marks.
The publication of the marksheet should be in two formats. The first format should have only the actual mark awarded out of the maximum mark for each subject and the aggregate mark. The second format should have the standardised scores in each subject and the aggregate of the standardised scores. The standardised scores are statistical estimates of marks in each subject, based on the distribution of marks (average and standard deviation), and the levels of difficulty of the questions based on the students’ aggregate ability to answer such questions. There are several statistical techniques, and the board may decide on a technique and publish this before the commencement of the examination. Standardised scores will remove mark inflation and such scores are comparable to the scores of students in other boards as well over different years.
The perception that school boards indulge in the practice of mark inflation and mark compression is not without strong evidence. Transparency and accountability that are backed by a good audit system should make our school board examination systems credible and devoid of mark inflation and mark compression.
R. Srinivasan is Member, State Planning Commission, Tamil Nadu; S. Raja Sethu Durai is Professor, Management Studies, Birla Institute of Technology And Science, Pilani (BITS Pilani), Dubai