The ugly face of a crime-fighting move

The implementation of the National Automated Facial Recognition System in India lacks adequate safeguards

Updated - December 04, 2021 10:37 pm IST

Published - August 25, 2021 12:02 am IST

Face detection and recognition of digital human 3d illustration.Concept of

Face detection and recognition of digital human 3d illustration.Concept of

In the monsoon session of Parliament, no meaningful debate could take place due to the controversy over Pegasus , the spyware. Some Indian journalists, civil society activists and political leaders, and a top election strategist were possibly under surveillance. There has been no categorical denial by the Government and that the Israeli software was not purchased. But above this, there is a much bigger issue of the privacy of the entire citizenry which has not received much public attention. On June 23, 2021, the Joint Committee examining the Personal Data Protection Bill (2019) was granted a fifth extension by Parliament . While informational privacy is not the Government’s priority, it has been simultaneously exploring the potential of facial recognition technology.

A prying technology

To empower the Indian police with information technology, India approved implementation of the National Automated Facial Recognition System (NAFRS) to “facilitate investigation of crime and detection of criminals” in a quick and timely manner. On its implementation, it will function as a national-level search platform that will use facial recognition technology: to facilitate investigation of crime or for identifying a person of interest (e.g., a criminal) regardless of face mask, makeup, plastic surgery, beard or hair extension.


The technology is absolutely intrusive: computer algorithms map unique facial-landmarks (biometric data) such as shape of the cheekbones, contours of the lips, distance from forehead to chin, and convert these into a numerical code — termed a faceprint. Thus, for the purposes of ‘verification’ or ‘identification’, the system compares the faceprint generated with a large existing database of faceprints (typically available to law enforcement agencies) through a database on driver’s licence or police mugshots). But the real problem is that facial recognition does not return a definitive result — it ‘identifies’ or ‘verifies’ only in probabilities (e.g., a 70% likelihood that the person shown on an image is the same person on a watch list). Though the accuracy of facial recognition has improved over the years due to modern machine-learning algorithms, the risk of error and bias still exists. For instance, there is a possibility of producing ‘false positives’ — a situation where the algorithm finds an incorrect match, even when there is none — resulting in wrongful arrest. Moreover, much research suggests facial recognition software is based on pre-trained models. Therefore, if certain types of faces (such as female, children, ethnic minorities) are under-represented in training datasets, then this bias will negatively impact its performance.

As NAFRS will collect, process, and store sensitive private information: facial biometrics for long periods; if not permanently — it will impact the right to privacy. Accordingly, it is crucial to examine whether its implementation is arbitrary and thus unconstitutional, i.e., is it ‘legitimate’, ‘proportionate to its need’ and ‘least restrictive’? What is its potential for abuse and misuse with the pending-status of the Personal Data Protection Bill (PDPB), and the absence of clear guidelines for its deployment? How does it impact other fundamental rights such as the right to dissent? Should NAFRS be banned or simply regulated?

The Federal Bureau of Investigation in the United States uses facial recognition technology for potential investigative leads; police forces in England use facial recognition to tackle serious violence. In other cases, countries such as China use facial recognition for racial profiling and mass surveillance — to track Uighur Muslims. Policing and law and order being State subjects, some Indian States have started the use of new technologies without fully appreciating the dangers involved.


Test of ‘proportionality’

Facial recognition being an intrusive technology has an impact on the right to privacy. The Constitution of India does not explicitly mention the right to privacy. However, a nine-judge Bench of the Supreme Court, in Justice K.S. Puttaswamy vs Union of India (2017) recognised it as a precious fundamental right. Since no fundamental right can be absolute and thus even in respect of privacy, the state may impose reasonable restrictions on the grounds of national integrity, security of the state, public order, etc.

The Supreme Court, in the K.S. Puttaswamy judgment provided a three-fold requirement (which was reiterated in Anuradha Bhasin while examining denial of the ‘right to Internet’ to the people of Kashmir) to safeguard against any arbitrary state action. Accordingly, any encroachment on the right to privacy requires the existence of ‘law’ (to satisfy legality of action); there must exist a ‘need’, in terms of a ‘legitimate state interest’; and, the measure adopted must be ‘proportionate’ (there should be a rational nexus between the means adopted and the objective pursued) and it should be ‘least intrusive.’ Unfortunately, NAFRS fails each one of these tests.

Also read | Coming to terms with biometrics in policing

First, NAFRS lacks ‘legitimacy’. It does not stem from any statutory enactment (such as the DNA Technology (Use and Application) Regulation Bill 2018 proposed to identify offenders or an executive order of the Central Government. Rather, it was merely approved by the Cabinet Committee on Economic Affairs in 2009 during United Progressive Alliance rule. Second, and more importantly, even if we assume that there exists a need for NAFRS to tackle modern day crimes, this measure is grossly disproportionate. This is because to satisfy the test of ‘proportionality’, benefits for the deployment of this technology have to be sufficiently great, and must outweigh the harm. For NAFRS to achieve the objective of ‘crime prevention’ or ‘identification’ will require the system to track people on a mass scale — avoiding a CCTV in a public place is fiendishly difficult — resulting in everyone becoming a subject of surveillance: a disproportionate measure. In the absence of a strong data protection law or clear guidelines on where this technology can be used or who can be put on a watch list? And, how long the system will retain sensitive personal data of those the surveilled people, NAFRS will indeed do more harm than good.


Impact on rights

From a technical angle, facial recognition technology can be tasked to ‘identify’, among other uses, cases. In doing so, one faceprint is compared to many other faceprints stored in a database (known as 1:N matching). In some cases, it is known that the person to be identified exists in the database, whereas, in other scenarios, it is not (for e.g., when persons are checked against watch lists). This is where its deployment becomes hugely worrisome. With the element of error and bias, facial recognition can result in profiling of some overrepresented groups (such as Dalits and minorities) in the criminal justice system.

Further, as anonymity is key to functioning of a liberal democracy, unregulated use of facial recognition technology will dis-incentivise independent journalism or the right to assemble peaceably without arms, or any other form of civic society activism. Due to its adverse impact on civil liberties, some countries have been cautious with the use of facial recognition technology. The Court of Appeal in the United Kingdom ruled the use of facial recognition technology by South Wales as unlawful in the absence of clear guidelines. In the United States, the Facial Recognition and Biometric Technology Moratorium Act of 2020 was introduced in the Senate to prohibit biometric surveillance without statutory authorisation. Similarly, privacy watchdogs in the European Union have called for a ban on facial recognition.

Also read | Investors call for ethical approach to facial recognition technology

Unchecked pathway

At present, the Information Technology Act 2000, and the Rules framed thereunder offer broad powers to the Central government to infringe privacy in the name of the sovereignty, integrity or the security of the state. The Personal Data Protection Bill 2019 is not much different. It gives the central government unchecked power for the purposes of surveillance — it can exempt any agency of the Government from the application of the proposed law in the name of legitimate state interest.

Without adequate safeguards such as penalties that are dissuasive and sufficiently deterrent, police personnel may routinely use facial recognition technology. In sum, even if facial recognition technology is needed to tackle modern-day criminality in India, without accountability and oversight, facial recognition technology has strong potential for misuse and abuse. In the interest of civil liberties and to save democracy from turning authoritarian, it is important to impose a moratorium on the use of facial recognition technology till we enact a strong and meaningful data protection law, in addition to statutory authorisation of NAFRS and guidelines for deployment. If the Government has the will, it can get any law passed with god speed just like the recently passed 20 Bills including the OBC Bill or three Farm Bills.

Faizan Mustafa is Vice-Chancellor, National Academy of Legal Studies and Research (NALSAR) University of Law, Hyderabad. Utkarsh Leo is Assistant Professor at NALSAR, Hyderabad. The views expressed are personal

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in


Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.