Facial recognition technology — is India ready for it?

The technology is admittedly fallible and presents potential for grave abuse that extends to human rights violation. Its deployment for security and law enforcement should come with strong safeguards against misuse.

September 04, 2019 05:12 pm | Updated 05:18 pm IST

Imagine a human being fooled by a fake moustache or glasses. Scary thing is — artificial intelligence can be. Now, imagine this fallible AI in charge of decisions that affect human rights.

Imagine a human being fooled by a fake moustache or glasses. Scary thing is — artificial intelligence can be. Now, imagine this fallible AI in charge of decisions that affect human rights.

This is a blog post from

Now that The Great Hack has been out for a while, it is time to ask — if someone tells you that a digital representation of your face could be used to gain access to your phone without your consent — or, even worse, as evidence against you in a criminal case — would you believe it? Reporters for Forbes in the United States were able to break into phones using 3D-printed models of their own faces . A university student in Baltimore faced death threats after she was incorrectly identified , by Sri Lankan authorities relying on facial recognition technology, as a suspect in the recent terror bombings.

Facial recognition technologies are slowly becoming ubiquitous in India . The Maharashtra government recently deployed the technology in Mumbai, where it will be integrated with the 10,000-strong contingent of CCTV cameras. Police in Delhi, Amritsar, and Surat have been using facial recognition since as early as mid-2018. The GMR Hyderabad International Airport recently introduced it at its passenger entry points to facilitate paperless travel.

However, this technology that the police proposes to use to identify and prosecute suspects is far from perfect. It works by sourcing images of a face and converting it into a digital representation to be matched against an existing database of faces. These images could be sourced from CCTV cameras, driver’s licences, news reports, and even social media. Any face that is scanned to be matched against the existing database could be added to it. If you are a fan of Game of Thrones , think of it like a virtual hall of faces. Consider this — as of 2016, U.S. law enforcement agencies had a facial database covering almost half the country’s population . Surely, half the population of the U.S. are not criminals. But considering how easy it is to be added to the database, it is not surprising that as of 2017, 80% of the entries in the database are non-criminal. It is not difficult to see how this technology could easily be set up for abuse.

 

 

There are ample reasons to suspect the infallibility or accuracy of the technology. Research conducted by the Massachusetts Institute of Technology has revealed that facial recognition algorithms consistently misidentify faces . In one case, the technology classified darker-skinned women as men 30% of the time the algorithm was used. Would you use a barcode scanner if it worked correctly only 70% of the time? In addition to inaccuracy, the technology also suffers from significant biases. Facial recognition relies on Artificial Intelligence, and the bias inherent in the use and deployment of AI is currently a big problem to solve. This predominantly affects Dalits, Muslims and tribals because they already comprise 55% of the undertrials in India despite being only 39% of the population. These biases may only get magnified with the use of AI and facial recognition.

Consider the impact the use of this fallible and biased technology could have on your rights. A visual surveillance system that could identify you from a database of faces can easily track your movement through the city, whom you meet, where you assemble, and quite possibly even what you said. All of this directly implicates your personal liberty, freedom of expression, right to assemble peacefully, right to move freely throughout the country, and even your privacy. Just envision a situation where you are incorrectly identified as a suspect in a criminal activity, the police manage to convince the magistrate to issue a warrant for arrest, and you find yourself in a jail cell wondering how this even came to be.

 

Deployment of facial recognition tech by the government must be preceded and accompanied by a strict and transparent governance and accountability framework.

 

Companies that have developed such systems have also called for regulation of the technology. For example , Microsoft recognises that apart from violating peoples’ privacy, its large-scale use by governments could encroach upon democratic freedoms. The company also says that certain uses of the technology increase the risk of outcomes that are biased and could violate laws prohibiting discrimination. San Francisco and Oakland, two major cities in Silicon Valley, have already banned the use of this technology by the police.

So what safeguards could you possibly have against the government’s use of this invasive technology? Pretty much nothing so far. There is no data protection law to regulate the government’s use of your data. There seems to be no official policy or document that guides the use of this technology. There is no transparency regarding the data that has been used to train the technology. There seems to be no system of accountability against its abuse or misuse by law-enforcement authorities. There is no clarity if the accused would be provided access to their data so they may refute the evidence in court.

Facial recognition technology may show promise for policing and security purposes. But it is imperative that we approach its use with appropriate caution and due regard for constitutional rights. Before allowing the use of this highly invasive technology, it should be put through strict Parliamentary scrutiny and oversight in addition to stringent privacy rights as part of the data protection bill. Deployment of facial recognition tech by the government must be preceded and accompanied by a strict and transparent governance and accountability framework.

Top News Today

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.