How facial recognition technology is redefining biological data

A breakdown of how facial recognition works, how much power we should allow it to hold, and the safety versus privacy debate

July 30, 2019 12:57 pm | Updated 07:34 pm IST

The solution  FR should be limited to one’s own device, and not up on any server or cloud

The solution FR should be limited to one’s own device, and not up on any server or cloud

‘Is this you?’ The Facebook algorithm ventured to ask my friend the other day, once again showing her a photo of her sister, elder to her by five years — the duo look startlingly similar, but not identical. The algorithm’s constant confusion between the two sisters is not just a source of mirth for my friend, but also indicative of a deeper concern: how smart is facial recognition and how much power does this technology deserve?

Most recently, the massive appeal of Wireless Lab’s FaceApp, an application that showed strangely accurate representations of how you might look 50 years down the line, raised the debate around facial recognition again. “Billions of people are handing over their facial data to one company!” “Pshhh, they have all that data already, so what’s the point now!” are the opposing points of view thrown around across the privacy debate wall.

But while India’s netizens were discussing this, the National Crime Records Bureau has proposed that a centralised Automated Facial Recognition System be created. It claims that this will help the police keep a repository of data on suspects, captured, from among other sources, CCTV footage. Even more recently, Delhi and Hyderabad airports have launched face-recognition systems for granting entry.


The anatomy of it all

What Facial Recognition (FR) technology essentially does is to “read” the geometry of your face and its biometrics: the distance between your eyes, between your forehead and chin, between your ears. This facial data can then be used to match with an image already available in the repository of the agency in question, in order to identify a person.

So why be concerned? First off, there is no guarantee that FR works accurately. In American cities such as San Francisco, the heart of technological developments, Somerville and Oakland, the government has banned the use of FR technology, due to its inaccuracies. There have also been reports that the mis-identification of minorities was higher than Caucasian communities.

“The Metropolitan police in London — a city with a high density of CCTV cameras — commissioned an independent study of FR, and found that the software was wrong four out of five times,” says Apar Gupta, Executive Director at Internet Freedom Foundation, Delhi.

“But even considering FR works well, it is the turning of public spaces into policed zones of conduct that is worrying. It puts people under perpetual watch, resulting in possible moral and social policing,” he says, quoting examples of young couples and minorities.

In 2017, the Supreme Court, in a historic judgement, declared privacy to be a fundamental right. However, there is a clause that it can be overridden in case of a reasonable public purpose — such as safety. “Under the guise of women’s safety, women may be more policed in a paternal sense, where their liberties are restricted,” he says.

Moreover, as things stand right now, we don’t even have legal framework to rely on, in case there is misuse of our facial data. There is no surveillance oversight by judiciary, nor is there a data protection authority which makes sure that facial information gathered is not put to use for other purposes. (This column previously spoke about deep-fakes (face swapping); something that is assisted by FR.)

But what about convenience?

“We call it the privacy paradox,” says Apar. People don’t mind sharing their data as long as it makes life convenient for them — unlocking phones, saving browsing history and passwords… But for that data to be used by other parties? How much consent has been given is not clear.

People can consent to their facial data being gathered, but that consent doesn’t extend to being sold or used without their knowledge and agreement. “The only framework in place currently is through a privacy policy, which is too clunky, verbose and inaccessible for the majority,” he points out. Instead, he feels, facial recognition should be limited to one’s own device, and not up on any server or cloud. So when agreeing to share facial data, ask yourself this: would you share your signature or fingerprint with just anyone? And then, use the same logic.





Top News Today

Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in


Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.