How facial recognition technology is redefining biological data

The solution FR should be limited to one’s own device, and not up on any server or cloud  

‘Is this you?’ The Facebook algorithm ventured to ask my friend the other day, once again showing her a photo of her sister, elder to her by five years — the duo look startlingly similar, but not identical. The algorithm’s constant confusion between the two sisters is not just a source of mirth for my friend, but also indicative of a deeper concern: how smart is facial recognition and how much power does this technology deserve?

Most recently, the massive appeal of Wireless Lab’s FaceApp, an application that showed strangely accurate representations of how you might look 50 years down the line, raised the debate around facial recognition again. “Billions of people are handing over their facial data to one company!” “Pshhh, they have all that data already, so what’s the point now!” are the opposing points of view thrown around across the privacy debate wall.

But while India’s netizens were discussing this, the National Crime Records Bureau has proposed that a centralised Automated Facial Recognition System be created. It claims that this will help the police keep a repository of data on suspects, captured, from among other sources, CCTV footage. Even more recently, Delhi and Hyderabad airports have launched face-recognition systems for granting entry.

Automated Facial Recognition System and citizens' privacy

The anatomy of it all

What Facial Recognition (FR) technology essentially does is to “read” the geometry of your face and its biometrics: the distance between your eyes, between your forehead and chin, between your ears. This facial data can then be used to match with an image already available in the repository of the agency in question, in order to identify a person.

So why be concerned? First off, there is no guarantee that FR works accurately. In American cities such as San Francisco, the heart of technological developments, Somerville and Oakland, the government has banned the use of FR technology, due to its inaccuracies. There have also been reports that the mis-identification of minorities was higher than Caucasian communities.

“The Metropolitan police in London — a city with a high density of CCTV cameras — commissioned an independent study of FR, and found that the software was wrong four out of five times,” says Apar Gupta, Executive Director at Internet Freedom Foundation, Delhi.

“But even considering FR works well, it is the turning of public spaces into policed zones of conduct that is worrying. It puts people under perpetual watch, resulting in possible moral and social policing,” he says, quoting examples of young couples and minorities.

In 2017, the Supreme Court, in a historic judgement, declared privacy to be a fundamental right. However, there is a clause that it can be overridden in case of a reasonable public purpose — such as safety. “Under the guise of women’s safety, women may be more policed in a paternal sense, where their liberties are restricted,” he says.

Moreover, as things stand right now, we don’t even have legal framework to rely on, in case there is misuse of our facial data. There is no surveillance oversight by judiciary, nor is there a data protection authority which makes sure that facial information gathered is not put to use for other purposes. (This column previously spoke about deep-fakes (face swapping); something that is assisted by FR.)

But what about convenience?

“We call it the privacy paradox,” says Apar. People don’t mind sharing their data as long as it makes life convenient for them — unlocking phones, saving browsing history and passwords… But for that data to be used by other parties? How much consent has been given is not clear.

People can consent to their facial data being gathered, but that consent doesn’t extend to being sold or used without their knowledge and agreement. “The only framework in place currently is through a privacy policy, which is too clunky, verbose and inaccessible for the majority,” he points out. Instead, he feels, facial recognition should be limited to one’s own device, and not up on any server or cloud. So when agreeing to share facial data, ask yourself this: would you share your signature or fingerprint with just anyone? And then, use the same logic.





Byte-sized play-by-plays of tech concepts
Related Topics
This article is closed for comments.
Please Email the Editor

Printable version | May 7, 2021 12:21:38 AM |

Next Story