While Shalini Kantayya was working on Coded Bias for two years, she would be asked at social gatherings what her projet du jour was. The filmmaker would often find it hard to explain the complexities of technologies with controversial biases, powered by Artificial Intelligence, so she would simply say ‘I’m working on a film about racist robots’.
Also Read | Get ‘First Day First Show’, our weekly newsletter from the world of cinema, in your inbox . You can subscribe for free here
Over a video call with MetroPlus from her home in Brooklyn, New York, Kantayya laughs as she recalls these moments. Switching to a more serious and urgent tone she says, “We are living in an era where it is absolutely necessary for the public to understand complex science and how these systems work. Data rights are the unfinished business of the civil rights movement.”
- Coded Bias premiered at Sundance Film Festival in January 2020 where it was nominated for ‘US Documentary Grand Jury Prize’ among other awards at numerous other film festivals. This year it won‘Best Director’ and ‘Grand Jury Prize for Transparency’ at the Social Impact Media Awards, and ‘Prize of OMCT (World Organisation Against Torture)’ at the International Film Festival and Forum on Human Rights.
- It is not Kantayya’s first tech-based documentary. Kantayya, who is also an environmental activist, has directed projects about big tech and clean energy: Catching The Sun (2015) and Breakthrough (2016), a National Geographic series.
Problems, galore
Documentary Coded Bias , currently streaming on Netflix, is making waves for its narrative around the troubling intersections of AI, facial recognition and racial bias. It is packed with compelling plot twists around the future of a surveillance state. Kantayya has let researchers Joy Buolamwini, Deborah Raji, Meredith Broussard, Cathy O’Neil, Zeynep Tufekci, Safiya Noble, Timnit Gebru, and Virginia Eubanks, all women of minority communities, explain the concept.
Kantayya says working with these academics has been “incredibly humbling”. Her driving force to make the film lies in the opening scenes of Coded Bias, where Buolamwini, as a young grad student, tries to get the camera to recognise her face. It registers a blank, white mask while not registering her face.
Large parts of the film were shot in the United Kingdom where facial recognition is openly used by governments. Freedom rights group Big Brother Watch features in Coded Bias as group director Silkie Carlo speaks against the privacy breaches and how racial profiling has been contentious for thousands of immigrant families. In one scene, a black 14-year-old is flanked by authorities and fingerprinted, having been marked by a facial recognition system. Big Brother Watch intervenes with the authorities who, in turn, justify the flawed system. “As many times as I edited and watched this scene over, I never got over it,” Kantayya shakes her head. “It could have resulted in a fatality, and it was never explained to the child why they were stopped but the child is just so calm.”
The filmmaker, whose family comes from Madurai, is well aware of the ‘dark skin obsession’ across offline spaces. “I realised this is not a technology that is being beta tested on a shelf somewhere in a laboratory. This was tech being sold to the FBI, immigration officials, and being deployed by law enforcement departments across the US with no one we had elected.”
She, like millions of others, is aghast at how this became a government oversight. “Law enforcement bodies all over the world are picking up the tools of authoritarian states with no democratic rule that would protect our civil rights. This is frightening to me that, as we trust these systems, we could roll back on these civil rights that help make society more equal.”
Making it accessible
Kantayya is aware she is not a technologist — in fact, this helped her. “Anyone challenging the system has felt like an imposter thinking ‘I didn’t go to MIT or Harvard, who am I to talk about these issues?’ When it comes to technologies such as AI or facial recognition, all of the knowledge and power is in the hands of a few. These technologies dovetail with almost every freedom we enjoy in democracies.”
In the film, mathematician O’Neil points out this power is clearly on one side, which makes it hard for the masses to question the technologies and nearly impossible to know when and if you are under surveillance. Kantayya gives the example of how China’s law enforcement has “unfettered access” to facial recognition systems that help officials track down members of religious minorities. “If you look at India, there is a long history of social movements and of people’s participation in democratic process,” she remarks.
Through Coded Bias, she learned how there is no power like that of big tech. “Three years ago I didn’t even know what an algorithm was,” she admits. “Everything I knew about AI came through the imagination of Steven Spielberg or Stanley Kubrick; I think my ‘street cred’ comes from being a sci-fi fanatic. I didn’t understand what facial recognition really was, how algorithms work or how Machine Learning, AIs and algorithms were gatekeepers of opportunity until I discovered Joy’s work. They were gatekeepers in that they decide who gets hired, who gets healthcare and who gets scrutiny.”
For research, Kantayya had also watched Buolamwini’s 2016 TEDx talk ‘How I’m Fighting Bias in Algorithms’ and understood how human beings are actively outsourcing decision-making to machines, “in ways that really shift human destinies.”
Buolamwini’s TEDx talk also shed light on how populations trust implicitly in these enigmatic technologies. “They have not been vetted for gender bias or racial bias, for whether or not they can cause harm to people, or even for some shared standard of accuracy outside of a company that stands to benefit economically.”
The director particularly enjoyed condensing complex sciences into two-minute sound bytes for the lay-person, while ensuring it is all visually stimulating. Kantayya drew on the visual language of science fiction which is what she knows best. “It was important that the science had integrity, distilled in a way that was easily accessible and digestible by audiences while holding relevance for their lives, instead of remaining abstract.”
She concludes that Coded Bias reveals that where there is more education and dialogue around these technologies, policy is quick to be implemented. She hopes audiences watch Coded Bias to not just understand these technologies better but also consider big tech companies and governments accountable. “I hope this is the movie that pulls a chair out for all of us and gives us a seat at the table, because these systems influence all of our lives and opportunities.” The film reminds people that, at the end of the day, human beings create these technologies and, in turn, these technologies perpetuate the creators’ biases.