Twitter to investigate racial bias in photo algorithm

Twitter has more analysis to do after photo algorithm showed signs of racial bias.   | Photo Credit: Reuters

(Subscribe to our Today's Cache newsletter for a quick snapshot of top 5 tech stories. Click here to subscribe for free.)

Twitter said on Monday it will look into the company’s photo algorithm that picks part of an image to be shown as preview.

"We tested for bias before shipping the model and didn’t find evidence of racial or gender bias in our testing," Liz Kelly, a member of Twitter communications, tweeted. "But it’s clear from these examples that we’ve got more analysis to do.”

Several users voiced their concerns when they noticed the machine learning (ML) model picking a white person when a pair of images of black and white individuals were displayed in photo preview on the timeline.

Tony Arcieri, a former systems engineer at Square, showed the algorithm’s bias with three pairs of photos of the former US President Barack Obama and US Senator Mitch McConnell.

When the first pair of pictures were inputted, the ML model highlighted McConnell’s face over Obama’s. In the next pair, Arcieri replaced the blue tie of Obama with McConnell’s red tie. For this input, the algorithm brought up Obama’s face over McConnell’s.

"It's the red tie! Clearly the algorithm has a preference for red ties!" Arcieri tweeted.

In the final pair, the skin colours were inverted to show a negative version. This time, the photo algorithm highlighted Obama’s negative image.

Twitter’s CTO Parag Agrawal tweeted that their team did an analysis before selling the model, “but [it] needs continuous improvement. Love this public, open and rigorous test- and eager to learn from this.”

Twitter's CDO Dantley Davis also addressed a few users, stating the finding as interesting and that the company will dig into other problems with the model.

This article is closed for comments.
Please Email the Editor

Printable version | Oct 31, 2020 11:24:49 PM |

Next Story