YouTube algorithm recommends violent content, hate speech, says Mozilla

As per the data analysed, 71% of the videos that volunteers of the project reported as regrettable were actively recommended by YouTube’s algorithm.

Updated - July 10, 2021 05:16 pm IST

The 10-month long crowdsourced investigation, which Mozilla claims is the largest-ever, revealed that people in non-English speaking countries saw 60% higher chances of encountering disturbing videos.

The 10-month long crowdsourced investigation, which Mozilla claims is the largest-ever, revealed that people in non-English speaking countries saw 60% higher chances of encountering disturbing videos.

A new research conducted by Mozilla has found that YouTube’s algorithm is recommending videos with misinformation, violent content, hate speeches and scams.

(Subscribe to our Today's Cache newsletter for a quick snapshot of top 5 tech stories. Click here to subscribe for free.)

The 10-month long crowdsourced investigation, which Mozilla claims is the largest-ever, revealed that people in non-English speaking countries saw 60% higher chances of encountering disturbing videos.

The company conducted the research using RegretsReporter, an open-source browser extension in which people voluntarily donate their data, to provide researchers access to a pool of YouTube’s recommendation data.

As per the data analysed, 71% of the videos that volunteers of the project reported as regrettable were actively recommended by YouTube’s algorithm.

Also Read | U.S. lawmakers call YouTube Kids a 'wasteland of vapid' content

About 200 of such videos have been removed by YouTube that had a collective 160 million views before they were taken offline.

“YouTube needs to admit their algorithm is designed in a way that harms and misinforms people,” Brandi Geurkink, Mozilla’s Senior Manager of Advocacy said in a statement.

“Our research confirms that YouTube not only hosts, but actively recommends videos that violate its very own policies.”

Stressing further, Geurkink noted that one person who watched videos about the U.S. military was then recommended a misogynistic video titled “Man humiliates feminist in viral video.” While another person who watched a video on software rights was recommended a video about gun rights.

In another instance, a person after watching an Art Garfunkel music video, was recommended a highly-sensationalised political video titled “Trump Debate Moderator EXPOSED as having Deep Democrat Ties, Media Bias Reaches BREAKING Point.”

Also Read | YouTube to deduct taxes from non-U.S. creators

Mozilla found that recommended videos were 40% times more likely to be regretted than videos searched for and in 43.6% of cases where Mozilla had data about videos a volunteer watched before a regret, the recommendation was completely unrelated to the previous videos that the volunteer watched.

To address the issues, Guerkink suggests having common sense transparency laws, better oversight, and consumer pressure can help reign in this algorithm. The other recommendations include publishing frequent and thorough transparency reports that include information about their recommendation algorithms, providing people with the option to opt-out of personalized recommendations and enacting laws that mandate AI system transparency and protect independent researchers

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.