YouTube removes 11.4 million videos as automation replaces human moderator

YouTube removed more videos than ever in the last quarter.   | Photo Credit: Reuters

YouTube took down over 11.4 million videos in the April-June quarter compared with about 9 million videos in the same quarter last year. The increase in removal comes at a time when the video-streaming platform replaced human moderators with algorithms.

Most videos were removed in the US, followed by India and Brazil. YouTube withdrew a little over 2 million videos in the US, 1.4 million in India, and about a million in Brazil.

“When reckoning with greatly reduced human review capacity due to COVID-19, we were forced to make a choice between potential under-enforcement or potential over-enforcement,” YouTube said in a blog post.

“Because responsibility is our top priority, we chose the latter—using technology to help with some of the work normally done by reviewers,” it added.

Each quarter, millions of videos are first flagged by automated systems and later reviewed by human review team. However, the second quarter of 2020 was the first full quarter where YouTube operated with automated systems without human review.

In a blog on March 16, the company had said that for safety and well-being of its employees, it will start relying more on technology to reduce work of human reviewers. It warned that users and creators might see an uptick in the number of videos being removed including videos that might not violate policies.

To minimise disruption among creators, it added more staff to handle requests to make sure that appeals were quickly reviewed. The numbers of appeals and reinstatement rate doubled from the previous quarter. The number of videos reinstated on appeal increased from 25% in first quarter to 50% in the second.

YouTube said that over enforcement of policies out of caution resulted in a more than three times increase in removal of content tied to violent extremism or potentially harmful to children- dares, challenges, or innocently posted content that might endanger minors.

“For certain sensitive policy areas, such as violent extremism and child safety, we accepted a lower level of accuracy to make sure that we were removing as many pieces of violative content as possible,” YouTube said. This also means that, in these areas specifically, a higher amount of content that does not violate policies was also removed, it added.

This article is closed for comments.
Please Email the Editor

Printable version | Jan 20, 2021 6:40:13 AM |

Next Story