Facebook moderators press for pandemic safety protections

A petition signed by the contract workers living in various countries said Facebook should guarantee better conditions or allow the workers to continue their jobs from home.

November 19, 2020 11:06 am | Updated 11:56 am IST

Facebook moderators press for pandemic safety protections.

Facebook moderators press for pandemic safety protections.

(Subscribe to our Today's Cache newsletter for a quick snapshot of top 5 tech stories. Click here to subscribe for free.)

More than 200 Facebook content moderators demanded better health and safety protections Wednesday as the social media giant called the workers back to the office during the pandemic.

A petition signed by the contract workers living in various countries said Facebook should guarantee better conditions or allow the workers to continue their jobs from home.

“After months of allowing content moderators to work from home, faced with intense pressure to keep Facebook free of hate and disinformation, you have forced us back to the office,” said the open letter released by the British-based legal activist firm Foxglove.

The letter called on Facebook to “keep moderators and their families safe” by maintaining remote work as much as possible and offering “hazard pay” to those who do come into the office.

When the pandemic hit, Facebook sent home most of its content moderators -- those responsible for filtering violent and hateful images as well as other content which violates platform rules.

But the social platform discovered limits on what remote employees could do and turned to automated systems using artificial intelligence, which had other shortcomings.

“We appreciate the valuable work content reviewers do and we prioritize their health and safety,” a Facebook spokesperson said in a statement to AFP.

“The majority of these 15,000 global content reviewers have been working from home and will continue to do so for the duration of the pandemic,” the spokesperson said.

The workers' letter said the current environment highlights the need for human moderators.

“The AI wasn't up to the job. Important speech got swept into the maw of the Facebook filter -- and risky content, like self-harm, stayed up,” the letter said.

“The lesson is clear. Facebook's algorithms are years away from achieving the necessary level of sophistication to moderate content automatically. They may never get there.”

The petition said Facebook should consider making the moderators full employees -- who in most cases may continue working remotely through mid-2021.

“By outsourcing our jobs, Facebook implies that the 35,000 of us who work in moderation are somehow peripheral to social media,” the letter said, referring to a broader group of moderators that includes the 15,000 content reviewers.

“Yet we are so integral to Facebook's viability that we must risk our lives to come into work.”

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.