Content monitoring National

Oversight Board overturns Facebook’s decision to remove a post related to ‘genocide of Sikhs’

Representational Image

Representational Image

The Oversight Board, an independent body set up by Facebook, on Thursday said it has overturned the social network’s decision to remove a November 2020 post alleging that the RSS and PM Narendra Modi were threatening to kill Sikhs in India.

“The Oversight Board has overturned Facebook’s original decision to remove a post about India’s Sikh community under its rules on dangerous individuals and organisations,” the Board, while expressing concerns that Facebook did not review the user’s appeal against its original decision.

“The Board also urged the company to take action to avoid mistakes which silence the voices of religious minorities,” it said.

The case relates to a November 2020 post where a user shared a video post from online media company Global Punjab TV featuring a 17-minute interview with “a social activist and supporter of the Punjabi culture” Professor Manjit Singh. In the text accompanying the post, the user claimed the RSS was threatening to kill Sikhs, and to repeat the “deadly saga” of 1984. The user also alleged that Prime Minister Modi himself is formulating the threat of “Genocide of the Sikhs” on advice of the RSS President, Mohan Bhagwat.

This post was viewed fewer than 500 times and taken down after a single report by a human reviewer for violating Facebook’s Community Standard on dangerous individuals and organisations.

“This triggered an automatic restriction on the user’s account. Facebook told the user that they could not review their appeal of the removal because of a temporary reduction in review capacity due to COVID-19,” the Board said.

After the user submitted their appeal to the Board, Facebook identified the removal of this post as an enforcement error and restored the content. “Facebook noted that none of the groups or individuals mentioned in the content are designated as “dangerous” under its rules. The company also could not identify the specific words in the post which led to it being removed in error,” it said.

The Board said it found that Facebook’s original decision to remove the post was not consistent with the company’s Community Standards or its human rights responsibilities.

“The Board noted that the post highlighted the concerns of minority and opposition voices in India that are allegedly being discriminated against by the government. It is particularly important that Facebook takes steps to avoid mistakes which silence such voices,” it said.

While recognising the unique circumstances of COVID-19, the Board stated that Facebook did not give adequate time or attention to reviewing this content. “It stressed that users should be able to appeal cases to Facebook before they come to the Board and urged the company to prioritize restoring this capacity,” it said.

The Board also noted that Facebook’s transparency reporting makes it difficult to assess whether enforcement of the Dangerous Individuals and Organisations policy has a particular impact on minority language speakers or religious minorities in India.

In a policy advisory statement, the Board recommends that Facebook translate its Community Standards and Internal Implementation Standards into Punjabi and should also aim to make its Community Standards accessible in all languages widely spoken by its users.

Further, Facebook should restore both human review of content moderation decisions and access to a human appeals process to pre-pandemic levels as soon as possible, while protecting the health of Facebook’s staff and contractors.

“Increase public information on error rates by making this viewable by country and language for each Community Standard in its transparency reporting,” the board has recommended.

Our code of editorial values

Related Topics
This article is closed for comments.
Please Email the Editor

Printable version | May 20, 2022 2:30:35 am |