By PTI
NEW DELHI: An independent board set up by Facebook has overturned the social media giant’s decision to remove a user’s post under its rules on ‘Dangerous individuals and Organisations’, and urged the social media platform to “avoid mistakes” that “silence the voices of religious minorities”.
In February, the Oversight Board had taken up the case related to a user’s post that was shared from a Punjabi-language online platform and contained insinuations against Rashtriya Swayamsevak Sangh (RSS) and Prime Minister Narendra Modi.
While the post was initially taken down for violating Facebook’s community guidelines, the social media company later restored the content.
The Oversight Board is an independent body set up by Facebook last year to look into hate speech and other undesirable content on the platform.
“The Oversight Board has overturned Facebook’s decision to remove a post under its Dangerous Individuals and Organizations Community Standard. The Board expressed concerns that Facebook did not review the user’s appeal against its original decision,” a statement said.
The Board also urged the company to “take action to avoid mistakes which silence the voices of religious minorities”.
The Board found that Facebook’s original decision to remove the post was not consistent with the company’s Community Standards or its human rights responsibilities, it said.
The Board noted that the post highlighted the concerns of minority and opposition voices in India that are allegedly being discriminated against by the government, it added.
“It is particularly important that Facebook takes steps to avoid mistakes which silence such voices.
While recognizing the unique circumstances of COVID-19, the Board argued that Facebook did not give adequate time or attention to reviewing this content,” the statement said.
The Board also said that Facebook’s transparency reporting makes it difficult to assess whether enforcement of the ‘Dangerous Individuals and Organizations’ policy has a particular impact on minority language speakers or religious minorities in India.
In a policy advisory statement, the Board recommended that Facebook translate its Community Standards and Internal Implementation Standards into Punjabi, and that the social media platform should also aim to make its Community Standards accessible in all languages widely spoken by its users.
It also suggested that Facebook restore both human review of content moderation decisions and access to a human appeals process to pre-pandemic levels as soon as possible while protecting the health of Facebook’s staff and contractors.
The Board was also of the view that Facebook should increase public information on error rates by making this viewable by country and language for each Community Standard in its transparency reporting.