The Meta Oversight Board, an independent body overseeing Meta’s content policies, has issued a decision urging Meta to update its approach to tracking Holocaust denial content and enforcing bans on false information.
This comes in response to Meta’s handling of an Instagram post that spread false and distorted information about the Holocaust.
Oversight Board reverses Meta’s decision on Holocaust denial post on Instagram
The Oversight Board overturned Meta’s initial decision to leave the controversial Instagram post untouched.
The post, which featured a meme of the “SpongeBob SquarePants” character Squidward, contained false claims presented as “Fun Facts About The Holocaust.”
Among these claims were doubts about the number of Holocaust victims and the existence of crematoria at Auschwitz.
When the Oversight Board began reviewing the case, Instagram had allowed the post to remain on its platform.
Read More: DeSantis supporters lean towards Trump as 2nd choice in New Hampshire primary
Oversight Board urges Meta to enhance Holocaust denial monitoring
This decision raised concerns, especially considering that the content had been initially posted in September 2020, a month before Meta updated its hate speech guidelines to prohibit Holocaust denial explicitly.
In addition to overturning the decision on the content, the Oversight Board has recommended that Meta take proactive steps to improve its monitoring and enforcement of Holocaust denial content.
One of the key recommendations is the creation of a system to label enforcement data, aiding Meta in tracking Holocaust denial posts and the effectiveness of its enforcement measures.
During the case review, the board identified issues related to Meta’s COVID-19 automation policy, which had been implemented due to reduced human reviewer capacity at the start of the pandemic.
Meta’s content moderation challenged by Holocaust denial case review
The policy resulted in some reports of Holocaust denial posts being automatically closed, further complicating the content moderation process.
The Oversight Board has called on Meta to provide public confirmation regarding the status of its COVID-19 automation policy.
Questions have arisen about why a user’s appeal of Instagram’s decision to keep the post up in May received an auto-closed message, even after the World Health Organization and the U.S. declared the end of the COVID-19 public health emergency.
The post at the center of the case had been reported six times for hate speech, with four reports made before the policy update to ban Holocaust denial.
Also Read: American volunteers in Ukraine Cry Out for Urgent Assistance
Meta balances automation, human review following Oversight Board decision
Two reports resulted in human reviews, while others were processed through automation, leading to mixed outcomes.
Meta welcomed the Oversight Board’s decision, acknowledging that the content in question had already been removed.
The company also committed to initiating a review of similar content with parallel context and taking appropriate action if it falls within its technical and operational capacity.
Meta Board calls for better moderation against Holocaust denial content
The Meta Oversight Board’s decision highlights the ongoing challenges social media platforms face in moderating sensitive and false content.
It emphasizes the need for stronger measures to combat Holocaust denial and improve content enforcement mechanisms, particularly in the context of evolving policies and automated systems.
Read Next: U.S. troops in Iraq assessed for traumatic brain injuries after Iran-backed militia attack