Facebook’s ‘Supreme Court’ tackles nudity, Nazi quotes, and Covid misinformation in first cases

Facebook’s Oversight Board, which was established to review the social media giant’s moderation decisions, has accepted its first cases.

“More than 20,000 cases were referred to the Oversight Board following the opening of user appeals in October 2020”, the board said in its announcement.

“As the Board cannot hear every appeal, we are prioritising cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Facebook’s policies.”

The six appeals, five of which were referred by users, focus on the company’s policies on hate speech, adult nudity, and dangerous individuals and organisations.

This includes images of dead children posted alongside a criticism of China for its treatment of Uyghur Muslims, an image posted on Instagram of female breasts to raise awarenesss of signs of breast cancer, and a quote from Joseph Goebbels, the Reich Minister of Propaganda in Nazi Germany, that was used to criticise the Trump administration.

Each of the cases will be assigned to a five-member panel that includes one person from the region of which the content came. The board will deliberate on the case and Facebook will act on their decision within 90 days.

The case that Facebook submitted to the board was a video criticising French health officials for not authorising hydroxychloroquine as a cure for the coronavirus, which was viewed 50,000 times and shared 1,000 times.

Facebook removed the video for violating its policy on violence and incitement, and referred it to the board as “an example of the challenges faced when addressing the risk of offline harm that can be caused by misinformation about the COVID-19 pandemic.”

Each board member will serve no longer than three years, and currently includes journalists, federal judges, law professors, and the former Prime Minister of Denmark, Helle Thorning-Schmidt.

The development of Facebook’s Oversight Board comes as the company has been repeatedly criticised for its moderation policies.

“In the three years I’ve spent at Facebook, I’ve found multiple blatant attempts by foreign national governments to abuse our platform on vast scales to mislead their own citizenry, and caused international news on multiple occasions,” wrote Zhang. “I know that I have blood on my hands by now.”

Another Facebook engineer previously resigned, claiming that the company was “profiting off hate in the US and globally” because of inaction against violent hate groups and far-right militias using Facebook to recruit members.

“We don’t benefit from hate,” a Facebook spokesperson told The Independent in a statement at the time.

“We invest billions of dollars each year to keep our community safe and are in deep partnership with outside experts to review and update our policies. This summer we launched an industry leading policy to go after QAnon, grew our fact-checking program, and removed millions of posts tied to hate organizations — over 96 per cent of which we found before anyone reported them to us.”

Leave a Reply