Site navigation

Facebook Oversight Board to Hear First Appeal Cases

Michael Behr

,

Facebook Oversight Board

The 20-strong committee will decide whether removed content should be restored to the platform in six cases that could set Facebook moderation policy.

The Facebook Oversight Board, an appeals body for content removed from the social media platform, has chosen its first batch of cases to review.

Six cases were selected from over 20,000 pieces of removed content referred to the board since appeals were opened in October. These include five appeals from users, and one directed from Facebook.

These were chosen based on whether the cases have “the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Facebook’s policies,” according to the board.

Of the cases, Facebook removed three for violating its hate speech policy, one for violating the company’s adult nudity and sexual activity policy, and one for the policy on spreading content from dangerous individuals and organisations.

The sixth case, which was referred to the committee by Facebook, was removed for violating the company’s violence and incitement policy. It involved the sharing of content within a Covid-19-related group.

Facebook referred the case to demonstrate to the board some of the challenges the platform faces over the risk of offline harm due to misinformation about the Covid-19 pandemic.

The content was appealed for several reasons – two were appealed on the argument that they only posted the removed content to criticise it. The content that violated the nudity clause was posted to raise awareness of breast cancer, according to the appeal.

Of the cases, five come from non-English sources. Facebook has previously struggled with moderating content published in different languages. Its moderation team speaks around 50 languages, less than half the 100 the platform officially supports on.

To help with this, at least one member on the five-person panel assigned to each case will come from the region from where the content originated. A decision should be made and implemented within 90 days.

“Once the Board has reached a decision on these cases, Facebook will be required to implement our decisions, as well as publicly respond to any additional policy recommendations that the board makes,” a statement from the committee reads.

The decisions will then be used to guide Facebook policy when dealing with similar cases.

The 20-person body is an independent structure designed as a final court of appeals over content acceptability on Facebook. Its members come from different backgrounds, including academics, journalists, politicians, and charity workers.

Facebook Director of Governance and Global Affairs Brent Harris noted that the board will include members with controversial views, saying the company chose “people who sit on this board who make a set of people uncomfortable”.

Recommended

The Facebook Oversight Board was established in late 2019 to deal with issues surrounding content moderation.

Big tech companies and their content moderation policy have been under the spotlight recently, especially their role in the recent US presidential election. They have faced accusations relating to alleged censorship of conservative opinions and publishing content on their platform that spreads misinformation.

In response, the companies have been implementing policies to restore faith in their platforms. For example, Facebook has created centres to combat the spread of misinformation about climate change and Covid-19.

Michael Behr

Senior Staff Writer

Latest News

Cybersecurity Data Protection
Cybersecurity Editor's Picks Trending Articles
%d bloggers like this: