Site navigation

Facebook’s regulations on violence, terrorism, discrimination and exploitation uncovered

Andrew Hamilton

,

Content Moderation is becoming an ever greater issue for social media giants

Facebook’s internal regulations for content moderation have been released by The Guardian.

The information, published in a report titled The Facebook Files, covers more than 100 training and guidance manuals on content moderation and making decisions 0n what content is acceptable.  According to the documents, images relating to violence, terrorism, pornography, discrimination and self-harm are all candidates for moderation. The disclosure of these regulations has come at a time of intense socio-political pressure for Facebook, which is still reeling from the broadcasted shooting of Robert Godwin Snr. and the more recent murder-suicide of a Thai father and his 11 year-old daughter.

The files reveal the difficulties that content moderators face, both in the images that confront them and in the decisions to police it. Many of the images which are candidates for deletion are particularly distressing in nature, and can range from cannibalism to child and animal abuse.

Reports are emerging of the effects that long exposure to these images can have on moderators. In January content managers in a similar role in Microsoft sued the company for exposing them to images of ‘indescribable sexual assaults’ and ‘horrible brutality’. They claim that facing the images resulted in post-traumatic stress disorder (PTSD).

Sarah Roberts, Assistant Professor of Information Studies at UCLA said to the Guardian: “People can be highly affected and desensitized. It’s not clear that Facebook is even aware of the long term outcomes, never mind tracking the mental health of the workers”.

Snap Decision Making

Making a decision on deletion is also extremely arduous. In policing content, moderators are instructed to look for ‘expressions’ of controversial themes from the poster, in order to determine whether the images are grounds for removal. Because of Facebook’s small team of moderators –  according to CEO Mark Zuckerberg only around 4,500 strong – units apparently have ‘seconds’ to decide whether content should be removed.

On top of this, Facebook’s guidelines are also accused of being too confusing, with the credibility of threats, for example, based largely upon the specificity of the poster’s claim. Facebook’s Abuse Standards on Credible Violence describes a fine line between satirical or angry posts, versus actual intent to cause harm. Meanwhile, threats made against more vulnerable targets such as heads of state, legal witnesses and informants are apparently always deemed as credible.

The slides detail: “Facebook is a global community where millions of people connect with each other. Each of these people represents unique opinions, ideals and cultural values. When millions of people get together to share things that are important to them, sometimes these discussions and posts include controversial topics and content.”

Censorship And Self Harm

Images of self-harm is another problematic subject, where Facebook’s policy is to allow the act if the person within the images can be helped, but should be balanced against ‘copycat’ acts. Instances of publicised self-harm are on the rise, with around 5,000 reports on Facebook flagged every two weeks, heightening the risk of mimicry. In a recent policy update, moderators were told: “We don’t want to censor or punish people in distress who are attempting suicide. Experts have told us what’s best for these people’s safety is to let them livestream as long as they are engaging with viewers.

“However, because of the contagion risk, what’s best for the safety of people watching these videos is for us to remove them once there’s no longer an opportunity to help the person. We also need to consider newsworthiness, and there may be particular moments or public events that are part of a broader public conversation that warrant leaving up.”

Regulation And Legislation

These topics are merely one aspect of the full disclosures, which are sure to enrich the debate surrounding Facebook and regulation. Since the 2016 U.S. Presidential Election, the site has struggled with allegations of ‘fake news’ and violence, which continue to plague the platform. However, Facebook may face even tighter regulations under proposals including the Malicious Communications (Social Media) Bill, currently up for consideration in the U.K. Parliament, which could afford investigatory powers to communications regulators such as Ofcom.

With a growing global audience of 2 Billion monthly users, Facebook has established itself as an integral part of many people’s lives. How they choose to use the platform and what they share will continue to evolve and challenge both the company and society. How the company responds to these challenges promises to be scrutinised by users, legislators, the media and free speech advocates alike.

Andrew Hamilton

Andrew Hamilton

PR & Content Executive at Hutchinson Networks

Latest News

%d bloggers like this: