Leave a comment

Leaked! Facebook rules that allow death threats, pictures of animal torture and videos of self-harm

Leaked! Facebook rules that allow death threats, pictures of animal torture and videos of self-harm
© Flashon | Dreamstime.com

It wasn't too long ago that Facebook announced its plans to hire 3,000 additional employees. These employees would join the 4,500 current employees on the "community operations team" who are tasked with moderating Facebook content. The announcement was Facebook's attempted response after the murder of 74-year-old Robert Godwin Sr. was streamed live on the social media platform.

When you do the math, that's a total of 7,500 people who are responsible for monitoring the posts of around 2 billion Facebook users. Or, to break it down even further, a ratio of 266,666 users for each moderator.

We told you then that the numbers just didn't add up, and that these Facebook employees were being given an impossible job. But one thing we didn't know at the time was what Facebook deemed acceptable to begin with.

Now, more than 100 of Facebook's internal documents have been leaked online, revealing the company's moderation policies on things such as pornography, racism, hate speech, violence and even terrorism. These documents include spreadsheets, flowcharts and training manuals that are used to direct employees on exactly what should be censored.

What guidelines are employees given?

In a recent statement, Monika Bickert, Facebook's head of global policy management admitted the difficulty in determining what type of content should be permitted or banned.

"We have a really diverse global community and people are going to have very different ideas about what is OK to share. No matter where you draw the line there are always going to be some [gray] areas," Bickert said. Yet, she also stated, "We feel responsible to our community to keep them safe and we feel very accountable. It's absolutely our responsibility to keep on top of it. It's a company commitment. We will continue to invest in proactively keeping the site safe, but we also want to empower people to report to us any content that breaches our standards."

But just how high are Facebook's standards? Well, not very. Take a look at the company's stance on content that would be considered offensive by many Facebook users.

  • Violent deaths: Videos containing a violent death are not always required to be removed since they can bring awareness to topics like mental illness. Instead, employees are encouraged to mark videos containing the violent death of a human adult as "disturbing." Deaths of children are viewed somewhat differently as the policy states, "we think minors need protection and adults need a choice." What's shocking is that, according to these standards, Facebook would have allowed live videos of people jumping from the Twin Towers on 9/11 to be posted and shared.
  • Threats: Before removing a threat, employees must determine if it's actually credible. For example, a threat on the life of President Trump will be deleted, whereas a comment like, "someone please kill my noisy neighbor" will likely slide by.
  • Non-sexual abuse: Unless there is a "celebratory element," abusive acts do not have to be removed.
  • Animal abuse: Should be marked as "disturbing" only if the content is extremely upsetting.
  • Nudity: Artwork such as paintings and drawings may contain nudity and depictions of sex acts, however, digitally created media cannot.
  • Abortions: A woman is allowed to live stream an abortion as long as no nudity is shown.
  • Self-harm: People attempting suicide or trying to hurt themselves in other ways will not be censored on Facebook, since the company's policy states not to "censor or punish people in distress."

A fine line

It's been obvious for quite some time that Facebook has struggled to adequately deal with controversial content shared on its platform, and these leaked documents reveal just how fine the line is between what's deemed appropriate and what's not.

Facebook moderators have also complained about the volume of work they're responsible for each day, with some even claiming they only have around 10 seconds to make a decision on the appropriateness of the content they're reviewing.

Governments are also getting involved. Earlier this year, Facebook's ability to monitor content was called into question by both the governments of Britain and Australia, with officials demanding that Facebook find a solution.

Still, we seem nowhere closer to a solution than we were before.

According to Bickert, these gray areas exist because Facebook is a "new kind of company." She explained that Facebook's primary role is to build the technology and take responsibility for how that technology is used. However, Facebook does not create the news that people read and share on the platform.

More problems to consider

While it's true that Facebook has expanded far beyond its original scope, should that be an excuse? In light of this news, experts are also pointing out that Facebook is monetizing its platform through ads that show up in users' News Feeds.

Sarah T. Roberts, a content moderation expert explained, "It's one thing when you're a small online community with a group of people who share principles and values, but when you have a larger percentage of the world's population and say 'share yourself', you are going to be in quite a muddle. Then when you monetize that practice you are entering a disaster situation."

What should you do?

While Facebook tries to find a solution, there is a little trick you can use to improve what you see in your News Feed. By telling Facebook more of what you'd like to see, and choosing content from reliable sources like Komando.com, you'll reduce your chances of encountering inappropriate content. Click here for step-by-step instructions on telling Facebook which sites and pages you'd like to see more from.

More from Komando.com

Facebook surveillance oversteps boundaries

Get a Facebook privacy checkup

5 ways to lock down your Facebook account for maximum security

Next Story
Source: The Guardian
Apple rumors: New photo shows iPhone 8, 7s and 7s Plus models!
Previous Happening Now

Apple rumors: New photo shows iPhone 8, 7s and 7s Plus models!

Wi-Fi can now be used to photograph inside your home
Next Happening Now

Wi-Fi can now be used to photograph inside your home

View Comments ()