Skip to Content
Social media

You won’t believe Facebook’s secret rules for removing posts

Over time, Facebook transformed from being a place to just post a profile of yourself to a site where you can share thoughts and ideas, too. Some of what we read from others is helpful and informative.

Much of what is posted is…not. As far as we’re concerned, it’s up to us to scroll past the stuff we do not appreciate or enjoy. Or, if we really want, unfriend or at least unfollow the person or people who are posting it all.

Those options have to do with what we are able to see. As you may have assumed and likely had confirmed during Mark Zuckerberg’s testimony in front of Congress, the site itself regulates what even makes it that far.

It used to be an open secret

That Facebook has policies of what is and is not acceptable on its platform comes as little surprise. As much freedom as they give us all to post, the first amendment really does not apply and therefore rules can be made and enforced.

The concept was brought to the forefront when Texas Senator Ted Cruz grilled Zuckerberg about a perceived liberal bias, challenging the Facebook boss on a variety of instances where he felt conservative viewpoints were being silenced.

Zuckerberg defended his company’s policies, saying he makes it a point to ensure political bias does not factor into anything they do, but no doubt there are many who believe that is hardly the case. It’s really up to you.

Now, though, Facebook is trying to be a bit more transparent with regards to what it calls the site’s “community standards.” In a blog post, it is explained that the standards are designed to be comprehensive and apply all around the world to all types of content.

Writing that the goal of having the standards is to encourage expression in a safe environment, the policies are based on input from the community as well as experts in technology and public safety.

With that in mind, there are three guiding principles the standards follow.


Facebook wants everyone who uses it to feel safe on the platform. Therefore, they say they “are committed to removing content that encourages real-world harm.” That includes, but is not limited to, physical, financial and emotional injury.

So, anything that encourages suicide or self-injury, child nudity or sexual exploitation of children, sexual exploitation of adults, bullying, harassment or privacy violations and image privacy rights are not allowed.


Facebook says the goal is to encourage a diverse set of views, going as far as to err on the side of allowing content that some may find to be quite objectionable. That is, “unless removing that content can prevent a specific harm.”

In keeping with this ideal, Facebook says they will sometimes allow content that would normally be deemed to violate their standards if they feel it is newsworthy, significant or important to public interest.

However, hate speech, graphic violence, adult nudity and sexual activity, cruel and insensitive content will not be accepted.


Facebook feels like its platform “transcends regions, cultures and languages,” resulting in the community standards sometimes appearing less nuanced than they would prefer. Because of that, there is an understanding that oftentimes decisions will be made based on the spirit of their standards, even if that means not following them completely.

What about fake news?

Facebook also wants to promote integrity and authenticity. Among the things they do not want on the site are false news, misrepresentation and spam.

For false news, specifically, Facebook writes that reducing the spread of it is “a responsibility that we take seriously,” while at the same time understanding it is a challenging and sensitive issue. The goal is to help people stay informed without stifling productive public discourse.

Because of that, as well as the fine line between false news and satire or opinion, Facebook says they do not remove news deemed to be wrong but rather aim to “significantly reduce its distribution by showing it lower in the News Feed.”

Facebook wants your help, too

Though they have policies in place, Facebook looks to its members to kind of self-police. They hope people will be respectful when using the site, but also let them know when they spot something that may violate the standards.

To do that, you will need to click on the three horizontal dots in the upper right-hand corner of a post. That will open up a host of options, one of which is “Give feedback on this post.”

If you don’t want to report the post, you can choose to block, unfollow or hide people and/or posts, too. In other words, you do have a bit of control over what you will see on the site.

That doesn’t have anything to do with Facebook tracking us

As we are now learning, the Cambridge Analytica breach involved more than the estimated 87 million users impacted by the “This Is Your Digital Life” app. No, according to a whistleblower, that was just the tip of the iceberg. Click here to learn more.

Komando Community background

Join the Komando Community

Get even more know-how in the Komando Community! Here, you can enjoy The Kim Komando Show on your schedule, read Kim's eBooks for free, ask your tech questions in the Forum — and so much more.

Try it for 30 days