Skip to Content
Parler app on an iPhone
ID 198488030 © Transversospinales |
Social media

Why Parler was removed from Apple and Google’s app stores

A battle over online speech is raging. Social media app Parler is down and not expected to come back anytime soon. Before Amazon cut hosting for the Twitter alternative, it was removed from Apple and Google’s app stores. Aside from the app, you could also access Parler on the web. Its site is down, too.

You may be wondering how exactly this happened and why it was allowed. We’ll walk you through what happened and what it means.

First, a quick note: We’re not taking political sides here.

Knowledge is power, and we want you to understand what’s happening in the background from a tech perspective. Let’s jump in, starting with a quick primer on Parler.

What is Parler?

The Twitter-Facebook alternative launched in 2018, then exploded in November to become America’s No. 1 downloaded news app. Parler, French for “to talk,” was set up to be a nonbiased, free speech-driven platform. While it’s supposed to be pronounced “par-lay,” the world has settled on “par-lour.”

Functionally, Parler is almost identical to Twitter. Instead of tweets, you have “parlays.” Instead of retweets, you have “echoes.” You “upvote” a post to show you liked it.

While Parler was not established as a conservative-only platform, most users are right-wing. At no time during registration are you asked your political affiliation. You can read a more in-depth look at Parler and how it works here.

This is a very important fact to know.

Unlike Twitter, which regulates content shared on the platform using internal company policies, Parler bases its user guidelines on the FCC’s obscenity definitions. Strictly put, this asks whether a post “is sexual in nature,” “is offensive,” and “Whether the work, taken as a whole, lacks serious literary, artistic, political, or scientific value.”

Parler relies on its users to report inappropriate content and posts that violate its guidelines. They don’t have any employees or third-parties fact-checking posts. Verification is strictly community-driven.

Late last week following the attack on the Capitol, Apple and Google said those policies are not enough and demanded Parler adopt a stricter moderation policy.

What do Apple and Google have to say?

From the beginning, CEO John Matze sold Parler as a “free speech” alternative to Facebook and Twitter — both of which have come under fire for their policies on what content is acceptable amid the pandemic and the presidential election.

After President Trump was banned from Twitter and Facebook, supporters flocked to Parler. It became the most downloaded free app on Apple’s App Store prior to its removal.

But as posts inciting more violence spread on Parler, Apple and Google called for moderation.

Google suspended Parler from the Play Store on Friday, saying its policies require apps that display user-generated content to have moderation policies and enforcement that remove “egregious content like posts that incite violence.” Google says it has reminded Parler of this policy in recent months.

We’re aware of continued posting in the Parler app that seeks to incite ongoing violence in the US. We recognize that there can be reasonable debate about content policies and that it can be difficult for apps to immediately remove all violative content, but for us to distribute an app through Google Play, we do require that apps implement robust moderation for egregious content.

Google’s statement, shared with the verge

Similarly, Apple gave Parler 24 hours to remove content of a “dangerous and harmful nature” and provide “detailed information about how you intend to moderate and filter this content from your app, and what you will do to improve moderation and content filtering your service for this kind of objectionable content going forward.”

In an email obtained by Buzzfeed News, Apple said it received “numerous complaints regarding objectionable content in your Parler service, accusations that the Parler app was used to plan, coordinate, and facilitate the illegal activities in Washington D.C. on January 6, 2021 that led (among other things) to loss of life, numerous injuries, and the destruction of property.”

Apple’s follow up indicates Parler did propose some changes within that 24-hour window. Ultimately, Apple decided were not enough.

You referenced that Parler has been taking this content “very seriously for weeks.” However, the processes Parler has put in place to moderate or prevent the spread of dangerous and illegal content have proved insufficient. Specifically, we have continued to find direct threats of violence and calls to incite lawless action.


By Saturday, Parler was removed from both app stores.

What did Parler CEO Matze have to say? “We won’t cave to politically motivated companies and those authoritarians who hate free speech!” he wrote on the app.

Following the removal from the Apple App Store, Matze said, “We do not own our phones, Apple simply rents them to us. Apple, Google and the rest of the anti-competitive pack of big tech tyrants coordinate their moves and work together to stifle competition in the marketplace.”

Is this legal? How are Google and Apple able to do this?

To make it into the app stores, developers agree to terms and conditions set by Google and Apple. Simply put, if the companies decide an app violates those policies, they can remove it.

Think of it this way. Imagine a retail store discovers one of the products for sale does something destructive and promotes mayhem. Would the store want to take responsibility for selling that product? Likely not.

Whether their logic was right or wrong, Google and Apple were able to remove what they deemed a dangerous product from their online stores.

Ultimately, though, Amazon’s decision to kick Parler off its web hosting services is what stopped users from accessing the app and the website. You need to know more about this situation, too. Read more here about that decision and what it means for the future of Parler.

Remember, Amazon, Apple and Google are big corporations. You agree to their terms and policies when you use their products. They are in control and have all the power.

The Bottom Line

Google struck first and removed Parler from its app store, saying that Parler had violated its terms of service by failing to control violent content. Then Apple hit next. Of course, there is NO social media company — ZERO — that’s managed to keep objectionable content off their website. But the Twitter and Facebook apps are still in the Apple and Google app stores.

Amazon struck next, taking Parler off its cloud servers.

Parler’s owners say that they’ll build their own servers. But that takes time and money. And not being in an app store seriously questions whether Parler can survive at all, even with its own servers. 

Regardless of where your political beliefs are, you are witnessing the power of three of big tech’s most ruthless players to put a fourth, smaller company, out of business.

Stop robocalls once and for all

Robocalls are not only annoying, but they scam Americans out of millions every year. Learn Kim's tricks for stopping them for good in this handy guide.

Get the eBook