YouTube is the go-to site for funny videos, help with DIY projects around the house and all things Komando.
It’s also become a popular home for conspiracy theorists to push crazy ideas.
Did YouTube kick-start the Flat Earth movement?
“Fake news” has been a popular term spreading through the U.S. for a few years now. Russian bots have been a huge source of the problem. In fact, so much fake news has circulated through Facebook, it created a tool to see if you fell for any Russian propaganda.
Things work a little differently on YouTube. Conspiracy theorists post videos trying to convince others that their theories are correct. The JFK assassination, who was actually behind the terror attacks on 9/11, and whether the moon landing real are a few of the big ones.
The latest conspiracy theory to catch on is a doozy. Some people actually believe that the Earth is flat. There’s an entire Flat Earth movement going around.
Can you guess where it began? Yep, YouTube.
Researchers at Texas Tech University have been trying to trace the Flat Earth movement back to its roots and said YouTube seems to be the answer. They interviewed Flat Earthers who attended the movement’s annual conference the past 2 years – yes there’s an annual conference – and most of them admitted YouTube videos changed their minds on the topic.
Many of them were already on YouTube watching conspiracy theory videos about 9/11 and the Sandy Hook school shooting. If you’ve ever spent time on the site, you know that it suggests other videos that you might like. Well, that’s the problem here.
How YouTube’s algorithm works
YouTube promoted videos claiming the Earth was flat, and conspiracy theorists ate it up. The following video really helped push them over the edge. It’s called, “200 Proofs Earth is Not a Spinning Ball.”
Some who were interviewed at the conferences said they started watching these videos just so they could debunk them, but ended up being won over by the information. One of their favorite “proofs” is, “Why is the horizon always at eye level?”
One of the researchers presented the results at an annual science meeting recently in Washington, D.C. She said, “There’s a lot of helpful information on YouTube but also a lot of misinformation. Their algorithms make it easy to end up going down the rabbit hole, by presenting information to people who are going to be more susceptible to it.”
YouTube’s recommendations appear to be based on an algorithm that combines data from your browsing history with data from other viewers’ activities, which somewhat explains how 60% of surveyed parents told Pew Research that their kids have come across inappropriate content.
Stuff like this makes you wonder if sites like YouTube are doing more harm than good. This isn’t the site’s only glaring problem, either.
We recently told you how the site might not be safe for children because it has a child predator issue. Hopefully it gets things cleaned up sooner rather than later.