Facebook is in hot water again, but - amazingly - this time not for privacy reasons. It's testing out a new system called "related articles."
When you click on a link to read a story that a friend posted, Facebook might suggest other similar stories you might also like. That doesn't sound like a problem; when you read an article on my site, I show other related articles you might like too.
However, some users are claiming Facebook's system has a big flaw.
A surprise awaited Facebook users who recently clicked on a link to read a story about Michelle Obama’s encounter with a 10-year-old girl whose father was jobless.
Facebook responded to the click by offering what it called “related articles.” These included one that alleged a Secret Service officer had found the president and his wife having “S*X in Oval Office,” and another that said “Barack has lost all control of Michelle” and was considering divorce.
A Facebook spokeswoman did not try to defend the content, much of which was clearly false, but instead said there was a simple explanation for why such stories are pushed on readers. In a word: algorithms.
The stories, in other words, apparently are selected by Facebook based on mathematical calculations that rely on word association and the popularity of an article. No effort is made to vet or verify the content.
The fact that Facebook isn't checking the content before suggesting it is what has people up in arms.
“They have really screwed up,” said Emily Bell, director of Columbia Journalism School’s Tow Center for Digital Journalism. “If you are spreading false information, you have a serious problem on your hands. They shouldn’t be recommending stories until they have got it figured out.”
Facebook, however, doesn't seem to see the problem.
“These news feed units are designed to surface popular links that people are sharing on Facebook,” Facebook spokesman Jessie Baker said via e-mail. “We don’t make any judgment about whether the content of these links are true or false, just as we don’t make any judgment about whether the content of your status updates are true or false.”
I'm leaning toward Facebook's position on this one. It isn't trying to run a reputable news organization - although it does have a separate news service called FB Newswire in the works - so fact-checking stories isn't really its responsibility.
Really, the Internet is no different than newspapers, magazines, books and TV - you can easily find good information and bad information. Just like in every other area, Internet users needs to think critically and evaluate every bit of information they find - no matter where they find it.
When humanity gets to the point where Facebook or Google have to weed out "bad stories" for us, then we have a bigger problem.