Leave a comment

The traumatic life of a Facebook moderator

The traumatic life of a Facebook moderator

It's a well-known fact that Facebook, in its grand ambition to "connect people," has always struggled meeting the challenges of moderating all the content that flows through its site.

With the constant flood of videos, photos and personal posts plus unprecedented self-publishing tools like Facebook Live, censoring someone with an intent to shock, disgust and intimidate as soon as they post is a tall order indeed.

Although Facebook is using a little bit of automation and artificial intelligence to flag disturbing and inappropriate content, it's still very far from perfect.

And currently, who does the bulk of policing the site? Well, no other than good 'ol humans. But with the millions of shocking and disturbing posts flooding Facebook each day, what price do these Facebook moderators pay?

The life of a Facebook moderator

A new in-depth report from The Verge details how the daily exposure to graphic and violent content, like videos of murders, hate speech and suicide, is leading an increasing number of Facebook moderators to cope by using sex, drugs, threats of violence and other inappropriate behavior.

The investigation focused on former and current employees of Cognizant, a Phoenix-based professional services contractor that reportedly provides Facebook with content moderation services.

The moderators, all hidden behind pseudonyms due to the non-disclosure agreements they signed with Cognizant, said that they have all manifested symptoms of severe mental trauma due to the nature of the work they do at the company. Worse yet, they said that Cognizant is not doing enough to help them.

'Could probably still do the job'

One employee, who went by the pseudonym "Chloe," recalls how she was struck with an extreme panic attack during a training session after moderating a video of a man being stabbed repeatedly. An onsite company counselor eventually came to advise her that she "could probably still do the job."

Due to the emotional impact caused by daily exposure to graphic content, Chloe and other moderators are worried about the long-term impact to their mental health. Some said that they have experienced symptoms of secondary traumatic stress, a condition identical to post-traumatic stress disorder (PTSD) caused by observing trauma experienced by others.

Feelings of isolation and anxiety

According to The Verge, cases like this, combined with the non-disclosure agreements, are "leading to increased feelings of isolation and anxiety" for the moderators.

Although Cognizant provides its employees with onsite counselors (when available), hotlines and therapy sessions through its employee assistance programs, the report said that employees find these resources lacking and to cope with their stressful jobs, they turn to sex, drugs and offensive humor.

Coping measures

Some moderators often tell "dark jokes about committing suicide," others "used marijuana on the job almost daily" and some were even caught having sex at the most inappropriate locations around the building -- the bathrooms, the stairwells, the parking garage and the room reserved for lactating mothers.

Beyond the mental stress, other employees admit that due to their constant exposure to conspiracy theories that they repeatedly flag on Facebook, they have started believing the same conspiracy theories themselves.

"One auditor walks the floor promoting the idea that the Earth is flat. A former employee told me he has begun to question certain aspects of the Holocaust," The Verge reports. One former employee even questions the 9/11 attacks and now sleeps with a gun on his side.

'High-profit margin'

Perhaps adding more to the mental pressure that the moderators working for Cognizant experience is their pay. A Cognizant moderator (aka a "process executive") only earns an average of $28,800 a year, a far cry from the $240,000 a year salary that a proper Facebook employee makes on the average.

This type of outsourcing arrangement "helps Facebook maintain a high profit margin," the Verge revealed.

A tour of Cognizant

After getting wind of the report, The Verge said that they were invited to visit the Phoenix site to see it for themselves. After interviewing five employees, they said (in the presence of their boss) that they "acknowledge the challenges of the job but tell me they feel safe, supported, and believe the job will lead to better-paying opportunities."

"Brad," a policy manager for Cognizant, also told The Verge that the majority of the content they review is "essentially benign."

“There’s this perception that we’re bombarded by these graphic images and content all the time, when in fact the opposite is the truth,” Brad said.

“Most of the stuff we see is mild, very mild. It’s people going on rants. It’s people reporting photos or videos simply because they don’t want to see it — not because there’s any issue with the content. That’s really the majority of the stuff that we see,” he continued.

The moderators that The Verge spoke to also understand the value of their work. They took "great pride in their work" but they wished that actual Facebook employees will start thinking of them as their peers.

“If we weren’t there doing that job, Facebook would be so ugly,” an employee said. “We’re seeing all that stuff on their behalf. And hell yeah, we make some wrong calls. But people don’t know that there’s actually human beings behind those seats.”

At the end of the day, scanning for questionable content on Facebook is a massive undertaking and although the social media giant is hoping that bots and artificial intelligence will be eventually smart enough to take over, current technology is not there yet.

In the meantime, the wellness of human employees, of course, outsourced or not, should take precedence over high-profit margins and continued growth.

How to report a problem on Facebook

To help out Facebook moderators in their fight against questionable and disturbing content, maybe you, as a user, should start reporting posts, videos and photos that should be pulled from the site. Here's how:

Reporting against posts that are against Facebook's Community Standards

To report something that you believe is against Facebook's Community Standards (hate speech, racist posts, nudity, violence, harassment, terrorism, etc,):

  1. Click the three dots on the upper-right side of the post.
  2. Select “Give feedback on this post.”
  3. Select your reason for reporting the post.
  4. Hit Send.

The decision regarding your report won't be immediate, but you will get some kind of response almost immediately. Give or take a few hours to a few days, if an actual takedown of the post is made, you will be notified that the post has been removed.

Bonus: Is it time for you to break up with Facebook? Here are 10 reasons to dump Facebook in 2019.

YouTube is scrambling to get rid of child predators

Facebook is not the online struggling to police its site. YouTube is also fighting its own share of issues after it was discovered that pedophiles have been using the site for malicious purposes.

Click or tap to find out exactly what YouTube did.

Next Story
Source: The Verge
View Comments ()
New Amazon Alexa skills: Publish your own skill, play Three Questions and more
Previous Happening Now

New Amazon Alexa skills: Publish your own skill, play Three Questions and more

Here are the most exciting gadgets you missed at MWC 2019 so far
Next Happening Now

Here are the most exciting gadgets you missed at MWC 2019 so far