TBS Newsbot

Extreme Moderation: PTSD Growing in Facebook’s Unseen Army

Due to the extreme content they have to filter on our behalf, the mental well-being of Facebook’s army of moderators is now finally being discussed.

 

An ex-content moderator is suing Facebook claiming that the company gave her PTSD as a result of the images she was tasked to shield us from. The moderator also believes that Facebook didn’t offer her suitable mental healthcare, knowing full well that she’d have to view the worst the Internet has to offer on a daily basis. Sadly, this covers the spectrum of beheadings, suicide livestreams, graphic animal abuse, and everything else in between you’d rather be spared viewing.

The moderator, Selena Scola, believes that her experience was “typical” of the unseen army of protectors/screeners that Facebook hires. From the complaint itself, Scola was “exposed to thousands of images, videos, and livestreamed broadcasts of graphic violence… Ms. Scola’s PTSD symptoms may be triggered when she touches a computer mouse, enters a cold building, watches violence on television, hears loud noises, or is startled. Her symptoms are also triggered when she recalls or describes graphic imagery she was exposed to as a content moderator.”

With all that being said, Facebook does offer countermeasures to protect their moderators, and blazed the trail in that regard, but the complaint says, “Facebook does not provide its content moderators with sufficient training or implement the safety standards it helped develop.”

At it stands, about 7,500 content moderators currently work on Facebook’s content. In the past, we’ve had grim snapshots of what the moderators face on a daily basis, but as Facebook’s base grows, as does the need for more of these embattled souls.

 


Also on The Big Smoke


 

The effects of their job shouldn’t be understated, and a case like Selena’s is not an outlier. Back in January 2017, two Microsoft content moderators sued the company, citing similar concerns. Tasked to review and report images of child abuse and murder, per The Guardian “…the moderators viewed on a regular basis and alleged that the psychological impact has been so extreme that the men are triggered by simply seeing children and can no longer use computers without breaking down.”

In response to the suit, Microsoft disagreed with the claims brought forward (whilst refusing to go into them) and “takes seriously its responsibility to remove and report imagery of child sexual exploitation and abuse being shared on its services, as well as the health and resiliency of the employees who do this important work.”

Yes, while the employees are responsible for agreeing to the brutality of the position, clearly the employers have a responsibility to soften such a heavy blow.

In her own crusade, Scola is seeking a Facebook-funded medical monitoring program to diagnose and treat content moderators for trauma.

Fingers crossed.

 

Related posts

*

Top