Facebook Content Moderators are Suffering From PTSD, Lawsuit Claims

(CNET)  A new lawsuit alleges Facebook fails to protect moderators who suffer from post-traumatic stress disorder after viewing violent and disturbing content people attempt to post on the social network.

The lawsuit, filed on Sept. 21 in state superior court in San Mateo County, California, says Facebook content moderators working under contracts have to look at thousands of “videos, images and live-streamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder” every day, according to a press release.

Facebook and other Internet providers have established industry standards for training, counseling, and supporting content moderators, but Facebook isn’t following the workplace safety guidelines that it helped create, the lawsuit said.

The social network said it’s currently reviewing the suit.

“We recognize that this work can often be difficult. That is why we take the support of our content moderators incredibly seriously,” said Bertie Thomson, director of corporate communication at Facebook, in an email statement. “Facebook employees receive [training and benefits] in house and we also require companies that we partner with for content review to provide resources and psychological support, including onsite counseling— available at the location where the Plaintiff worked— and other wellness resources like relaxation areas at many of our larger facilities.”

The lawsuit, which is seeking class action status, was filed on behalf of Selena Scola, who worked at Facebook for nine months under a contract through staffing company Pro Unlimited. Scola was diagnosed with PTSD after experiencing symptoms such as fatigue, insomnia, and social anxiety, according to the release.

Scola is the lone Plaintiff at the moment, but if class action is granted, the lawsuit could impact thousands. The lawsuit alleges negligence and failure to maintain a safe workplace at Facebook and Pro Unlimited.

“Our client is asking Facebook to set up a medical monitoring fund to provide testing and care to content moderators with PTSD,” said Steve Williams, one of Scola’s lawyers from the firm Joseph Saveri, in the release. “Facebook needs to mitigate the harm to content moderators today and also take care of the people that have already been traumatized.”

Pro Unlimited didn’t immediately respond to requests for comment.

(Reporting by Marrian Zhou)