YouTube Sued By PTSD-Stricken Moderator Over 'Beheadings, Rape, Child Abuse And Other Disturbing Content'

Tyler Durden's Photo
by Tyler Durden
Tuesday, Sep 22, 2020 - 03:45 PM

An anonymous former YouTube content moderator is suing the Google-owned video sharing platform, claiming she's suffering depression and post-traumatic stress disorder (PTSD) after being forced to watch "objectionable content such as child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder."

According to the lawsuit, the former moderator "has trouble sleeping and when she does sleep, she has horrific nightmares. She often lays awake at night trying to go to sleep, replaying videos that she has seen in her mind." The woman she says she saw thousands of disturbing videos - including school shootings with dead children, a fox being skinned alive, a person's head getting run over by a tank, and people eating from a smashed open skull.

As CNET notes, the woman also claims that she can't be in crowded places due to fear over mass shootings, and has panic attacks and anxiety which has cost her friendships. She's also frightened to have children and has trouble being around kids.

The proposed class-action lawsuit accuses YouTube of violating California law by failing to provide a safe workplace for content moderators and not doing enough to safeguard their mental health. Moderators spend more than four hours a day reviewing graphic video content because YouTube is "chronically understaffed," the suit says. These long hours run afoul of YouTube's best practices, according to the lawsuit. Workers are required to review "between 100 and 300 pieces of content per day with an error rate of two to five percent," creating stress and increasing the risk that content moderators develop psychological trauma from the job, according to the lawsuit.

The former moderator, who isn't named, is seeking medical treatment, compensation for the trauma she suffered and the creation of a YouTube-funded medical monitoring program that would screen, diagnose and treat content moderators. -CNET

What's more, the woman complains that she's been subject to "conspiracy theories, fringe beliefs, and political disinformation."

The anonymous woman worked at YouTube in Austin, Texas through third-party staffing agency Collabera from January 2018 through August 2019. She claims that YouTube failed to adequately inform prospective content moderators that they would be exposed to a cornucopia of gore that could impact their mental health.

YouTube allows content moderators to step out of the room while the company shows them graphic videos during training, however workers say they're afraid they'll lose their jobs if they do. At the end of training, they have to pass a test which measures their ability to accurately flag content which violates YouTube's rules.

he ex-moderator who is suing YouTube sought the advice of a wellness coach in 2018 after she felt traumatized by a video she reviewed. The coach recommended the worker take illegal drugs and didn't provide any resilience training or ways to cope with her symptoms, according to the lawsuit. Another coach told a content moderator to just "trust in God." The Human Resources department also didn't provide content moderators with any help and YouTube requires workers to sign non-disclosure agreements, making it harder for them to talk about their problems. -CNET

Moderator complaint against... by jonathan_skillings

PTSD-stricken content moderators is nothing new for silicon valley tech giants. In 2018, a Northern California woman sued Facebook after she was "exposed to highly toxic, unsafe, and injurious content during her employment as a content moderator."

Selena Scola moderated content for Facebook as an employee of contractor Pro Unlimited, Inc. between June 2017 and March of this year, according to her complaint. 

"Every day, Facebook users post millions of videos, images, and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder," the lawsuit reads. "To maintain a sanitized platform, maximize its already vast profits, and cultivate its public image, Facebook relies on people like Ms. Scola – known as “content moderators” – to view those posts and remove any that violate the corporation’s terms of use."

"You’d go into work at 9am every morning, turn on your computer and watch someone have their head cut off. Every day, every minute, that’s what you see. Heads being cut off," one content moderator told the Guardian.