TikTok is the home of viral cooking hacks and dance videos — but it’s also the latest social media company with a content moderation problem.
Candie Frazier, who works as a contracted content moderator, filed a class action lawsuit in federal court on Thursday against TikTok and its parent company, ByteDance, for psychological trauma. Frazier alleges she developed anxiety, depression, and PTSD as a result of the highly regimented 12-hour shifts in which she would watch a stream of videos containing “animal cruelty, torture, suicides, child abuse, murder, beheadings, and other graphic content.” Troubling videos also discussed conspiracy theories, holocaust denial, political misinformation, and other destabilizing content.
TikTok knew the risks of subjecting contractors to this type of work, and did not follow the industry standard protocols meant to protect content moderators’ mental health, the lawsuit claims. Going further, it notes TikTok was part of a coalition that created best practices for safeguarding employees who have to filter out child sexual abuse imagery. However, TikTok hasn’t implemented many of those guidelines, which include limiting how much time moderators are exposed to troubling videos, checking workers’ mental health histories, and providing mental health check-ins, the lawsuit alleges.
TikTok has not responded to a request for comment on the lawsuit.
Frazier is calling upon TikTok and ByteDance to pay for a “medical monitoring program to facilitate the ongoing screening, diagnosis, and adequate treatment” of Frazier and anyone else who joins the class action lawsuit, if it’s allowed to move forward.
TikTok moderators are required to watch multiple 25-second clips of videos simultaneously and in rapid succession. TikTok uses a computer program to make sure moderators stay on task during their 12-hour shifts, the lawsuit claims. Moderators may get their pay dinged if they don’t keep to the tight schedule (which comes with two 15-minute break and an hour-long lunch) amid the continuous flow of toxic content.
Content moderators suffering from psychological trauma is a well-known issue. In 2020, Facebook settled a similar class action lawsuit filed by its own contracted content moderators who suffered PTSD for $52 million.
Powered by WPeMatico