More News:

December 25, 2021

TikTok moderator sues social media giant over trauma from graphic videos

The lawsuit claims brutal work conditions and inadequate support have caused serious mental health problems

Social Media Lawsuits
TikTok Lawsuit Moderator Cottonbro/Pexels.com

A proposed federal class-action lawsuit filed against TikTok and its parent company, ByteDance Inc., accuses the social media app of failing to adequately protect content moderators who are exposed to disturbing and graphic videos throughout their work shifts.

A TikTok content moderator has filed a proposed class-action lawsuit against the company over its alleged failure to implement guidelines that would better support employees who become traumatized by viewing hours of disturbing videos, according to a complaint filed in federal court in Los Angeles.

Like most social media platforms, including Facebook and YouTube, TikTok employs a team of about 10,000 moderators tasked with sifting out graphic and illegal content in order to protect users from unwanted exposure. That can range from rapes and beheadings to suicides, child sexual abuse, animal mutilation and other footage that may be damaging to the mental health of those screening it. 

Candie Frazier, a TikTok content moderator based in Las Vegas, claimed in her lawsuit that she suffers from PTSD after seeing videos of school shootings, deadly falls and even cannibalism, Bloomberg reported.

"Plaintiff has trouble sleeping and when she does sleep, she has horrific nightmares,” the complaint says.

Frazier's lawsuit contends that TikTok has not adopted guidelines that other social media platforms have put in place to protect moderators, such as limiting shifts to four hours and providing them with psychologist support.

TikTok allegedly requires its moderators to work 12-hour shifts, giving them only a one-hour lunch and a pair of 15-minute breaks. They are bombarded with non-stop content, much of which depicts disturbing scenes.

“Due to the sheer volume of content, content moderators are permitted no more than 25 seconds per video, and simultaneously view three to ten videos at the same time,” the lawsuit states.

Content moderators for Facebook and YouTube are often hired through third parties, including global professional services firm Accenture, which asks employees to sign consent forms acknowledging that the job may cause PTSD. Facebook was hit with a similar lawsuit in 2018 over claims that the company ignored its duty to protect the well-being of its content moderators.

In recent years, moderators around the world have become more vocal in their criticism of social media companies for not paying wages that reflect the hazards of the job and not offering adequate psychological support to those who need it.

Frazier's lawsuit claims that TikTok and parent company ByteDance Inc. never implemented moderation guidelines that were recommended after the company joined other social media platforms to create standards to address these concerns.

In 2021, TikTok experienced a meteoric rise in popularity among a growing user base, which tends to skew younger than other platforms. The viral video app will end the year with the most cumulative internet traffic of any domain in the world including Google, according to Input Mag, and has more than one billion active monthly users.

TikTok's growing popularity has contributed to a number of disturbing and dangerous trends among teenagers, problems that have plagued other apps for years but are now concentrated on a platform that developed with younger users in mind.

Frazier's lawsuit seeks compensation for psychological injuries and a court order requiring the company to set up a medical fund for moderators. The company has yet to comment publicly on the complaint.

Videos