Technology,
Litigation & Arbitration
May 17, 2023
Moderators’ suit over graphic TikTok videos advances
Two content moderators hired to look through and filter offensive videos on TikTok, including necrophilia and child abuse, said in their lawsuit that the social media company made them view this type of content at a pace and frequency which it knew would cause them harm.





A federal judge in San Francisco advanced a class action against TikTok and its parent company ByteDance alleging negligence and subpar efforts to protect its content moderators from the consequences of watching graphic and disturbing videos.
Two content moderators hired to look through and filter offensive videos on TikTok, including necrophilia and child abuse, said in their lawsuit that the social media company made them view this type...
For only $95 a month (the price of 2 article purchases)
Receive unlimited article access and full access to our archives,
Daily Appellate Report, award winning columns, and our
Verdicts and Settlements.
Or
$795 for an entire year!
Or access this article for $45
(Purchase provides 7-day access to this article. Printing, posting or downloading is not allowed.)
Already a subscriber?
Sign In