Earlier this year, TikTok decided to award content creators for being socially driven, and the same recipients might be causing a rift in the company's legal background.
Recently, about seven families in France have filed lawsuits against TikTok with claims that allegedly led to the deaths of two children.
CNN detailed that the families alleged TikTok exposed their teenage children to harmful content, which contributed to two suicides. Specifically, families were concerned about the algorithm that promoted suicide, self-harm, and eating disorders.
This is the first grouped legal case of its kind in Europe against TikTok.
TikTok Lawsuit Over Suicides
Filed in the Créteil judicial court, the parents want the court to hold TikTok legally responsible for its content, arguing that since TikTok is a commercial platform targeting minors, it should be held accountable for risks associated with its product.
Similar to Telegram's case, TikTok is also not exempted from content moderation criticisms. Currently, even Meta, is facing numerous lawsuits in the US for allegedly allowing harmful content to reach young users, which harms their health and makes their platforms more addictive to these groups.
Now, a court ruling in favor of the families will likely require TikTok to implement stricter content moderation policies, especially when it comes to content that reaches the algorithm of minors, in France or Europe overall. They could also be held legally liable for damages that the families of the victims will receive.
TikTok has yet to respond to the lawsuit, but the company itself stated that they are committed to addressing mental health issues linked to its platform and has been investing in measures to protect young users.
Join the Conversation