French families sue TikTok over teen suicides

Lawyers blame algorithm for harmful content recommendations

Seven French families have taken legal action against TikTok, claiming the platform's algorithm exposed their teenage children to harmful content, ultimately leading to the suicide of two 15-year-olds.

The lawsuit, filed in the Créteil judicial court, is the first of its kind in Europe.

The families argue that TikTok's inadequate content moderation contributed to their children's severe mental health struggles.

Lawyer Laure Boutron-Marmion, representing the families, says the teens were exposed to a barrage of videos promoting self-harm, eating disorders and suicide.

She says TikTok's algorithm actively recommended the harmful content, exacerbating existing vulnerabilities.

The families aim to hold TikTok accountable for what they describe as negligence. They argue that the company, as a provider of a consumer product, has a duty to protect the safety and well-being of its users, particularly minors.

"The parents want TikTok's legal liability to be recognised in court", Boutron-Marmion said.

"This is a commercial company offering a product to consumers who are, in addition, minors. They must, therefore, answer for the product's shortcomings."

This group lawsuit is separate from a criminal complaint filed against TiKTok last year by the parents of Marie, one of the two teenagers.

Marie, whose last name has not been disclosed, was 15 when she took her own life in 2021. Her mother claims Marie's death was influenced in part by unmoderated video content she accessed on TikTok.

Like other social media giants, TikTok has come under increasing scrutiny for its content moderation practices.

The platform has faced numerous lawsuits in the US, with claimants alleging that it entices and addicts millions of young users, ultimately harming their mental health.

The EU is investigating TikTok's adherence to new safety regulations aimed at protecting minors.

In response to the lawsuit, TikTok said its community guidelines strictly prohibit content promoting self-harm and suicide.

The company also claims to have implemented safeguards such as limiting screen time, adjusting content recommendations and providing mental health resources.

CEO Shou Zi Chew told US lawmakers this year that TikTok has invested in measures to protect young users on the app.

In 2023, Ireland's Data Protection Commission (DPC) fined TikTok €345 million for inadequate protections of children's personal data.

The DPC's investigation focused on TikTok's compliance with privacy obligations for users aged 13 to 17, scrutinising the platform's settings for young users and the effectiveness of its age-verification measures.

The inquiry found that TikTok guided users toward privacy-compromising options during both account registration and video posting.

Furthermore, the DPC noted that TikTok's "family pairing" feature, designed for parents to control settings, was not implemented with sufficient rigour.

In July 2024, Ofcom imposed a £1.875 million fine on TikTok for providing inaccurate information about its parental control safety feature.

The investigation revealed significant flaws in TikTok's data governance, showing that the company lacked effective checks to prevent inaccurate data submissions and was slow to identify and address these issues.