Seven French families suing TikTok after teenagers’ deaths

Seven French families are suing the Chinese-owned social media platform TikTok over its alleged failure to remove content featuring themes of suicide, self-harm and eating disorders. Two 15-year-olds linked to the group took their own lives after reportedly being influenced by such videos.

The seven families have united under the name Algos Victima, or Victims of the Algorithm in English, to file a lawsuit against TikTok, a global social media platform where users create, share and view short videos, after two teenagers from the south of France died by suicide.

They submitted their claim, which is the first of its kind against the platform in Europe, to a court in Créteil on 4th November.

In a statement to the French media, the group’s lawyer, Laure Boutron-Marmion, said, “TikTok, like other industry giants, must be held accountable for its actions and negligence. The families involved in this lawsuit denounce the devastating effects of the app on the mental and physical health of their children, two of whom sadly took their own lives.”

Among the claimants are the parents of 15-year-old Marie from Cassis, who died in September 2021, and Charlize from Nice, also 15, who died in November 2023. Their stories share tragic similarities: both reportedly suffered from bullying at school and became increasingly withdrawn in the lead-up to their deaths. Both died by suicide, hanging themselves in their bedrooms.

Other parents in the Algos Victima group allege that their children were bombarded with negative and harmful messages via TikTok’s “killing machine” algorithm.

One mother told France 3, “My goal is to stop bombarding our children with harmful content. It’s time we act, as parents and as adults, to protect our children and demand TikTok enforce regulations to secure its network, which is destroying our children physically and psychologically.”

She also criticised the lack of action from French public authorities and the government in tackling the issue.

In a statement, TikTok said: “Over 40,000 trust and safety experts worldwide ensure user safety and data protection, including more than 6,000 focused on Europe, with 637 handling French-language content—significantly more than on other comparable platforms.”

According to data published by the platform, TikTok removes all content related to suicide and/or self-harm. TikTok further asserts that, between April and June 2024, 91% of such videos removed were deleted before being viewed by app users.

TikTok has not publicly commented on the Algos Victima lawsuit.

Read related:

France releases guide for parents after report highlights the dangers of excessive screen use

 

Monaco Life is produced by real multi-media journalists writing original content. See more in our free newsletter, follow our Podcasts on Spotify, and check us out on Threads,  Facebook,  Instagram,  LinkedIn and Tik Tok.

 

Photo credit: Solen Feyissa, Unsplash