Out of the massive amount of bad news we are constantly exposed to whether on TV or on social media, one particular headline from my home country, Bulgaria, caught my attention – a 13-year old girl  has been hospitalized after suffering a severe intoxication due to ingesting a large amount of paracetamol after participating in a TikTok challenge.

This news gave me pause. It is just one of many examples of teenagers engaging in dangerous trends in order to gain more popularity through validation on social media. Instead of judging the parents who allowed this to happen or blaming society, I took a different perspective: what is being done at the regulatory level to combat such practices and have the authorities around the world noticed the problematic nature of TikTok?

In this blog post, we will see why unsafe and harmful type of content is even allowed on such a popular social media and then we will have a look at two major legal battles in the EU and dozens of states in the USA to see how the authorities are approaching this situation.

What is the Paracetamol TikTok Challenge and how did this type of content become a trend?

The Paracetamol challenge is a TikTok trend, started in 2023, which encourages teenagers to take dangerous amounts of the paracetamol to see who can stay in hospital the longest. Many countries other than Bulgaria (USA, Switzerland, Belgium, Netherlands and other), have reported raising numbers of hospitalized children after an overdose use of paracetamol. This challenge, however, is not the only toxic and life-threatening trend on TikTok and concerns over children’s health have raised not only within the society, but amongst regulatory authorities in numerous countries.

It’s more than logical to wonder how such a dangerous content is even allowed on a world-wide popular social media platform and the answer lays behind TikTok’s „recommendation system“. The „recommendation system“ is a central feature on TikTok: this is a complex series of algorithms that powers the “For You feed”. It provides users with a stream of videos that TikTok’s recommendation system calculated to keep users on the platform for longer periods. In a nutshell, just like a basic math equation, the platforms notices what is interesting, funny, intriguing for you and gives you more and more of such content. The longer time spent on the social media, the richer the advertising flow for the company is.

The problem arises when, for instance, a child „shows“ the algorithm an interest towards rather unsafe content. In a recent report on children online safety from the European Consumer Organisation BEUC „evidence uncovered that TikTok exploits profiling to push users into harmful “rabbit holes” of toxic content“ and „internal documents from TikTok reveal that the platform knowingly exposes users to harmful filter bubbles, such as “SadTok” or “PainHub,” which promote self-harm, eating disorders and other mental health harms“. According to two other reports –Driven into the Darkness: How TikTok Encourages Self-harm and Suicidal Ideation and the I Feel Exposed: Caught in TikTok’s Surveillance Web, developed by Amnesty International, Algorithmic Transparency Institute (ATI) at the National Conference on Citizenship and AI Forensics, children and young people who watch mental health-related content on TikTok’s ‘For You’ page are quickly being drawn into “rabbit holes” of potentially harmful content, including videos that romanticize and encourage depressive thinking, self-harm and suicide. The technical research revealed that after 5-6 hours on the TikTok platform, almost 1 in 2 videos shown were mental health-related and potentially harmful, roughly 10 times the volume served to accounts with no interest in mental health.

USA lawsuits against TikTok

In October 2024 dozens of states, led by California and New York, have filed a complaint against TikTok claiming that the platform „exploits and harms young users and deceives the public about its platform and platform’s dangers“. The plaintiff from the California State Attorney is written in a strong and highly criticizing matter against the social media platform: numerous accusations are laid off with overall idea to expose that basically the platform doesn’t care about the mental harm and potential harms of the children as long as they spend more and more time on the platform which will eventually grow their advertising revenue.  

According to the plaintiff, TikTok designs and deploys exploitative and manipulative features to addict young which poses a central pillar in the platform’s growth strategy. To achieve constantly growing advertising revenue, TikTok created platform features intended to cause excessive, compulsive, and addictive use. For young users of the TikTok platform, the features’ impacts are claimed to be severe: increased levels of depression and anxiety disorders, reduced sleep, self-harm, suicidal ideation, and eating disorders. The claim provides a deep dive into TikTok multiple features to manipulate young users into compulsive and excessive use: filters, autoplay, infinite scroll, stories and lives, push notifications.

One of the central accusations towards the platform is that TikTok markets the platform as safe, well-moderated and appropriate for young users.  According to Attorney General of California the public representations of TikTok to remove violative content amongst which promoting dangerous activities and challenges are misleading. In contrary to the platform’s Community Guidelines, TikTok permits physically dangerous behavior by simply moving it out of „For You“ feed making it just less visible.

As said above, California is only one of the states which filed allegations against the platform. All of them have similar allegations: TikTok hooks children and may lead not only to sleep deprivation, depression but also may push them towards dangerous for their life challenges such as the Paracetamol challenge. The files are ongoing and TikTok’s response is yet to be expected. The platform will probably rely of Section 230 – a U.S. federal law, part of the Communications Decency Act of 1996, that shields online platforms from being sued for content posted by users.

The outcome is the lawsuits will have a major impact on content moderation and can result in fines in billions for the platform.

Investigation against Tik-Tok under the Digital Services Act

In February 2024 the European Commission opened formal proceedings against TikTok for potential breach of the Digital Services Act. The accusation are similar to the USA lawsuits suggesting that „in terms of actual or foreseeable negative effects stemming from the design of TikTok’s system, including algorithmic systems, that may stimulate behavioural addictions and/ or create so-called ‘rabbit hole effects‘“.

Earlier, the Commission has succeeded to persuade TikTok to permanently withdraw TikTok Lite Rewards programme from the EU binding. A The Commission was concerned that the TikTok Lite Rewards programme had been launched without a prior diligent assessment of the risks it entails, particularly in relation to the addictive effect of the Rewards programme, and without taking effective risk mitigating measures. The Rewards programme, which may stimulate addictive behaviour, could potentially have negative effects on the physical and mental health of users. This is of particular concern for minors, who may have a heightened sensitivity to such features.

The Commission may impose up to 6%  of the global revenue of TikTok if evidence confirm the allegations.

The outcomes of these two major legal battles in the USA and EU against TikTok will have a significant impact on the rules of content moderation and protection of children online. Both of them aim to put an end to TikTok’s harmful and dangerous content.