The European Commission has accused TikTok of creating an "addictive design" in its app which could harm the physical and mental wellbeing of minors and vulnerable adults.
It said the platform had been guilty of "multiple" violations of the EU's Digital Services Act (DSA).
The commission said the addictive design included infinite scroll features, autoplay, push notifications, and a highly personalised "recommender system", which uses AI to predict the preferences or ratings a user would give a product.
Officials said the recommender system was designed only to increase time spent on the app, and not to take account of negative experiences that a teenage user might have.
The app constantly rewarded users with new content, thus fuelling the urge to keep scrolling and to "shift the brain of users into 'autopilot mode’," said the commission.
This could lead to compulsive behaviour and reduce users' self-control.
In a statement the commission claimed TikTok "did not adequately assess how these addictive features could harm the physical and mental wellbeing of its users, including minors and vulnerable adults."
The tech firm had also "disregarded important indicators of compulsive use of the app, such as the time that minors spend on TikTok at night, the frequency with which users open the app, and other potential indicators."
The commission said TikTok had failed to implement "reasonable, proportionate and effective measures" to mitigate risks stemming from its addictive design.
Screentime management and parental control tools had not effectively reduced the risks stemming from TikTok’s addictive design, the commission said.
"The time management tools do not seem to be effective in enabling users to reduce and control their use of TikTok because they are easy to dismiss and introduce limited friction," the commission said.
The investigation, launched in February 2024, also covered the so-called "rabbit hole effect" when using TikTok, as well as the risk of minors being exposed to inappropriate content if they misrepresented their age.
The commission said TikTok should disable key addictive features such as "infinite scroll" over time, and implement effective screen-time breaks, including after midnight.
The preliminary findings do not prejudge the outcome of the overall investigation, said the statement.
The commission said its conclusions were based on an analysis of TikTok's risk assessment reports, internal data and TikTok's responses to "multiple requests for information", as well as a review of the recent scientific research and interviews with experts in multiple fields, including behavioural addiction.
TikTok can now challenge the preliminary finding by consulting the commission's investigation documents, and can reply in writing.
If the preliminary finding is upheld, TikTok could face a fine worth up to 6% of its global annual turnover.
Senior EU officials said there was growing evidence of TikTok users using the app after midnight, citing expertise that more and more 12-15 year olds were spending "too much time" on TikTok.
The commission said there should be mandatory screen time limits and lock-outs at night "to avoid sleep deprivation".
Officials said that combatting behavioural addiction by minors was a mandatory component of the assessment of the risks, and that TikTok had ignored widespread evidence.
"TikTok did not properly assess these risks…and did not properly mitigate these mental health risks on its platforms … TikTok disregarded relevant evidence on excessive use of its platforms," said a senior EU official.
The commission said it based its investigation on a range of publicly available studies and expertise.
It included a French parliamentary report that showed 8% of 12-15 year-olds spending more than five hours on TikTok, a Danish study mentioning users as young as eight using it on average more than two hours per day, and a Polish study where TikTok was cited as the most used platform after midnight by 13-18 year-olds.
Officials said there had been multiple failures by TikTok to take such evidence into account.
"There's a general rule in the DSA that the best scientific evidence and expert knowledge have to be taken into account when assessing these risks," said an EU official. "We found really serious shortcomings in the system, and not just in one risk assessment report but in multiple ones."
Officials suggested TikTok’s business model linked advertising income to keeping users on the platform as long as possible.
"It's not fundamentally impossible to add friction to the system to make these mitigations effective. And indeed, in the [DSA] guidelines there is rich material on how some of this can be mitigated," said an official.
Despite the preliminary finding, the commission said TikTok had been cooperative throughout the investigation and had come forward with ideas.
Officials denied the finding amounted to censorship.
"The DSA … is not a conte
It said the platform had been guilty of "multiple" violations of the EU's Digital Services Act (DSA).
The commission said the addictive design included infinite scroll features, autoplay, push notifications, and a highly personalised "recommender system", which uses AI to predict the preferences or ratings a user would give a product.
Officials said the recommender system was designed only to increase time spent on the app, and not to take account of negative experiences that a teenage user might have.
The app constantly rewarded users with new content, thus fuelling the urge to keep scrolling and to "shift the brain of users into 'autopilot mode’," said the commission.
This could lead to compulsive behaviour and reduce users' self-control.
In a statement the commission claimed TikTok "did not adequately assess how these addictive features could harm the physical and mental wellbeing of its users, including minors and vulnerable adults."
The tech firm had also "disregarded important indicators of compulsive use of the app, such as the time that minors spend on TikTok at night, the frequency with which users open the app, and other potential indicators."
The commission said TikTok had failed to implement "reasonable, proportionate and effective measures" to mitigate risks stemming from its addictive design.
Screentime management and parental control tools had not effectively reduced the risks stemming from TikTok’s addictive design, the commission said.
"The time management tools do not seem to be effective in enabling users to reduce and control their use of TikTok because they are easy to dismiss and introduce limited friction," the commission said.
The investigation, launched in February 2024, also covered the so-called "rabbit hole effect" when using TikTok, as well as the risk of minors being exposed to inappropriate content if they misrepresented their age.
The commission said TikTok should disable key addictive features such as "infinite scroll" over time, and implement effective screen-time breaks, including after midnight.
The preliminary findings do not prejudge the outcome of the overall investigation, said the statement.
The commission said its conclusions were based on an analysis of TikTok's risk assessment reports, internal data and TikTok's responses to "multiple requests for information", as well as a review of the recent scientific research and interviews with experts in multiple fields, including behavioural addiction.
TikTok can now challenge the preliminary finding by consulting the commission's investigation documents, and can reply in writing.
If the preliminary finding is upheld, TikTok could face a fine worth up to 6% of its global annual turnover.
Senior EU officials said there was growing evidence of TikTok users using the app after midnight, citing expertise that more and more 12-15 year olds were spending "too much time" on TikTok.
The commission said there should be mandatory screen time limits and lock-outs at night "to avoid sleep deprivation".
Officials said that combatting behavioural addiction by minors was a mandatory component of the assessment of the risks, and that TikTok had ignored widespread evidence.
"TikTok did not properly assess these risks…and did not properly mitigate these mental health risks on its platforms … TikTok disregarded relevant evidence on excessive use of its platforms," said a senior EU official.
The commission said it based its investigation on a range of publicly available studies and expertise.
It included a French parliamentary report that showed 8% of 12-15 year-olds spending more than five hours on TikTok, a Danish study mentioning users as young as eight using it on average more than two hours per day, and a Polish study where TikTok was cited as the most used platform after midnight by 13-18 year-olds.
Officials said there had been multiple failures by TikTok to take such evidence into account.
"There's a general rule in the DSA that the best scientific evidence and expert knowledge have to be taken into account when assessing these risks," said an EU official. "We found really serious shortcomings in the system, and not just in one risk assessment report but in multiple ones."
Officials suggested TikTok’s business model linked advertising income to keeping users on the platform as long as possible.
"It's not fundamentally impossible to add friction to the system to make these mitigations effective. And indeed, in the [DSA] guidelines there is rich material on how some of this can be mitigated," said an official.
Despite the preliminary finding, the commission said TikTok had been cooperative throughout the investigation and had come forward with ideas.
Officials denied the finding amounted to censorship.
"The DSA … is not a conte