Written by Mar Negreiro with Öykü Dilara Anaç.
Increased time spent online and regulatory pressure.Social media platforms’ business model relies on keeping users online for as long as possible so they can display more advertising. The platforms are optimised to trigger dopamine, a neurotransmitter the brain releases when it expects a reward, encouraging repeated and prolonged use. Yet excessive social media use – defined as spending more than three hours a day on online platforms – has been linked to poorer mental health, particularly higher levels of depression and anxiety. A 2025 survey conducted by Pew Research Center showed that minors aged between 13 and 17 in the United States (US) are much more likely than they were two years ago to describe their social media use as excessive. Nearly half reported that they spend too much time on these platforms, as they are on the internet ‘almost constantly’.
According to one survey, European teenagers aged 16 and 17 also reported spending more time than they wanted to on online platforms and losing sleep time at night, which might result in displacement from other, healthier activities. For instance, teens who are online late at night are more likely to experience shortened sleep duration and poorer quality of sleep, both risk factors for depression and irritability, a review shows.
Among both children and adults, excessive screen time and social media use have been linked to changes in brain function, including reduced attention and weaker impulse control. The adolescent brain is especially vulnerable. Experts warn that constant exposure to comparison cues, curated content and algorithm-driven engagement loops can create psychological stress that resonates long after the screen is turned off.
Facebook, Instagram, YouTube and TikTok are leading in terms of online monthly users. In the EU, TikTok has more than 200 million active users, making it one of the fastest-growing networks ever. There are over 100 million pieces of content uploaded daily. Users spend an average of 137 minutes on it per day (compared to 27 minutes in 2019) and open it about eight times a day – over 20 % of US teenagers ‘almost constantly’.
At present, TikTok is facing regulatory pressure on both sides of the Atlantic. In Europe, the European Commission started an investigation into TikTok on 19 February 2024 under the DSA, which is ongoing. In the US, TikTok was obliged to restructure its operations under a majority American-owned joint venture, and has settled ahead of trial in a social media addiction lawsuit in California that also involved other platforms, such as Meta and YouTube. They were found negligent for designing addictive online platforms. It is the first time that major social media companies have been found liable by a US jury for this reason. While the damages awarded (US$6 million) are insignificant for two companies worth trillions of dollars, the decision represents a precedent and could impact design choices to avoid further prosecution.
The DSA as a tool to redress online addictive design choicesThe Commission has intensified its scrutiny under the DSA of addictive design choices on online platforms. In 2024, it opened an investigation into Meta (ongoing), as it believed both Facebook and Instagram platforms’ designs might stimulate behavioural addictions in minors. Shein is also under scrutiny.
On 6 February 2026, the Commission preliminarily found TikTok in breach of the DSA for its addictive design features, including infinite scroll, autoplay, push notifications and highly personalised recommender systems. Additionally, it found that TikTok disregarded important indicators of compulsive use of the app, such as the time minors spend on TikTok at night, the frequency with which users open the app and other potential indicators. Under the DSA, very large online platforms (VLOPs) such as TikTok have to carry out risk assessments (Article 34) and implement effective measures to mitigate these risks (Article 35). The term ‘addictive design’ does not appear explicitly in the DSA. Instead, the legal link lies in Article 34 (including risks to public health, minors, and users’ physical and mental well-being) and Article 25. The latter prohibits deceptive or manipulative interface design, often associated with ‘dark patterns‘. It introduces a general prohibition applicable to providers of online platforms (not only VLOPs), preventing them from designing or organising their online user interfaces in such a way as to deceive or manipulate users or otherwise materially distort or impair their ability to make free and informed decisions. In addition, Article 28 stipulates general protection of minors online. There are also specific guidelines for all platforms to protect children from addictive behaviours and commercial practices online.
The Commission’s assessment is based on an in-depth investigation (still ongoing) that included an analysis of TikTok’s DSA risk assessment reports, internal data and TikTok’s responses to multiple requests for information, a review of research on this topic and expert interviews. According to the Commission, TikTok’s recommender systems and engagement-maximising interfaces generate systemic risks to the mental well-being of minors and vulnerable adults. Thus, the harm arises from prolonged, compulsive engagement that users struggle to control, stemming from the persuasive design choices made by the platform. The DSA does not provide an explicit definition of a ‘vulnerable adult’. It employs a risk-based approach focusing on protecting users from systemic risks, particularly targeting minors, those with disabilities and vulnerable groups.
TikTok can now exercise its right to defence. It may examine the documents in the Commission’s investigation files and reply in writing to the Commission’s preliminary findings. In parallel, the European Board for Digital Services, an independent advisory group to the Commission, will be consulted. If the Commission’s views are ultimately confirmed, the Commission may issue a non-compliance decision, potentially triggering a fine of up to 6 % of TikTok’s total worldwide annual turnover (estimated at over €30 billion in 2025).
The Commission preliminarily finds that TikTok needs to change the basic design of its service. Specific examples cited by the Commission include disabling key addictive features, such as infinite scroll over time, implementing effective screen time breaks (including during the night) and adapting its recommender system. Incremental adjustments or optional user controls might not be sufficient. Instead, the platform’s core architecture, with features that drive user engagement, might need to be restructured.
Next stepsSafety through design of online platforms for minors is gaining political attention and scrutiny on both sides of the Atlantic. Many argue that age restrictions are not sufficient, as they shift the blame away from platforms’ harmful designs. Likewise, parental control tools are not enough, as they also transfer responsibility from platforms on to children and their parents, and can be difficult to implement depending on parents’ digital literacy. According to the European Consumer Association BEUC, these measures should be complemented with fairness by design components.
If confirmed, these findings will establish the first European precedent for how platforms should mitigate risks from features designed to maximise engagement. The upcoming Digital Fairness Act may introduce even stricter rules, including obligations to switch off manipulative features and greater protections for children. Defining and regulating ‘addictive design’ is complex. Hence, the challenge of this investigation is to assess what constitutes acceptable design. At its core is also whether online platforms’ business models are compatible with children’s safety, and whether platforms’ declarations of intent are enough to mitigate the risks identified in their annual DSA reports. Civil society has criticised the lack of clarity. They argue that DSA risk assessments should be carried out more transparently, as platforms’ methodologies and claims are not always supported by the indicators and data provided.
The European Parliament has been active on this issue. In a December 2023 resolution on addictive design of online services, it called for an end to dark patterns and gaps in consumer protection online. The issue has also been considered more recently in the Internal Market and Consumer Protection Committee (IMCO)’s own initiative report on the protection of minors online and in another report on the impact of social media and the online environment on young people being prepared by the Culture and Education Committee (CULT).Read this ‘at a glance’ note on ‘Addictive design on online platforms‘ in the Think Tank pages of the European Parliament.