Skip to content
Nemko DigitalFeb 27, 2026 2:47:24 PM4 min read

European Commission Finds TikTok in Breach of the Digital Services Act Over Addictive Design

The European Commission has preliminarily found that TikTok’s platform design, which includes features known to be addictive, is in breach of the Digital Services Act. These preliminary findings highlight the increasing regulatory scrutiny on platform accountability and user safety, particularly concerning minors. The decision signals a pivotal moment for EU digital regulation and its enforcement, setting a precedent for how online platforms and other powerful tech services will be held responsible for the impact of their design choices under this landmark law and new law framework.

 

 

This development underscores the importance for all online platforms—especially major internet platforms and major social media platforms—to proactively assess and mitigate risks associated with their services. For organizations navigating the complex landscape of digital compliance and platform regulations, understanding the implications of the Digital Services Act is crucial for ensuring their own services are aligned with evolving regulatory expectations, clear rules, and user-protection goals.

 

What the Commission Found

The European Commission's investigation concluded that TikTok failed to adequately assess and mitigate the risks associated with its addictive design. The platform's features, such as its “infinite scroll” and “autoplay” functions, are designed to keep users engaged for extended periods. The Commission's preliminary view is that these features can fuel compulsive behavior and reduce user self-control, posing a significant risk to the physical and mental well-being of users, especially children. This aligns with a growing global focus on online child safety, as seen in measures like the California Child Safety Law.

Crucially, the Commission is also framing these harms as systemic risks that digital service providers must address through stronger platform governance, transparency reports, and an in-depth investigation process coordinated across the EU.

 

Addictive Design Features Under Scrutiny

At the heart of the Commission's concerns are specific design choices that create a highly personalized and immersive experience. The investigation highlighted the following features:

  • Infinite scroll and autoplay: These features create a continuous stream of content, making it difficult for users to disengage from the platform.
  • Push notifications: Constant notifications are designed to draw users back into the app.
  • Highly personalized recommender system: The algorithmic systems that power the “For You” page are designed to be highly engaging, but the Commission is concerned about their potential to create “rabbit holes” of content that can be harmful—including risks linked to TikTok addiction and the way content is served to minors through a personalized recommender system.

These design questions are increasingly relevant across tech companies, not just TikTok. Similar engagement mechanics exist on other major social media platforms such as Meta’s Facebook and Instagram, as well as YouTube and X (formerly Twitter), and even in some search engines that test continuous feeds. The Commission’s move signals broader enforcement efforts that may ultimately affect both major platforms and, over time, smaller platforms too.

The Commission's findings are detailed in their press release, which outlines the specific articles of the Digital Services Act that TikTok is alleged to have breached.

 

Risk Mitigation Deemed Insufficient

While TikTok has implemented some risk mitigation measures, such as screen time management tools and parental controls, the European Commission found them to be ineffective. The investigation concluded that these tools are too easy for users to dismiss and do not create enough “friction” to be effective. The parental controls were also found to be lacking, as they require a significant amount of time and technical skill for parents to implement effectively.

From a compliance perspective, this is also about whether platforms can demonstrate meaningful safeguards, new user-level transparency, and auditable evidence of what they did to reduce systemic risks—not just publish policies.

 

What This Means for Digital Platform Compliance

This preliminary finding against TikTok is a clear signal that the European Commission is taking a proactive approach to enforcing the Digital Services Act. It demonstrates a commitment to protecting users from the harmful effects of addictive design and ensuring that platforms are held accountable for the safety of their users. Depending on the final decision, the case could also lead to fines, further remedial measures, and tighter expectations around transparency reports and day-to-day platform governance (including how content moderation decisions are made and documented).

This action is part of a broader push for a Europe fit for the Digital Age, where technology serves people. More broadly, EU lawmakers are signaling that digital service providers must build products that respect new rights and safety-by-design principles—shaping Europe’s digital future through enforceable platform regulations, including the role of Digital Services Coordinators in oversight and coordination.

For companies operating in the digital space, this case serves as a critical reminder of the importance of integrating ethical design principles and robust risk assessments into their product development processes. As regulations like the EU AI Act and the Digital Services Act continue to shape the digital landscape, a proactive approach to AI regulatory compliance is no longer just an option, but a necessity. Understanding and implementing principles of regulating artificial intelligence is key to building trust and ensuring long-term success in the European market—including for companies tracking accountability debates raised by initiatives such as AlgorithmWatch.

avatar
Nemko Digital
Nemko Digital is formed by a team of experts dedicated to guiding businesses through the complexities of AI governance, risk, and compliance. With extensive experience in capacity building, strategic advisory, and comprehensive assessments, we help our clients navigate regulations and build trust in their AI solutions. Backed by Nemko Group’s 90+ years of technological expertise, our team is committed to providing you with the latest insights to nurture your knowledge and ensure your success.

RELATED ARTICLES