News

TikTok under the DSA: Why the EU Is scrutinising Platform Design

  • Luxembourg Centre for European Law (LCEL)
    25 February 2026
  • Category
    Outreach
  • Topic
    EU Governance, EU Law

Infinite scrolling. Autoplay videos. Hyper-personalised feeds.
These features are not accidental, they are central to how platforms like TikTok keep users engaged. Under EU law, they are also subject to regulatory scrutiny.

When the European Commission in early February signalled preliminary findings of TikTok breaching the Digital Services Act (DSA), public debate quickly focused on so-called “addictive design”. But the case reveals something broader: the DSA was never just about removing illegal content. It also requires very large platforms to examine whether the way their services are built creates systemic risks for society.

Looking at platform architecture was always part of the regulation’s logic. What we are seeing now is enforcement.”

Dr Lucas Henrique Muniz da Conceição

Postdoctoral researcher (Luxemourg Centre for European Law, University of Luxembourg)

Join us for an expert explainer, with Dr Lucas Henrique Muniz da Conceição, postdoctoral researcher at the University’s Luxembourg Centre for European Law (LCEL).

That is a common perception, and not entirely wrong.

The DSA harmonises how platforms must deal with illegal content across the EU. The DSA still protects online platforms from being automatically responsible for what users post, as long as the platforms respect certain obligations. It also improves transparency and makes it easier for people to report illegal content and for platforms to respond to those reports.

The regulation establishes a broader framework for digital intermediaries — social media platforms, online marketplaces and hosting services,combining internal market rules with due diligence obligations and structured supervision.

For the largest services, so-called Very Large Online Platforms (VLOPs), the obligations are stricter. The logic is simple: platforms that shape information flows at scale have greater responsibilities.

TikTok falls into this category. As a VLOP, it must:

  • Assess systemic risks linked to its service,
  • Put mitigation measures in place,
  • Undergo independent audits,
  • Operate under enhanced supervision by the European Commission.

Because the DSA recognises that behaviour is shaped by more than single pieces of content. Features such as recommender systems, autoplay or infinite scrolling influence what gains visibility and how long users stay engaged. They pose in fact greater risks than individual posts, which might not be seen by many users without those features.

If, for example, an algorithm consistently pushes extreme or harmful material to minors because it maximises engagement, the legal issue is not just the individual video. It is the system that amplifies it.

For these reasons, the DSA requires very large platforms to assess systemic risks linked to the “design, functioning and use” of their services. These risks can concern fundamental rights, public debate, public health or the protection of minors. The Commission is examining whether TikTok has properly assessed and mitigated foreseeable risks arising from how its platform is designed.

The DSA does not define “addictive” as a legal category.
Instead, it focuses on how platform design features may create systemic risks that platforms are required to identify and mitigate.

The Digital Services Act draws particular attention to interface designs that could undermine users´capacityto make free and informed choices, especially where children and vulnerable users are concerned.

Under the DSA, the key obligation for a Very Large Online Platform (VLOP) is to conduct a systemic risk assessment of its service, identify foreseeable harms, and implement proportionate mitigation measures in response. In this context, the relevant legal question is not limited to whether scrolling is medically addictive per se; rather, it is whether the platform has appropriately identified, assessed, and addressed the systemic risks arising from its engagement-driven algorithm and design features under the DSA. While concerns about clinical or psychological effects can inform the risk analysis, the Commission’s enforcement action is rooted in whether the platform fulfilled its risk-mitigation obligations under the Regulation.

Concerns about engagement-driven design are not unique to Europe. In the United States, courts and state authorities have also examined whether platform design may contribute to harm.

The difference lies in regulatory structure.

In the United States, harmful design is often addressed through litigation after harm is alleged. In the EU, the DSA requires very large platforms to assess and mitigate systemic risks in advance, under regulatory oversight.

One model is primarily reactive and court-driven. The other is preventive and supervisory.

About the expert

Dr Lucas Henrique Muniz da Conceição is a Postdoctoral Researcher at the Luxembourg Centre for European Law (LCEL), University of Luxembourg. His research focuses on EU digital regulation, platform governance and the constitutional dimensions of European Union law.

Read the full interview with Dr Lucas Henrique Muniz da Conceição for a deeper analysis of the constitutional dimensions of the DSA.

  • Dr Lucas Henrique Muniz da Conceição

    Luxembourg Centre for European Law
    Postdoctoral Researcher