TikTok in Breach of Digital Services Act

TikTok in Breach of Digital Services Act

The European Commission says TikTok’s infinite scroll, autoplay and recommendations breach the Digital Services Act, citing risks to user wellbeing and minors.

Eliza Crichton-Stuart

Eliza Crichton-Stuart

Updated Feb 8, 2026

TikTok in Breach of Digital Services Act

The European Commission has issued preliminary findings stating that TikTok is in breach of the EU’s Digital Services Act (DSA), citing concerns that the platform’s core design encourages compulsive use. The investigation focuses on how features such as infinite scroll, autoplay, push notifications, and personalised recommendations affect user behaviour and wellbeing, particularly for minors and vulnerable users.

According to the Commission, TikTok’s interface continuously rewards engagement, which may weaken users’ ability to stop scrolling. Regulators describe this as pushing users into an “autopilot mode,” where self-control is reduced and session length increases without deliberate choice. Under the DSA, large online platforms are required to identify and mitigate systemic risks tied to how their services are designed and operated, not just the content they host.

How TikTok’s Interface Encourages Continuous Use

At the centre of the case is TikTok’s recommender system and content delivery loop. Infinite scroll removes natural stopping points, while autoplay ensures new videos appear without user input. Combined with highly personalised recommendations, the system is designed to keep users engaged by presenting content aligned closely with their interests and behaviour.

The Commission’s assessment suggests TikTok did not sufficiently evaluate how these mechanics may promote habitual or compulsive use. Regulators argue that repeatedly rewarding users with tailored content can make it harder to disengage, especially for younger audiences who may be more sensitive to behavioural nudges embedded in digital platforms.

This approach reflects a broader regulatory shift in Europe, where algorithmic design and engagement systems are increasingly treated as safety and compliance issues rather than purely product features. That perspective is also relevant to emerging social platforms in web3, many of which rely on similar recommendation and reward mechanics.

Special Focus on Minors and Night-Time Usage

A major part of the investigation looks at TikTok’s impact on minors. The Commission flagged concerns about how long younger users remain on the app during night hours and how frequently users reopen TikTok throughout the day. These behaviours are viewed as indicators of problematic use that should be addressed in a platform’s risk assessment process.

Regulators believe TikTok failed to properly account for these patterns when evaluating potential harm. Prolonged night-time engagement can interfere with sleep and concentration, and frequent reopening of the app may signal dependence on constant content consumption.

While TikTok offers screen time limits and parental control tools, the Commission said these measures currently provide limited friction. They can often be dismissed easily by users, and parental controls may require more technical effort from caregivers than is practical for widespread use.

Why Existing Safety Tools Are Not Enough

The Commission’s findings go beyond identifying risks and question whether TikTok’s mitigation tools actually change behaviour. According to regulators, current features meant to manage screen time do not meaningfully interrupt long sessions or discourage repeated re-entry into the app.

Parental controls were also criticised for requiring too much setup knowledge from guardians, which can reduce their effectiveness in real-world use. Under the Digital Services Act, platforms are expected to implement protections that work by default, rather than placing the burden primarily on users or families to configure safety.

Because of this, regulators believe TikTok may need to adjust fundamental parts of its service instead of relying on optional settings. That could mean introducing unavoidable breaks, changing how recommendations are delivered, or phasing out certain engagement mechanics over time.

What Changes the EU May Force TikTok to Make

At the preliminary stage, the Commission says TikTok may be required to modify core design elements. This could include disabling or reshaping addictive features, introducing meaningful screen-time interruptions, and altering how the recommender system prioritises content.

Rather than focusing only on moderation, the DSA pushes platforms to rethink how user experience design shapes behaviour. For TikTok, that means its signature fast-scrolling, autoplay-driven format may come under structural revision in the EU market.

The case also signals how future enforcement may work across the industry. Large social platforms, games with social feeds, and even web3-based social experiences that use algorithmic engagement loops may face similar scrutiny if their systems are found to promote harmful usage patterns.

TikTok’s Legal Options and Possible Fines

These findings are not yet final. TikTok has the right to respond and defend its design choices before the Commission reaches a conclusion. This step allows the company to present evidence, propose changes, or challenge the interpretation of the rules.

If the Commission confirms non-compliance, it can issue a formal decision under the Digital Services Act. That decision may include fines of up to 6 percent of TikTok’s global annual turnover. Beyond financial penalties, TikTok could also be required to implement specific product changes across the European Union.

For TikTok, the outcome could affect both its business operations and how millions of European users interact with the platform on a daily basis.

What This Means for Platforms in Europe

The TikTok investigation shows that the Digital Services Act is not just about what users post, but how platforms are built. Interface design, recommendation systems, and engagement mechanics are now regulatory targets.

By treating addictive design as a systemic risk, the EU is pushing major platforms to balance growth with user wellbeing. As enforcement continues, companies operating in Europe may need to reassess how their algorithms and UI choices influence behaviour, especially for younger users.

For the gaming and social ecosystem, this approach could eventually shape how feeds, progression loops, and reward systems are designed across apps, platforms, and even connected web3 environments.

Source: PocketGamer

Make sure to check out our articles about top games to play in 2026:

Top Anticipated Games of 2026

Best Nintendo Switch Games for 2026

Best First-Person Shooters for 2026

Best PlayStation Indie Games for 2026

Best Multiplayer Games for 2026

Most Anticipated Games of 2026

Top Game Releases for January 2026

Frequently Asked Questions (FAQs)

What did the European Commission accuse TikTok of?
The Commission preliminarily found that TikTok’s design, including infinite scroll, autoplay, and personalised recommendations, may breach the Digital Services Act by encouraging compulsive use and harming user wellbeing.

Why is TikTok’s infinite scroll a problem under EU law?
Infinite scroll removes natural stopping points, making it harder for users to disengage. Regulators say this can weaken self-control and promote excessive usage, especially among minors.

How does this affect minors on TikTok?
The EU is concerned about long night-time usage and frequent reopening of the app by younger users, which may affect sleep, focus, and mental health.

What changes might TikTok have to make?
TikTok may need to introduce meaningful screen-time breaks, adjust its recommender system, and reduce or disable certain addictive engagement features in the EU.

Can TikTok be fined for this?
Yes. If the findings are confirmed, TikTok could face fines of up to 6 percent of its global annual turnover under the Digital Services Act.

Is the decision final?
No. TikTok has the right to respond to the Commission’s findings before a final ruling is issued.

Does this affect other platforms?
Yes. The case shows the EU is regulating platform design, not just content. Other social, gaming, and web3 platforms using similar engagement systems may face future scrutiny.

Educational, Reports

updated

February 8th 2026

posted

February 8th 2026