EU Flags TikTok’s ‘Addictive’ Design, Threatens Billions in Fines and Forced UX Changes

The European Commission has preliminarily concluded that TikTok’s design features, including autoplay, recommendation systems and a gamified rewards scheme in TikTok Lite, foster addictive behaviour and violate the Digital Services Act. Brussels has proposed design remedies and warned of fines up to 6% of global turnover; TikTok rejects the findings and plans to challenge them. The dispute forms part of a broader global push to curb minors’ exposure to social platforms and tests the EU’s power to regulate product design.

EU digital COVID certificate with vaccine vials and syringe on a white background.

Key Takeaways

  • 1EU Commission’s preliminary finding says TikTok’s design encourages addictive use and breaches the Digital Services Act.
  • 2Regulators have proposed measures such as a night‑time screen‑use rest, disabling infinite scroll and algorithm adjustments; non‑compliance could trigger fines up to 6% of global turnover.
  • 3TikTok calls the findings "completely wrong" and intends to contest the decision.
  • 4The probe highlights a growing international trend toward restricting minors’ access to social media and raises practical challenges like age verification and privacy.
  • 5If upheld, the ruling would set a precedent for regulators to mandate changes to core product design and algorithmic features.

Editor's
Desk

Strategic Analysis

This confrontation is consequential because it moves regulatory scrutiny from platform content to user‑experience mechanics that underpin engagement and revenue. Requiring design changes such as disabling infinite scroll or inserting enforced breaks would force platforms to rethink how they keep attention — with direct implications for ad metrics, time‑on‑site, and the incentives of recommendation systems. The EU’s willingness to impose design prescriptions under the Digital Services Act could inspire similar efforts globally, but enforcement will be technically and legally fraught: robust age verification risks privacy compromises, and users — particularly adolescents — may migrate to lesser‑regulated apps. For TikTok and other major platforms, the practical choice will be expensive compliance, protracted litigation, or fragmented regional services tailored to regulatory regimes. For policymakers, the test will be whether legal pressure can yield safer products without prompting evasive behaviour or unintended privacy harms.

China Daily Brief Editorial
Strategic Insight
China Daily Brief

Brussels has delivered a stark message to TikTok: the platform’s user experience is engineered in ways that promote addictive behaviour and may breach the European Union’s Digital Services Act. After a two‑year probe, the European Commission issued a preliminary finding in early February that functions such as autoplay and opaque recommendation systems, together with a gamified rewards scheme in TikTok Lite, create heightened risk for minors and were not subject to adequate risk assessment or mitigation.

The Commission singled out several concrete remedies it expects TikTok to adopt if it is to avoid severe penalties. Regulators proposed a night‑time "screen‑use rest" mechanism, stronger and more usable parental controls, limits on infinite scrolling and adjustments to recommendation algorithms to reduce the platform’s capacity to keep users continuously engaged. Under the Digital Services Act, the EU can levy fines up to 6% of a company’s global annual turnover — a sum that would amount to billions of dollars for a major social media business.

EU officials also revisited a separate inquiry opened in April 2024 into TikTok Lite, deployed in France and Spain, where a "tasks and rewards" programme that pays points for behaviours such as watching videos, liking content, following creators or inviting friends raised particular alarms. Regulators argued the scheme had not been properly risk‑assessed for its potential to foster compulsive use, especially among children, and lacked sufficient safeguards.

TikTok responded quickly and forcefully, calling the Commission’s description "completely wrong" and signalling plans to contest the findings. The company has routinely defended its design choices as consistent with user preferences and technical norms across the sector, and it is likely to mount a legal and public relations campaign to blunt Brussels’s leverage.

The EU action is part of a wider global trend: several countries and European governments have moved to tighten rules on minors’ access to social platforms. Australia has enacted a ban on social‑media use for under‑16s, France and Denmark have advanced age limits of 15 or 16 for new accounts, and the UK is debating similar measures. Those policy shifts reflect rising political pressure to put "guardrails" around children’s online exposure, but they also expose practical problems such as reliable age verification, privacy trade‑offs and enforcement across borderless apps.

Beyond the immediate clash between Brussels and a single app, the case matters because it tests the EU’s capacity to regulate the habits mobile platforms cultivate through interface design and algorithmic curation. Forcing changes to core engagement mechanics — autoplay, infinite scroll and recommendation loops — would go beyond conventional content moderation and cut to the business model that underpins social media advertising and retention strategies. The outcome will shape not only TikTok’s operations in Europe but also the template regulators use elsewhere when balancing child protection, privacy and commercial freedoms.

Share Article

Related Articles

📰
No related articles found