Reporting and analysis based on original reporting by Adam Satariano, The New York Times

European regulators have drawn a sharp line in the sand over the design of social media platforms, signaling that the mechanics that fuel growth may no longer be tolerated when they come at the expense of user well-being—especially that of children.

In a preliminary decision released this week, the European Commission said that core features of TikTok—including infinite scroll, autoplay, and algorithmic recommendations—may constitute an “addictive design” in violation of the European Union’s Digital Services Act. Regulators argued that these features encourage compulsive behavior and pose risks to users’ physical and mental health, particularly minors and other vulnerable groups.

If upheld, the ruling would mark the first time anywhere in the world that a legal standard has been formally applied to the addictiveness of social media design.

When Engagement Becomes a Liability

TikTok’s rise has been inseparable from its ability to keep users scrolling. Its algorithm quickly learns individual preferences, delivering an endless stream of short-form videos that require little conscious decision-making. According to European regulators, that is precisely the problem.

Officials cited data showing how frequently users—especially younger ones—open the app, as well as the amount of time minors spend on TikTok late at night. The platform, regulators said, continually “rewards” users with new content, placing them into what was described as a kind of mental autopilot.

The conclusion from Brussels was blunt: TikTok may need to redesign the very features that made it a global phenomenon.

“The service needs to change the basic design of its platform,” the European Commission said in its statement.

TikTok Pushes Back

TikTok has forcefully rejected the findings and said it will challenge them.

In a statement, the company described the Commission’s preliminary conclusions as “categorically false and entirely meritless,” signaling a prolonged legal fight ahead. No timeline has been set for a final ruling, but the stakes are high. Under the Digital Services Act, penalties can reach up to 6 percent of a company’s global revenue.

TikTok currently counts more than 200 million users across Europe.

A Global Reckoning for Social Media

The case against TikTok does not exist in isolation. Governments around the world are increasingly questioning whether social media platforms have crossed a line—from offering entertainment to engineering dependency.

In the United States, TikTok and other major platforms face lawsuits modeled in part on litigation against Big Tobacco, accusing them of knowingly designing products that hook young users. Similar concerns have driven policy debates and age-restriction proposals in countries including France, Denmark, Malaysia, and Spain.

European officials say their investigation mirrors claims made in U.S. courtrooms: that endless feeds, autoplay videos, and hyper-personalized recommendations contribute to anxiety, depression, eating disorders, and self-harm among young people.

Last month, TikTok agreed to settle a lawsuit in Los Angeles just before trial—the first in what is expected to be a series of closely watched cases involving TikTok, Meta, YouTube, and Snap. All of the companies deny that their products are addictive or that a clear causal link exists between platform design and mental health outcomes.

Europe’s Regulatory Muscle Flexes Again

For years, the European Union has positioned itself as the world’s most aggressive regulator of the tech industry. Its policies on privacy, competition, and online safety have often set de facto global standards.

That trend continues here. Henna Virkkunen, an executive vice president of the European Commission, said in a statement that social media addiction can have “detrimental effects on the developing minds of children and teens,” adding that European law makes platforms responsible for the consequences of their design choices.

The Commission has shown little hesitation in enforcement. Just last month, it fined Elon Musk’s social media platform X €120 million (about $140 million) for violating transparency rules.

Not everyone is applauding. U.S. officials—including members of the Trump administration—have criticized the European Union for what they see as disproportionate scrutiny of American tech companies, framing the dispute as a broader debate over free speech and regulatory overreach. European regulators counter that the TikTok case demonstrates their willingness to pursue companies regardless of national origin.

What Comes Next

TikTok now has an opportunity to formally respond before regulators issue a final decision. If the ruling stands, it could force not only TikTok but the entire social media industry to reconsider how engagement is engineered—and where responsibility begins and ends.

For platforms built on attention, Europe’s message is clear: growth at all costs is no longer an acceptable business model.


Acknowledgements & Sources
This article is based on reporting by Adam Satariano, technology correspondent for The New York Times, with contributions from Jeanna Smialek in Brussels. Original reporting published February 6, 2026. RedX Magazine has adapted and contextualized the reporting for editorial analysis.