Today : Oct 28, 2025
World News
26 October 2025

EU Accuses Meta And TikTok Of Breaking Digital Rules

The European Union’s historic crackdown on Meta and TikTok over transparency and user safety failures could lead to billions in fines and force sweeping changes to social media platforms.

In a move that’s rattled the tech world from Silicon Valley to Beijing, the European Union has accused Meta—the parent company of Facebook and Instagram—and TikTok of breaching the bloc’s landmark Digital Services Act (DSA). Announced on October 24, 2025, the preliminary findings from the European Commission mark the first time these social media titans have faced such sweeping regulatory scrutiny under the DSA, a law designed to keep internet users safe and demand accountability from Big Tech.

The Commission’s investigation, launched in 2024, found that both Meta and TikTok failed to meet strict transparency obligations. The DSA, which came into force as the EU’s digital rule book, requires platforms to make it easy for users to report illegal or harmful content—such as hate speech, child sexual abuse material, or terrorist propaganda—and to ban targeted ads for children. It also mandates that platforms provide researchers with access to data to study the societal impacts of social media, including the spread of disinformation and the mental health of minors.

Regulators did not mince words in their assessment. According to the European Commission, Meta’s Facebook and Instagram, along with TikTok, “do not offer a user-friendly and easily accessible ‘Notice and Action’ mechanism for users to flag illegal content, such as child sexual abuse material and terrorist content.” Instead, users face a maze of unnecessary steps and demands, which the Commission characterized as “dark patterns”—deceptive interface designs that discourage people from reporting problematic material.

“Such practices can be confusing and dissuading,” the Commission stated, adding that both platforms “appear to use so-called ‘dark patterns’ when it comes to the ‘Notice and Action’ mechanisms.” The investigation also found that the content moderation appeal systems on Facebook and Instagram are deeply flawed. Users are not allowed to provide explanations or supporting evidence when challenging content decisions, making it nearly impossible to effectively contest removals or bans.

Beyond the user interface, the Commission accused Meta and TikTok of running opaque systems that shield their algorithms and data from outside scrutiny. Both companies allegedly failed to grant researchers proper access to public data, as required by law. This lack of transparency, regulators argue, cripples society’s ability to understand how these platforms shape public opinion and impact mental health—especially among children and teenagers.

Henna Virkunnen, the EU’s executive vice president for tech sovereignty, security, and democracy, underscored the stakes in a post on X (formerly Twitter): “Our democracies depend on trust. That means platforms must empower users, respect their rights, and open their systems to scrutiny. The DSA makes this a duty, not a choice.”

The timing of the EU’s announcement is no accident. In just days, a new rule will take effect, granting researchers even greater access to both public and private datasets on these platforms. The current findings are being seen as a warning shot—a signal that the era of self-policing by Big Tech is over, and that Europe’s enforcement of the DSA is entering a new, more aggressive phase.

If the violations are confirmed in the final decision, the consequences for Meta and TikTok could be severe. The Commission has the authority to impose fines of up to 6% of a company’s total worldwide annual turnover—potentially costing Meta more than €10 billion. Ongoing penalty payments could also be levied until the companies bring their practices into compliance.

Meta, for its part, has denied any wrongdoing. In a statement, the company said, “We disagree with any suggestion that we have breached the DSA. In the European Union, we have introduced changes to our content reporting options, appeals process, and data access tools since the DSA came into force and are confident that these solutions match what is required under the law.” TikTok echoed this sentiment, insisting that it is “committed to transparency” but also pointing out what it sees as a conflict between the DSA’s requirements and the EU’s own data protection rules under the General Data Protection Regulation (GDPR). “If it is not possible to fully comply with both, we urge regulators to provide clarity on how these obligations should be reconciled,” a TikTok spokesperson said.

Despite these assurances, Brussels appears unconvinced. Regulators have accused the platforms of deploying business models that prioritize engagement and profit over user safety and transparency. The Commission’s findings echo longstanding concerns raised by critics, including former Meta engineer Arturo Béjar, who previously warned that the company’s tools often fail to protect users—especially minors—from known harms.

This regulatory showdown has not gone unnoticed in Washington. The Trump administration has repeatedly objected to European regulation of US tech giants, with President Trump threatening tariffs on countries that enforce digital services rules targeting American companies. Although the EU and US reached a tariff agreement in the summer of 2025, talks over its implementation are ongoing, and tensions remain high. EU digital spokesman Thomas Regnier pushed back against accusations of censorship, arguing, “When accused of censorship, we prove that the DSA is doing the opposite. It is protecting free speech, allowing citizens in the EU to fight back against unilateral content moderation decisions taken by Big Tech.”

For now, Meta and TikTok have the opportunity to challenge the Commission’s preliminary findings, review the evidence, and propose remedies. The European Board for Digital Services will oversee this compliance process, ensuring fairness. If the companies fail to satisfy Brussels with their proposals, the Commission can impose fines per breach, per platform. The bigger threat for Meta and TikTok, however, may be structural: a final ruling could force them to redesign their user interfaces, overhaul their moderation and reporting systems, and build new, open research tools—steps they have long resisted.

This case is more than a regulatory skirmish; it is the first real test of whether Europe’s Digital Services Act has teeth. The world is watching as the EU positions itself as the digital referee for a new era. The message from Brussels is clear: the days of gentle nudges are over, and the enforcement era has officially begun.

As the legal chess match unfolds, the outcome will shape not only the future of Meta and TikTok but also the global standards for tech accountability and user protection in the digital age.