Today : Oct 27, 2025
Technology
26 October 2025

EU Accuses Meta And TikTok Of Breaking Digital Law

European regulators say Meta and TikTok failed to provide data access and user protections required by the Digital Services Act, raising the threat of multibillion-dollar fines as both companies dispute the findings.

On October 24, 2025, the European Union drew a digital line in the sand, announcing that Meta and TikTok had breached the bloc’s sweeping Digital Services Act (DSA). The preliminary findings, released by the European Commission, accused the social media giants of failing to live up to their legal obligations—particularly when it comes to transparency, user empowerment, and accountability. The stakes? Potential fines that could soar into the billions.

According to Ammon News, the Commission’s assessment centered on a critical issue: both Meta and TikTok were found to be withholding “adequate access to public data” from researchers. This isn’t just a technicality. The DSA, which has been called the EU’s “mammoth content law,” was designed to keep internet users safe and make online platforms more accountable to the public. By denying researchers the data they need, the platforms are, in the Commission’s view, undermining public scrutiny and the very aims of the law.

But that’s not all. Meta—whose platforms include Facebook and Instagram—faced a barrage of additional criticisms. The Commission said Meta had failed to provide user-friendly ways for people to flag illegal content, like hate speech or counterfeit goods. Even more troubling, according to the EU, was Meta’s lack of “effective systems” for users to challenge content-moderation decisions. In other words, if you think a post was unfairly taken down, good luck getting it reviewed—at least, as things stand now.

As AP reports, the inquiry also found that both Facebook and Instagram had deployed so-called “dark patterns”—deceptive interface designs that made it confusing or even discouraging for users to report malicious content such as child sexual abuse material or terrorist propaganda. The Commission described these designs as “confusing and dissuading,” warning that they “may therefore be ineffective.” That’s a damning assessment, especially given the DSA’s explicit aim to make flagging harmful or illegal content as easy as possible.

Henna Virkunnen, the EU’s executive vice president for tech sovereignty, security, and democracy, summed up the bloc’s stance in a forceful post on X (formerly Twitter): “We are making sure platforms are accountable for their services, as ensured by EU law, towards users and society. Our democracies depend on trust. That means platforms must empower users, respect their rights, and open their systems to scrutiny. The DSA makes this a duty, not a choice.”

The DSA, which took effect in 2024, is widely seen as the world’s most ambitious attempt to regulate Big Tech. It imposes strict requirements on large online platforms, including bans on ads targeted at children, easier ways to report unsafe goods, and—crucially—transparency obligations that force platforms to open up their systems to outside scrutiny. The law’s teeth are sharp: companies that violate its provisions can be fined up to 6% of their annual global profits. For Meta and TikTok, that could mean billions of dollars on the line.

When the Commission launched its investigations into Meta and TikTok, it was clear that the focus would be on transparency. Regulators wanted to know whether the companies were making it easy for researchers to access data about how their platforms work and what impact they have on users’ physical and mental health. “Allowing researchers access to platforms’ data is an essential transparency obligation under the DSA, as it provides public scrutiny into the potential impact of platforms on our physical and mental health,” the Commission said in its statement.

But the findings went further. The inquiry concluded that Meta’s Facebook and Instagram didn’t just fail to provide adequate data access—they also made it unnecessarily hard for users to flag illegal content and challenge moderation decisions. This, the Commission argued, was a fundamental breach of the DSA’s requirements. The use of dark patterns only compounded the problem, making it “confusing and dissuading” for users to take action when they encountered harmful material.

Meta, for its part, isn’t taking the accusations lying down. Spokesperson Ben Walters pushed back against the Commission’s assessment, saying, “We have introduced changes to our content reporting options, appeals process, and data access tools since the DSA came into force and are confident that these solutions match what is required under the law in the EU.” Walters emphasized that Meta disagrees with the findings but intends to continue negotiating with the EU over compliance. The company now has the opportunity to file a formal response to the inquiry before any final decision—or fine—is handed down.

TikTok has also signaled its intention to fight its corner. The company raised a thorny issue that’s been simmering beneath the surface of the DSA debate: the potential conflict between the DSA’s transparency obligations and the EU’s General Data Protection Regulation (GDPR), which imposes strict privacy rules. “If it is not possible to fully comply with both, we urge regulators to provide clarity on how these obligations should be reconciled,” said Paolo Ganino, a spokesperson for TikTok. This tension isn’t just academic—it’s a real-world puzzle for companies operating at the intersection of data privacy and public accountability.

The broader context, as highlighted by Reuters, is that the EU’s actions are part of a global push to rein in the power of large online platforms. By holding Meta and TikTok to account, the Commission aims to set a precedent for how Big Tech is governed—not just in Europe, but around the world. The DSA’s requirements for transparency and user empowerment are, in the words of Henna Virkunnen, “a duty, not a choice.”

As things stand, Meta and TikTok both face the possibility of fines of up to 6% of their annual profits—a penalty that could run into the billions. But the outcome of the Commission’s inquiry is far from certain. Both companies will have the chance to respond to the findings, and the regulatory process could drag on for months. In the meantime, the case has put a spotlight on the delicate balance between privacy, transparency, and accountability in the digital age.

For users, researchers, and policymakers alike, the stakes couldn’t be higher. The EU’s DSA was designed to give people more control over their online lives and to shine a light on the workings of social media giants. Whether it succeeds may depend on how this high-stakes standoff between Brussels and Big Tech plays out.