Today : Oct 25, 2025
World News
25 October 2025

EU Accuses Meta And TikTok Of Breaking Digital Rules

The European Union says Meta and TikTok failed key transparency obligations under the Digital Services Act, exposing them to massive fines as the bloc cracks down on Big Tech's handling of online safety and user rights.

On October 24, 2025, the European Union sent shockwaves through the tech world by accusing social media giants Meta and TikTok of breaching key transparency rules under the Digital Services Act (DSA). The findings, announced by the European Commission, mark the first time Meta—parent company of Facebook and Instagram—has been formally accused of violating the DSA, and they place both companies at risk of fines that could reach billions of dollars.

The DSA, which came into force to bolster user safety online and increase accountability for large digital platforms, imposes strict requirements on how companies handle illegal content, protect children, and open their systems to public scrutiny. The law requires platforms to make it easy for users to report counterfeit or unsafe goods, flag harmful or illegal content such as hate speech, and bans targeted ads aimed at children. But according to the EU’s preliminary findings, both Meta and TikTok have fallen short on several crucial fronts.

At the heart of the EU’s accusations is the claim that Meta’s Facebook and Instagram, as well as TikTok, have failed to provide researchers with adequate access to public data. This access is considered an "essential transparency obligation under the DSA," the European Commission stated, because it allows independent experts to evaluate the platforms’ effects on users’ physical and mental health. Without it, investigations into the prevalence of harmful content and the risks posed to vulnerable groups, especially children, become significantly more difficult.

"Allowing researchers access to platforms’ data is an essential transparency obligation under the DSA, as it provides public scrutiny into the potential impact of platforms on our physical and mental health," the Commission emphasized in an official statement. The lack of such access, the EU argues, not only impedes research but also undermines the DSA’s core goal of protecting internet users and society at large.

But the issues don’t stop there. The Commission found that users of Facebook and Instagram face confusing and cumbersome procedures when trying to flag illegal or harmful content. Rather than offering a straightforward process, the platforms allegedly employ so-called "dark patterns"—deceptive interface designs that make the task unnecessarily complicated. The result, according to the EU, is a system that is "confusing and dissuading" and "may therefore be ineffective." This, regulators say, can discourage users from reporting serious issues like child sex abuse or terrorist content.

The DSA also stipulates that users must be able to challenge content moderation decisions easily and have the opportunity to provide explanations or evidence in appeals. Yet the Commission’s investigation found that Facebook and Instagram do not offer effective systems for users to contest moderation decisions, making the appeal process one-sided and opaque. The platforms have also reportedly failed to adequately explain why certain content is removed or left online, further fueling concerns about accountability and fairness.

Henna Virkunnen, the EU’s executive vice president for tech sovereignty, security, and democracy, took to X (formerly Twitter) to underscore the stakes: "We are making sure platforms are accountable for their services, as ensured by EU law, towards users and society. Our democracies depend on trust. That means platforms must empower users, respect their rights, and open their systems to scrutiny. The DSA makes this a duty, not a choice."

The investigation, which began in 2024 in collaboration with Ireland’s Digital Services Coordinator, has produced what the Commission calls "preliminary findings." Both Meta and TikTok now have the opportunity to examine the results, respond in writing, and propose corrective actions. If the Commission remains unsatisfied, it can issue a non-compliance decision and levy fines of up to 6% of the companies’ total worldwide annual turnover—a penalty that, given the size of these tech giants, could easily climb into the billions.

Meta, for its part, disputes the EU’s conclusions. "We disagree with any suggestion that we have breached the DSA," a Meta spokesperson said, adding, "We have introduced changes to our content reporting options, appeals process, and data access tools since the DSA came into force and are confident that these solutions match what is required under the law in the EU." The company maintains it is engaged in ongoing negotiations with the Commission and stands by the improvements it has made.

TikTok, owned by China’s ByteDance, is also reviewing the Commission’s findings. However, the company has raised a thorny legal dilemma: a potential conflict between the DSA’s transparency obligations and the EU’s General Data Protection Regulation (GDPR), which sets strict privacy standards. "If it is not possible to fully comply with both, we urge regulators to provide clarity on how these obligations should be reconciled," said Paolo Ganino, a TikTok spokesperson. The company insists it is "committed to transparency," but wants guidance on navigating the sometimes competing demands of transparency and privacy.

These latest accusations come against a backdrop of heightened scrutiny of Big Tech in Europe. TikTok has already faced a €530 million ($600 million) fine for failing to adequately protect EU users’ personal data under the GDPR. Both Meta and TikTok are subject to ongoing EU investigations, including probes into whether they are doing enough to combat the addictive nature of their platforms, particularly for children.

The stakes are high, not just for the companies involved but for the entire digital ecosystem. The DSA is regarded as a trailblazing law, with the potential to reshape how tech giants operate in Europe and, by extension, the world. U.S. officials, including former President Donald Trump, have criticized the DSA as a tool of censorship and threatened retaliatory tariffs against countries that target American technology companies. But EU officials have pushed back, insisting that the law is designed to protect free speech by giving users the ability to challenge unilateral decisions by Big Tech—not to silence voices.

"When accused of censorship, we prove that the DSA is doing the opposite. It is protecting free speech, allowing citizens in the EU to fight back against unilateral content moderation decisions taken by Big Tech," said EU digital spokesman Thomas Regnier.

As the process unfolds, Meta and TikTok have the chance to address the Commission’s concerns and stave off potentially massive penalties. The outcome will be closely watched by regulators, tech companies, and users worldwide, as it could set a precedent for how digital platforms are governed in the years to come.

For now, the future of online transparency and accountability in Europe hangs in the balance, with users, researchers, and policymakers all waiting to see whether the world’s largest social networks will step up—or face the consequences.