The European Commission has escalated its regulatory campaign against Big Tech, issuing preliminary findings on October 24, 2025, that accuse Meta and TikTok of breaching the European Union’s Digital Services Act (DSA). The charges, which strike at the heart of how these social media giants handle transparency, user empowerment, and data access, could result in fines of up to 6% of each company’s global annual turnover—a sanction that for Meta alone could reach several billion euros.
According to Reuters, the Commission’s investigation found that Meta’s flagship platforms, Facebook and Instagram, as well as TikTok, failed to grant researchers adequate access to public data. This is a fundamental requirement under the DSA, designed to let independent experts scrutinize how platforms manage content, disinformation, and the amplification effects of their algorithms. The Commission stated, “Allowing researchers access to platforms’ data is an essential transparency obligation under the DSA, as it provides public scrutiny into the potential impact of platforms on our physical and mental health.”
But the alleged breaches don’t stop at transparency. The Commission has also zeroed in on Meta’s user-facing tools, finding that Facebook and Instagram use systems for reporting illegal content and appealing moderation decisions that are confusing, difficult to use, and potentially deceptive. As reported by WinBuzzer, these platforms impose unnecessary steps and employ what regulators term “dark patterns”—interface designs that may actively dissuade users from flagging harmful material, such as child sexual abuse imagery or terrorist propaganda. The Commission’s report is blunt: “Such practices can be confusing and dissuading. Meta’s mechanisms to flag and remove illegal content may therefore be ineffective.”
Moreover, users whose content is removed or whose accounts are suspended are reportedly not given a proper chance to provide explanations or evidence to support their appeals. The Commission argues that this limitation undermines users’ rights to challenge a platform’s decision, a key pillar of the DSA’s user protection framework.
TikTok, while not accused of the same user-reporting failures as Meta, faces similar allegations regarding data access for researchers. The company has raised a significant legal challenge, claiming that the DSA’s transparency requirements may conflict with the European Union’s General Data Protection Regulation (GDPR), which imposes strict data privacy rules. As a TikTok spokesperson told Reuters, “Requirements to ease data safeguards place the DSA and GDPR in direct tension. If it is not possible to fully comply with both, we urge regulators to provide clarity.” TikTok also noted that it has already given nearly 1,000 research teams access to data through its research tools, but the Commission’s preliminary findings suggest this does not meet the DSA’s standards.
Meta, for its part, has firmly denied any wrongdoing. A spokesperson told Reuters, “We disagree with any suggestion that we have breached the DSA, and we continue to negotiate with the European Commission on these matters.” The company insists that it has introduced changes to its content reporting options, appeals process, and data access tools since the DSA came into force, and that these updates are sufficient to meet EU legal requirements. As Meta spokesperson Ben Walters put it, “We have introduced changes to our content reporting options, appeals process and data access tools since the DSA came into force and are confident that these solutions match what is required under the law in the EU.”
The Commission’s preliminary findings do not represent a final ruling. Both Meta and TikTok now have the opportunity to respond and propose remedies. Should they fail to satisfy the Commission, the companies could face fines of up to 6% of their annual global revenue—a potentially historic penalty. For a company the size of Meta, this would mean billions of euros at stake.
This regulatory standoff comes as part of a much broader EU effort to rein in the power of digital platforms. The DSA, which came into force in 2024, aims to compel large tech firms to take more responsibility for illegal and harmful content, increase transparency around their algorithms, and ensure that vetted researchers can access both public and, soon, certain non-public data. A new delegated act on data access under the DSA will come into force on October 29, 2025, further expanding researchers’ rights to examine the workings of very large online platforms.
The current charges also follow a string of high-profile disputes between Brussels and major tech companies. Meta recently announced it would ban all political advertising in the EU, citing the complexity of complying with the bloc’s new Transparency and Targeting of Political Advertising (TTPA) regulation. This move, as WinBuzzer reported, mirrors a similar withdrawal by Google and effectively shuts down the two largest digital avenues for political outreach in Europe. Meta has previously argued that such regulations “effectively remove popular products and services from the market, reducing choice and competition.”
Meta’s regulatory headaches don’t end there. In April 2025, the company was fined €200 million under the Digital Markets Act (DMA) for its controversial “pay or consent” subscription model, which regulators said failed to provide users with a genuine choice regarding data use. The company has repeatedly framed these enforcement actions as anti-competitive and overly burdensome.
Geopolitical tensions add another layer of complexity. The Trump administration in the United States has sharply criticized the EU’s digital regulations, describing them as “Orwellian” and warning that such measures could inflame transatlantic relations. Washington claims the DSA unfairly targets American companies and amounts to censorship. The outcome of these cases could set a precedent for how Europe intends to police global tech giants—and how far its regulatory reach may extend.
For now, the Commission remains resolute. Henna Virkkunen, the Commission’s executive vice-president for tech, previously stated her commitment to “ensuring that every platform operating in the EU respects our legislation, which aims to make the online environment fair, safe, and democratic for all European citizens.” She added pointedly, “Our democracies depend on trust. That means platforms must empower users, respect their rights and open their systems to scrutiny.”
As both Meta and TikTok weigh their next moves, the eyes of the tech world—and indeed, of regulators and lawmakers globally—are fixed on Brussels. The stakes are high, not just for the companies involved but for the future of digital governance itself. The coming months will reveal whether the EU’s ambitious digital rulebook can truly hold the world’s most powerful platforms to account.