The European Union has ramped up its oversight of major technology companies, as the European Commission released preliminary findings on October 27 and 28, 2025, accusing Meta and TikTok of failing to meet key transparency and user-rights obligations under the Digital Services Act (DSA). This landmark law, which came into force earlier this year, is designed to rein in the power of Very Large Online Platforms (VLOPs) and ensure they operate with greater openness, accountability, and respect for user rights.
According to the Commission’s findings, reported by ExchangeWire and other outlets, both Meta—parent company of Facebook and Instagram—and TikTok have set up overly complex and restrictive procedures that make it tough for researchers to access public data. This access is not a trivial matter; it’s central to the DSA’s mission to allow independent scrutiny of how platforms influence everything from mental health to civic discourse. The Commission stated bluntly, “Providing researchers access to platform data is an important transparency obligation under the DSA, as this allows public scrutiny of potential platform impacts on people's physical and mental health.”
The DSA doesn’t just focus on research, though. It also requires platforms to provide easy and effective tools for users to report illegal content—think child sexual abuse material or terrorism-related posts—and to challenge moderation decisions they think are unfair. Here, too, the Commission found Meta wanting. Its Notice and Action system, the mechanism for reporting illegal content, was criticized for being neither user-friendly nor accessible. Investigators flagged the use of so-called “dark patterns”—misleading interface designs that can confuse users or deter them from completing reports. Such practices, the Commission warned, may render Meta’s reporting systems ineffective, potentially leaving illegal content online far longer than it should be.
For researchers, the hurdles are equally daunting. The Commission’s review, as covered by MediaNama, found that both TikTok and Meta’s platforms often left researchers with only partial or unreliable data, making it hard to study crucial issues like the exposure of minors to harmful or illegal material. The procedures for requesting data were described as “complicated” and “burdensome,” with unnecessary additional steps and confusing interface designs that could easily discourage even seasoned academics. The Commission’s preliminary findings suggest that Facebook, Instagram, and TikTok may have intentionally or inadvertently created these barriers, undermining efforts to keep digital ecosystems accountable.
Meta, for its part, has pushed back against the allegations. A company spokesperson told the European Commission, “We have introduced changes in content reporting options, appeals, and data access tools since the DSA came into effect. We believe these measures have met the legal requirements in the European Union.” Meta also emphasized that it disagrees with the assumption that it has violated the DSA, and said it will continue to negotiate with the Commission. TikTok, meanwhile, affirmed its commitment to transparency but pointed out a thorny issue: tensions between the DSA and the EU’s General Data Protection Regulation (GDPR), which governs data privacy. A TikTok spokesperson said, “The demand to relax data protection has actually caused tensions between DSA and GDPR. If the two regulations cannot be fully complied with simultaneously, we urge regulators to provide clarity on how to reconcile these obligations.”
These aren’t just technical squabbles. At stake are billions of euros and the future of digital regulation across Europe and, potentially, the world. If the Commission’s findings are upheld after further consultation and review, both Meta and TikTok could be slapped with fines of up to 6% of their total annual global income—a penalty that could run into the billions. As ExchangeWire noted, “Should the investigation confirm the breaches, Meta and TikTok could face fines of up to 6% of their annual global revenue.”
The process, however, is far from over. Both companies have been given the opportunity to review the investigation files, respond in writing, and propose corrective steps before the Commission makes a final decision. The findings released in late October are preliminary, not final, and the companies’ responses could influence the outcome. The Commission also confirmed that new data access rules for researchers will come into effect on October 29, 2025, expanding the scope of data available from VLOPs and search engines. The goal is to enhance transparency, promote independent research, and strengthen accountability across the digital ecosystem.
This isn’t the first time the EU has flexed its regulatory muscles against tech giants. The investigation into Meta and TikTok follows similar proceedings against X (formerly Twitter), which was found in July 2024 to have misled users with its paid verification system and to have blocked independent researchers from accessing public data. In that case, as in the current one, the Commission highlighted the importance of transparency and the dangers of design choices that obscure key information or discourage user participation. The DSA, as the Commission sees it, is not just about ticking boxes or issuing policy statements—it’s about real, measurable change in how platforms operate.
For users, the stakes are high. Without genuine access to data, transparent reporting tools, and fair appeals processes, there’s little hope of holding platforms accountable for their decisions. The Commission has made clear that it expects more than lip service from the companies it regulates. “Allowing researchers to access platform data is essential for public scrutiny of how online platforms affect mental health, civic discourse, and information integrity,” the Commission noted. It added that such transparency is vital for understanding how algorithms and recommendation systems shape online experiences.
Industry observers see the EU’s actions as a crucial test of the DSA’s effectiveness. Several tech giants, including Apple, have previously objected to the DSA, but most have ultimately complied with the rules. The final outcome of the Meta and TikTok cases could set a precedent for how the DSA is enforced in the future—and how global tech companies approach transparency, user rights, and data sharing worldwide.
As the dust settles, one thing is clear: the European Union is determined to ensure that the digital public square is governed by rules that prioritize openness, fairness, and accountability. Whether Meta and TikTok will rise to meet these standards remains to be seen, but the message from Brussels is unmistakable: the era of opaque algorithms and inaccessible data is coming to an end.