Today : Oct 28, 2025
World News
27 October 2025

EU Accuses Meta And TikTok Of Breaking Digital Law

European regulators allege major social media platforms have failed to provide data access and user protections required by the Digital Services Act, raising the threat of hefty fines.

On October 27, 2025, the European Commission took a decisive step in its ongoing scrutiny of major social media platforms, accusing Meta and TikTok of breaching the European Union’s Digital Services Act (DSA). The preliminary findings, announced in Brussels, have sent ripples through the tech industry, raising questions about the responsibilities of online giants and the future of digital oversight in Europe.

The Digital Services Act, which came into force on October 19, 2022, was designed to bring sweeping changes to how large online platforms operate within the EU. It places special obligations on so-called "Very Large Online Platforms"—a category that includes Meta’s Facebook and Instagram, as well as TikTok. According to the European Commission, both companies are now under formal investigation for failing to meet key transparency and user protection requirements.

At the heart of the Commission’s concerns is the allegation that Meta and TikTok have not granted researchers adequate access to public data, a critical transparency measure under the DSA. As reported by Mobile World Live, the Commission’s preliminary findings state that Facebook, Instagram, and TikTok "may have put in place burdensome procedures and tools for researchers to request access to public data. This often leaves them with partial or unreliable data, impacting their ability to conduct research, such as whether users, including minors, are exposed to illegal or harmful content."

The DSA mandates that platforms must allow researchers to scrutinize their operations, with the goal of providing public oversight into the societal impacts of these digital services. Without meaningful access to data, researchers argue, it becomes nearly impossible to assess risks such as the spread of disinformation, exposure of minors to harmful material, or the amplification of addictive content.

But the Commission’s criticism does not stop there. Meta faces additional accusations regarding its user complaint systems. The preliminary findings, as detailed by the European Commission and corroborated by European Sting, reveal that Facebook and Instagram fail to provide simple, user-friendly mechanisms for flagging illegal content—such as child sexual abuse material or terrorist propaganda. Instead, users encounter a labyrinth of unnecessary steps and interface designs, referred to as "dark patterns," which can confuse or dissuade them from reporting problematic content. "Meta’s mechanisms to flag and remove illegal content may therefore be ineffective," the Commission noted in its official statement.

The DSA’s "Notice and Action" requirement is meant to empower EU users and trusted flaggers to inform platforms when content violates EU or national laws. If platforms do not act expeditiously after being alerted, they lose their liability exemption under the law. However, the Commission’s investigation found that Meta’s current reporting tools fall short, potentially leaving illegal content unaddressed.

Content moderation appeals are another sticking point. Under the DSA, European users have the right to challenge platform decisions when their content is removed or their accounts are suspended. Yet, according to the Commission’s findings, the appeal mechanisms on both Facebook and Instagram "do not appear to allow users to provide explanations or supporting evidence to substantiate their appeals." This limitation, the Commission argues, undermines the effectiveness of the entire appeal process, leaving users with little recourse if they believe a moderation decision was made in error.

Meta, for its part, disputes the Commission’s assessment. In a statement provided to Mobile World Live, a Meta representative pushed back: "In the European Union, we have introduced changes to our content reporting options, appeals process, and data access tools since the DSA came into force and are confident that these solutions match what is required under the law in the EU." The company maintains that it continues to engage constructively with the European Commission and the Irish Digital Services Coordinator, which oversees Meta’s EU operations.

TikTok, meanwhile, is under further examination for additional issues, including its mechanisms to prevent minors from accessing inappropriate content and its efforts to mitigate addiction risks among users. The Commission is also probing whether TikTok’s advertising transparency measures meet DSA standards. Both companies have been designated as "Very Large Online Platforms"—a status that brings heightened regulatory scrutiny and greater expectations for risk mitigation and accountability.

The stakes are high. If the Commission’s preliminary findings are confirmed, Meta and TikTok could face penalties of up to 6% of their total worldwide annual turnover—a potentially massive sum for companies operating at a global scale. The DSA also allows for periodic penalty payments to compel compliance, and the Commission has not ruled out further action as its investigations continue. As of October 27, 2025, 14 DSA-related proceedings have been launched, though none have yet concluded.

Looking ahead, new possibilities for researchers are set to open up on October 29, 2025, when a delegated act on data access comes into force. This regulation will grant researchers access to non-public data from very large online platforms and search engines, further enhancing public scrutiny and accountability. The move is widely seen as a response to persistent calls from academics and civil society groups for greater transparency in how social networks shape public discourse and influence mental health.

While the Commission’s findings are still preliminary, both Meta and TikTok have the opportunity to examine the documents in the investigation files and reply in writing. They may also take remedial measures to address the alleged breaches. The European Board for Digital Services will be consulted before any final decisions are made.

The DSA, together with the Digital Markets Act, forms the backbone of the EU’s ambitious digital regulatory framework. The goal, as articulated by EU tech policy lead Henna Virkkunen, is clear: "We are making sure platforms are accountable for their services, as ensured by EU law, towards users and society." The outcome of these investigations will be closely watched, not only by the companies involved but by lawmakers, researchers, and digital rights advocates across the continent.

As the EU tests the limits of its new digital rulebook, the coming months will reveal whether the world’s largest social media platforms can adapt to a future where transparency and user empowerment are not just buzzwords, but legal obligations with real teeth.