Today : Oct 14, 2025
World News
14 October 2025

Europe Pressures Tech Giants Over Child Online Safety

Regulators, privacy advocates, and tech firms clash as the EU weighs sweeping new rules to protect minors online, with encryption and surveillance at the heart of the debate.

European regulators are ramping up scrutiny of major tech companies—Apple, Google, YouTube, and Snap—over how they protect minors from illegal and harmful content online. On October 13, 2025, these digital giants were summoned by the European Commission to answer tough questions about their child safety measures, marking a pivotal moment in the enforcement of Europe’s ambitious Digital Services Act (DSA). This sweeping legislation, which came into effect earlier this year, is designed to set new global standards for online child protection, but its rollout has exposed a tangled web of technical, legal, and ethical dilemmas that are far from resolved.

The Commission’s immediate concerns focus on issues such as underage access to drugs, vaping products, and content linked to eating disorders. Apple’s App Store and Google Play were specifically asked to clarify how they prevent minors from downloading harmful or illegal apps. Meanwhile, Snapchat was pressed to demonstrate that it enforces its under-13 ban, and YouTube faced pointed questions about its recommendation algorithm after reports surfaced of inappropriate content reaching young users. According to MT Newswires, Google has responded by highlighting its progress in adding parental controls and age protections, but Apple has remained silent on the matter.

These inquiries are more than bureaucratic box-ticking—they are a high-profile test of the DSA’s new child safety standards. The legislation demands not only transparency from tech platforms, but also robust, proactive steps to keep children safe online. How these companies respond could have far-reaching implications for the tech industry, not just in Europe, but around the world. As the DSA throws a spotlight on potential regulatory risks, investors are watching closely. Stricter child safety rules could mean higher compliance costs, the threat of hefty fines, and the need for costly new product features. For companies like Snap and Alphabet, which rely heavily on young users and app revenue, this regulatory pressure could sway investor sentiment and reshape the future of digital advertising and app sales.

The stakes are high—and not just for Big Tech’s bottom line. Europe’s DSA is setting the pace for online child safety worldwide. Policymakers in other regions are likely to follow suit, making the EU’s approach a bellwether for global standards. As digital risks multiply, the push for greater transparency and accountability in handling young users online is only expected to intensify.

Yet even as Europe tries to lead on child protection, its efforts have become mired in controversy. On October 14, 2025, the European Union delayed decisions on a set of even more contentious online child protection laws that would require tech companies to scan images, videos, and links for evidence of child sexual abuse material (CSAM). The delay followed fierce privacy concerns raised by member states such as Germany, which opposes mass scanning of private messages. The planned laws would apply even to encrypted messaging platforms like WhatsApp, raising the specter of what many critics describe as mass surveillance.

The personal stories that drive this debate are harrowing. Take Iris, an eleven-year-old whose experience of online sexual extortion was documented during a rally by children’s rights campaigners outside the EU headquarters in Brussels. Iris’s case, and many like it, are reminders of what’s at stake. According to Fabiola Bas Palomares, policy lead at Eurochild, “In these three years of delay—that’s over 1200 days of negotiations—a lot of children have fallen into the hands of perpetrators.”

Globally, the scale of the problem is staggering. More than 60 million pictures and videos linked to the sexual exploitation of minors were flagged online last year, and around two-thirds of all child sexual abuse webpages detected by the Internet Watch Foundation were traced to an EU country. Children’s rights advocates argue that every day of delay in passing new regulations leaves more children vulnerable to abuse.

But privacy campaigners are adamant that the proposed laws go too far. Germany’s Justice Minister Stefanie Hubig stated, “Private communication must never be subject to general suspicion. Nor may the state force messenger services to scan messages massively for suspicious content before they are sent.” Under the latest proposal, penned by Denmark—the current holder of the EU’s rotating presidency—tech firms deemed high risk could be ordered to scan all links, images, and videos (though not texts) shared on their platforms, and report instances of suspected CSAM to law enforcement. This would include encrypted content, which is designed to be visible only to the sender and recipient.

The debate has exposed deep divisions. Some experts, like cryptographer Bart Preneel of the Catholic University of Leuven, warn that the technology simply isn’t up to the task. “It’s already very hard for humans to distinguish between CSAM and legitimate content,” he told DW. “We’re very skeptical that AI can learn this. We believe that there is no technology—and there will be no technology in the next 10 years—that can do this.” Preneel also cautions that such scanning technology could leave users and authorities more vulnerable to hackers.

Signal, a leading encrypted messaging provider, has threatened to leave the EU market altogether if the proposals go ahead, echoing concerns that backdoors for law enforcement would undermine the privacy and security of all users. “It is a misguided belief that encrypted services can be weakened solely for good guys,” a coalition of privacy experts warned in an open letter to the Irish government, which is among the 12 EU member states backing the Child Sexual Abuse Regulation (CSAR) bill.

In Ireland, the debate is especially charged. As host to the EU headquarters of major tech companies like Apple and Meta, Ireland’s stance carries particular weight. Minister for Justice Jim O’Callaghan has argued that law enforcement needs access to encrypted communications to catch criminals and guarantee citizens’ security. But privacy advocates counter that “weakening encryption would put both individuals and businesses at greater risk of scams, fraud, identity theft, and other cybercrime. It would also make sensitive data more vulnerable to foreign cyberattacks and undermine national security.”

The tension between privacy, security, and child protection is not unique to Ireland. According to Callum Voge, Director of Government Affairs and Advocacy at the Internet Society, “This looks like a pattern we are seeing in Europe,” referencing similar pushes for encryption backdoors in France and Sweden that ultimately failed. The pattern, he says, confirms experts’ worries about “scope creep” behind these laws—that measures introduced for one purpose could be expanded to others, eroding privacy for all.

For now, the EU’s efforts to forge a consensus are ongoing. Denmark, which holds the presidency until 2026, remains committed to finding a compromise. “We will continue, of course, the ongoing and constructive negotiations towards a sustainable compromise,” Danish Justice Minister Peter Hummelgaard told reporters. Parliamentary negotiations are the next step, and even if the laws eventually garner enough backing, they would likely take years to implement.

As Europe grapples with how to balance child protection and digital privacy, the outcome of these debates will shape not only the future of online safety for minors, but the very fabric of digital rights and freedoms across the continent. The choices made now will echo far beyond Brussels, setting precedents that could define the internet for a generation.