Today : Sep 22, 2025
Technology
22 September 2025

Roblox And Reddit Face Scrutiny Over Child Safety

Major online platforms introduce new age verification measures amid mounting pressure from regulators, parents, and privacy advocates over child protection and digital privacy risks.

When it comes to keeping children safe online, the stakes have never been higher. Two of the world’s largest digital platforms, Roblox and Reddit, are now facing mounting scrutiny and regulatory pressure to address child safety and age verification, following years of disturbing reports and the introduction of tough new laws in the UK and Australia. Their responses signal a dramatic shift in how tech giants are being forced to reckon with the darker corners of the internet, where predators and inappropriate content lurk just beneath the surface.

For millions of children, Roblox is more than a game—it’s a sprawling metaverse of over 44 million experiences, drawing 111 million daily users with its blocky, Lego-inspired worlds. But according to a September 21, 2025, investigation by 7NEWS, the platform’s very popularity has made it a magnet for predators. The eSafety Commissioner of Australia, Julie Inman Grant, didn’t mince words, calling Roblox a “popular target for paedophiles.” Her comments came as Roblox finally announced a suite of new safety measures, after years of relentless advocacy from parents, educators, and online safety experts.

Joyce McDonald, an 18-year-old gamer from Melbourne, has seen the problem firsthand. She’s reported predatory behavior on Roblox “maybe 20 to 50 times” in just the past few months, telling 7NEWS.com.au, “They have done nothing about it.” McDonald’s story is hardly unique. She described how predators use the platform’s open chat and private messaging features to befriend children, then coax them onto external messaging apps like Discord or WhatsApp. “If you go into a game that is very popular among young people, it’s pretty easy to find people trying to make friends and then, further on, you can see that they’re an adult. You can tell the signs, like when they say, ‘Hey, can we go off platform’.”

Some of the most alarming incidents have occurred in so-called “roleplay servers.” McDonald reported a “public bathroom roleplay” server that had been live for two months and had 300 people online when she last checked. She provided screenshots and video evidence of avatars engaging in sexual acts. Despite repeated reports, she said Roblox failed to take meaningful action.

Kirra Pendergast, founder of Safe on Social and an online safety educator, has spent years warning about these dangers. She likens letting a child play Roblox unsupervised to “dropping their child off outside a Westfield shopping centre, and saying: ‘Off you go, darling. You can go in there and be whoever you want to be for the next hour’.” Pendergast’s classroom sessions reveal just how normalized inappropriate requests have become. “If I ask a room full of kids who’s been asked to be someone’s boyfriend or girlfriend (on Roblox), most of the room will put their hand up, and they’re all giggling because it’s quite normal,” she told 7NEWS. Offers of Robux, Roblox’s in-game currency, in exchange for inappropriate actions are also disturbingly common.

The consequences can be devastating. In one tragic case, a lawsuit filed in September 2025 by Rebecca Dallas in San Francisco County Superior Court alleges that Roblox and Discord “recklessly and deceptively” operated their platforms in a way that led to the sexual exploitation and suicide of her 15-year-old son, Ethan Dallas. The suit claims Ethan was groomed for years by an adult posing as a child, coerced into sending explicit images, and then blackmailed. After months of escalating threats, Ethan died by suicide in April 2024. “These companies are raking in billions. Children are paying the price,” said Alexandra Walsh, a Louisiana prosecutor involved in the case.

Regulators are now demanding change. Thanks to Australia’s Online Safety Act and new industry codes, Roblox faces heavy fines if it fails to implement robust protections against grooming and abuse. The company has responded with what it calls “an ambitious plan” centered on age estimation technology—combining facial recognition, ID verification, and verified parental consent. Chat features will be disabled until a user’s age is confirmed. Those under 16 will need parental consent to communicate with adults, and voice chat will be banned between users aged 13-15 and adults. Roblox also plans to refine its avatar detection model to better police inappropriate behavior.

Yet experts caution that these measures are no panacea. “They will work around it. Creeps work around everything. But it’s a really good start,” Pendergast said. Commissioner Inman Grant echoed the need for continued vigilance: “I would also urge parents and carers to remain vigilant and actively support children in navigating online environments safely.”

Meanwhile, across the globe, Reddit is grappling with similar challenges. On September 22, 2025, Mashable reported that Reddit will begin verifying the ages of its UK users starting July 14, 2025, to comply with the UK Online Safety Act. This law, passed in 2023, mandates that platforms hosting restricted content implement age verification systems by July 24, 2025. Reddit’s solution is a partnership with Persona, a third-party provider, requiring users to upload a selfie or government ID to access mature content. Persona will not retain images for more than seven days, and Reddit will only store verification status and birthdate—never visible to other users or advertisers.

Reddit’s chief legal officer, Ben Lee, assured users that the system is solely for age verification, not identity tracking. “Reddit will not have access to the uploaded photo, and Reddit will only store your verification status along with the birthdate you provided so you won’t have to re-enter it each time you try to access restricted content,” Lee explained. The company is also rolling out optional birthdate submission worldwide to tailor content and ads by age, and hinted at future updates to distinguish humans from AI bots.

Yet the new requirements have sparked a fierce debate over digital privacy. Free speech and privacy advocates argue that such laws are ineffective, hard to enforce, and risk exposing sensitive personal data. Many users are uneasy about uploading government IDs to access online content, fearing potential misuse or data breaches. Ben Lee acknowledged these concerns, stating, “We’re carefully watching how the law evolves… And we continue to advocate for alternative approaches that don’t require platforms to ask for IDs.”

Some experts and industry players, like adult content platform Pornhub, have pushed for device-based verification systems that would store age data on a user’s device rather than on company servers. This, they argue, could offer better enforcement without the same privacy risks. Governments worldwide are watching closely—Australia is set to require age checks for users logged into Google or Microsoft accounts by the end of this year.

Despite the technological advances and regulatory muscle, one thing remains clear: there’s no magic bullet for online child safety. As platforms like Roblox and Reddit roll out new protections, the onus will remain on parents, educators, lawmakers, and the companies themselves to stay one step ahead of those who would do harm. For now, the digital world remains a place of both wonder and peril—especially for its youngest explorers.