Roblox, the wildly popular online gaming platform, is rolling out significant updates aimed at enhancing the safety of its younger users, particularly those under the age of 13. The updates, set to be fully implemented by March 2025, will address concerns raised by parents and child safety advocates about the platform's ability to protect its young user base from potential predatory behavior and inappropriate content.
One of the major changes is the prohibition of direct messaging for users under 13. Instead of being able to send messages to other players, these young users will only have access to public chat features within games. The primary aim of this restriction is to prevent children from engaging with strangers outside of their established friend lists without parental oversight.
The recent move follows heightened scrutiny of Roblox, particularly after investigations revealed its vulnerabilities to online predators. A prominent report highlighted incidents where adults had exploited the platform, leading to calls for stricter safety measures. "Roblox has been widely criticized for insufficient safety protocols, including cases where predators were able to locate and groom minors," said Aisha Malik, TechCrunch journalist, shedding light on the urgency behind these updates.
To facilitate parental engagement, Roblox is also introducing revamped tools for account management. Parents will now have the ability to remotely manage their child's account settings, including monitoring their friends list, adjusting spending limits, and setting daily time restrictions. Previously, such management could only be done from the child's device, which led to challenges for many parents trying to monitor their children's gaming habits.
“These changes were developed after extensive research and consultations with experts,” explained Roblox's chief safety officer, Matt Kaufman. “Our goal is to make Roblox the safest online platform by continuously adapting our safety measures as our community grows.”
The initiative has been met with positive feedback from child safety organizations. Richard Collard from the NSPCC remarked, “This is certainly a positive step forward, but it should be accompanied by rigorous age verification mechanisms to truly translate these policies to safer user experiences.”
Roblox's user base is predominantly younger, with statistics showing around 32 million active players are under the age of 13 out of the 80 million daily users. This demographic shift has prompted the company to rethink and refine its safety strategies. The platform has been increasingly attracting younger audiences, which has also raised alarm about its content accessibility among the youngest players.
Security measures will go beyond just messaging limitations. Roblox will implement content labels to replace age recommendations, providing parents with clearer guidance on what types of games are appropriate for their children based on content descriptors such as "minimal" or "restricted." For example, users under the age of nine will only be allowed to access games labeled as "minimal" or "mild" by default, with stricter parental control needed for any higher content ratings.
Beginning this week, the company is updating its settings to prevent younger users from playing unlabelled experiences, emphasizing their commitment to safeguarding children from unsuitable content. The platform will also automatically revoke certain protective features as children grow older, ensuring parents are notified 30 days before any changes take effect.
These updates are part of Roblox's broader strategy to evolve its safety measures. The platform has announced the formation of its first Teen Council, composed of users aged 14 to 17 who will provide insights on making Roblox safer and more welcoming for younger audiences. The input from the council is expected to guide future developments, emphasizing the importance of community feedback.
Despite these measures, experts acknowledge the need for parents to remain actively involved, encouraging open dialogue between parents and children about online safety. Carmi Levy, a Canadian tech expert, advised, “It’s important for parents to create an environment where children feel comfortable discussing any issues they encounter online without fear of judgement.”
Roblox's implementation of these new features marks a significant point in its efforts to regain the trust of parents and safety advocates, especially alongside broader discussions surrounding child protection on social platforms. The changes aim not only to meet compliance with upcoming regulations but also reflect Roblox's commitment to providing children with a secure gaming environment.
Overall, these initiatives come as part of Roblox's proactive approach to tackle the scrutiny surrounding the platform and to adapt to the expectations for online safety set forth by users and guardians alike.