On Tuesday, September 9, 2025, the halls of the U.S. Senate echoed with allegations that could shake the very core of Silicon Valley’s approach to child safety. Former Meta Platforms researchers Jason Sattizahn and Cayce Savage appeared before the Senate Judiciary Committee’s Subcommittee on Privacy, Technology, and the Law, leveling explosive accusations against their former employer. According to testimony and supporting documents reported by Roll Call and The Washington Post, Meta—the parent company of Facebook, Instagram, and a dominant force in the virtual reality (VR) sector—stands accused of systematically suppressing internal research that exposed serious child safety risks on its VR platforms.
For years, Meta has positioned itself as a pioneer in the digital world, especially with its Quest VR device lineup. But behind the innovation, whistleblowers allege, lurked a corporate culture more concerned with profit and public image than with the well-being of its youngest users. Sattizahn, who worked at Meta from 2018 to 2024 in integrity research, recounted how his team uncovered disturbing evidence: underage children using Meta VR in Germany were subjected to demands for sex acts, requests for nude photos, and exposure to other dangers no child should face. "Meta demanded that we erase any evidence of such dangers that we saw," Sattizahn told the Senate panel, as reported by Roll Call.
But the problems, according to the whistleblowers, ran deeper than isolated incidents. Savage, who worked at Meta from 2019 to 2023 as a user experience researcher, testified that Meta was fully aware of the prevalence of children on its VR platforms—despite terms of service requiring users to be 13 or older. "Meta purposely turns a blind eye to underage children using its VR platform," she said, adding that the company avoided removing young users because it would decrease the active user numbers presented to shareholders.
In one particularly chilling exchange, Senator Josh Hawley, R-Mo., pressed Savage on how widespread exposure to sexual content or abuse was among young users. Savage replied, "I would estimate that any child that is in a social space in VR will come in contact with or directly expose something very inappropriate." She explained that the immersive nature of VR makes negative experiences especially damaging. "Because VR is immersive and embodied, negative experiences cause greater psychological harm than similar experiences on an iPad or an Xbox," Savage said. "In VR, someone can stand behind your child and whisper in their ear and your child will feel their presence as if it is real."
According to The Washington Post, Sattizahn and five other researchers described a pattern of legal intervention following the high-profile leaks by Frances Haugen in 2021. Meta, they alleged, deployed lawyers to screen, edit, and sometimes veto sensitive safety studies. Internal documents revealed that after the 2021 congressional scrutiny, Meta imposed new restrictions on research topics related to children, gender, race, and harassment. Researchers were even advised to avoid words like “illegal” or “violates” in their studies, a move the whistleblowers say was designed to create plausible deniability about the negative effects of Meta’s products on young users.
Employees repeatedly warned that children under 13 were finding ways to bypass age restrictions and join VR spaces. One staffer estimated as early as 2017 that 80 to 90 percent of users in some virtual rooms were underage, predicting this would eventually spark public outrage and negative headlines. Yet, according to the testimony, little was done. Sattizahn concluded, "Meta is incapable of change without being forced by Congress after unearned opportunities to correct behaviour." He said the company prioritizes engagement or profits at any cost, despite having had chances to address these issues properly.
Meta, for its part, has strenuously denied the allegations. In an emailed statement, the company called the claims “nonsense,” arguing they were “based on selectively leaked internal documents that were picked specifically to craft a false narrative.” Meta spokeswoman Dani Lever echoed this sentiment, insisting, "The company stands by its research team’s excellent work and has developed various safety protections for young users." Meta also asserted that, since the start of 2022, it had approved nearly 180 Reality Labs-related studies on issues including youth safety and well-being. “The truth is there was never any blanket prohibition on conducting research with young people,” Meta said.
Despite Meta’s denials, the allegations resonated with lawmakers and advocacy groups. Senator Marsha Blackburn, R-Tenn., who chairs the subcommittee, called the researchers’ accounts "shocking." She urged her colleagues to pass the Kids’ Online Safety Act, a bill she sponsors that would require social media platforms to implement design changes aimed at preventing harms to children, such as sexual exploitation and the marketing of narcotics. "They were hired to purportedly make the platform safer for children, and what they found was a company that knew their products were unsafe. And they just did not care," Blackburn said during the hearing.
Senator Amy Klobuchar, D-Minn., the panel’s ranking member, placed the testimony in a broader context. “For too long, these companies have worked to attract kids to their platforms. They do so knowing that their platforms use algorithms that increase the risk of sexual exploitation, push harmful content, facilitate bullying and provide venues, sadly, for dealers to sell deadly drugs like fentanyl,” Klobuchar said. She, too, pressed for the passage of the Kids’ Online Safety Act, though she acknowledged the difficulties in overcoming industry lobbying. “No matter what you seem to do, you get lobbied against and millions of dollars against you.”
The legislative push comes as courts and lawmakers grapple with how best to regulate social media and VR platforms. The Kids’ Online Safety Act passed the Senate last year but awaits further committee action. Other proposed bills would raise the age at which platforms can collect user data without parental consent from 13 to 17 and ban users under 13 from social media entirely. Meanwhile, a federal court in California recently blocked enforcement of the California Age-Appropriate Design Code Act—legislation that would have required platforms to design their systems in an age-appropriate manner—on free speech grounds.
Advocacy groups have seized on the whistleblower revelations. The youth-led coalition Design it for Us said in a statement, "Meta has spent years touting its efficacy in protecting young people online, but all the while, they were deleting evidence and research that confirmed countless instances of harm to young people on their platforms. These allegations are not isolated—they’re part of a well-documented, consistent pattern of negligence and deception from Meta."
Senators are now demanding answers. Last week, Senate Judiciary Chair Charles E. Grassley, R-Iowa, along with Blackburn and Hawley, sent a letter to Meta requesting detailed information on safeguards for young users and protection against sexual exploitation in VR. The company has until September 16 to respond.
As the debate over digital safety intensifies, the Senate hearing has laid bare the deep divisions between whistleblowers, lawmakers, and one of the world’s most powerful tech companies. Whether Congress will act decisively—and whether Meta will change its ways under pressure—remains a question with profound implications for the future of online childhood.