Today : Nov 05, 2025
U.S. News
05 November 2025

States And Congress Intensify Children’s Online Privacy Push

A surge of new laws, federal reforms, and lawsuits is rapidly transforming the rules for protecting kids and teens online, challenging companies and regulators to keep pace.

Children’s privacy and online safety have taken center stage in the United States’ ever-shifting legal landscape, as lawmakers, regulators, and industry leaders grapple with the realities of kids growing up in a digital-first world. From sweeping state laws to major federal amendments and high-profile lawsuits, the movement to protect children online has become a defining issue for policymakers and legal professionals alike. The stakes are high: with companion chatbots, social media, and gaming platforms woven into the fabric of daily life, experts warn of the psychological and developmental risks associated with being “terminally online.”

For more than two decades, the Children’s Online Privacy Protection Act (COPPA) has set the federal standard, restricting the collection and use of personal information from children under 13. But as digital platforms have evolved, so too have the threats—and the regulatory response. According to a recent analysis by Wolters Kluwer, the Federal Trade Commission’s (FTC) 2025 amendments to COPPA represent a pivotal shift. The new rules expand the definition of personal information to include biometric identifiers and government-issued IDs, clarify data retention limits, and require companies to implement formal information security programs. Operators must also adopt new parental consent mechanisms, designed to make compliance less of a headache for both companies and families.

These changes aren’t just theoretical. The FTC has already taken action, filing a complaint against the anonymous messaging app “Send It” for failing to notify parents or obtain proper consent before collecting data from children under 13. According to Wolters Kluwer, this signals that “the regulatory framework is clearly tightening,” even if broader enforcement could be slowed by political factors.

Yet, as Congress debates further reforms, states are charging ahead with their own rules—sometimes moving faster than Washington can keep up. Since 2000, a bipartisan mix of states has extended protections beyond age 13, requiring teens to opt in before their information can be sold or used for targeted ads, or outright banning the profiling of teen data. This growing patchwork of laws means companies must navigate a maze of compliance obligations, often making tough decisions about whether a state-by-state approach is even viable.

California has been at the forefront of this movement. Its Age-Appropriate Design Code (CA AADC), enacted in 2022 and modeled after the UK’s approach, aimed to require businesses to put children’s interests first and reduce online harms through thoughtful design. Federal courts have twice enjoined enforcement, but the idea has caught fire: Maryland, Connecticut, Nebraska, and Vermont are among the states pressing forward with their own age-appropriate design codes. These laws commonly demand that companies conduct data protection impact assessments, set high-privacy default settings, and prioritize children’s best interests in product design.

Legal challenges—especially First Amendment claims—have delayed enforcement in California and Maryland, but the movement is gaining momentum. “Legal professionals must prepare for a landscape that is both fragmented and fast-moving,” Wolters Kluwer notes, warning that tracking, interpreting, and responding to a patchwork of laws across jurisdictions is an increasingly complex challenge.

Social media platforms are also under the microscope. New laws in Florida, Georgia, Louisiana, Mississippi, Nebraska, Tennessee, and Utah restrict minors’ access to social media accounts by mandating the collection of age information and requiring parental consent for minors to open or maintain accounts. While these laws face a wave of litigation that could take years to resolve, other states—like California, Colorado, and Minnesota—are tackling concerns with prominent warnings and further restrictions.

Perhaps the most notable recent development is California’s Digital Age Assurance Act (DAAA), signed into law on October 13, 2025. This law shifts responsibility onto operating system developers to collect age information and generate an “age signal” that app developers must honor. Unlike earlier efforts in Utah and Texas, where private rights of action were included, the DAAA won support from major tech players like Google and Meta. The California model requires operating systems to provide developers with a user’s age range based solely on information provided by the account holder. Developers must treat the age signal as the “primary indicator” of a user’s age unless clear and convincing evidence suggests otherwise. This shift gives developers actual knowledge of a user’s age, triggering state-law privacy and COPPA obligations.

Meanwhile, Congress is considering two major bills: COPPA 2.0 (S.836) would extend affirmative consent requirements to teens aged 13–16, ban targeted advertising to minors, and require platforms to offer an “eraser button” for deleting data. The Kids Online Safety Act (KOSA, S.1748) would impose a duty of care on platforms to protect minors from harms like addiction, anxiety, and suicidal ideation. Both bills passed the Senate last year with overwhelming bipartisan support but stalled in the House. Their reintroduction this session signals a growing consensus around the need for stronger protections, even if actual implementation remains uncertain. The House Energy and Commerce Committee is reportedly finalizing a package of bills, with hearings expected in the coming weeks, but resolution during this legislative session seems increasingly unlikely as Congress remains mired in other political battles.

Litigation is playing a major role in shaping the regulatory environment. Roblox is currently facing lawsuits from families and state attorneys general in Kentucky and Louisiana, alleging that the platform’s design made children vulnerable to abuse, despite assurances to parents about its safety. Roku is also being sued by attorneys general in Florida and Michigan, accused of improperly collecting and sharing children’s personal information and facilitating access to inappropriate content.

On the enforcement side, global regulators are joining forces. The California Privacy Protection Agency and other international counterparts are scrutinizing how companies protect children online, underlining the urgency of the issue. The Attorney General Alliance recently launched its “Partnership for Youth Online Safety,” an initiative that brings together state attorneys general, industry participants, civil society, families, and academics. The goal: to identify practical, design-based safeguards for children’s online safety, establish rapid information-sharing frameworks between platforms and law enforcement, and promote greater parental awareness through education and new tools.

With the legal landscape evolving so quickly, compliance is no small feat. Experts recommend that companies stay current with both state and federal rules, regularly reassess internal practices, and leverage technology like age signals, parental controls, and safety-by-design features. Coordination across operating systems, platforms, and apps is increasingly expected to ensure consistent protection for minors. Legal professionals, for their part, are urged to review and update privacy policies, enhance security programs, and closely monitor legislative and regulatory developments using reliable tools—platforms like VitalLaw, for example, which provide daily-updated trackers and expert analysis to help legal teams stay ahead of the curve.

Despite the complexity and ongoing litigation, one thing is clear: children’s privacy now enjoys rare bipartisan support. Legal challenges may slow the rollout of new protections, but the movement shows no signs of stopping. For families, companies, and policymakers alike, the message is unmistakable—children’s online safety is no longer just a priority, it’s an imperative that’s reshaping the rules of the digital age.