Today : Jan 16, 2026
Technology
15 January 2026

Google Settles Lawsuit Over Child Data Collection

A new $8.25 million settlement and workplace privacy reforms highlight growing scrutiny of data collection practices by tech giants and employers worldwide.

On January 15, 2026, Google found itself in the spotlight once again, agreeing to an $8.25 million settlement to resolve a class-action lawsuit that accused the tech giant of illegally collecting data from children under 13. The case, which has unfolded over two and a half years, was brought by the parents of six minors who downloaded popular children’s games like Fun Kid Racing and GummyBear and Friends Speed Racing from the Android Play Store. These games were part of Google’s much-promoted “Designed for Families (DFF)” program, which, at least on paper, required developers to comply with the federal Children’s Online Privacy Protection Act (COPPA). Yet, the parents allege, the reality was far different.

According to the complaint detailed by The Record, Google’s AdMob software development kit continued to collect data from children’s devices, even after the company supposedly banned the offending apps from its store. The DFF program was intended to be a safeguard, with developers pledging to protect the privacy of young users. Under COPPA, companies are forbidden from knowingly collecting personal data from children under 13 without parental consent. Despite these assurances, the lawsuit claimed, Google was “surreptitiously exfiltrating the personal information of the children under the age of 13 who were playing the games.”

The legal action, brought by the parents, centered on the argument that Google knowingly violated COPPA, collecting personal information from children without the required parental consent. The complaint asserted that Google misled the public, presenting DFF apps as COPPA-compliant while quietly gathering sensitive data from its youngest users. The company, for its part, did not immediately respond to requests for comment as the news of the settlement broke.

Interestingly, the $8.25 million proposed settlement surfaced on the very same day that a different federal judge approved a much larger $30 million settlement in a separate case involving Google’s YouTube division. That lawsuit, dating back to 2019, alleged that YouTube illegally collected data from children, including IP addresses, geolocation information, and device serial numbers, which were then used for targeted advertising. This parallel case underscored the mounting legal and regulatory challenges facing technology companies as they navigate the complex terrain of children’s online privacy.

These high-profile settlements arrive at a time when data privacy is under unprecedented scrutiny—not just for consumers, but in the workplace as well. On January 15, 2026, Lewis Silkin published its Workplace Data Privacy Update, highlighting a global surge in regulatory and enforcement activity around data privacy. The report points to heightened scrutiny of employee monitoring and strengthened enforcement across the European Union, signaling a broader trend toward tighter controls and greater accountability in how organizations handle data.

According to the update, major legislative reforms are underway not only in the EU but also in countries like New Zealand, Chile, and India. These reforms are reshaping the landscape of workplace privacy, with new expectations for cybersecurity, data governance, and lawful data handling. Developments tied to the EU AI Act and the Digital Omnibus are particularly noteworthy, as they reflect the region’s commitment to safeguarding both individual and collective digital rights.

But what does this mean for employers and employees alike? The Lewis Silkin report emphasizes the importance of transparency, proportionality, and minimization in all aspects of workplace data management. From background checks to the use of biometric data, organizations are being called upon to ensure that their data practices are not just compliant with the letter of the law, but also with its spirit. The update offers practical guidance, urging employers to understand and meet their emerging obligations around privacy, data governance, and ethical use of technology.

The Google settlements and the broader regulatory developments detailed by Lewis Silkin are part of a larger story about the growing pains of the digital age. As technology becomes ever more embedded in daily life—from the games children play to the tools employees use at work—questions about privacy, consent, and accountability have moved to center stage. The stakes are high: for parents, the fear is that their children’s most personal details could be exploited without their knowledge; for workers, it’s about how much of their professional (and sometimes personal) lives can be monitored and analyzed by employers.

It’s worth noting that these issues aren’t confined to the United States or Europe. The legislative reforms in New Zealand, Chile, and India highlighted by Lewis Silkin demonstrate that concerns about privacy and data protection are truly global. Each jurisdiction faces its own unique challenges, but the common thread is a push for stronger protections and clearer rules. The EU’s moves around AI and digital regulation, for example, could set new standards that ripple far beyond its borders, influencing how multinational companies design their products and services.

For Google, the recent settlements underscore the risks of failing to live up to public commitments around privacy—especially when it comes to children. The company’s DFF program was supposed to be a model of responsible data stewardship, but the allegations suggest a troubling gap between policy and practice. The fact that AdMob continued to collect data even after the offending apps were removed from the Play Store raises uncomfortable questions about oversight and enforcement within one of the world’s most powerful tech companies.

Meanwhile, the YouTube settlement serves as a reminder that the business of targeted advertising—one of the engines of the modern internet—can run afoul of privacy laws when vulnerable populations like children are involved. Collecting data such as IP addresses and geolocation information for advertising purposes, without adequate consent, is increasingly seen as unacceptable by both regulators and the public.

As regulators ramp up enforcement and new laws come into effect, companies large and small are being forced to rethink their approaches to data privacy. The emphasis on transparency and minimization in the workplace, as outlined by Lewis Silkin, is likely to become a standard expectation across industries. Employers are being urged to review their monitoring practices, limit the use of intrusive technologies, and ensure that any data collection is justified, lawful, and respectful of individual rights.

The events of January 15, 2026, mark a turning point in the ongoing struggle to balance innovation with privacy. For families, workers, and businesses, the message is clear: data stewardship is no longer optional. Whether it’s a child playing a game on a smartphone or an employee logging into a company system, the right to privacy—and the obligation to protect it—has never been more important.