On September 25, 2025, Canadian privacy authorities delivered a stern message to TikTok, the wildly popular video-sharing platform, after concluding a months-long investigation into its handling of children’s data. The Office of the Privacy Commissioner of Canada (OPC), alongside its provincial counterparts in Quebec, British Columbia, and Alberta, found TikTok’s efforts to keep children under 13 off its platform—and to protect their personal information—were simply not up to snuff. The findings have sent ripples through the tech world, highlighting both the challenges of regulating social media and the persistent gaps in protecting young users online.
According to the OPC’s report, “hundreds of thousands” of Canadian children under the age of 13 access TikTok each year, despite the platform’s stated policy that it’s not intended for users in that age group. The investigation revealed that TikTok not only failed to keep underage users off its service, but also collected and used their personal information for online marketing and content targeting. This, the OPC said, violated both the spirit and the letter of Canadian privacy laws, which require companies to obtain meaningful consent before collecting or using personal data—especially from minors.
The joint investigation, which wrapped up this week, paints a troubling picture of the digital landscape for Canadian youth. TikTok’s data collection practices were found to be both extensive and opaque. The platform gathers biometric information and other sensitive data to estimate users’ ages for its own business purposes, yet the measures it had in place to prevent underage access were deemed “inadequate.” As Privacy Commissioner Philippe Dufresne put it, “Despite the fact that the application uses the information that it collects, including biometric information, to estimate users’ ages for its own business purposes, our investigation found that the measures that TikTok had in place to keep children off the popular video-sharing platform and to prevent the collection and use of their sensitive personal information for profiling and content targeting purposes were inadequate.”
It wasn’t just the collection of data that raised eyebrows. The OPC found that TikTok failed to adequately explain its data practices to teens and adults alike, and did not obtain proper consent as required under Canadian law. Many users, especially the youngest ones, were left in the dark about how their personal information was being used. This lack of transparency, according to the investigators, put children at risk of exploitation and manipulation—concerns that have grown more acute as social media’s influence on youth has exploded in recent years.
In response to the findings, TikTok has promised to step up its privacy game. The company agreed to improve its privacy communications, particularly for younger users, so they better understand how their data is being used for targeted advertising and content personalization. TikTok also committed to enhancing its age-assurance methods to prevent minors from accessing the platform in the first place. Additionally, the company said it would provide more privacy information in French, reflecting Canada’s bilingual requirements.
However, TikTok was quick to push back on some of the report’s conclusions. In a statement to Reuters, a company spokesperson said, “While we disagree with some of the findings, we remain committed to maintaining strong transparency and privacy practices.” The spokesperson did not specify which aspects of the investigation TikTok contested, but the company’s response underscores the ongoing tension between tech giants and regulators worldwide.
The Canadian probe is just the latest chapter in a growing global story. TikTok, which is owned by Chinese company ByteDance, has faced mounting scrutiny over its privacy and security practices in the United States, the European Union, and elsewhere. Concerns about the app’s potential to collect sensitive information and its national security implications have fueled legislative and regulatory actions on multiple continents. The Canadian investigation, then, is part of a broader reckoning with social media’s role in society—and the urgent need to protect children in a digital world that often seems to move faster than the law can keep up.
One especially thorny challenge highlighted by the OPC’s report is the difficulty of policing so-called “lurkers” or passive users. These are individuals—often children—who consume content on TikTok without posting videos or text. Because age-verification tools tend to focus on active account creation, many of these passive users slip through the cracks, exposed to the same targeted content and data collection as more engaged users. The investigation concluded that TikTok’s tools for keeping children off its platform were “largely ineffective,” particularly for these lurkers. Ultimately, the OPC found that TikTok was collecting and using the personal information of children with “no legitimate need or bona fide interest,” a practice it deemed “inappropriate” under Canadian law.
Philippe Dufresne, the Privacy Commissioner, emphasized the stakes in a statement: “This [investigation] underscores important considerations for any organization subject to Canadian privacy laws that designs and develops services, particularly for younger users. As technology plays an increasingly central role in the lives of young people in Canada, we must put their best interests at the forefront so that they are enabled to safely navigate the digital world.” Dufresne also stated that the ultimate goal is “to create a safer, more transparent online environment for children, where they feel empowered to exercise their privacy rights and where they can safely explore, learn, and grow without compromising their privacy or security.”
The investigation’s recommendations are clear: TikTok must do more to keep underage children off its platform, better explain its data collection and consent processes, and ensure that its privacy safeguards are not just promises on paper but realities in practice. The OPC and its provincial partners have signaled that they will be watching closely to see whether TikTok follows through on its commitments—and that other tech companies should take note.
Meanwhile, the broader debate over social media’s impact on society shows no signs of slowing down. From age-verification laws in Australia to high-profile regulatory battles in the UK and the US, lawmakers and advocates are increasingly calling for stronger protections for children online. The Canadian findings add fresh urgency to those calls, reminding everyone that the digital playground is still fraught with risk.
For now, the spotlight remains squarely on TikTok. Whether the company’s promised reforms will be enough to satisfy regulators—and, more importantly, to keep children safe—remains to be seen. But as the OPC’s investigation makes clear, the days of tech giants operating with minimal oversight are coming to an end. The conversation about privacy, data, and the rights of young users is only just beginning.