The UK’s Information Commissioner’s Office (ICO) has launched significant investigations targeting TikTok, Reddit, and Imgur concerning the protection of children’s privacy online. Announced on March 3, 2025, this inquiry is primarily focused on how these platforms use personal information from users aged 13 to 17, amid growing concerns about the potential exposure of minors to inappropriate or harmful content.
The ICO's scrutiny of TikTok is particularly notable. The app has previously faced legal issues, including a hefty fine of £12.7 million for violating data protection laws by processing the personal information of children under 13 without securing parental consent. The current investigation aims to dissect the platform’s use of minors’ data to curate content recommendations on its feeds. “Our investigation considers how the platform uses personal information of 13–17-year-olds to deliver suggested content,” the ICO stated, emphasizing their commitment to ensuring substantial safeguards for young users.
Reddit and Imgur are also under the microscope. The inquiry will assess how these platforms verify user ages and their methods of safeguarding minors from harmful material. There has been extensive debate among regulators about the responsibilities of social media companies. Reddit's spokesperson assured their audience: "The company is committed to cooperating with the ICO and plans to implement necessary changes to align with updated UK regulations concerning age assurance measures. Although most of our users are adults, we prioritize compliance with privacy laws," they noted.
These investigations are part of the ICO's broader strategy to push for improved practices around children’s privacy and safety online. John Edwards, UK Information Commissioner, stressed the necessity for social media platforms to adhere strictly to privacy regulations: "If social media and video sharing platforms want to benefit from operating in the UK, they must comply with data protection law," he declared. He emphasized the technology sector’s responsibility to protect children’s privacy, stating, "The responsibility to keep children safe online lies firmly at the door of the companies offering these services." This message resonates deeply as the ICO ramps up its investigations.
The UK's Online Safety Act, which came to fruition in 2023, applies stringent obligations on digital platforms to protect children, including the implementation of effective age-verification mechanisms and privacy-preserving measures. By law, these platforms are required to prevent minors from encountering harmful or age-inappropriate content. This has spurred platforms like TikTok to rethink their data processing protocols.
Age-assurance mechanisms—which include age verification tools and data estimation methods—are under close examination. The ICO is investigating how these companies manage age-sensitive data, including potentially sensitive information such as ID proofs and financial data from mobile operators, to protect children from inappropriate online interactions. These inquiries are not isolated; they form part of wider governmental concerns surrounding the processing of minors' data by social media platforms.
Despite these regulations and pressures to reform, challenges remain. While TikTok insists it has implemented industry-leading safety features and content restrictions for minors, concerns persist about whether these measures are sufficient to safeguard young users adequately. "We’re deeply committed to ensuring a positive experience for young people on TikTok, just like the ICO," stated TikTok's spokesperson, who emphasized their current practices and technologies aimed at protecting child users.
With the ICO making it clear they will act against any violations of the law stemming from these investigations, impending responses from TikTok, Reddit, and Imgur will be closely monitored. Should any evidence of breaches come to light, the ICO plans to confront these companies with their findings before reaching any definitive conclusions. This could lead to substantial repercussions, including fines or enforced changes to their operational practices.
This investigation aligns with the ICO's past efforts, where they previously expressed concerns over various social media platforms and their ability to protect children’s data. Following their Tech Lab's review of over 30 social media and video-sharing platforms, many were caught failing to meet the expectations set by the ICO's Children’s Code. They have been vocal about their commitment to reinforcing children's information rights online, and this latest initiative marks another step toward ensuring safer digital spaces.
For many, the outcome of these investigations will significantly shape the future of social media operations within the UK and globally. The heightened scrutiny of data practices involving minors is indicative of shifting perceptions about the responsibilities social media platforms have toward their youngest audiences. The ICO will continue to provide updates as the investigations develop, reinforcing their commitment to child data privacy.