On October 3, 2025, Global Witness, a respected investigative research nonprofit, released a report that sent shockwaves through the digital world: TikTok’s algorithm, it claimed, isn’t just failing to shield minors from sexually explicit content—it’s actively steering them toward it. The findings, which quickly caught the attention of major media outlets like The New York Post and Channel 9 Eyewitness News, have reignited a fierce debate about the responsibilities of social media platforms in protecting their youngest users.
Researchers at Global Witness set out on a straightforward, if unsettling, experiment. They created TikTok accounts posing as 13-year-olds—TikTok’s minimum age for users. These accounts were set up in the United Kingdom on clean devices with no prior search history, and every safety setting for minors was enabled. Yet, as The New York Post reported, the researchers found themselves quickly bombarded with sexually explicit search suggestions. Some of the phrases offered by TikTok’s search bar included eyebrow-raising terms like “hardcore pawn clips” and “very very rude skimpy outfits.” Following these suggestions led to videos depicting women simulating masturbation, flashing underwear, and even exposing their breasts.
It didn’t stop there. The Global Witness report, as cited by The Wall Street Journal, noted that the most extreme search suggestions led to full pornographic films of penetrative sex. In some cases, explicit content appeared to have been spliced into otherwise innocuous videos—an apparent attempt to slip past TikTok’s moderation algorithms. According to the report, “For one of the users, there was pornographic content just two clicks away after logging into the app—one click in the search bar and then one click on the suggested search.”
Perhaps most troubling, the Global Witness team hadn’t actually set out to study children’s online safety. They stumbled upon this issue while conducting unrelated research earlier in the year. But once they realized the gravity of what they’d found, they immediately alerted TikTok. In response, a TikTok spokesperson told The New York Post, “As soon as we were made aware of these claims, we took immediate action to investigate them, remove content that violated our policies and launch improvements to our search suggestion feature.” The company also emphasized that it removes roughly nine out of every ten videos that break its community guidelines—guidelines that include a strict ban on nudity and sexual content—before users ever see them.
Despite these assurances, Global Witness didn’t see much change. When they repeated their experiment in July and August 2025, using new accounts with no search history, they continued to receive explicit and sexualized search suggestions. This persistence, even after TikTok’s intervention, led Henry Peck, campaign strategy lead for digital threats at Global Witness, to call the findings “a huge shock.” Peck told The New York Post, “TikTok claims to have guardrails in place to make children and young people safe on its platform, yet we’ve discovered that moments after creating an account, they serve kids pornographic content. Now it’s time for regulators to step in.”
The report’s release comes at a pivotal moment in the broader debate over children’s safety online. According to Channel 9 Eyewitness News, governments in the United States and other countries are ramping up their scrutiny of social media platforms, demanding stronger protections and safe spaces for minors. The timing is no coincidence: just days before the report’s publication, President Donald Trump signed an executive order approving the transfer of TikTok’s U.S. operations to a consortium of American-based investors. The move is widely seen as an effort to bring more American oversight—and perhaps accountability—to the platform.
The stakes are high. Pew Research Center data cited by Newsmax reveals that as of 2025, a staggering 63% of American teens use TikTok daily, with 16% admitting to near-constant use. For many young people, TikTok isn’t just a pastime—it’s a primary channel for entertainment, communication, and self-expression. But as this report makes clear, the very algorithms that keep teens glued to their screens may also be putting them at risk.
The Global Witness report isn’t the first to raise alarms about TikTok’s content recommendations. Back in September 2021, The Wall Street Journal documented how TikTok’s algorithm could draw young users into an “unending loop” of adult content. What’s new—and chilling—about the latest findings is the apparent speed and directness with which the platform’s search suggestions push minors toward explicit material, even when every possible safety feature is enabled.
And it’s not just researchers who are noticing. The report highlights complaints from TikTok users themselves, who have taken to social media to share screenshots of inappropriate search suggestions. One user captioned their screenshot, “can someone explain to me what is up w my search recs pls,” while others replied with comments like, “I THOUGHT I WAS THE ONLY ONE,” and “same what’s wrong with this app.” Some users expressed frustration at being unable to rid their accounts of explicit suggestions, despite never having searched for such content in the first place.
In its public statements, TikTok has repeatedly emphasized its commitment to user safety. After the Global Witness report, the company reiterated its ongoing efforts to protect young users, stating that it had not only removed violating content but also improved its search suggestion feature. Yet, as the Global Witness researchers demonstrated, explicit recommendations continued to appear for new, supposedly protected accounts months after TikTok’s intervention.
This disconnect between TikTok’s assurances and the lived experience of its users—especially minors—has fueled calls for stronger regulation. Henry Peck’s call for regulators to step in echoes a growing sentiment among parents, educators, and policymakers that self-policing by tech giants may not be enough. As governments worldwide consider new legislation and oversight mechanisms, the TikTok case is likely to become a touchstone in the ongoing debate over how to balance free expression, innovation, and the imperative to protect children online.
For now, the Global Witness findings serve as a stark reminder of the risks that lurk behind the fun and creativity of social media. As more teens flock to TikTok and other platforms, the question of who is responsible for their safety—and how that responsibility should be enforced—remains as urgent as ever.