In a move that has sent shockwaves through London’s technology sector, TikTok’s Chinese parent company ByteDance announced on August 22, 2025, that it will lay off hundreds of staff from its London office, primarily those working in content moderation. The abrupt decision, which comes just one week before employees were scheduled to vote on unionisation, has drawn fierce criticism from trade unions and reignited debate over the role of artificial intelligence in keeping social media safe.
According to The Independent, ByteDance’s restructuring plan will see many of these roles relocated to other European offices or outsourced to third-party providers, with Lisbon cited as one destination for offshored jobs. The company’s spokesperson explained the move as part of an ongoing effort to “concentrate our operations in fewer locations globally to ensure that we maximise effectiveness and speed as we evolve this critical function for the company with the benefit of technological advancements.”
But the timing of the layoffs has raised eyebrows and tempers. Just days before the redundancies, TikTok’s London-based content moderation team had been preparing to participate in a voluntary ballot to establish a branch of the Communication Workers Union (CWU). In a letter to the union, a senior ByteDance employee wrote, “Given these exceptional circumstances, we have decided that it is necessary for us to suspend the planned voluntary ballot process with immediate effect.” The redundancy consultation process, the letter stated, made it impossible to proceed with the union vote as planned.
CWU national officer John Chadfield minced no words in his response, telling The Independent, “The timing is deliberate… and it is deliberately cruel. It is bare-faced union busting, leaves the members who have organised facing massive uncertainty and, from what we can see, they are just going to be offshoring these jobs to a third-party in Lisbon.”
The union’s frustration is rooted not only in the timing of the layoffs but also in the nature of the work performed by TikTok’s moderators. Chadfield described content moderation as “the most dangerous job on the internet,” explaining that moderators are routinely exposed to the most disturbing material imaginable. “The stuff they have to see is literally the stuff of nightmares,” he said.
ByteDance, for its part, maintains that the restructuring is necessary to strengthen its global “Trust and Safety” department. A company spokesperson told Zamin that the changes are intended to “improve speed and efficiency,” with a growing reliance on artificial intelligence to automate the removal of harmful content. TikTok claims that more than 85% of content taken down for violating its community guidelines is now identified and removed by automation.
Yet, the CWU and other critics argue that the technology is not yet ready to replace the human element in content moderation. Chadfield warned, “While ByteDance has said it plans to adopt AI to take up some content moderation responsibilities, the technology is not yet ready and human moderators will be essential.” The union contends that relying solely on technology could be dangerous and puts corporate interests above people’s safety.
Dismissed staff, according to Zamin, will have the right to apply for other positions within TikTok, though the union remains skeptical about the sincerity and sufficiency of these reassurances. The redundancy consultation process is still ongoing, but Chadfield insists that the union’s campaign is far from over. “The unionisation of TikTok is inevitable. They might want to delay it in the most spiteful way possible, but it is inevitable,” he said.
The layoffs and the suspension of the union ballot come at a particularly sensitive time for online platforms operating in the UK. Just last month, the country’s landmark Online Safety Act came into force, enforced by Ofcom. The law requires platforms like TikTok to protect UK viewers from illegal material, such as child sexual abuse and extreme pornography, and to prevent children from accessing harmful and age-inappropriate content. Non-compliance can now result in hefty fines.
To meet these new obligations, TikTok has stated that it is introducing additional control measures. The company’s safety moderation teams are trained to spot signs that accounts might be being used by a child, and they can suspend such accounts. AI-based systems are also used to identify potentially underage users, relying on keywords and in-app reports from the user community.
Despite these assurances, the CWU remains concerned that ByteDance’s decision to lay off hundreds of content moderators will leave TikTok a more dangerous platform. As Chadfield put it, “As the government clamps down on harmful content online, ByteDance’s decision to lay off hundreds of content moderators would leave TikTok a more dangerous platform.” The union’s criticism is echoed by other voices in the tech and labor sectors, who warn that automation, while powerful, still cannot match the nuance and judgment of human moderators—especially when it comes to the most complex and distressing content.
ByteDance has emphasized that it has engaged voluntarily with the union, despite having no legal requirement to do so. The company’s UK head office is currently located in Farringdon, London, with plans to open a new office in Barbican early next year. These moves, the company says, are part of a broader reorganisation that began last year, aimed at strengthening its global operating model for trust and safety. “We are continuing a reorganisation that we started last year to strengthen our global operating model for trust and safety, which includes concentrating our operations in fewer locations globally to ensure that we maximise effectiveness and speed as we evolve this critical function for the company with the benefit of technological advancements,” a TikTok spokesperson explained.
For TikTok’s London-based content moderation staff, the future remains uncertain. As the redundancy consultation process unfolds, many are left weighing their options: apply for other roles within the company, seek new employment elsewhere, or continue to fight for union representation in the hope of securing better protections and working conditions for themselves and their colleagues.
Meanwhile, the broader debate over the use of artificial intelligence in content moderation is unlikely to subside. While automation has proven effective at removing vast swathes of harmful material—indeed, TikTok reports that more than 85% of content removed for guideline violations is now detected by AI—critics argue that the technology is still no substitute for human judgment. The stakes are high: with the UK’s Online Safety Act now in force, platforms that fail to adequately protect their users could face not just public backlash but significant financial penalties.
The coming months will test whether ByteDance’s gamble on automation and consolidation will pay off—or whether the concerns of trade unions, regulators, and safety advocates will force the company to rethink its approach. For now, all eyes are on TikTok’s London office, where hundreds of workers wait anxiously to see what the future holds.