London’s Metropolitan Police are once again at the center of a heated debate as they prepare to deploy live facial recognition (LFR) technology at the Notting Hill Carnival, one of Europe’s largest street festivals, expected to draw around 2 million people to West London. The move, announced on August 20, 2025, has reignited national and international concerns over privacy, racial discrimination, and the adequacy of legal safeguards—while also exposing a striking lack of concrete evidence about how such technology performs in the unpredictable conditions of real-world public events.
At the heart of the controversy is the Metropolitan Police’s decision to use real-time face-matching cameras at the Carnival, a step that places the UK among a handful of countries where police can deploy such powerful surveillance tools without an active emergency. The force’s stated intent is to target a "small minority" responsible for serious crimes, including violence and sexual offenses, according to Commissioner Mark Rowley. However, critics argue that the risks to civil liberties and human rights far outweigh the potential benefits, especially in the absence of robust, transparent research demonstrating LFR’s effectiveness and fairness in live settings.
Eleven anti-racist and civil liberties organizations have jointly written to Commissioner Rowley, urging him to reconsider the deployment. Their letter, disclosed in The Guardian, warns that LFR could inflame longstanding public concerns about police overreach and racial profiling. The Equality and Human Rights Commission (EHRC), the UK’s statutory equalities watchdog, has also stepped into the fray, declaring that the Met’s current LFR policy is "unlawful" and "incompatible" with Articles 8, 10, and 11 of the European Convention on Human Rights—which guarantee the rights to privacy, freedom of expression, and freedom of assembly and association.
The EHRC’s intervention comes as part of a judicial review, launched by anti-knife campaigner Shaun Thompson and privacy advocate Silkie Carlo, director of Big Brother Watch. Thompson, a Black British man, was wrongly identified as a criminal by LFR, detained by police, and pressured to provide his fingerprints—a case that has become emblematic of the technology’s potential for harm. Data reviewed by the EHRC shows that Black men are disproportionately flagged by LFR alerts compared to their share of London’s population, adding weight to allegations of systemic bias.
Rebecca Vincent, Big Brother Watch’s interim director, welcomed the EHRC’s involvement, calling it "hugely welcome" in this "landmark legal challenge." She warned, "The rapid proliferation of invasive live facial recognition technology without any legislation governing its use is one of the most pressing human rights concerns in the UK today. Live facial recognition surveillance turns our faces into barcodes and makes us a nation of suspects who, as we’ve seen in Shaun’s case, can be falsely accused, grossly mistreated and forced to prove our innocence to authorities."
The Met, for its part, insists that its use of LFR is both lawful and proportionate, emphasizing the presence of safeguards such as the automatic deletion of biometric data unless a match is found. A spokesperson told The Guardian, "We believe our use of LFR is both lawful and proportionate, playing a key role in keeping Londoners safe. We welcome the Equality and Human Rights Commission’s (EHRC) recognition of LFR’s potential in policing. The Court of Appeal has confirmed the police can use LFR under Common Law powers, with the Met carefully developing a policy to operate the technology in a way which protects people’s rights and privacy." The force also cited independent research from the National Physical Laboratory, which helped configure the technology to avoid discrimination.
Yet, critics remain unconvinced. Alba Kapoor, racial justice lead at Amnesty International UK, stated, "Facial recognition technology has been proven repeatedly to be discriminatory, being less accurate in scanning the faces of people of colour and leading to wrongful arrest, causing distress and harm. These systems violate our right to privacy, our rights to freedom of peaceful assembly and expression, and to equality and non-discrimination. No matter who we are, we should all feel safe to move freely without risk of harassment or false arrest. The Met’s plans to use this dangerous and discriminatory tool of surveillance should be immediately scrapped."
Adding to the complexity is the lack of specific domestic legislation regulating police use of LFR. The Met currently relies on common law powers and obligations under the Equality Act 2010, but the EHRC and other legal experts argue that this framework is insufficient. John Kirkpatrick, chief executive of the EHRC, stressed, "There must be clear rules which guarantee that live facial recognition technology is used only where necessary, proportionate and constrained by appropriate safeguards. We believe that the Metropolitan police’s current policy falls short of this standard."
Beyond the legal and ethical debates lies a more fundamental issue: the absence of reliable, real-world data on how LFR performs in the chaos of crowded public events. Researchers from the Oxford Internet Institute have pointed out that impressive laboratory test results—such as those from the NIST Facial Recognition Technology Evaluation—fail to capture the realities of live deployments. Factors like weather, lighting, and crowd density can dramatically affect accuracy, yet are not reflected in controlled testing environments. The UK’s own National Physical Laboratory dataset, for instance, includes too few teenagers and no children, raising questions about the technology’s performance across different age groups.
Moreover, while critics often cite incidents of bias and misidentification, only one of three major cases referenced by researchers involved LFR deployed within the last five years—a period marked by rapid advances in deep learning algorithms. Still, as the Oxford Internet Institute trio concluded in Tech Policy Press, "Without dedicated and transparent research into these critical areas, decisions about the deployment of facial recognition systems will continue to be based on out-of-context lab results, rather than a clear understanding of their real-world impacts and inherent limitations."
Despite these unresolved questions, the Met is forging ahead. Commissioner Rowley has argued that the technology, law, and oversight have all improved since earlier deployments at the Notting Hill Carnival in 2016 and 2017. "Since then, we’ve made considerable progress. The current version of the algorithm is significantly improved, has undergone independent testing and validation, and now performs to a much higher standard," he wrote, as quoted by The Guardian. "The algorithm the Met now uses does not perform in a way which exhibits bias."
London Mayor Sadiq Khan has pledged to maintain transparency and oversight, stating through a spokesperson, "The mayor is committed to ensuring the Met is open and transparent around where, when, why and how live facial recognition technology is used. Sadiq and his Office for Policing and Crime will scrutinise and oversee its use to ensure there’s trust and confidence in policing as we continue to build a safer London for everyone."
Some voices within policing have also backed the move. Lord Paddick, former senior Met officer in charge of Brixton, wrote, "The Met doesn’t get everything right, but I trust those responsible for the development and deployment of live facial recognition, and compared with other police crime prevention measures, such as blanket, no-suspicion stop and search, it is much less intrusive."
Meanwhile, Home Secretary Yvette Cooper has defended plans to expand LFR nationwide to catch "high-harm" offenders, and the Met has announced an increase in LFR deployments across the capital—up to 10 times per week, from the previous four. Yet, as the Carnival approaches, the debate shows no sign of abating, with legal challenges pending and public trust in the balance.
As London prepares for the vibrant spectacle of Notting Hill Carnival, the city finds itself at a crossroads: between the promise of technological innovation in policing and the enduring imperatives of human rights and social justice. The outcome of this debate—and the legal challenges now underway—will likely shape the future of surveillance in Britain for years to come.