Today : Aug 29, 2025
U.S. News
21 August 2025

UK Watchdog Challenges Met Police Facial Recognition

The Equality and Human Rights Commission joins a pivotal legal case, warning that the Metropolitan Police’s expanded use of live facial recognition technology risks violating fundamental rights and lacks sufficient safeguards.

On August 20, 2025, the debate over police surveillance in the United Kingdom reached a new crescendo as the Equality and Human Rights Commission (EHRC) was granted permission to intervene in a landmark judicial review examining the Metropolitan Police’s use of live facial recognition technology (LFRT). The review, R (Thompson and Carlo) v The Commissioner of Police of the Metropolis, is poised to shape the future of policing and privacy in Britain, with implications that could ripple far beyond London’s city limits.

LFRT, a form of artificial intelligence, scans the faces of passersby in real-time against police “watchlists” containing thousands of individuals. According to the EHRC, this technology captures and analyses the unique biometric data of anyone passing in front of specially equipped CCTV cameras. The data is then rapidly compared to databases of people sought by law enforcement. While police tout the system as a tool for combating serious crime and keeping the public safe, critics worry about its potential for overreach and error.

The EHRC’s intervention comes at a moment when the Metropolitan Police are expanding their use of LFRT, including plans to deploy the technology at the Notting Hill Carnival over the August bank holiday weekend—a move that has drawn sharp criticism from civil liberties groups. As reported by The Telegraph, the Met has increased LFRT deployments from four times per week over two days to as many as ten times per week over five days. This ramp-up is happening despite the absence of specific domestic legislation regulating LFRT; instead, police currently rely on common law powers.

John Kirkpatrick, chief executive of the EHRC, summarized the dilemma succinctly: “Live facial recognition technology is a tool which, when used responsibly, can help to combat serious crime and keep people safe. But the data this technology processes is biometric data, which is deeply personal.” He continued, “The law is clear: everyone has the right to privacy, to freedom of expression and to freedom of assembly. These rights are vital for any democratic society. As such, there must be clear rules which guarantee that live facial recognition technology is used only where necessary, proportionate and constrained by appropriate safeguards. We believe that the Metropolitan Police’s current policy falls short of this standard. The Met, and other forces using this technology, need to ensure they deploy it in ways which are consistent with the law and with human rights.”

The EHRC’s concerns are not merely theoretical. The regulator points to real-world consequences—such as the case of Shaun Thompson, an anti-knife crime community worker who was wrongly identified by LFRT in 2024. According to The Guardian, Thompson’s experience led to significant disruption and distress, highlighting the very real risks of false positives. Data further shows that black men are disproportionately likely to trigger an “alert” compared to their representation in London’s population, raising urgent questions about bias and discrimination.

Rebecca Vincent, interim director of Big Brother Watch, a civil liberties group, welcomed the EHRC’s involvement, calling the rapid proliferation of LFRT “one of the most pressing human rights concerns in the UK today.” She argued, “Live facial recognition surveillance turns our faces into barcodes and makes us a nation of suspects who, as we’ve seen in Shaun’s case, can be falsely accused, grossly mistreated and forced to prove our innocence to authorities. Given this crucial ongoing legal action, the Home Office and police’s investment in this dangerous and discriminatory technology is wholly inappropriate and must stop.”

The EHRC’s legal submissions draw on international developments, notably the European Union’s AI Act, which classifies law enforcement use of LFRT as “high risk” and demands strict safeguards. The EU’s approach contrasts with the UK’s ongoing expansion of the technology, including the government’s recent announcement of more LFRT-equipped police vans rolling out across England. The EHRC argues that the UK’s policy is out of step with these international standards, particularly regarding Articles 8, 10, and 11 of the European Convention on Human Rights—covering the rights to privacy, freedom of expression, and freedom of assembly and association.

The Metropolitan Police, for their part, maintain that their use of LFRT is both lawful and proportionate. A spokesperson told The Telegraph, “As part of this model, we have strong safeguards in place, with biometric data automatically deleted unless there is a match. Independent research from the National Physical Laboratory has also helped us configure the technology in a way that avoids discrimination.” Commissioner Sir Mark Rowley has sought to reassure campaign groups that the technology will be used without bias, emphasizing its role in tackling “high-harm” offenders.

Still, the EHRC is not convinced. The commission warns that the sheer scale of LFRT deployments—where thousands of faces can be scanned in a single operation—means even low error rates can lead to significant numbers of false identifications. The consequences of these errors, as the Thompson case illustrates, can be deeply disruptive for individuals. The EHRC also stresses that the use of LFRT at protests or public gatherings could have a “chilling effect” on the exercise of fundamental rights, deterring people from expressing themselves or assembling freely in public spaces.

In response to criticism about accuracy and discrimination, the Met has adopted a minimum accuracy threshold for LFRT systems since July 25, 2024, aiming to limit adverse impacts on protected groups. However, the EHRC notes that accuracy alone is not enough—robust legal safeguards and oversight are essential to prevent misuse and protect civil liberties.

The 2020 case against South Wales Police, where the use of LFRT was found to be unlawful at the time, looms large over the current debate. That ruling underscored the importance of transparency, accountability, and legal clarity in the deployment of new surveillance technologies. Yet, as the UK government continues to expand LFRT’s reach, campaigners argue that lessons from the past risk being ignored.

Yvette Cooper, the Home Secretary, has defended the expansion of LFRT, framing it as a necessary step to catch dangerous offenders and keep communities safe. But opponents counter that without clear legislation and strong safeguards, the technology’s risks may outweigh its benefits.

As the judicial review unfolds, all eyes will be on the courts to provide the clarity and guidance that have so far been lacking. The outcome will not only determine the future of LFRT in London but could set a precedent for how emerging technologies are balanced against fundamental rights in the UK and beyond.

For now, the debate remains as heated as ever, with both sides digging in. What’s at stake, ultimately, is the delicate balance between security and liberty—a question that has never been more urgent in the age of artificial intelligence.