Today : Feb 05, 2026
U.S. News
05 February 2026

Sainsbury’s Apologizes After Facial Recognition Error Sparks Outcry

A longtime customer was wrongly ejected from a London store, raising concerns about privacy, technology, and the human cost of automated security systems.

It was an ordinary shopping trip that turned into a public ordeal for Warren Rajah, a 42-year-old data professional and long-time customer of Sainsbury’s in Elephant and Castle, south London. On February 3, 2026, Rajah was abruptly approached by two staff members and a security guard, asked to show his “bar code”—which he mistook for his Nectar card—and then instructed to leave the store. The reason? Staff believed he had been flagged as an offender by the store’s newly implemented facial recognition system, Facewatch.

Rajah, who has shopped at the Elephant and Castle branch for a decade, was left stunned and humiliated. “It was the most humiliating moment of my life, being escorted out the place I have shopped in for 10 years in front of my community,” he told Metro, as reported by Daily Mail. He described the experience as “Orwellian,” raising concerns not only about his own treatment but also about the broader implications of such technology for customers—especially those who are vulnerable.

Sainsbury’s, the UK’s second-largest supermarket chain, recently rolled out Facewatch technology in six London stores, including Dalston, Ladbroke Grove, Camden, Whitechapel, and Elephant and Castle, following pilot programs in Bath and Sydenham. The move came in response to a worrying rise in theft and violence against staff. According to BBC News, the pilot stores saw a 92% drop in repeat offenders and a 46% reduction in theft, aggression, and antisocial incidents. Sainsbury’s stated that this led to “fewer frightening moments for colleagues and a more reassuring experience for customers.”

Facewatch, which describes itself as “the UK’s leading facial recognition company providing a cloud-based facial recognition security system to safeguard businesses against crime,” works by scanning customers’ faces and comparing them to a database of known offenders. When a match is detected, store managers are alerted and tasked with verifying the match before taking action.

Yet, as Rajah’s case illustrates, the human element in this process can be a weak link. Both Sainsbury’s and Facewatch were quick to clarify that the technology itself did not misidentify Rajah. Instead, the error occurred during the human verification stage—staff simply approached the wrong person. “This was not an issue with the facial recognition technology in use but a case of the wrong person being approached in store,” a Sainsbury’s spokesperson told The Standard. Facewatch echoed this, stating, “This incident arose from a case of human error in-store, where a member of staff approached the wrong customer.”

For Rajah, the aftermath was as troubling as the incident itself. To clear his name, he was directed to a poster in the shop window about the use of facial recognition and told to contact Facewatch. In order to verify that he was not on the system, Rajah had to submit a copy of his passport and a photograph of himself—a process he described as “a massive invasion of my privacy,” according to Daily Mail. Facewatch explained that, by law, they must verify the identity of anyone requesting data before disclosing information. After completing this process, Facewatch confirmed there were “no incidents or alerts associated with [him]” and that he had not been subject to any alerts generated by their system.

Sainsbury’s subsequently apologized for the incident, offered Rajah a £75 shopping voucher, and stated that management at the Elephant and Castle store would receive additional training to prevent similar occurrences. “We have been in contact with Mr Rajah to sincerely apologise for his experience in our Elephant and Castle store,” a spokesperson said. “This was not an issue with the facial recognition technology in use but a case of the wrong person being approached in store.”

Despite these assurances, Rajah remains concerned about the potential for future errors and the lack of clarity around customers’ rights. “Am I supposed to walk around fearful that I might be misidentified as a criminal?” he asked, speaking to BBC News. He also questioned the impact such incidents could have on more vulnerable customers: “Imagine how mentally debilitating this could be to someone vulnerable, after that kind of public humiliation.”

Rajah’s skepticism extends to the technology itself. While Facewatch and Sainsbury’s tout the software’s “99.98% accuracy” and emphasize that all matches are reviewed by trained colleagues before action is taken, Rajah questioned the data behind these claims. “What’s the data behind those figures? How can we verify the accuracy of those claims?” he asked The Standard. As someone who works in the tech industry, he acknowledged the potential benefits of artificial intelligence and facial recognition, but added, “The caveat is that it’s only ever as good as the people behind it.”

His experience is not unique. Other customers have reported similar incidents in UK stores using facial recognition technology. In one case, a B&M customer named Jenny was placed on a watchlist and barred from her local store after being wrongly accused of stealing a bottle of wine. She told BBC Radio 4’s Today programme, “It’s like we’ve made retail managers and technology companies judge, jury and executioner, with no legal due process.” Another woman was wrongly accused of stealing less than £1 worth of paracetamol from Home Bargains, while Danielle Horan from Manchester was ordered out of two separate shops after being falsely accused of theft.

These incidents have sparked criticism from privacy advocates and legal experts. Jasleen Chaggar, legal and policy officer at Big Brother Watch, said her organization “regularly hears from members of the public who are left traumatised after being wrongly caught in this net of privatised biometric surveillance.” The Information Commissioner’s Office weighed in as well, stating, “Retailers should carefully consider the risks of misidentification and have robust procedures in place to ensure the accuracy and integrity of the personal information they collect and process.”

For retailers, the stakes are high. The retail trade union Usdaw’s latest survey found that 71% of staff had experienced verbal abuse, 48% had been threatened by a customer, and 9% had been assaulted—a sobering context that explains the drive to adopt new security technologies. In November 2025 alone, Facewatch reported sending 49,589 positive alerts of known offenders to its retail partners.

But as Sainsbury’s pushes forward with its expansion of facial recognition systems, the balance between safety, privacy, and fairness remains a contentious issue. Rajah’s ordeal serves as a stark reminder that, for all their promise, these technologies are not infallible—and that human error can have very real, very public consequences. As more stores consider similar systems, the debate over how to safeguard both staff and shoppers—without sacrificing dignity or due process—will only intensify.

For now, Sainsbury’s has pledged additional training for its staff and a renewed commitment to transparency. But for Rajah and others who have been caught up in the crosshairs of technology and human fallibility, the experience lingers as a cautionary tale for a society increasingly watched by machines.