Today : Feb 05, 2026
Technology
05 February 2026

Sainsbury’s Facial Recognition Error Sparks Privacy Outcry

A longtime London shopper was wrongly ejected from his local Sainsbury’s after staff misinterpreted facial recognition alerts, fueling debate over biometric surveillance and retail security.

Warren Rajah had been a loyal customer at his local Sainsbury’s in Elephant and Castle, south London, for a decade. On the afternoon of February 3, 2026, that routine was upended in a way he never could have anticipated. As he browsed the aisles, Rajah was suddenly approached by two staff members and a security guard. Without warning or explanation, they asked him to leave immediately. Confused, Rajah asked for a reason. Instead of a clear answer, staff simply pointed to a sign announcing the store’s use of facial recognition technology.

Rajah’s experience, as reported by BBC News, Daily Mail, and other outlets, highlights the risks and realities of a growing trend in UK retail: the use of facial recognition software to combat theft and violence. Sainsbury’s, the country’s second-largest supermarket chain, had recently rolled out the Facewatch system in six London stores—including Rajah’s local branch—to address rising incidents of shoplifting and aggression toward staff. The company’s early data claimed a 92% rate of offenders not returning and a 46% drop in theft, aggression, and antisocial behavior in pilot locations. “Fewer frightening moments for colleagues and a more reassuring experience for customers,” Sainsbury’s stated in its justification for the expansion.

But for Rajah, a 42-year-old data professional, the technology delivered neither reassurance nor fairness. Instead, he faced public humiliation. “It was the most humiliating moment of my life, being escorted out of the place I have shopped in for 10 years in front of my community,” he told Metro, as quoted by the Daily Mail. Rajah described the ordeal as “Orwellian” and traumatic, expressing concern about the potential impact on more vulnerable customers. “Imagine how mentally debilitating this could be to someone vulnerable, after that kind of public humiliation,” he told BBC News.

The confusion didn’t end at the store’s exit. Rajah, desperate to clear his name, contacted Facewatch directly. To verify he was not on their database, he was required to submit a photo of himself and a copy of his passport—a process he described as “a massive invasion of my privacy.” “I started panicking massively because I don’t know anything about this company or what they do. Do they record crimes as they happen? Are they linked to law enforcement? Would this impact my career?” he wondered aloud, as reported by the Daily Mail.

Facewatch, which describes itself as "the UK's leading facial recognition company providing a cloud-based facial recognition security system to safeguard businesses against crime," confirmed that there were no incidents or alerts associated with Rajah. The company explained that, by law, it must conduct "appropriate identity checks" before disclosing sensitive information. “Our Data Protection team followed the usual process to confirm his identity and verified that he was not on our database and had not been subject to any alerts generated by Facewatch,” the company said in a statement quoted by the BBC.

Ultimately, both Facewatch and Sainsbury’s concluded that Rajah’s ejection was not the result of a technological failure, but rather a “human error” in which staff had misinterpreted the system’s alert and approached the wrong person. Sainsbury’s apologized to Rajah, offered him a £75 shopping voucher, and stated, “This was not an issue with the facial recognition technology in use but a case of the wrong person being approached in store.” The company also announced that management at the Elephant and Castle store would receive additional training to prevent similar incidents in the future.

Still, Rajah remained unconvinced by assurances about the technology’s safety and accuracy. “Am I supposed to walk around fearful that I might be misidentified as a criminal?” he asked, echoing broader public anxieties about biometric surveillance. He criticized the system’s reliance on human interpretation and questioned whether staff were sufficiently trained to handle such sensitive technology. “The Facewatch system relies on humans to interpret the alerts and I see Sainsbury's staff as being insufficiently trained,” he told the BBC.

Rajah’s case is not unique. Previous incidents involving Facewatch have been reported across the UK. In one case, a B&M customer in Cardiff was wrongly added to a watchlist and accused of shoplifting. Another customer, Jenny, was banned from a Birmingham store after being falsely accused of stealing wine. A 64-year-old woman was put on a watchlist for allegedly stealing less than £1 worth of paracetamol from Home Bargains, and Danielle Horan from Manchester was ordered out of two shops after being mistakenly identified as a toilet roll thief. As Jenny told BBC Radio 4’s Today programme, “It’s like we’ve made retail managers and technology companies judge, jury and executioner, with no legal due process.”

These stories have drawn the attention of privacy advocates and legal experts. Jasleen Chaggar, legal and policy officer at Big Brother Watch, told the BBC that her organization “regularly hears from members of the public who are left traumatised after being wrongly caught in this net of privatised biometric surveillance.” The Information Commissioner’s Office weighed in as well, urging retailers to “carefully consider the risks of misidentification and have robust procedures in place to ensure the accuracy and integrity of the personal information they collect and process.”

Retailers, for their part, argue that such measures are necessary in the face of rising violence and abuse. According to the retail trade union Usdaw, 71% of shop staff reported experiencing verbal abuse, 48% had been threatened, and 9% assaulted by customers in the past year. Sainsbury’s and other chains—including Budgens, Sports Direct, B&M, and Home Bargains—have turned to Facewatch in hopes of curbing these troubling trends. The company touts its “99.98% accuracy” and claims to have sent over 49,000 positive alerts in November 2025 alone.

Yet even as the technology’s defenders highlight its benefits, critics like Rajah demand greater transparency and accountability. “What’s the data behind those figures? How can we verify the accuracy of those claims?” he asked in an interview with The Standard. As someone who works in data himself, Rajah believes in the promise of artificial intelligence and technology—“The caveat is that it’s only ever as good as the people behind it.”

For now, Sainsbury’s says it will provide additional training to its staff and continue reviewing the deployment of facial recognition systems. The company maintains that no one else has been wrongly identified by Facewatch technology in its stores and that this was the first instance of someone being wrongly approached by a manager. But for Rajah—and for many concerned customers—the incident serves as a stark reminder that even the most advanced security tools are only as effective, and as fair, as the people who use them.