Today : Jan 13, 2026
U.S. News
13 January 2026

Charlottesville Ends License Plate Surveillance As Connecticut Debates Facial Recognition

City officials in Virginia halt Flock Safety camera use over privacy concerns while Connecticut stores face scrutiny for deploying facial recognition to deter theft.

In recent months, the debate over surveillance technology and privacy has intensified across the United States, with two high-profile cases in Virginia and Connecticut highlighting both the potential benefits and profound concerns surrounding automated monitoring systems. From the streets of Charlottesville to the aisles of Connecticut grocery stores, communities are grappling with the trade-offs between security, civil liberties, and the reach of private technology companies.

On December 15, 2025, the Charlottesville City Council voted to end its contract with Flock Safety, the company behind Automatic License Plate Reader (ALPR) cameras that had been deployed across the city in a one-year pilot program. According to reporting from The Cavalier Daily, ten Flock cameras were installed in late 2024 at strategic locations, including the busy intersection of University Avenue and 14th Street. The cameras were initially championed by local police as a way to alert officers in real time when stolen vehicles, missing persons, or other vehicles of interest entered the city, thereby supporting criminal investigations and enhancing public safety efforts.

Charlottesville City Manager Sam Sanders acknowledged the system’s success in aiding law enforcement, noting during the December council meeting, “Council has received a performance briefing and learned of the positive results in solving cases.” Indeed, the cameras were credited with helping to locate the suspect in the Brown University shooting on December 13, 2025. But positive results were not enough to outweigh mounting concerns about data protection, potential misuse, and a lack of local control over the system. Sanders explained, “Because of ongoing concerns … [including] an inability to guarantee our local parameters or protect information … Council has requested that we not move forward with this system.”

These concerns are far from hypothetical. The Virginia Center for Investigative Journalism at WHRO reported that between June 2024 and June 2025, law enforcement conducted nearly 3,000 inquiries into Virginia’s Flock cameras in connection with immigration enforcement. A May 2025 law signed by Governor Glenn Youngkin, effective July 2025, now restricts ALPR data use to specific criminal investigations involving reasonable suspicion, missing or endangered persons, or alerts regarding stolen vehicles or plates. The law also prohibits law enforcement agencies from searching or downloading Flock system data unless it directly relates to one of those purposes.

Flock Safety, for its part, insists it does not automatically share data with federal agencies. In an October 22, 2025 press release, the company stated, “Local public safety agencies collaborate with federal agencies on a wide variety of serious crimes, including human and narcotics trafficking and multi-jurisdictional cases. If agencies choose to collaborate with federal agencies, that is wholly up to them.”

Yet skepticism remains. Charlottesville City Council Member Michael Payne, who opposed the cameras, raised alarms about the creation of a privately controlled, nationwide surveillance network capable of tracking vehicle movements across jurisdictions. He wrote in a statement to The Cavalier Daily, “Such a dataset can be connected with other data points and create a nationwide, privately owned system of mass surveillance. Once such capabilities exist, threats to civil liberties and privacy become inevitable.” Payne also cited troubling examples from other states, including the tracking of women visiting abortion clinics in areas where abortion has been outlawed, targeting people for deportation, and misusing the technology to locate partners in danger of domestic violence.

Adding fuel to the fire, an October 21, 2025 report from the University of Washington’s Center for Human Rights found that Flock surveillance systems were accessed by U.S. Border Patrol through what the report described as “back door access” in jurisdictions that had not explicitly granted permission. The report echoed Payne’s concerns, warning that such surveillance could be used to criminalize those seeking gender-affirming or reproductive healthcare. Flock Safety disputed these findings, stating in a press release, “These are slippery slope arguments that are not in line with the thousands of use cases of Flock technology being used to solve violent and property crime every week. We are unaware of any credible case of Flock technology being used to prosecute a woman for reproductive healthcare or anyone for gender affirming healthcare.” The company also said it had introduced “keyword filters” to prevent users from searching license plate data using terms related to immigration or reproductive healthcare in states where such searches are prohibited.

Despite the controversy, Payne argued the removal of Flock cameras would not significantly hamper police response during emergencies, pointing out that agencies rely on established protocols rather than ALPR data during active threats. “In the absence of Flock cameras, the agencies would respond to an active threat as they did before Flock cameras existed,” he said. “Flock cameras would not be used during an active threat, but rather would be utilized during investigations.” Charlottesville will hold a work session in 2026 to consider alternatives to the Flock system.

Meanwhile, hundreds of miles north in Connecticut, a different surveillance technology is sparking its own debate. As reported by WFSB, grocery stores such as ShopRite and Wegmans have begun using facial recognition technology to combat theft. ShopRite posts signs on its doors notifying customers that facial recognition is in use, and the cameras snap pictures of shoppers as part of the store’s security system. Wakefern, the parent company of ShopRite, claims the technology is used solely for security, with video footage regularly deleted, never sold, and only shared with law enforcement if a crime occurs.

Still, customers like Kelly Schuchardt of Avon are uneasy. “I am one of those moms who runs in and out, and I didn’t even take notice. But I guess it does concern me,” she told WFSB. Others, like Ralph Zimbouski, feel resigned: “Do you have a choice anymore? That’s a problem, you don’t have a choice no matter where you go. It’s their property; they are going to do what they want.”

Lawmakers in Connecticut are now considering new consumer protections, such as requiring consent or mandatory signage for facial recognition use. “There is something about your overall understanding of privacy—you don’t want your behavior turned over, and certainly sold, without your consent to other companies,” said Sen. Jujata Gadkar-Wilcox, a member of the General Law Committee. Republican Sen. Paul Cicerella, while recognizing privacy concerns, argued that facial recognition is a valuable tool for stores but should not be banned outright. “I understand it’s a concern, and we need to address that, but we should not ban this tool altogether,” Cicerella said.

Unlike New York, which already regulates stores’ use of facial recognition, Connecticut currently has no such laws. The ACLU has weighed in as well, warning that the technology creates serious privacy risks and raises concerns about racial justice.

As cities and states wrestle with these thorny questions, one thing is clear: the balance between public safety and individual privacy is more contested than ever, with technology companies, lawmakers, and citizens all vying to shape the rules of the road.