When Windsor, Connecticut agreed in 2022 to install 16 Automated License Plate Readers (ALPRs) from Flock Safety Systems, town officials likely believed they were taking a step forward in public safety. But just two years later, the quiet suburb found itself at the center of a heated debate about surveillance, privacy, and the unintended consequences of new security technologies.
At the heart of the controversy was a setting labeled “Enable Nationwide Lookup.” According to CT News Junkie, this option—switched on by default—didn’t just allow Windsor police to search a national database for information about vehicles. It also gave hundreds of agencies across the country access to Windsor’s own ALPR data, all without notifying local authorities or residents. The implications of this revelation rippled quickly through the community. Police Chief Donald Melanson told the Town Council that the setting has since been disabled, restricting access to only Connecticut agencies, each of which must now request specific permission.
Still, for some Windsor residents, these changes were not enough. Davida Crabtree, former acting senior minister of First Church of Windsor, voiced the fears of many: “These cameras are having an impact—a negative impact—on people’s lives.” Crabtree described how families, particularly those with immigrant backgrounds, felt a “chilling effect” on their participation in church and community life. “Let us all understand what we’ve gotten ourselves into, and how we can get out of it,” she urged.
Despite these concerns, the Windsor Town Council moved forward on January 19, 2026, unanimously approving a new policy designed to balance privacy and security. The policy, as reported by CT News Junkie, aims to “protect the privacy, dignity, civil rights, and personal information of all residents” while still supporting legitimate law enforcement functions. Notably, it prohibits the use of facial recognition with the ALPRs—a feature not currently available, but one that could be added in the future, Melanson noted.
“ALPR use in the Town of Windsor shall reflect the balance between safety and privacy that residents expect,” the policy states. Council members and residents alike acknowledged that the new rules are “better than nothing,” but many called for further discussion and greater transparency. Liz Dupont-Diehl, a Windsor resident, emphasized, “It is imperative that we in Windsor and Connecticut do not knowingly put our residents at risk.”
Windsor’s experience isn’t unique. Across Connecticut, more than 40 police departments have adopted some form of license plate reader, with Flock Safety the most widely used provider. These devices capture and temporarily store images of the rear license plate of every passing car—never the driver’s image, officials say—with data kept for only 30 days. According to Flock Safety, over 5,000 law enforcement agencies nationwide now use their systems.
Supporters of the technology, like Chief Melanson, argue that ALPRs have been instrumental in solving crimes such as burglaries, robberies, assaults, and in finding missing people. “We do know that crime travels across town lines,” Melanson explained. “Something happens in Manchester and they jump on I-291 and come into Windsor… Being able to access all that data at once is really important.” The department audits the system monthly, tracking which officers access the database and for what purpose, to ensure it’s used appropriately.
Yet, privacy advocates remain wary. The American Civil Liberties Union of Connecticut last year called for a statewide moratorium on ALPR use until comprehensive legislation is in place to prevent the misuse, sharing, or selling of driver-location data. Gus Marks-Hamilton of the ACLU voiced concerns during public comment: “There’s evidence ALPR data is used for immigration and abortion and gender affirming care, in violation of the Trust Act. The problem with mass surveillance is that it always expands beyond the use it was justified for.”
Other Connecticut towns are grappling with similar dilemmas. In Putnam, authorities temporarily covered up five license plate readers while they held public hearings on their use and debated which agencies should have access to the data, The New London Day reported. Several police departments have also launched transparency portals on the Flock website, listing the number of cameras in use, which departments can access the data, and the number of searches conducted. These portals also specify prohibited uses, such as immigration enforcement.
Windsor’s ALPR debate is part of a much broader national conversation about surveillance and biometric data. On January 19, 2026, the Shared Security Podcast discussed Amazon Ring’s new Familiar Faces feature, which uses AI-powered facial recognition to identify people at your doorstep. While this offers convenience—automatically recognizing friends, family, or delivery drivers—it also raises thorny questions about privacy, data accuracy, and the legal implications of collecting biometric information.
The podcast, as summarized by Shared Security, delved into the feature’s potential inaccuracies, the patchwork of privacy laws across U.S. states, and the broader concerns over AI-driven surveillance. Hosts provided practical advice for responsible use, but the underlying tension was clear: how do we balance technological convenience with the fundamental right to privacy?
Illinois has been at the forefront of this debate since 2008, when it enacted the Biometric Information Privacy Act (BIPA)—widely regarded as one of the nation’s strongest consumer privacy laws. BIPA requires companies to publish clear policies and obtain explicit consent before collecting or using biometric data, such as fingerprints, retina scans, or facial geometry. The intent was to protect consumers, but as the Chicago Tribune editorialized, the law’s implementation has led to a wave of costly litigation and legal uncertainty.
On December 17, 2025, the 7th U.S. Circuit Court of Appeals in Chicago upheld certification of a class-action lawsuit against Amazon, alleging the company failed to meet BIPA’s notice and consent requirements for a virtual try-on feature that analyzes customers’ facial geometry. The law’s strict provisions have already resulted in massive settlements—Facebook paid $650 million over its facial tagging tool, and Google settled for $100 million regarding its photo grouping technology.
In 2023, the Illinois Supreme Court called on lawmakers to revisit BIPA after a case involving fingerprint scanners at White Castle restaurants highlighted the risk of astronomical damages. The court ruled that each instance of collecting biometric data without informed consent constituted a separate violation, potentially leading to billions in liability. Although the General Assembly amended the law in 2024 to limit excessive penalties, debate continues over whether these changes apply retroactively to pending cases like Amazon’s.
Critics argue that while strong privacy protections are necessary, BIPA in its current form creates a legal minefield that stifles innovation and invites opportunistic lawsuits. The Chicago Tribune editorial suggested that lawmakers should consider repealing the law and starting anew, warning that “if the state can’t fix this issue after years of evidence that the original law has gone wrong, then…give it a fresh shot.”
As surveillance technologies—from license plate readers to AI-powered doorbells and virtual try-on apps—become increasingly woven into daily life, communities across the country are wrestling with the same fundamental question: how do we harness the benefits of innovation without sacrificing privacy and civil rights? Windsor’s journey, and the experiences of states like Illinois, offer a cautionary tale and a call for thoughtful, transparent policymaking that keeps pace with technology’s relentless advance.