After six months of enforcement of Oregon’s Consumer Privacy Act (OCPA), a new report from Oregon Attorney General Dan Rayfield indicates strong consumer engagement with law’s privacy rights, notable business compliance efforts, and key areas where businesses are falling short. Since the OCPA took effect in July 2024, the Privacy Unit at the Oregon DOJ has received 110 consumer complaints.
The most common complaints involve data brokers, particularly background check websites selling personal information; social media and tech companies collecting and sharing user data; and denials of consumer rights requests, with the right to delete personal data being the most frequently requested yet denied right.
Under the OCPA, businesses failing to comply receive “cure notices,” which provide 30 days to resolve violations. Over the past six months, the Privacy Unit has initiated and closed 21 privacy enforcement matters. Common compliance deficiencies flagged include lack of required disclosures, confusing or incomplete privacy notices, and difficult or hidden opt-out mechanisms. Many businesses failed to inform consumers adequately about their rights under the OCPA, sometimes listing rights for other states but omitting Oregon.
The report highlights the positive response from most businesses to initial enforcement. After receiving cure notices, companies quickly updated policies and rights mechanisms to comply with the DOJ’s requests. Upcoming compliance deadlines are set to extend the OCPA’s effects: nonprofits will become subject to the OCPA by July 1, 2025, and stricter enforcement measures will take place starting January 1, 2026. Future provisions like the universal opt-out mechanism will take effect after 2026, requiring businesses to honor automated consumer requests.
Meanwhile, Gal Ringel explores how organizations are transforming approaches to data subject requests (DSRs) from costly manual processes to strategic privacy initiatives. Under laws like the GDPR and CCPA, consumers can file DSRs to understand what personal data organizations collect. With approximately 6 billion people protected under nearly 150 data regulations, the challenge of fulfilling these requests intensifies, especially as awareness of data rights continues to rise.
The UK’s information commissioner recorded a 66% increase in complaints following the GDPR's implementation. Organizations must adapt to navigate this patchwork of privacy rules across local, national, and global jurisdictions, handling rising volumes of DSRs within specific timeframes.
Unfortunately, the traditional methods of managing DSRs remain resource intensive. Many companies are bogged down by vast networks of data sources, making compliance complex and burdensome. Large tech firms face overwhelming scale, having to fulfill numerous user requests annually. To streamline compliance processes, AI technology is being considered, particularly to help identify potential risks and streamline operations.
Among other concerns, California lawmakers proposed legislation addressing privacy intrusions affecting women, transgender individuals, and immigrants. These efforts respond to alarming incidents where personal data has been misused by anti-abortion groups and law enforcement, compelling lawmakers to strengthen privacy protections amid growing political challenges.
Responding to the misuse of healthcare data, proposed bills include measures like Assembly member Rebecca Bauer-Kahan’s AB 45, which would outlaw geofencing around healthcare facilities, aimed at preventing data brokers from targeting vulnerable communities seeking reproductive health services.
Senate leaders acknowledge the financial stakes for California, which relies on over 60% of Medicaid funding from the federal government. Amid this, Gov. Gavin Newsom’s nuanced stance raises questions about the balance of protecting rights and maintaining federal cooperation. While Newsom has supported abortion, transgender, and immigrant rights, he also expressed concerns about politically contentious issues like sports participation for transgender athletes.
Controversially, Trump has rolled back protections against federal immigration enforcement near healthcare facilities, heightening fears about accessing care. California’s reforms aim to establish more stringent barriers surrounding personal information, particularly as law enforcement agencies face rising scrutiny for accessing private health data.
Recent research about popular AI tools such as DeepSeek sheds light on privacy issues. Despite the frenzy around DeepSeek since its launch, data suggests some US-based chatbots could collect more user information. Researchers at Surfshark revealed Google Gemini as the most data-intensive option, gathering 22 of 35 possible data types, compared to DeepSeek's 11 and ChatGPT's 10. Concerns about these platforms relate predominantly to their data collection practices, especially as AI technology increasingly becomes entangled with legislative scrutiny.
DeepSeek’s privacy policy raises alarms, stating user information may be stored on servers outside the user's country—heralding fears of surveillance and data misuse. Organizations face significant challenges to create scalable and efficient systems for managing DSRs responsibly and ethically.
Legislation like the OCPA signals both advances and challenges within privacy advocacy. Lawmakers increasingly must weigh consumer rights, public safety, and the necessity of data access—all against the backdrop of powerful technologies continuously reshaping personal information landscapes. How America navigates these turbulent waters will largely influence the future of privacy and consumer rights.