The world of privacy and cybersecurity is in a state of constant flux, and as 2026 dawns, the challenges facing companies are only increasing in complexity. Between rapidly evolving technologies, a tangle of new state and federal laws, and the ever-present risks of data breaches and regulatory scrutiny, organizations are scrambling to keep up. The push and pull between innovation and compliance has never been more pronounced, and industry experts say the coming year is shaping up to be one of the most challenging yet for those tasked with protecting sensitive information.
According to Dark Reading, 2025 was a watershed year for privacy and cybersecurity legislation in the United States. The Department of Justice (DOJ) announced a new Data Security Program, the Federal Trade Commission (FTC) updated the Children’s Online Privacy Protection Act (COPPA), and the Department of Health and Human Services (HHS) proposed amendments to the Health Insurance Portability and Accountability Act (HIPAA) Security Rule. Each of these regulatory moves highlighted just how much the privacy landscape has changed in the past decade—and how challenging it is for organizations to stay compliant.
"It's made more challenging by the frequency of how quickly things change in the environment," said David Saunders, a privacy and cybersecurity partner at McDermott, Will and Schulte, in an interview with Dark Reading. "I get it, but it's hard to expect compliance from companies when it's constantly changing. At some point, it has a deterrent effect on compliance."
One of the most hotly debated issues heading into 2026 is the implementation of minimum age requirements for app downloads and purchases. State laws requiring app stores like Google and Apple—and app developers themselves—to verify users’ ages have been a flashpoint. In late December 2025, a federal judge temporarily blocked the Texas Senate Bill known as the App Store Accountability Act, which was set to take effect on January 1, 2026. Meanwhile, a similar law in Louisiana was struck down by the state supreme court, though the attorney general has vowed to appeal. Utah, on the other hand, enacted its own age verification law in mid-2025, adding yet another layer of complexity.
"It's still front of mind because the laws were first of their kind," Saunders explained. Many companies, especially those with products aimed at children, found themselves racing to adapt to new frameworks and standards, only to have the courts intervene at the last minute. The uncertainty has left businesses in a holding pattern, unsure whether to invest in compliance measures that could be rendered moot by ongoing legal battles.
The challenges don’t end with age verification. The California Consumer Privacy Act (CCPA) is set to introduce new requirements in 2026, including mandatory cyber risk audits and risk assessments. These changes will require companies to enhance their data collection, consent notices, and handling of sensitive information. For many organizations, the prep work has already begun, but the scope of the task is daunting.
Artificial intelligence (AI) is another area where the regulatory ground is shifting beneath companies’ feet. Human resources departments are increasingly using AI for resume screening and performance evaluations—a move that promises efficiency but also raises the risk of discrimination and bias. Illinois responded by amending its Human Rights Act to regulate AI use in employment decisions, with the law taking effect on January 1, 2026. Several other states are following suit, and companies are being forced to rethink how they deploy AI in the workplace.
"I think this year companies are catching up to the fact that these laws now exist," Saunders noted. The rapid pace of change means that many businesses are only now realizing the full extent of their compliance obligations.
At the federal level, the picture is even murkier. Demian Ahn, a partner at Wilson Sonsini specializing in data, cybersecurity, and privacy, told Dark Reading that the Trump administration’s approach to cybersecurity has been "inconsistent and a work in progress." While some officials and members of Congress have pushed for harmonization of rules, others have simply let proposed regulations languish. As a result, Ahn expects that enforcement will focus on existing laws and new requirements for industries with national security implications, such as those covered by the Cyber Incident Reporting for Critical Infrastructure Act (CIRCIA), which is due to be implemented in May 2026.
With federal action uncertain, states are stepping up. Attorney general offices across the country are preparing to fill what they see as a void in federal enforcement. "I think it's going to continue to be on the state level, and frankly that's more complicated and introduces more burdensome compliance rubrics for companies," Saunders observed. Companies would prefer a single federal standard, but for now, they must navigate a patchwork of state laws—each with its own quirks and requirements.
Against this backdrop, the role of experienced privacy attorneys has never been more important. Jacqueline (Jackie) Cooney, who recently joined Nixon Peabody LLP as a partner on the Cybersecurity & Privacy team, brings more than 30 years of experience to the table. Cooney’s journey began in the early days of the internet, working in the Senate as lawmakers grappled with the implications of online data sharing. "It was the beginning of the government recognizing that personal information and privacy required protection," she recalled in an interview with Nixon Peabody’s "A Little Privacy, Please!" podcast.
Cooney specializes in helping clients build sustainable, flexible privacy programs that integrate legal compliance with business operations. Her work involves mapping out where data is stored, how it’s used, and ensuring that documentation and processes are in place to meet regulatory expectations. "That’s the bulk of what I do: digging into their programs; figuring out what needs fixing; and determining how to do it from an operational, practical perspective," Cooney explained.
Clients are increasingly concerned about the regulatory risks associated with AI, not just in the U.S. but also abroad. The EU AI Act and a flurry of state-level initiatives have left many companies unsure of which rules apply to them—and how to comply. "Clients are becoming really concerned from a couple of perspectives. One is regulatory. The landscape around AI regulations is shaky at the moment. Companies don’t know what laws apply to them now or what will apply," Cooney said.
In recent years, many organizations have tried to limit employee use of AI tools like ChatGPT, fearing that sensitive company or personal information could be inadvertently disclosed. But with AI becoming integral to business operations, the focus is shifting to governance—establishing clear rules for how and when AI can be used, and what safeguards need to be in place.
Cooney finds the most satisfaction in conducting comprehensive privacy program assessments and remediations. "So much of it relates to business operations. How do you integrate these rules and requirements without stymying business or getting in the way of profit?" she said. Her approach is pragmatic, helping clients develop risk-based programs that balance regulatory demands with the need to innovate and grow.
For companies looking ahead to 2026, the advice from experts is clear: stay vigilant, focus on the big-ticket compliance items, and be prepared for surprises. As Saunders put it, "The fun thing about privacy in my world is there's going to be something this year that I didn't expect." With new laws, evolving technologies, and an unpredictable enforcement landscape, one thing is certain—organizations will need to remain nimble and proactive if they hope to stay ahead of the curve.