Cyberscoop

U.S. companies hit with record fines for privacy in 2025


U.S. states issued $3.45 billion in privacy-related fines to companies in 2025, a total larger  than the last five years combined, according to research and advisory firm Gartner.

The increase is partly driven in part by stronger, more established privacy laws in states like California, new interstate partnerships built around enforcing laws across state lines, and a renewed focus to how AI and automation affect privacy.

The data indicates that “regulators are shifting their efforts away from awareness to full scale enforcement,” marking a significant shift from even the last few years in how aggressively states are investigating and penalizing companies for privacy law violations.

“This is increasingly becoming the standard in 2026 and for the coming two years,” Gartner’s analysis concludes.

Privacy related fines have gone up significantly in recent years. (Source: Gartner)

The California Consumer Privacy Act had consumer privacy provisions go live in 2023, but for years enforcement was largely dormant. According to Nader Heinen, a data protection and AI analyst at Gartner and co-author of the research, that enforcement lag mirrors the way other major privacy laws, like Europe’s Global Data Protection Regulation, have been carried out in order to “lead with a bit of guidance” for companies while using enforcement sparingly.

But that era appears to be over. In 2025, the California Privacy Protection Agency has used the law to pursue violators across a wide range of industries— not just large conglomerates, but smaller and mid-sized companies in tech, the auto industry, and consumer products, including off-the-shelf goods and apparel.

Heinen said some businesses “weren’t paying attention” and may have been lulled into a false sense of complacency as regulators spun up their enforcement teams, leading to a harsh 2025.

“Unfortunately what happens when so much time passes between the legislation and starting enforcement regularly, is a lot of organizations let their privacy program atrophy,” he said.

States have also sought to combine their resources to target and penalize privacy violators across state lines. Last year, ten states came together to form the Consortium of Privacy Regulators, pledging to coordinate investigations and enforcement of common privacy laws around accessing, deleting and preventing the sale of personal information.

Beyond laws like the CCPA, states have been updating existing privacy and data-protection laws to more directly address harms from automated decision-making technologies, including AI. State privacy regulators are especially focused on how personal or private data is used to train AI systems and  help it make inferences.

Gartner expects privacy fines to further increase in the coming years and Heinen said states will likely again lead the way on building the legal infrastructure to enforce data privacy in the AI age as they become the main conduit for lingering anxiety about the potential negative impacts of the technology.

“You have to put yourself in the position of these state legislatures,” Heinen said. “Their constituencies – the voting public – is telling them we’re worried about AI. AI anxiety is a thing. “Everybody’s worried about whether AI is going to take their job or impact their capacity to find a job, so they want to see legislation in place to protect them.”

This past month, House Republicans unveiled their latest attempt to pass comprehensive federal privacy legislation with a bill that would preempt tougher state laws like those in California. In particular, the CCPA gives residents a private right of action – the legal right to sue companies directly – for violation of privacy laws.

On Monday, Tom Kemp, executive director of the California Privacy Protection Agency, wrote to House Energy and Commerce Chair Brett Guthrie, R-Ky., to oppose the bill, arguing it would provide “a ceiling” for Americans’ data privacy protections rather than a “floor” to build on.

“Preemption would strip away important existing state privacy provisions that protect tens of millions of Americans now,” Kemp wrote. “That would be a significant step backward in privacy protection at a time when individuals are increasingly concerned about their privacy and security online, and when challenges from data-intensive new technologies such as AI are developing quickly.”

Derek B. Johnson

Written by Derek B. Johnson

Derek B. Johnson is a reporter at CyberScoop, where his beat includes cybersecurity, elections and the federal government. Prior to that, he has provided award-winning coverage of cybersecurity news across the public and private sectors for various publications since 2017. Derek has a bachelor’s degree in print journalism from Hofstra University in New York and a master’s degree in public policy from George Mason University in Virginia.



Source link