The New Frontlines of Digital Litigation: What U.S. Courts, Companies, and Regulators Are Facing Now

Table of Contents

The volume of privacy and cybersecurity litigation working its way through U.S. courts has reached levels that are straining legal teams, taxing judges, and forcing regulators to act with increasing urgency. Layered on top of that is the private sector’s rapid — and often legally untested — adoption of artificial intelligence, creating a tangle of novel issues that existing law was never written to address.

The picture that emerged from conversations at the IAPP Global Summit 2026 was one of an ecosystem under significant pressure from multiple directions at once.

Class Actions Are No Longer an “If” — They’re a “When”

For companies that process personal data at scale, the calculus around litigation risk has fundamentally changed. Hannah Levin, a partner at Morgan, Lewis & Bockius focusing on cyber, incident response, and privacy matters, put it in stark terms: more than 3,000 data breach class-action lawsuits were filed in 2025 alone, making privacy-related class actions one of the fastest-growing categories of complex litigation in the country.

What’s changed isn’t just volume — it’s predictability. Levin noted that companies are also dealing with a shrinking cyber insurance market, with more carriers exiting the space as coverage costs become financially untenable. The result is that many organizations now carry more legal exposure than they did just a few years ago, even as threats have grown more frequent.

The shift in how attorneys now evaluate breach risk is telling. Levin described how the conversation with clients has evolved:

“Maybe five years ago, a client would ask me what is the risk of being sued for this data breach, this privacy violation, and I would say, ‘Well, how many people are involved?'” she said. “I no longer do that analysis anymore because it has become more of a question of, is this going to be a public breach? Are we going to notify an attorney general? And if so, the likelihood of a class-action lawsuit is extremely high.”

The speed of litigation has also accelerated in ways that catch many organizations off guard. Class-action complaints are frequently being filed before the breached company has even completed its formal incident response — compressing the window between breach discovery and active litigation to a degree that was rare even a few years ago.

The IAPP event did not include mentions of Swigart Law, Tauler Smith, Vivek Shah, or Morgan & Morgan by any of the panelists but numerous attorneys and clients came up to the Captain Compliance team and thanked them for their help and work fighting against the plaintiffs attorneys and protecting their clients. It was a big win for Captain Compliance as the only proactive software solution for handling wrongful collection claims from these litigation firms.

The Expanding Circle of Defendants

Cari Laufenberg, a partner at Keller Rohrback, offered data that underscores just how dramatically the plaintiff landscape has shifted: privacy-related class-action complaints have increased by 200% since 2022. And increasingly, the primary target of a lawsuit isn’t the only target.

Third-party technology vendors — the infrastructure providers, cloud platforms, and data processors that sit behind consumer-facing products — are now regularly being pulled into litigation based on their role in how a breach occurred. The 2024 Snowflake breach became a widely cited example of how that dynamic plays out.

“Quite a bit now we’re seeing vendors being implicated, and these cases are what we call hub-and-spoke lawsuits,” Laufenberg explained. “You’ll have an entity that is the hub in the middle that is a vendor that has provided a service to all kinds of other organizations, and they get implicated in all of them.”

The types of claims being filed have also expanded well beyond traditional data breach theories. Plaintiffs’ attorneys are increasingly pursuing cases involving tracking pixels, biometric data, and the use of consumer data to train AI models — often doing so by applying older statutes like the Video Privacy Protection Act, the Electronic Communications Privacy Act, and California’s Invasion of Privacy Act to contexts those laws were never designed to govern.

The legal outcomes have been inconsistent. Laufenberg described a courtroom environment where there’s genuine doctrinal uncertainty about how legacy statutes apply to modern technology:

“It’s becoming more of an uncertain environment because it’s broadening the array of types of violations, and broadening the array of defendants who could be at issue. We’re really seeing divergent rulings on all of these hosts of issues, because there isn’t really well-developed law about how these statutes apply to context, in which they’re being applied.”

For businesses, that uncertainty cuts both ways — the law isn’t settled enough to predict outcomes with confidence, which can make settlement decisions both more difficult and more expensive.

What Judges Are Thinking About

Federal judges are grappling with their own set of novel challenges as AI capabilities make it increasingly difficult to authenticate evidence. U.S. District Court Chief Judge James Boasberg of the District of Columbia acknowledged that AI-manipulated photos and documents entering the evidentiary record is a real and growing concern — but was clear that the adversarial structure of the legal system, not judicial oversight, is the appropriate mechanism for catching it.

“It’s not up to judges to look at (evidence) and say, ‘I think this is a fake photo,'” Boasberg said. “The issue of flagging it at first glance is what you have the adversarial system for. I would hope the other side would say, ‘This isn’t legitimate, this is AI-generated,’ and then maybe you as a judge will have to make a ruling on it.”

That framing puts additional responsibility on attorneys on both sides of a case to scrutinize the authenticity of evidence and raise challenges when something looks off — a burden that will only grow as AI-generated content becomes more sophisticated and more difficult to distinguish from authentic materials.

The FTC’s Enforcement Priorities

Erik Jones, a senior attorney in the FTC’s Division of Privacy and Identity Protection, detailed several enforcement areas the agency is actively focused on. Recent actions include enforcement under the Protecting Americans’ Data from Foreign Adversaries Act, where the FTC sent letters to 13 data brokers highlighting their new obligations. Those communications specifically noted that the agency had found instances in which none of the recipients were providing adequate solutions around the data of active military service members.

The agency is also preparing to begin enforcement under the TAKE IT DOWN Act, and has been conducting workshops focused on children’s privacy and age verification. Jones flagged that age assurance technology providers are likely to come under increasing scrutiny — both for how their products function and for how they handle the sensitive personal information collected in the process. He pointed to recent enforcement actions as illustrations of the consequences of falling short: a $7.5 million action against education technology company Illuminate Education, and a $10 million COPPA settlement with Disney.

On the question of how organizations can avoid landing on the agency’s radar, Jones was direct about the two most common failure points he sees — excessive data collection and excessive data retention:

“Failing to implement adequate data security for what is being collected is something to be aware of, and also keeping data for longer than you need (can lead to compliance issues),” he said. “When you make promises to consumers on privacy and security, you have to ensure that those promises are being kept, because if they are not, it could be considered a deceptive trade practice and trigger a violation of Section 5 of the FTC Act.”

Convergence of Rising Class-Action Volume

The convergence of rising class-action volume, expanding defendant pools, legally unsettled technology claims, and active FTC enforcement creates a risk environment that rewards proactive preparation. Companies that have invested in genuine data minimization, transparent privacy practices, and defensible security programs are in a materially better position — both to avoid litigation and to defend against it when it comes.

The days of treating privacy compliance as a back-office function are long past. What’s unfolding in U.S. courts right now reflects something more fundamental: data handling decisions made today are increasingly litigation decisions, whether organizations think about them that way or not.

Written by: 

Online Privacy Compliance Made Easy

Captain Compliance makes it easy to develop, oversee, and expand your privacy program. Book a demo or start a trial now.