California’s privacy enforcement landscape is poised for significant evolution in 2026, with state regulators signaling both increased financial penalties and a strategic pivot toward examining how data practices actually impact consumers’ daily lives. These developments, disclosed at the recent IAPP Global Summit 2026, suggest that organizations operating in California should prepare for more rigorous scrutiny and potentially more expensive consequences for violations.
The California Privacy Protection Agency (CPPA) and the state Attorney General’s office—which share enforcement authority under the California Consumer Privacy Act (CCPA)—are charting courses that could fundamentally change the cost-benefit calculus companies use when evaluating privacy compliance investments.
The Fine Adjustment Problem
Since the CPPA began issuing penalties in 2024, enforcement actions have resulted in relatively modest financial consequences. The agency’s first five CCPA settlements collectively generated less than $4 million in fines, while individual data broker registration violations brought penalties in the five-figure range. The Attorney General’s office has issued more CCPA fines, but none have exceeded $2.75 million.
These figures have raised questions about whether current penalty levels provide sufficient deterrence, particularly for large technology companies and data brokers where violation-related revenues could dwarf enforcement costs.
Michael Macko, the CPPA’s Deputy Director of Enforcement, addressed this concern directly at the Summit: “I do think that fines under the CCPA could become a cost of doing business if they’re not higher. And I think as we mature and grow as an agency, one aspect of that is ensuring that fines are appropriate. I do think that’s an area to watch for us.”
This acknowledgment suggests the CPPA is actively considering how to calibrate penalties to achieve genuine deterrence rather than simply imposing nominal costs on violators.
What Higher Fines Could Look Like
While Macko didn’t specify target penalty levels, examining California’s statutory framework provides context for potential increases. The CCPA authorizes civil penalties up to $2,500 per violation, or $7,500 per intentional violation. For violations affecting thousands or millions of consumers, these per-violation caps create theoretical exposure running into hundreds of millions of dollars.
Current settlements have stayed well below these theoretical maximums, suggesting significant room for the CPPA to escalate penalties while remaining within statutory boundaries. The agency’s maturation process Macko referenced likely involves developing methodologies for calculating penalties that better reflect violation scope, duration, revenue impact, and corporate size.
Organizations should anticipate that settlements reached in 2024-2025 may not provide reliable benchmarks for penalties imposed in 2026 and beyond. The CPPA’s early enforcement actions established its capability to identify and prosecute violations; subsequent actions may focus on establishing credible deterrence through financial consequences.
The Attorney General’s Posture
Stacey Schesser, Supervising Deputy Attorney General and head of California’s Privacy Section, did not indicate whether the Department of Justice is similarly considering increased fines. However, the AG’s office has historically taken aggressive positions on consumer protection enforcement, suggesting parallel escalation is plausible.
The division of enforcement authority between the CPPA and Attorney General creates potential for divergent approaches, with one agency potentially using higher penalties to pressure the other toward similar positions, or vice versa. Companies facing investigations from either authority should not assume that settlement precedents from one necessarily predict outcomes with the other.
Beyond Opt-Outs: The Data Minimization Turn
Perhaps more significant than fine adjustments is the CPPA’s signaled shift in enforcement priorities. Macko acknowledged that the agency devoted substantial attention to user opt-out violations throughout 2025—addressing failures to honor consumer choices about data sales and sharing. While this work will continue, the agency is preparing to emphasize different CCPA provisions.
“A priority for us going forward is making sure when we look at opt outs and other aspects of California law, are we also looking at data minimization? Are we asking the right questions about purpose limitation?” Macko explained.
This represents a fundamental evolution from enforcing procedural rights (like opt-out mechanisms) to examining substantive data practices (whether data collection is necessary and appropriately limited).
Data Minimization and Purpose Limitation Explained
Data minimization requires that organizations collect only personal information that is adequate, relevant, and limited to what’s necessary for the disclosed purposes. It challenges the “collect everything we might someday use” mentality that characterized early digital business models.
Purpose limitation requires that personal information collected for one purpose not be used for incompatible purposes without new consent or legal basis. It prevents mission creep where data collected for one legitimate reason gradually gets repurposed for increasingly tangential uses.
Macko positioned these principles as fundamental to California privacy law, drawing parallels to the EU’s General Data Protection Regulation (GDPR) and principle-based U.S. securities regulation from the 1930s.
What This Means for Enforcement Targets
The shift toward data minimization and purpose limitation scrutiny has profound implications for how investigations will be conducted and violations identified.
Opt-out enforcement is relatively straightforward: either a company provides required mechanisms and honors requests, or it doesn’t. Evidence is often clear-cut, and remediation is typically procedural.
Data minimization enforcement is inherently more complex and fact-intensive. It requires examining:
- Why specific data elements are collected
- Whether those elements are genuinely necessary for stated purposes
- Whether less invasive alternatives could achieve the same purposes
- How long data is retained and why
- Whether collection scope has expanded beyond original justifications
This type of enforcement demands deeper investigation into business operations, product design decisions, and data lifecycle management. It also involves more subjective judgments about what constitutes “necessary” data collection—creating both uncertainty for businesses and broader discretion for regulators.
Purpose limitation enforcement similarly requires detailed examination of:
- Original purposes for which data was collected
- Current uses of that data
- Whether current uses are compatible with original purposes
- Whether new consent was obtained for incompatible uses
- Data sharing arrangements and recipients’ uses
Organizations should expect that CPPA investigations increasingly will involve detailed questioning about data flows, internal data use approvals, purpose documentation, and compatibility assessments. Companies that have treated purpose specifications in privacy notices as boilerplate language rather than operational constraints may face challenges demonstrating compliance.
Industry-Specific Vulnerabilities
Certain business models and practices appear particularly vulnerable to data minimization and purpose limitation scrutiny:
Advertising technology and data brokers: These industries are built on collecting maximum data for flexible future uses—exactly what minimization and purpose limitation principles constrain. Companies that collect consumer data for one purpose (e.g., website functionality) then broker it to third parties for unrelated purposes (e.g., employment screening) face obvious purpose limitation questions.
Loyalty and rewards programs: These programs often collect extensive behavioral and demographic data ostensibly for program administration, then use that data for detailed consumer profiling, predictive analytics, and third-party marketing. Whether such secondary uses are compatible with the “rewards program” purpose will likely face scrutiny.
Connected devices and IoT: Smart home devices, wearables, and connected vehicles often collect far more data than necessary for their primary functions. A smart thermostat that collects occupancy patterns might claim this optimizes heating efficiency, but if that data is also used for consumer profiling or sold to insurers, purpose limitation issues arise.
Mobile apps with extensive permissions: Applications that request access to contacts, location, camera, microphone, and other device features “just in case” rather than for specific, necessary functions exemplify data minimization concerns.
SaaS platforms with broad data retention: Cloud software that retains all customer data indefinitely “for potential future features” rather than deleting data after purposes are fulfilled faces both minimization and retention questions.
Age Assurance Rulemaking on the Horizon
While the CPPA focuses on CCPA enforcement evolution, the Attorney General’s office is preparing to tackle one of digital privacy’s most contentious frontiers: age verification and parental consent for minors’ social media use.
Schesser indicated that official rulemaking on age assurance requirements under California’s Protecting Our Kids from Social Media Addiction Act could begin “pretty soon” following the AG’s consideration of stakeholder comments from a November 2025 public meeting.
The Social Media Addiction Act Framework
This legislation prohibits platforms from producing “addictive feeds” to users under 18. Covered platforms can avoid violations through two mechanisms:
- Demonstrating lack of knowledge: Proving they had no actual knowledge about users’ ages
- Obtaining verifiable parental consent: Securing authenticated parental permission for minors’ use
The law’s age verification provisions take effect January 1, 2027, requiring the Attorney General to develop implementation rules governing how platforms verify users’ ages and authenticate parental consent.
The Age Verification Dilemma
Age verification for online services presents a fundamental tension between child protection and privacy. Effective age verification typically requires collecting additional personal information (government IDs, biometric data, financial account information, etc.) that creates new privacy risks and potential security vulnerabilities.
Methods platforms might use include:
Document verification: Users submit driver’s licenses, passports, or other government IDs. This provides high confidence in age verification but requires users to share sensitive documents with platforms (and potentially third-party verification services), creating identity theft risks.
Credit card verification: Only adults typically hold credit cards, so payment method verification can serve as age proxy. However, this excludes adults without credit cards and incentivizes platforms to require paid accounts.
Biometric analysis: Facial recognition or other biometric techniques can estimate age from uploaded selfies. This avoids sharing documents but raises accuracy concerns (particularly across different ethnicities) and creates biometric data privacy issues.
Third-party verification services: Specialized services claim to verify ages without platforms seeing underlying personal information. However, this shifts rather than eliminates privacy risks and creates questions about verification service oversight.
Social vouching: Existing verified adult users could vouch for new users’ ages. This approach minimizes data collection but is vulnerable to fraud and creates incentives for deception.
Each method involves tradeoffs between effectiveness, privacy protection, user friction, cost, and potential for circumvention.
What California’s Rules Might Require
The Attorney General’s rulemaking could take several approaches:
Prescriptive requirements: Specifying particular verification methods platforms must use, potentially with different options for different assurance levels.
Performance standards: Establishing accuracy and privacy protection requirements without mandating specific technologies, allowing platforms flexibility in meeting standards.
Tiered approaches: Different verification requirements based on platform type, user age, or feature access levels.
Privacy guardrails: Requirements limiting verification data collection, restricting retention and use, and mandating security measures regardless of verification method chosen.
Schesser noted that lessons from California’s stakeholder engagement and rulemaking could inform similar efforts in other states, including Connecticut, where legislators are considering comparable social media restrictions. This raises the prospect of California’s age assurance rules becoming a template for multi-state approaches, amplifying their national significance.
Beyond Facial Compliance: Examining Real-World Consumer Harms
Perhaps the most philosophically significant shift in California’s enforcement approach is the Attorney General office’s movement away from “facial compliance” toward “ways everyday consumers are being impacted by certain data practices,” according to Schesser.
“We’re trying to think more critically about potential harms, and with that, we are going deeper in both the technology and how the technology is used,” she explained. “That may shift a lot the work that we’re doing in terms of what resolutions are going to look like.”
What “Beyond Facial Compliance” Means
Traditional privacy enforcement often focuses on whether companies have required policies, post required notices, maintain required processes, and honor required requests. A company with comprehensive privacy policies, clear notices, and responsive opt-out mechanisms could achieve “facial compliance” while nonetheless engaging in practices that harm consumers.
The AG’s harm-focused approach asks different questions:
- Are companies collecting data in ways that exploit psychological vulnerabilities?
- Do data practices disproportionately impact vulnerable populations?
- Are algorithms producing systematically unfair or discriminatory outcomes?
- Does data collection or use create safety risks beyond privacy concerns?
- Are business models fundamentally predicated on practices that harm consumer welfare?
This represents enforcement philosophy alignment with how privacy regulators in the EU and UK increasingly approach enforcement—examining not just regulatory box-checking but actual impacts on individuals and society.
Case Studies in Harm-Focused Enforcement
Schesser highlighted several ongoing Attorney General investigations that exemplify this approach:
Nonconsensual explicit deepfakes investigation: The AG’s probe into allegedly nonconsensual sexually explicit deepfakes produced by social platform X’s AI chatbot Grok goes beyond whether X’s privacy policy adequately disclosed Grok’s capabilities. It examines whether the technology creates harms (nonconsensual intimate images) that privacy frameworks should prevent.
Surveillance pricing sweep: The January 2026 investigation into surveillance pricing examines purpose limitation angles of data collected ostensibly for service delivery but used for individualized price discrimination. The harm focus is on whether consumers are disadvantaged by opaque personalized pricing based on data they didn’t knowingly provide for that purpose.
Both investigations suggest the AG’s office is willing to pursue novel theories connecting privacy violations to consumer harms that extend beyond traditional privacy injury frameworks.
Implications for Settlement Structure
Schesser’s comment that this approach “may shift a lot the work that we’re doing in terms of what resolutions are going to look like” suggests that future settlements might include:
Algorithmic audits and modifications: Requirements to assess and remediate algorithmic systems that produce harmful outcomes, not just post notices about their existence.
Product design changes: Mandates to redesign features or interfaces that exploit consumer psychology or create disproportionate risks, rather than simply disclosing those features.
Business model modifications: In extreme cases, requirements to fundamentally alter revenue models or data practices that regulators determine create unavoidable harms.
Enhanced monitoring and reporting: Ongoing obligations to track consumer harm metrics and report to regulators, creating continuing oversight beyond one-time penalties.
Community benefit provisions: Requirements to fund consumer education, privacy research, or other public benefit initiatives addressing harms the violations created.
This represents a more interventionist regulatory posture than early CCPA enforcement, which largely resulted in financial penalties and procedural remediation.
Multi-State Coordination and Emerging Patterns
California’s enforcement evolution is occurring alongside privacy enforcement maturation in other states, creating opportunities for coordination and potential for inconsistent requirements.
Connecticut’s AI and Chatbot Focus
Connecticut Deputy Associate Attorney General Michele Lucan revealed that her office is actively investigating a major AI chatbot’s compliance with the Connecticut Data Privacy Act (CTDPA), noting: “I don’t think this should be surprising. I know we’ve all seen the reports of really serious harms related to chatbots, and especially with kids.”
This investigation parallels California’s Grok probe, suggesting multi-state attention to AI chatbot risks. Connecticut’s 2026 legislative session is exploring chatbot-specific regulations, including child interaction protections and chat history retention requirements.
Connecticut is also prioritizing enforcement of new CTDPA provisions around AI training data transparency and consumer rights to contest profiling decisions. Lucan indicated privacy notice reviews will specifically examine compliance with AI transparency provisions.
Delaware and Indiana: Early Days of Comprehensive Enforcement
Delaware and Indiana’s comprehensive privacy laws became enforceable January 1, 2026, following multi-year transition periods. Both states’ attorneys general are developing their enforcement approaches with lessons from more mature programs.
Indiana’s transparency emphasis: Assistant Section Chief Jennifer Van Dame indicated transparency will be a significant theme, with particular focus on whether privacy notices are comprehensible to ordinary consumers, not just lawyers. “Give your privacy policy to your mom. If she can’t understand it then you might have a problem,” she advised.
Indiana’s 30-day cure provision will create distinctions between violations remediated during cure periods and those proceeding to settlements with fines.
Delaware’s operational focus: Deputy Attorney General John Eakins said Delaware will move “beyond just what a privacy notice says and actually looking at how the business is operationalizing data flows.” This includes examining organizational structure, board-level oversight of privacy practices, and senior executive accountability.
Delaware’s 60-day cure provision sunsets December 31, 2026, after which settlements with financial penalties become possible—suggesting 2027 could bring Delaware’s first fined enforcement actions.
Coordination Mechanisms
While state enforcers operate independently, several coordination mechanisms are emerging:
Information sharing: Attorneys general regularly share enforcement intelligence, investigative findings, and case theories, allowing states to learn from each other’s experiences.
Multi-state investigations: Some privacy investigations involve coordination between multiple states, particularly when violations affect consumers across state lines.
Settlement consistency: While not formally binding, settlement terms in one state often influence others’ expectations, creating de facto standardization pressure.
Model approaches: California’s large market and regulatory sophistication often make its approaches templates for other states, though this isn’t automatic.
Companies operating nationally should anticipate that innovative enforcement approaches in California or other leading states may spread to additional jurisdictions relatively quickly.
What Organizations Should Do Now
These enforcement developments create clear action items for organizations subject to California privacy law and similar state regimes:
Immediate Priority: Data Minimization and Purpose Audits
With the CPPA signaling increased focus on data minimization and purpose limitation:
- Inventory data collection: Document all personal information categories collected, why each is collected, and whether collection is genuinely necessary for stated purposes.
- Assess necessity: For each data element, rigorously evaluate whether it’s truly necessary or merely convenient, potentially valuable, or collected “just in case.”
- Eliminate unnecessary collection: Where data isn’t necessary for clearly identified purposes, stop collecting it. This not only reduces compliance risk but also decreases breach exposure and storage costs.
- Review purpose specifications: Examine privacy notice purpose descriptions and compare against actual data uses. Identify and remediate misalignments.
- Document purpose limitation controls: Establish systems ensuring data collected for one purpose isn’t used for incompatible purposes without new legal basis.
- Evaluate retention practices: Confirm data is deleted or anonymized when purposes are fulfilled rather than retained indefinitely.
Prepare for Increased Penalties
Recognizing that fines are likely to increase:
- Assess exposure: Model potential penalty exposure under higher fine scenarios. What would violations potentially cost if penalties increased 5x? 10x?
- Cost-benefit recalibration: Re-evaluate compliance investment decisions with updated penalty assumptions. Compliance measures that seemed expensive relative to historical fines may be cost-effective against higher penalties.
- Prioritize high-risk areas: Focus compliance resources on practices most likely to attract enforcement attention: opt-out mechanisms, data minimization, purpose limitation, sensitive data handling.
- Strengthen governance: Ensure privacy compliance has adequate executive attention, budget, and organizational authority to prevent violations before they occur.
If You’re a Social Media Platform
With age assurance rulemaking approaching:
- Monitor rulemaking: Track the Attorney General’s age assurance rule development closely. Submit comments if stakeholder input opportunities arise.
- Evaluate verification options: Assess available age verification technologies and approaches against likely regulatory requirements, cost, user experience impact, and privacy protection.
- Prepare technical infrastructure: Develop capability to implement age verification regardless of which approach regulators ultimately require.
- Consider voluntary measures: Platforms that proactively implement reasonable age assurance may gain regulatory goodwill and shape rulemaking through demonstrated approaches.
- Address addictive design: Beyond age verification, evaluate whether platform features could be characterized as “addictive feeds” and consider design modifications.
For Organizations Using AI and Automated Decision-Making
Given multi-state attention to AI and chatbots:
- Transparency review: Ensure privacy notices adequately disclose AI use, particularly for consequential decisions or content generation.
- Chatbot guardrails: If operating chatbots, implement content filters and safeguards preventing generation of harmful content, with particular attention to minor interactions.
- Training data documentation: Prepare to explain what data is used to train AI systems and whether appropriate rights exist for that use.
- Profiling contestability: For systems that profile consumers, establish mechanisms for individuals to contest profiling decisions.
- Harm assessment: Evaluate potential harms AI systems could create beyond privacy violations per se—discrimination, manipulation, safety risks, etc.
General Governance Strengthening
Regardless of specific business model:
- Enhance documentation: Maintain thorough records of data practices, purpose assessments, minimization analyses, and compliance decisions. These will be valuable if investigations occur.
- Board-level engagement: Ensure board of directors receives regular privacy compliance updates and approves privacy strategies. Delaware’s emphasis on executive accountability suggests regulators will examine governance structures.
- Privacy by design: Integrate privacy considerations into product development, business strategy, and operational decisions from inception rather than retrofitting compliance.
- Cross-functional collaboration: Privacy compliance requires coordination between legal, product, engineering, marketing, and other functions. Break down silos that prevent holistic privacy management.
- Continuous monitoring: Implement systems for ongoing compliance monitoring rather than periodic audits. Enforcement focused on real-world harms requires understanding actual practices, not just policies.
- Consumer-centric mindset: Adopt Schesser’s framing: consider whether data practices harm everyday consumers, not just whether they technically comply with regulations.
The Broader Trajectory: Where Privacy Enforcement Is Heading
California’s enforcement evolution reflects broader trends in privacy regulation globally:
From procedural to substantive: Regulators increasingly examine not just whether required procedures exist but whether data practices are fundamentally fair, necessary, and beneficial rather than harmful.
From notice-and-choice to protection: While transparency and consent remain important, regulators recognize their limitations and are willing to prohibit or constrain practices regardless of notice quality.
From privacy-specific to consumer protection integration: Privacy violations are increasingly viewed through consumer protection lenses, asking whether they constitute unfair, deceptive, or harmful business practices.
From reactive to proactive: Rather than waiting for consumer complaints to trigger investigations, enforcers are proactively identifying problematic practices and business models.
From fines to remediation: While financial penalties matter, regulators increasingly emphasize operational changes that prevent ongoing harms rather than simply imposing retrospective costs.
From U.S. exceptionalism to global alignment: American privacy enforcement is gradually converging with approaches long established in Europe, even as legal frameworks differ.
Organizations that understand these trajectories can anticipate regulatory evolution rather than merely reacting to announced priorities. The enforcement priorities California regulators discussed at the IAPP Summit aren’t isolated developments but manifestations of fundamental shifts in how privacy regulation operates.

The Maturation of California Privacy Enforcement
California’s privacy enforcement apparatus is entering a new phase of maturity characterized by:
- Financial consequences calibrated for deterrence rather than symbolism
- Substantive examination of data necessity and purpose alignment rather than procedural compliance
- Focus on consumer harms rather than technical violations
- Willingness to challenge business models and product designs, not just require disclosures
- Coordination with other states creating multi-jurisdictional enforcement patterns
For organizations, this maturation means the cost and risk calculus around privacy compliance is changing. Practices that seemed acceptable when enforcement was nascent and penalties were modest may no longer be defensible as regulators become more sophisticated and assertive.
The organizations best positioned for this environment are those that view privacy not as a compliance checkbox but as a dimension of product quality, consumer trust, and responsible business practice. When regulators ask whether data practices harm everyday consumers, companies genuinely designed around consumer welfare have clear answers.
California’s privacy enforcement evolution is ultimately about translating statutory rights and principles into practical protection for individuals in an increasingly data-intensive digital economy. Organizations that align their practices with that objective will find compliance manageable; those that treat privacy as an obstacle to overcome will find the regulatory environment increasingly inhospitable.
The signals from the IAPP Summit are clear: 2026 marks a turning point where California privacy enforcement transitions from establishment to operationalization, from experimentation to standardization, and from tolerance to expectations of genuine compliance. Organizations would be wise to prepare accordingly.