Understanding California’s New ADMT Requirements

Table of Contents

 

The California Privacy Protection Agency’s approval of updated CCPA regulations at the end of 2025 introduced a fundamental shift in how businesses must approach automated decision-making. For the first time, California has established comprehensive requirements governing the use of automated decision-making technologies (ADMT) when those systems make or substantially influence significant decisions affecting consumers’ lives.

These requirements represent more than incremental privacy compliance updates—they signal a regulatory philosophy that automated systems processing personal data for consequential decisions require heightened transparency, oversight, and consumer control. Organizations using or considering ADMT for hiring, lending, healthcare access, housing, or educational decisions now face substantive operational changes to remain compliant.

What Qualifies as ADMT Under the New Rules

The regulations define automated decision-making technology as any computational system that processes personal information and replaces or substantially replaces human judgment in making decisions. The critical distinction is whether the technology’s output drives the final decision without meaningful human intervention.

This definition encompasses a broad spectrum of technologies: traditional algorithms, machine learning models, artificial intelligence systems, and even simpler automated processing systems that meet the “substantial replacement” threshold. The regulatory focus is not on the sophistication of the technology but rather on whether it functionally removes human decision-making from consequential choices.

The “Substantial Replacement” Standard

A technology “substantially replaces” human decision-making when its output is used to make decisions without human involvement. This doesn’t require complete automation—it captures scenarios where human review is nominal or where humans lack meaningful authority or information to overturn automated outputs.

Consider these scenarios:

Scenario A: An AI system screens loan applications and assigns risk scores. A human reviews each score and makes the final approval decision with full authority to override the system and access to all relevant information. Result: Likely NOT substantial replacement

Scenario B: The same AI system generates decisions that humans approve by clicking “confirm” without reviewing underlying data or having practical ability to override. Result: Likely IS substantial replacement

The distinction matters significantly for compliance obligations.

Which Decisions Trigger ADMT Requirements

Not all automated decisions trigger these heightened requirements. The regulations specifically target “significant decisions”—those that materially affect consumers’ access to fundamental opportunities and services.

Significant decisions include determinations related to:

  • Financial and lending services: Credit approvals, loan terms, account access, payment plans
  • Housing: Rental applications, lease terms, housing assistance eligibility
  • Education: Admissions decisions, scholarship awards, program placement
  • Employment: Hiring decisions, contractor selection, compensation determinations, promotion decisions
  • Healthcare services: Treatment approvals, coverage determinations, care access (for non-HIPAA covered entities)

Notably, the regulations explicitly exclude advertising from the definition of significant decisions, though targeted advertising remains subject to other CCPA requirements.

Industry-Specific Scenarios

Healthcare Technology Companies: A digital health platform that uses algorithms to approve or deny users’ requests for mental health counseling sessions would be making significant decisions about healthcare access. If the system automatically denies requests based on risk profiles without clinical review, it triggers ADMT obligations.

Educational Institutions: Universities using AI to automatically score and rank scholarship applications, then awarding scholarships based solely on those rankings without human evaluation, are making significant decisions through ADMT.

Financial Services: Buy-now-pay-later services that use automated systems to instantly approve or deny consumer credit applications at checkout are making significant lending decisions through ADMT.

Employment Platforms: Gig economy platforms that use algorithms to automatically assign work opportunities or set pay rates based on worker profiles and behavior patterns are making significant employment decisions through ADMT.

Property Management: Tenant screening services that automatically reject rental applications based on algorithmic analysis of credit, criminal, or eviction records are making significant housing decisions through ADMT.

Explicit Exceptions

The regulations carve out specific technologies that don’t constitute ADMT even if they involve automation: web hosting, domain registration, content delivery networks, data storage, security tools (firewalls, antivirus, anti-malware), spam filtering, spellchecking, calculators, and basic database or spreadsheet functions—provided these tools don’t replace human decision-making in significant matters.

The Five Core Compliance Obligations

Organizations using ADMT for significant decisions face five interconnected compliance requirements, each with distinct implementation challenges.

1. Pre-Use Notice Requirements

Before collecting personal information that will be processed through ADMT for significant decisions, businesses must provide conspicuous notice that includes:

Purpose specification: The specific reason ADMT will be used, described with sufficient detail that consumers understand the decision being automated.

Opt-out rights: Clear explanation of consumers’ right to refuse ADMT processing and practical instructions for exercising this right.

Access rights: Notification that consumers can request information about how ADMT affected decisions about them.

System explanation: A comprehensible description of how the ADMT functions, including:

  • What categories of personal information influence its outputs
  • What types of outputs it generates
  • How those outputs are used in decision-making

Alternative processes: Explanation of how decisions will be made if consumers opt out of ADMT.

Anti-retaliation assurance: Statement that exercising privacy rights will not result in adverse treatment.

Implementation Considerations

This notice can be integrated into general privacy policies, but it must be provided at or before the point of data collection whenever ADMT may be involved. This creates timing challenges for businesses that don’t determine whether to use ADMT until after data collection.

For employment contexts, this means job applicants must receive ADMT notices before submitting applications if automated screening might be used. For lending, consumers must receive notices before providing information for credit decisions. For healthcare, patients must be informed before intake if automated systems might affect their care access.

The “comprehensible description” requirement poses particular challenges for complex AI systems. Businesses must balance technical accuracy with accessibility, explaining sophisticated algorithms in plain language without oversimplifying to the point of misrepresentation.

2. Opt-Out Rights and Exemptions

Consumers must have the ability to refuse ADMT processing for significant decisions affecting them, with two important nuances:

Dual method requirement: Businesses must provide at least two methods for opting out, one of which must match the primary communication channel the business uses with that consumer. If you primarily communicate via email, email must be an opt-out method. If through mobile app, the app must offer opt-out functionality.

Exemption through appeals: Businesses can avoid providing opt-out rights by implementing an appeals process that includes human review with genuine authority to overturn automated decisions. This is not merely rubber-stamping automation outputs—reviewers must have access to complete information, decision-making authority, and organizational support to disagree with system outputs when warranted.

Specialized Employment and Education Exemptions

Additional narrow exemptions exist for ADMT used to assess job performance, allocate work, or evaluate academic performance, but these require businesses to demonstrate:

  • The ADMT functions as intended for its specific purpose
  • The system does not result in unlawful discrimination
  • Regular validation of system accuracy and fairness

These exemptions don’t eliminate all obligations—transparency and access rights still apply.

Strategic Decision Points

Organizations must choose between three approaches:

  1. Provide broad opt-out rights: Accept that some percentage of consumers will opt out and maintain parallel manual processes
  2. Implement qualifying appeals processes: Build robust human review systems that satisfy exemption requirements
  3. Eliminate ADMT for significant decisions: Redesign processes to keep meaningful human judgment in the loop

Each approach has distinct cost, operational, and customer experience implications.

3. Access Request Response Requirements

When consumers request information about ADMT use affecting them, businesses must provide plain-language explanations covering:

Purpose clarity: Why ADMT was used for their specific situation

System logic: How the ADMT works, including parameters and factors that influenced outputs related to them

Decision outcome: What decision was made and specifically how the ADMT output was used in reaching that decision

Future use: Whether the ADMT output will be used for future significant decisions about them

This goes beyond general system descriptions to personalized explanations of how ADMT affected individual consumers. A loan applicant who was denied credit should receive not just general information about the credit scoring algorithm, but specific insight into which factors influenced their particular denial.

Technical and Operational Challenges

Providing meaningful access responses requires:

  • Systems capable of logging and retrieving individual processing records
  • Translation capabilities that convert technical system operations into consumer-comprehensible explanations
  • Staff training to handle access requests that may involve sensitive decision outcomes
  • Response templates that balance standardization with personalization

Organizations should anticipate that access requests may come from consumers who disagree with decisions made about them, requiring careful communication to provide transparency without creating additional liability.

4. Privacy Risk Assessments Before ADMT Deployment

Before using ADMT to make significant decisions, businesses must conduct formal privacy risk assessments. This requirement became effective January 1, 2026—earlier than most other ADMT obligations.

Required Assessment Components

Privacy risk assessments for ADMT must document:

Processing fundamentals:

  • Specific purpose for processing personal information through ADMT
  • Categories of personal information processed, limited to what’s necessary
  • Collection, use, disclosure, retention methods and data sources
  • How consumers will interact with systems and purposes of those interactions
  • Approximate number of consumers affected
  • Notice provided to consumers and notification methods
  • Recipients of personal information

ADMT-specific analysis:

  • System logic, including underlying assumptions and known limitations
  • Outputs generated and how they’re used in decision-making
  • Benefits to consumers, stakeholders, and the business
  • Potential negative impacts to consumer privacy
  • Safeguards implemented to mitigate identified negative impacts
  • Determination of whether processing should proceed based on risk assessment
  • Individuals who contributed to the assessment (excluding legal counsel)
  • Approval dates and approvers (excluding legal counsel)

Assessment as Decision-Making Tool

Unlike check-box compliance exercises, these assessments should function as genuine decision-making tools. The requirement to evaluate whether processing should proceed based on assessment findings suggests organizations may need to modify or abandon ADMT implementations that pose unacceptable privacy risks.

This creates potential tension between business objectives and privacy protection, requiring senior leadership involvement in assessment outcomes.

Integration with Existing Processes

Most organizations already conduct some form of privacy impact assessment. The challenge is ensuring existing processes capture ADMT-specific requirements, which may involve:

  • Updating assessment templates with ADMT-specific questions
  • Training assessment teams on ADMT considerations
  • Establishing approval thresholds for ADMT deployments
  • Creating remediation processes when assessments identify concerning risks

Organizations should maintain assessment records systematically, as they’ll be required for upcoming reporting obligations.

5. Annual Reporting to the California Privacy Protection Agency

Beginning April 1, 2028, businesses must submit annual reports to the CPPA detailing their privacy risk assessment activities. While specific reporting format and content requirements haven’t been finalized, organizations should expect to report:

  • Number of privacy risk assessments conducted
  • Categories of processing assessed
  • Assessment outcomes and decisions made based on assessments
  • Changes to ADMT systems based on assessment findings

This reporting requirement transforms privacy risk assessments from internal compliance activities to regulatory disclosures subject to agency review.

Implementation Timeline

Understanding when each obligation takes effect is critical for compliance planning:

Obligation Effective Date Status
Privacy risk assessments January 1, 2026 Currently in effect
Pre-use notice January 1, 2027 Upcoming
Opt-out rights January 1, 2027 Upcoming
Access rights January 1, 2027 Upcoming
Annual reporting to CPPA April 1, 2028 Future

Organizations should be conducting privacy risk assessments now for any current ADMT use in significant decisions, while preparing operational systems for the 2027 effective date of consumer-facing rights.

Strategic Implementation Roadmap

Achieving compliance requires coordinated action across multiple organizational functions. A systematic approach might include:

Phase 1: Discovery and Assessment (Immediate)

Map existing ADMT use: Inventory where automated decision-making currently exists or is planned across the organization. Don’t rely on assumptions—engage with business units that handle hiring, lending, customer service, risk management, and other functions that make significant decisions.

Evaluate scope: Determine which automated systems meet the regulatory definition of ADMT for significant decisions. Not every automated process will qualify.

Conduct privacy risk assessments: For in-scope ADMT, complete required privacy risk assessments immediately (this obligation is already in effect).

Identify gaps: Compare current practices against regulatory requirements to identify specific compliance gaps in notices, opt-out mechanisms, access processes, and documentation.

Phase 2: Policy and Procedure Development (Q2-Q3 2026)

Revise privacy notices: Update consumer-facing privacy policies, employee handbooks, candidate notices, and other disclosures to incorporate ADMT-specific information.

Develop opt-out mechanisms: Design and implement systems for consumers to refuse ADMT processing, or establish qualifying appeals processes if pursuing that exemption.

Update access request processes: Modify data subject access request (DSAR) procedures, templates, and response capabilities to address ADMT-specific information requirements.

Create risk assessment templates: Develop standardized but flexible templates that capture all required ADMT assessment elements.

Establish governance frameworks: Define approval processes, decision criteria, and accountability for ADMT deployments and modifications.

Phase 3: Technical Implementation (Q3-Q4 2026)

Build logging capabilities: Ensure systems can capture and retrieve data necessary to respond to access requests about ADMT processing.

Develop explanation tools: Create mechanisms to translate technical ADMT operations into plain-language explanations for consumers.

Implement preference management: Build technical infrastructure to honor opt-out requests and maintain alternative decision-making processes.

Establish reporting systems: Develop data collection and reporting capabilities for annual CPPA submissions.

Phase 4: Training and Operationalization (Q4 2026 – Q1 2027)

Train affected teams: Ensure personnel in HR, lending, customer service, product development, and other relevant functions understand ADMT requirements and their responsibilities.

Test processes: Conduct dry runs of notice delivery, opt-out handling, access request responses, and risk assessments before the January 2027 effective date.

Refine workflows: Identify friction points and streamline processes based on testing results.

Prepare for launch: Final preparations for January 1, 2027 effective date of consumer-facing obligations.

Phase 5: Ongoing Compliance (2027 and beyond)

Monitor and adjust: Track opt-out rates, access requests, and system performance; adjust processes as needed.

Maintain assessments: Conduct new risk assessments as ADMT systems change or new applications are deployed.

Prepare for reporting: Collect and organize data needed for April 2028 reporting deadline and subsequent annual reports.

Cross-Functional Collaboration Requirements

ADMT compliance cannot be siloed within privacy or legal teams. Effective implementation requires engagement from:

Privacy/Compliance: Policy development, risk assessment oversight, regulatory interpretation

Legal: Contract review with vendors, employment law considerations, regulatory risk evaluation

Human Resources: Hiring process modifications, employee communications, applicant notices

Product/Engineering: System design changes, logging capabilities, explanation tools

IT/Security: Infrastructure for preference management, data retention, access request fulfillment

Customer Service: Handling consumer inquiries about ADMT, processing opt-out requests

Business Leadership: Resource allocation, risk appetite decisions, strategic choices about ADMT use

Organizations that treat ADMT compliance as primarily a legal or privacy issue will likely struggle with implementation. This requires business process redesign, not just policy updates.

Vendor and Third-Party Considerations

Many organizations don’t build ADMT systems internally but instead procure them from technology vendors or work with service providers who operate such systems. This creates additional compliance complexity.

Contract Requirements

Agreements with vendors providing ADMT or service providers using ADMT on your behalf should address:

  • Whether the vendor/provider’s system constitutes ADMT for significant decisions
  • Which party is responsible for each compliance obligation (notices, opt-outs, access requests, assessments)
  • Technical capabilities needed to support compliance (logging, explanation generation, preference management)
  • Data access rights to support access request responses
  • Cooperation with privacy risk assessments
  • Audit rights to verify compliance
  • Liability allocation for regulatory violations

Due Diligence for New Vendors

When evaluating new vendors whose solutions might involve ADMT:

  • Request detailed information about how their systems make decisions
  • Assess whether they replace or substantially replace human judgment
  • Evaluate their capabilities to support your compliance obligations
  • Review their own privacy practices and assessments
  • Understand limitations of their explainability features

Existing Vendor Remediation

For current vendor relationships involving ADMT:

  • Conduct compliance review of existing contracts
  • Engage vendors to negotiate necessary amendments
  • Establish clear responsibility allocation
  • Implement technical integrations needed for compliance
  • Document vendor commitments and capabilities

Some vendors may be unprepared for these requirements, potentially requiring you to seek alternative solutions or bring capabilities in-house.

Practical Compliance Challenges

Beyond the formal requirements, organizations will encounter several practical challenges:

Explainability Limitations

Some AI and machine learning systems function as “black boxes” where even their developers cannot fully explain how specific inputs produce specific outputs. This creates tension with requirements to explain ADMT logic to consumers.

Organizations may need to:

  • Prioritize interpretable models over slightly more accurate but unexplainable ones
  • Invest in explainability tools and techniques
  • Provide the best available explanations while acknowledging system limitations
  • In some cases, reconsider whether opaque systems should be used for significant decisions

Dynamic Systems

Machine learning models that continuously learn and adapt create assessment and explanation challenges. If a system’s logic changes between the time of a privacy risk assessment and a consumer’s access request, which version should be explained?

Organizations should:

  • Document system versioning and changes
  • Conduct periodic reassessments of evolving systems
  • Maintain records of system states at decision points
  • Consider whether continuously learning systems are appropriate for significant decisions

Opt-Out Operational Impact

If significant numbers of consumers opt out of ADMT, organizations need alternative decision-making processes that may be more costly and slower than automated approaches. This creates pressure to:

  • Design appeals processes that qualify for exemptions
  • Improve ADMT transparency to reduce opt-out rates
  • Maintain hybrid workflows that accommodate both automated and manual processing
  • Price services accounting for potential efficiency losses

Competitive Concerns

Different interpretations of these requirements or varying enforcement priorities could create competitive imbalances. An organization that invests heavily in gold-standard compliance may face cost disadvantages against competitors taking more aggressive interpretive positions or betting on limited enforcement.

Industry associations and collaborative guidance development may help establish common compliance baselines.

The Broader Context: Why These Requirements Emerged

Understanding the policy motivations behind ADMT requirements provides insight into how they may be interpreted and enforced.

Documented Harms from Automated Decision-Making

These regulations respond to documented instances where automated systems produced discriminatory, erroneous, or unjust outcomes in high-stakes contexts:

  • Hiring algorithms that systematically disadvantaged certain demographic groups
  • Credit scoring models that denied loans based on factors unrelated to creditworthiness
  • Healthcare algorithms that allocated resources inequitably
  • Criminal justice risk assessments that perpetuated historical biases

California regulators view transparency, consumer control, and mandatory risk assessment as mechanisms to prevent such harms.

The Profiling Connection

The regulations define “profiling” as automated processing to evaluate personal aspects like intelligence, aptitude, economic situation, health, preferences, or behavior. While not explicitly prohibited, profiling that drives significant decisions triggers heightened scrutiny through risk assessments.

This reflects concerns that data-driven decision-making may penalize individuals for characteristics or predictions rather than demonstrated facts, potentially reinforcing existing inequalities.

Balancing Innovation and Protection

The regulations attempt to permit beneficial ADMT use while establishing guardrails against harmful applications. This balance is reflected in:

  • Allowing appeals processes as an alternative to opt-out rights (recognizing some ADMT may benefit consumers)
  • Excluding advertising from “significant decisions” (preserving business models while focusing on high-stakes choices)
  • Requiring assessments but not prohibiting risky systems (enabling risk-aware decisions rather than blanket restrictions)

How this balance evolves will depend partly on enforcement actions and refinements based on early implementation experience.

Enforcement and Penalties

The California Privacy Protection Agency has broad enforcement authority for ADMT violations. Potential consequences include:

  • Administrative fines up to $7,500 per intentional violation
  • Orders to cease processing activities
  • Required audits at business expense
  • Mandated corrective action plans

Additionally, some ADMT violations may give rise to private rights of action if they involve unauthorized access or disclosure of personal information.

Early enforcement priorities remain unclear, but organizations should expect the CPPA will make examples of clear violations, particularly those causing demonstrable consumer harm.

Looking Ahead: Potential Regulatory Evolution

These ADMT requirements won’t be static. Likely developments include:

Federal action: Multiple federal AI regulation proposals include provisions related to automated decision-making. Federal law could eventually preempt, supplement, or harmonize with California’s approach.

Other state adoption: States including Colorado, Connecticut, and Virginia have privacy laws with some ADMT-related provisions. California’s detailed requirements may influence other states’ regulatory development.

Guidance and interpretation: The CPPA will likely issue guidance addressing common compliance questions, interpretations of ambiguous terms, and industry-specific applications.

Amendment based on experience: As businesses implement these requirements and challenges emerge, California may refine the regulations to address unforeseen issues or close loopholes.

Organizations should monitor regulatory developments and remain engaged with industry groups tracking ADMT compliance evolution.

Conclusion: Moving from Compliance to Competitive Advantage

While these requirements create compliance obligations, they also present opportunities for organizations that approach ADMT thoughtfully.

Businesses that excel at ADMT transparency and governance can differentiate themselves in markets where consumers increasingly care about how automated systems affect them. Clear communication about automated decision-making, robust fairness safeguards, and genuine human oversight can become trust-building advantages rather than mere compliance costs.

The organizations most likely to succeed with these requirements are those that:

  • Begin with the assumption that transparency about ADMT is valuable, not just required
  • Involve diverse perspectives in assessing ADMT risks, not just technical and legal viewpoints
  • Design systems with explainability as a core feature, not an afterthought
  • View consumer opt-out and access rights as feedback mechanisms for improving systems, not obstacles to business operations
  • Commit to ongoing evaluation and improvement of ADMT rather than one-time compliance exercises

California’s ADMT requirements signal a fundamental shift toward treating automated consequential decision-making as requiring special accountability. Organizations that embrace this shift strategically, rather than simply reacting to compliance deadlines, position themselves advantageously for an increasingly privacy-conscious and regulation-forward future.

Need Expert Guidance on ADMT Compliance?

Navigating these complex requirements requires specialized privacy expertise combined with operational knowledge of how automated systems function in business contexts. If your organization is struggling to interpret how ADMT requirements apply to your specific systems, develop compliant processes, or prepare for upcoming deadlines, consulting with privacy professionals who understand both the regulatory landscape and practical implementation challenges can accelerate your compliance journey while minimizing business disruption.

Written by: 

Online Privacy Compliance Made Easy

Captain Compliance makes it easy to develop, oversee, and expand your privacy program. Book a demo or start a trial now.