California has a well-earned reputation as America’s de facto privacy regulator. When California moves, the country eventually follows. The CCPA reshaped how every U.S. business approaches consumer data. The Delete Act created the first centralized data broker deletion mechanism in the nation. The California Privacy Protection Agency is the only independent privacy enforcement body of its kind in the United States. So when California introduces a bill billed as landmark student privacy legislation, the privacy community pays attention — not just because of what the bill says, but because of what it reveals about the gaps that have existed for a decade and the political forces that have made closing them so difficult.
Assembly Bill 1159, introduced by Assemblymember Dawn Addis of San Luis Obispo and sponsored by Privacy Rights Clearinghouse, is the most ambitious attempt yet to modernize California’s student privacy framework for the AI era. It has passed the Assembly Privacy and Consumer Protection Committee, cleared the Assembly Judiciary Committee, and is now heading to Assembly Appropriations. It has attracted powerful supporters — the California Labor Federation, the California Federation of Teachers, the California Faculty Association — and equally powerful opponents, including the California Chamber of Commerce, TechNet, the College Board, and the ACT Education Corporation. The political stakes are significant. The legal stakes are higher. And for privacy professionals trying to understand what AB 1159 actually does, what it does not do, and what the opposition’s arguments reveal about the industry’s real priorities, the details demand careful examination.
The Problem AB 1159 Is Trying to Solve: A Decade of Loopholes
To understand AB 1159, you need to understand the framework it is trying to fix — and why that framework, despite being passed twelve years ago, has produced exactly one enforcement action.
California’s Student Online Personal Information Protection Act, known as SOPIPA, was enacted in 2014 as the first law in the nation to directly regulate education technology companies. It prohibited ed-tech operators from selling student data, using student data for targeted advertising, building profiles for non-educational purposes, and disclosing student personal information without authorization. At the time it was passed, it was genuinely groundbreaking.
The problem is structural. SOPIPA — now referred to as KOPIPA in its current form — only applies to products that are primarily designed and marketed for K-12 school purposes. That “primarily designed and marketed” language was intentional when the law was written: legislators did not want to sweep in general-purpose consumer technology that happened to be used occasionally in schools. But what seemed like a reasonable carve-out in 2014 became a highway for ed-tech companies to drive through in the years that followed.
The result is a compliance landscape that has produced, by the bill’s own legislative history, a single enforcement action in nearly a decade. The November 2025 Illuminate Education case — involving a 2021 data breach that exposed sensitive information from over 434,000 California students — was the first time California had ever successfully gone after a company for violating KOPIPA. One enforcement action in nearly a decade is not a compliance regime. It is a suggestion.
The reasons for that enforcement vacuum are not mysterious. Exclusive reliance on the Attorney General for enforcement means that the entire weight of student privacy protection rests on a single office with finite resources and an enormous caseload. When the cost of violation is essentially zero — no private right of action, no civil penalty unless the AG decides to prioritize your case — the rational calculation for a technology company handling student data is straightforward: proceed unless stopped. Most companies have not been stopped.
What the Bill Actually Does: Three Core Changes
AB 1159 makes three substantive changes to California’s student privacy architecture, each targeting a specific gap in the current framework.
First: Expanding the Operator Definition
The bill replaces the “primarily designed and marketed” standard with a broader definition covering any operator with actual knowledge that its site, service, or application is used for school purposes and was designed or marketed for those purposes. The shift from “primarily” to “actual knowledge” is designed to capture the significant category of technology companies that serve both student and non-student populations but know perfectly well that their product is being used in educational contexts.
This is the provision most directly aimed at the TeamSnap problem — the category of apps and platforms that serve extracurricular programs, sports teams, school clubs, and other educational-adjacent contexts while arguing they are not primarily school-focused products. As the legislative analysis notes, the new standard should cover apps or services that are marketed to schools, directed by schools, and used in school-sponsored programs, even if the same app is also used by adult recreational sports leagues.
But the “murky” characterization in the original reporting is accurate. The line between a school-sponsored sports team using TeamSnap and a club sports team using TeamSnap — where both teams may have students from the same school — is not clearly drawn in the bill’s language. Addis’ own communications director could only say that TeamSnap would “most likely” fall under the bill in certain configurations, which is not the kind of legal certainty that compliance programs require.
Second: Prohibiting AI Training on Student Data
The bill adds an explicit prohibition on using covered student information — including persistent unique identifiers — to train generative AI systems or develop AI models, unless the use is strictly in furtherance of an educational purpose and for the benefit of the relevant educational institution.
This provision addresses a gap that the current framework does not touch at all. KOPIPA prohibits selling student data and using it for advertising. It says nothing about using it to train AI models, because the drafters in 2014 could not have anticipated that language model training would become one of the primary uses of large-scale data collection. By the time AI training became a standard industry practice, the law had no answer for whether feeding student behavioral data, content creation patterns, and assessment responses into a training pipeline constituted a prohibited use.
The legislative record makes clear that this gap has already been noticed and selectively addressed by individual contracts: last year, the California State University system signed a nearly $17 million contract with OpenAI, including an agreement that the company will not train its models on student data. Advocates for Addis’ bill say the same privacy restrictions should apply to any AI company with access to California student data, regardless of whether the company has an agreement with the student’s school district or college.
That framing is important. The CSU-OpenAI contract demonstrates that the technical infrastructure for segregating training data from user data exists and is operationally feasible. Major AI providers including Anthropic, Microsoft, and Google already offer no-training commitments in their educational products. What AB 1159 does is convert that voluntary commitment into a legal requirement — preventing a race to the bottom in which the protections available to a student depend entirely on whether their school district had the negotiating leverage to demand them.
Third: Extending Protection to Higher Education
AB 1159 extends comprehensive student data privacy protections beyond K-12 and early learning by enacting the Higher Education Student Information Protection Act, thereby creating a unified privacy framework that applies from preschool through postsecondary education. While ELPIPA extended KOPIPA-like safeguards to preschool and prekindergarten students, higher education students — numbering approximately 2.9 million in California — remain largely outside any comparable state-level statutory regime governing the conduct of educational technology vendors.
The higher education gap is real and significant. A student who benefits from KOPIPA’s prohibitions on data sale, targeted advertising, and profile-building throughout their K-12 career effectively loses those protections the day they enroll in a California college or university. The ed-tech products used in higher education — learning management systems, proctoring software, AI tutoring tools, mental health apps required for campus housing access — collect data that is at least as sensitive as anything collected in K-12 contexts, and in many cases more so.
The higher education provisions include prohibitions on targeted advertising, profiling, sale of student data, and improper disclosure, along with heightened restrictions on the collection and use of sensitive categories including immigration status, reproductive or sexual health, and sexual orientation or gender identity. That last element is not incidental. The bill strengthens protections for sensitive categories of student data, including information relating to immigration status, reproductive or sexual health, and sexual orientation or gender identity, reflecting the heightened risks posed by advanced data aggregation and inference.
In the current political environment — where the Trump administration is actively attempting to collect data about California residents’ immigration status and gender identity from federal systems — building state-level protection against the aggregation of this data by private ed-tech companies is not merely a privacy compliance consideration. It is a direct response to a documented federal threat.
The Private Right of Action: The Provision the Industry Fears Most
AB 1159 significantly strengthens enforcement of California’s student data privacy laws by establishing a limited private right of action for students, pupils, or their parents or guardians who suffer actual harm as a result of an operator’s noncompliance with KOPIPA, ELPIPA, or the new higher education provisions, while simultaneously incorporating a structured notice-and-cure framework designed to promote compliance and avoid unnecessary litigation.
The notice-and-cure mechanism is a deliberate compromise with the business community’s concerns: before filing suit, plaintiffs must provide the alleged violator with notice and an opportunity to remedy the violation. This framework limits the exposure to companies that receive notice and fail to cure — meaning organizations that promptly address identified violations have a meaningful path to avoiding litigation. It is not a BIPA-style regime where technical violations without any attempt at remediation automatically generate statutory damages.
But it is still a private right of action, and that is precisely why TechNet, the College Board, and the Chamber of Commerce oppose it with the resources they are deploying. The College Board’s written opposition is characteristically candid: the express addition of the broad private right of action subjects College Board and other operators to class action and litigation exposure, which is an extraordinary expense that can limit our ability to dedicate nonprofit resources to our educational mission and to students.
Strip away the nonprofit framing and the argument is straightforward: we prefer a world where enforcement depends on the Attorney General’s caseload decisions, because in that world the probability of accountability is vanishingly small. One enforcement action in a decade supports that preference empirically. The private right of action threatens to change the math in a way that purely AG-dependent enforcement never has.
For privacy professionals, this enforcement architecture question is more significant than any specific substantive provision. A privacy law without effective enforcement is not a privacy law — it is a statement of aspiration. The difference between KOPIPA’s current posture, which produced one enforcement action in eleven years, and a regime with a functional private right of action is not a marginal compliance consideration. It is the difference between a law that changes behavior and a law that generates policy papers.
The Loopholes That Remain: Where AB 1159 Falls Short
For all its ambition, AB 1159 has meaningful limitations that privacy professionals should understand clearly, because they define the boundaries of what the bill achieves even if it passes in its current form.
The extracurricular ambiguity is the most immediate operational problem. The bill’s expansion of the operator definition is designed to capture TeamSnap-type platforms, but the conditions under which a non-school-primary platform falls within scope — marketed to schools, directed by schools, school-sponsored program — leave significant gray area for exactly the fact patterns where protection is most needed. A coach who recommends TeamSnap without district approval, a university that informally suggests a wellness app for student use, a college club that requires members to use a communication platform not contracted by the institution — all of these scenarios involve student data in contexts that AB 1159 may or may not reach, and the legislative analysis acknowledges the ambiguity rather than resolving it.
The college student consent question is equally unresolved. For adult students at higher education institutions, the bill’s protections are meaningfully weaker than those for K-12 students, reflecting the tension between privacy protection and adult autonomy. The legislative analysis notes that adult students have increased control over their own data under the bill — they can share information with businesses outside the covered framework if they choose. But the reality of the higher education environment — where students are routinely required to use specific platforms for coursework, residential life, and administrative functions — makes the concept of meaningful voluntary consent to those platforms structurally problematic in ways the bill does not fully resolve. The power dynamics in a mandatory course platform are not materially different from those in a mandatory K-12 tool, but the legal treatment is.
The federal context creates a third limitation that no state law can fully address. The bill’s heightened protections for immigration status and gender identity data are a direct and appropriate response to the federal threat environment. But a state law prohibiting ed-tech companies from selling or disclosing this data provides protection against private commercial exploitation — it provides no protection against federal compelled disclosure through legal process. If federal authorities subpoena an ed-tech company for student data, California’s student privacy law does not prevent compliance. That gap is not a failure of AB 1159 specifically; it is a structural limitation of state-level privacy protection in a federal system where the executive branch is actively treating state-collected data as a federal resource.
The Opposition’s Arguments: What They Reveal
The organized opposition to AB 1159 is a coalition of organizations whose arguments deserve examination not just for their surface claims but for what those claims reveal about the industry’s actual interests.
TechNet’s argument — that the bill “remains overly broad and would significantly chill responsible AI development, particularly in education technology” — follows a pattern that privacy professionals have seen repeatedly in every significant privacy legislation battle of the past decade. The “chilling innovation” argument has been deployed against the CCPA, against BIPA, against the GDPR, and against every other significant privacy framework. It has been consistently wrong as a prediction: GDPR has not prevented European technology development; BIPA has not prevented Illinois employers from using biometric systems; the CCPA has not caused California’s technology sector to relocate. The argument functions not as a factual prediction but as a political pressure tool.
The College Board’s opposition is more substantively specific and deserves more careful attention. The College Board’s concerns center on the bill’s disclosure provisions — specifically, whether the bill’s language adequately preserves the ability of students to send SAT and AP scores to colleges and scholarship programs. The legislative analysis addresses this concern with a specific carve-out allowing disclosure to educational institutions for assessment, admissions, and other K-12 or higher education purposes. Whether that carve-out fully addresses the College Board’s operational concerns is a legitimate technical question.
But the College Board’s opposition sits uneasily alongside its own documented record. The author points to a Consumer Reports investigation reporting that the College Board uses its role as gatekeeper to higher education to collect and share information on students despite apparent promises to the contrary, and the College Board settled for a $750,000 penalty for sharing and selling student data. An organization that has paid a $750,000 penalty for sharing and selling student data arguing that new student privacy protections are too burdensome is an organization that merits regulatory skepticism, not deference.
What This Means for Compliance Programs
For privacy professionals advising ed-tech companies, school districts, and higher education institutions on AB 1159, the practical compliance implications fall into several categories.
Ed-tech operators need to conduct an honest assessment of whether their products meet the “actual knowledge” standard under the revised operator definition. The shift from “primarily designed and marketed” to “actual knowledge” is not a technical change — it is a substantive expansion of scope that will capture products that have been operating outside KOPIPA’s reach by relying on the primary-use argument. If your platform is being used in school-sponsored extracurricular programs and you know it, the bill’s language is designed to reach you regardless of whether you market to the general consumer market as well.
For AI-powered ed-tech products specifically, the training prohibition requires an immediate audit of data pipelines. If student behavioral data, content creation, or assessment responses are flowing into model training datasets — even under contractual arrangements with schools that do not specifically address training — that use will be prohibited if AB 1159 passes. The CSU-OpenAI model is the operational template: a documented, auditable commitment to data segregation between user data and training data, with contractual teeth.
For higher education institutions, AB 1159 creates a new vendor due diligence obligation that currently does not exist at the state statutory level. The ed-tech vendors serving California’s 2.9 million higher education students — learning management systems, proctoring platforms, AI tutoring tools, mental health apps, housing management systems — will become subject to KOPIPA-equivalent prohibitions under HESIPA. Institutions should be reviewing their existing vendor agreements now, before the bill passes, to identify contracts that would need to be amended to comply with the new framework.
For school districts navigating the extracurricular gray area, the compliance answer is straightforward even where the legal answer is uncertain: apply the substantive protections of KOPIPA to all ed-tech vendors that interact with student data in any school-adjacent context, regardless of whether those vendors would technically fall within or outside the bill’s formal scope. A district that voluntarily requires KOPIPA-equivalent contract terms from TeamSnap-type platforms is both better protected than the law currently requires and better positioned if the scope question is later resolved against the narrow interpretation.
The Legislative Trajectory
AB 1159 has garnered support from education groups and co-sponsors, and now heads to Assembly Appropriations Committee. The Appropriations Committee is where fiscally significant bills can stall or be amended to reduce their cost impact, and the bill’s private right of action provision creates potential fiscal implications for state court operations — the Governor’s January budget proposal includes $70 million in ongoing General Fund for court operations, partly in anticipation of increased litigation from privacy-related private rights of action.
The political configuration — California Labor Federation supporting, Chamber of Commerce opposing, Governor’s office watching — is familiar from every major California privacy bill of the past decade. Governor Newsom’s track record on privacy legislation is mixed: he signed the CPRA but pressured CalPrivacy to weaken its automated decision-making rules. Whether AB 1159 reaches his desk in a form he will sign, or arrives there with the private right of action intact, will tell privacy professionals a great deal about where California’s political will on student privacy actually sits in 2026.
What is not in question is that the problem AB 1159 addresses is real, documented, and growing. The technology that California’s students are required to use every day — from classroom AI tools to sports team management apps to college health platforms — is generating personal data at a scale and sensitivity level that the 2014 framework was never designed to govern. One enforcement action in a decade is not a compliance regime for a state with 6 million K-12 students and 2.9 million higher education students. AB 1159 is an imperfect but necessary attempt to close a gap that should not have been allowed to persist this long.
The question for privacy professionals — and for the California legislators voting on this bill — is whether an imperfect solution is better than the status quo. The answer to that question, measured by eleven years of near-zero enforcement and a data broker ecosystem that has been profiting from student data without legal consequence, seems clear.