State privacy trackers are useful, but they have a habit of treating each new bill the way a stock ticker treats each new trade — as an isolated movement, scrolling past with no narrative. The week ending 19 April 2026 deserves better treatment than that. Six discrete legislative actions across five states landed in seven days, and they fit together into four trends that any privacy or compliance lead should be reading as a single signal: the US patchwork is no longer adding states; it is now sharpening teeth.
Below is a structured read of what moved, why these specific moves are happening together, and what your program should be doing about it before the next quarterly compliance review.
What Happened, Quickly
For readers who want the news at the top:
- Alabama enacted the Alabama Personal Data Protection Act (HB 351), becoming the 21st state with a comprehensive consumer privacy law and the second to cross the line in 2026 (after Oklahoma).
- Virginia signed SB 338, prohibiting controllers from selling or offering for sale precise geolocation data — making Virginia the third state to draw that specific line.
- Kentucky signed HB 692, adding “automatic content recognition” (ACR) to the categories of sensitive data under its consumer privacy law.
- Nebraska signed LB 838, broadening the applicability of its Age-Appropriate Design Code Act and tightening several other children’s-privacy provisions.
- Maryland’s HB 711 cleared the legislature before adjournment. It expands sensitive data to include inferred data that “indicates” a sensitive category, and restricts the sale of personal data to government entities involved in immigration enforcement.
- California’s AB 1542 cleared the Assembly Privacy and Consumer Protection Committee 9–5. It would amend the CCPA to prohibit the sale or sharing of sensitive personal information outright.
- Delaware’s HB 380 was introduced by 32 Democratic members to significantly amend the state’s young consumer privacy law.
- Maine’s legislature adjourned without enacting a comprehensive privacy bill.
- California’s AB 2246 unanimously cleared committee, codifying AADC-style provisions in the state’s business code.
That is a lot of motion in one week, but it is not random motion. Four threads run through it.
Trend One: Precise Geolocation Has Become a Defined Regulatory Category
Virginia’s SB 338 is the most consequential individual signing of the week, even though it sounds narrow. With this amendment, Virginia joins Maryland and Washington (under the My Health My Data Act for the health-data slice) in treating the sale of precise geolocation as categorically off-limits, rather than just as a sensitive-data category that can be processed with consent.
That is a meaningful doctrinal shift. For most of the post-CCPA era, sensitive data has been governed by a “sell with consent / opt-in for processing” structure. The new pattern — pioneered by Washington for health-linked geo and now generalised by Virginia — is prohibition rather than gating. Consent does not unlock the activity.
Compliance implication: any monetisation flow that involves the sale or sharing of latitude/longitude or radius data tied to identifiers needs a state-by-state map immediately. The legal basis you relied on in 2024 (consent, opt-in, contractual) is no longer sufficient in a growing number of jurisdictions. The ad-tech ecosystem that depends on bidstream geolocation is the most exposed; the more pedestrian risk is consumer-facing apps that quietly sell SDK-collected location to data brokers and analytics partners.
Trend Two: Sensitive Data Is No Longer a Closed List
Three of the week’s six moves expand the definition of sensitive data — and they do it in three different directions, which is the part to pay attention to.
Kentucky added “automatic content recognition” (ACR). ACR is the technology your smart TV uses to identify what is on the screen, and which connected-device manufacturers use to serve targeted ads against viewing data. It is a category most privacy notices do not describe by name, even though many programs are subject to it through analytics or measurement vendors. Treating ACR as sensitive data forces opt-in (under Kentucky’s existing structure) and pulls smart-device data flows into a tier of obligations they have largely escaped.
Maryland expanded sensitive data to inferences. Under HB 711, sensitive data now includes “data inferred by a controller based on personal data that, alone or in combination with other data, is used to indicate” any sensitive category. This is the inference doctrine that European regulators have been applying for years, finally written into US state law. It captures the data scientist’s favorite pattern: do not collect the sensitive attribute, infer it from a hundred non-sensitive ones. If your model output indicates protected class, health status, immigration status, sexual orientation, or any other Article 9-style category, Maryland will now treat it as sensitive regardless of how it was derived.
California’s AB 1542 would prohibit the sale or sharing of sensitive personal information entirely. This is the most aggressive of the three moves: rather than expanding the definition of sensitive PI, it forecloses a use case (sale and sharing) for the existing definition. If enacted, it pushes California closer to the European model in which special-category data simply does not move to third parties via commercial transactions.
The combined effect is that “sensitive data” is becoming a multi-layer concept: more categories, more attribution methods (including inference), and stricter use restrictions. A single, harmonised definition across your privacy program is no longer realistic. You need state-specific decision logic.
Trend Three: Children’s Privacy Is the Most Active Legislative Front
Two of the week’s six moves are children’s-privacy specific (Nebraska LB 838 and California AB 2246), and they represent two different legislative postures that are both gaining traction.
The “broadened applicability” model (Nebraska). LB 838 expands which entities are covered by the state’s Age-Appropriate Design Code Act. AADC-style laws have historically used either a “directed to children” trigger (narrow) or a “likely to be accessed by children” trigger (much broader, modeled on the UK’s AADC). When a state broadens applicability, it usually means moving toward the “likely to be accessed” standard, which can pull in general-audience services that are not marketing themselves to minors at all.
The “additional venue” model (California). AB 2246 would add AADC provisions to the business code, in addition to whatever exists in privacy law and the existing AADC. Layering matters because it creates additional enforcement hooks — different regulators, different remedies, different statutes of limitation. AADC compliance has been complicated by the federal court injunction history around California’s original AADC; codifying provisions in the business code is, in part, an attempt to give the substantive obligations more legal redundancy.
The pattern across both: states are not waiting for the federal kids’ privacy debate (KOSA, COPPA 2.0, the FTC’s COPPA Rule revisions) to settle. They are stacking additional state-level obligations now, on the assumption that the federal floor will eventually rise.
Compliance implication: any service plausibly accessible by minors — which, post-AADC, includes most general-audience consumer services — needs an actual age-assurance posture, a live design review against AADC principles, and a documented assessment of which states’ rules apply. “We aren’t directed at kids” is not a sufficient legal defense in a growing number of jurisdictions.
Trend Four: Privacy Law Is Becoming a Vehicle for Other Policy Fights
Maryland’s HB 711 contains a provision that, on its face, sits oddly inside a privacy bill: it restricts the sale of personal data to government entities that have engaged in or supported immigration enforcement within the last six months.
Set aside the policy debate. The structural point is that state privacy statutes are now being used to restrict commercial data flows for non-privacy purposes — civil rights, immigration, law enforcement, abortion access, gender-affirming care. Washington’s My Health My Data Act was an early version of this pattern (privacy law deployed to address abortion-related surveillance concerns). Maryland is extending it.
For data brokers, ad-tech firms, and any business that sells or licenses personal information to government buyers, this means privacy compliance now has an end-user vetting dimension. It is no longer enough to know that your buyer is contractually a “controller” or a “service provider.” You need to know whether the buyer falls into a category your states of operation have decided is off-limits, and whether the relevant six-month or twelve-month window applies. That is a meaningfully different procurement and customer-due-diligence problem.
Expect more of this. The next legislative cycles will almost certainly add similar carve-outs around law enforcement use of consumer data, ICE data sharing, and possibly bounty-style enforcement around health-data sales.
What This Means for Your Program This Quarter
A short, action-oriented list. Not exhaustive — and consciously not generic.
Run a precise-geolocation audit before July. Identify every flow where lat/long, radius, or place-of-presence data tied to identifiers leaves your environment under any commercial arrangement. Map the receiving entities, classify each flow as “sale,” “sharing,” or “service-provider processing” under each applicable state’s definitions, and confirm a compliant lawful basis exists in every state — including the new “no-sell” jurisdictions.
Re-derive your sensitive-data inventory. Add ACR data (smart-TV, smart-speaker, content-identification) if it touches your stack. Add inferred sensitive attributes — your ML team’s pipeline outputs, scoring outputs, segmentation outputs that indicate health, immigration status, sexual orientation, religion, race, or political views. The Maryland-style “inferred sensitive” definition is going to spread; you do not want to discover this category through an investigation.
Pressure-test your AADC posture against “likely to be accessed.” If your defense rests on “we are not directed at children,” reconsider. Run an honest assessment of teen accessibility, perform the design review, and document the conclusion. The legislative trajectory says this question will be asked of you formally, in a regulator inquiry, within the next eighteen months.
Add a customer-vetting layer to data-broker and licensing operations. If you sell, license, or share personal information with government buyers — directly or via resellers — start tracking buyer category, use restrictions, and any state-specific carve-outs. Maryland’s immigration-enforcement restriction is the first; it will not be the last.
Build state-specific sensitive-data routing into your DSAR and consent tooling. Treating sensitive PI as a single category across all states is becoming actively wrong. Your tooling needs to know that a California user’s sensitive PI may not be sold or shared at all (under AB 1542 if enacted), a Maryland user’s inferred sensitive data carries the same restrictions as collected sensitive data, a Kentucky user’s ACR data requires opt-in, and a Virginia user’s geolocation is off-limits for sale regardless of consent.
Track Delaware HB 380 and California AB 1542 closely. Delaware’s bill, if it passes substantively, will be the first significant amendment to a relatively new state law — a signal of how quickly second-generation state privacy laws are appearing. AB 1542 is the bigger story of the two; its passage would meaningfully change the calculus for any business selling or sharing sensitive PI in California, which is to say, most US-facing businesses.
The Bigger Picture
Twenty-one states is a milestone, but it is no longer the most useful number. The more revealing numbers are these: three states with absolute prohibitions on selling precise geolocation. Two states with inference-based sensitive data definitions, with more on the way. A growing list of states moving from “consent-gated” to “categorically prohibited” for specific data types. A first state — Maryland — pulling immigration-enforcement-end-buyer restrictions into a privacy law. A children’s-privacy front that is moving faster than every other privacy front combined.
The US privacy landscape spent five years building out laterally — adding states, harmonising structurally around the Virginia model, with Colorado and California as outliers. It is now building vertically — sharpening within categories, adding prohibitions on top of consent regimes, expanding sensitive data definitions, and using privacy law as a vehicle for adjacent policy goals.
For compliance teams, the operational consequence is clear and uncomfortable. You can no longer maintain a single nationwide consumer privacy posture and call it sufficient. The cost of not running state-specific decision logic is going up every quarter, and one of those quarters will be the one in which a regulator, an attorney general’s office, or a class-action plaintiff decides to make an example.
The good news is that this is fixable with discipline rather than headcount. The bad news is that the discipline has to start now, not after the next state crosses the line.
Captain Compliance covers US state privacy legislation, federal regulatory developments, and the operational backbone in-house teams need to keep pace. This article is provided for general information and is not legal advice.