Picture this: Over a third of Americans strap on a wearable—whether it’s an Apple Watch tracking heart rhythms or an Oura ring monitoring sleep cycles—chasing smarter wellness routines. These gadgets pull in intimate details like blood oxygen reads, fall alerts, respiration rates, VO2 max scores, and even EEG brain waves from advanced models. But here’s the rub: Unlike traditional healthcare providers, the tech firms behind these devices aren’t bound by HIPAA. That means no mandatory de-identification before hawking anonymized data to advertisers or partners.
Federal lawmakers have talked a big game, but gridlock has stalled comprehensive reform. Enter the states, racing to plug the holes with rules that treat health data as ultra-sensitive. Knowing how these patchwork protections are nudging businesses toward HIPAA-inspired de-identification practices is important to building out a refined data governance posture for your company if you’re in the health-tech space. So, whether you’re a wearable maker, app developer, or ad platform, here’s the lay of the land—and how to gear up to meet privacy compliance measures with our help.
Federal Foot-Dragging Meets State Action
Congress has mulled bills to rope in consumer tech under HIPAA-like rules, but nothing’s stuck—until now. On November 4, 2025, Sen. Bill Cassidy (R-La.) dropped the Health Information Privacy Reform Act (S.3097), a bipartisan push to extend federal privacy baselines to wearables and health apps. It would mandate consent for data shares, beef up breach notifications, and require de-identification for secondary uses, mirroring HIPAA’s safe harbor or expert methods. Still early days: The bill’s in committee, with hearings eyed for early 2026.
Without a national fix, states are leading the charge. All 19 with comprehensive privacy laws flag “consumer health data” as sensitive, triggering extras like opt-ins or sale bans. (That’s up from 18 earlier this year, with Nevada’s tweaks.) These split into two camps: narrow scopes in nine states and broader ones in ten, plus Washington’s outlier standalone act. The variance? It could mean the difference between routine opt-outs and full-throated consents for your data playbook.
Narrow Definitions: Room to Maneuver for Wearables
On the lighter-touch side, nine states—Indiana, Iowa, Kentucky, Minnesota, Montana, Nebraska, Tennessee, Texas, and Virginia—keep “consumer health data” tight: basically, formal diagnoses from medical pros. If your wearable sticks to fitness metrics without “diagnosing” (say, flagging arrhythmia as a condition), you might dodge sensitive status. Heart rate spikes? Still personal data, so users get access, deletion, and targeted ad opt-outs—but no opt-in wall.
This carve-out buys breathing room for non-medical apps, but tread carefully: Regulators could stretch “diagnosis” to inferred insights, like AI spotting sleep apnea from patterns. Recent enforcement in Texas (a 2025 AG probe into a fitness tracker) nixed claims of overreach, but appeals loom as of this reporting.
Broad Strokes: Heart Rates and Beyond as Sensitive
Flip to the ten broader states—California, Colorado, Connecticut, Delaware, Maryland, New Hampshire, New Jersey, Oregon, Rhode Island, and Utah—and the net widens to any “status, condition, or health-related info.” That sweeps in wearable staples: wrist temps, VO2 trends, even stress inferred from variability. Except in California (opt-out model), these demand explicit opt-in consent for collection, processing, or sharing.
Compliance hit: For cross-state ops, default to opt-in banners at signup, with granular toggles. A 2025 Colorado case against a meditation app fined $2.3 million for burying consents in fine print—lesson learned.
The Neural Data Puzzle: Brain Waves Under the Microscope
Edgier still: Neural signals like EEG from sleep headbands. Do they count as “health data”? It’s murky, but three states say yes outright—California, Colorado, and Connecticut explicitly list neural data in their definitions. Montana folds it into its Genetic Information Privacy Act, though likely just for DNA-tied firms.
For the broad-definition crew, AGs could argue EEG reveals mental states (anxiety patterns, say), demanding opt-ins. Jameson Spivak’s 2025 analysis flags Colorado’s biometric tie-in: It kicks in only if the device IDs the user. Open bet: In a test case, New Jersey’s AG might probe a neurofeedback wearable, citing “condition” coverage. Businesses: Flag neural features early, layer consents, and pseudonymize where possible.
The Heavy Hitters: Washington and Maryland Lead the Charge
Then come the strictest: Washington’s My Health My Data Act (MHMDA, effective 2024) casts the widest net, deeming any “consumer health data” (from wellness tips to symptom logs) regulable, with private rights of action for violations. No de minimis exemptions—opt-in for all, plus 18-month data minimization.
Maryland’s Online Data Privacy Act (ODPA, 2025) dials it to 11: It outright bans selling or targeted ads based on sensitive health data, consent be damned. “Sale” includes any monetized share, so adtech tie-ins? Off-limits. Both likely snag neural data; Washington’s 2025 enforcement against a fertility tracker racked up $5 million in settlements for unconsented shares.
Pending Power Plays: Eyes on Massachusetts and New York
The momentum’s building. Massachusetts’ H.4746 (Massachusetts Consumer Data Privacy Act), released November 17, 2025, by the House Committee, echoes Maryland’s toughness—no sales of sensitive data, plus civil suits. It’s advanced to Ways and Means, with floor votes possible by spring 2026; if passed, it’d be among the nation’s strictest.
New York’s Senate Bill 929 (Health Information Privacy Act) cleared both chambers in March 2025, mirroring Washington’s MHMDA with consent mandates and breach alerts. As of late November 2025, it’s on Gov. Hochul’s desk—signing odds high, given bipartisan backing, effective mid-2026. This would vault NY into the standalone health privacy club, hitting wearables hard.
De-Identification: Your HIPAA-Inspired Shield
For ad-driven models, the endgame? De-identify before sharing. Every U.S. privacy law so far carves out truly anonymized data—no re-identification risk, no rules apply. Contrast GDPR: Pseudonymized counts as personal, needing DPIAs.
Borrow HIPAA’s toolkit: Safe harbor strips 18 identifiers (names, IPs, etc.), or expert stats certify low risks. For wearables, aggregate trends (e.g., city-wide VO2 averages) sans geofencing. Washington’s MHMDA nods to this, exempting de-ID’d sets. Pro: Scales nationally. Con: Re-ID tech (like Apple’s differential privacy) evolves fast—audit yearly.
Steering Through the Maze: Action Items for Your Team
As states harmonize toward HIPAA-lite, non-compliance risks fines (up to $7,500 per violation in CA) or class actions. Quick wins:
- Map Data Flows: Tag health metrics by state—narrow vs. broad.
- Consent Overhaul: Granular opt-ins for broad states; test UI for clarity.
- De-ID Playbook: Adopt HIPAA methods; third-party certify.
- Monitor Feds: Track Cassidy’s HIPRA for preemptive alignment.
- Audit Partners: Vet adtech for downstream compliance, especially neural adds.
These shifts aren’t just hurdles—they’re chances to build trust and sidestep 23andMe-style breaches (that 2023 hack still echoes in suits). At Captain Compliance, we craft bespoke roadmaps and privacy solutions to keep your business compliant. Let’s chat: Book a privacy software demo and future-proof your health tech edge with the help of Captain Compliance today.