Privacy Pitfalls in the Push for Wearables

Table of Contents

U.S. Health Secretary Robert F. Kennedy Jr. announced an ambitious plan to equip every American with a health-tracking wearable within four years, backed by one of the largest advertising campaigns in the history of the Department of Health and Human Services (HHS). Touted as a cornerstone of the “Make America Healthy Again” agenda, devices like smartwatches, fitness bands, and continuous glucose monitors (CGMs) promise to empower individuals to monitor metrics like heart rate, sleep, and glucose levels. But as these devices become ubiquitous, they raise serious privacy concerns, echoing issues seen in recent health data lawsuits. This article explores the privacy risks of this initiative and its parallels with other health data privacy cases, highlighting the need for robust protections in an AI-driven world.

As we’ve covered in depth there has been large increases in privacy litigation especially in the healthcare space. The Electronic Communications Privacy Act triggers a HIPAA violation and a private right of action privacy lawsuit when a business is collecting sensitive health data without permission.

The Promise and Peril of Health Wearables

Health trackers offer undeniable benefits. Devices like the Oura Ring and Apple Watch can detect early signs of illness, encourage healthier habits, and provide personalized insights, such as how diet affects glucose levels. Kennedy emphasized their affordability, noting that a $50-$90 CGM is a cost-effective alternative to medications like Ozempic, which costs $1,300 monthly. The HHS envisions subsidies to make these devices accessible, potentially transforming how Americans approach preventive care. Yet, the mass adoption of wearables introduces significant risks. Unlike medical records protected by HIPAA, data from most health trackers often classified as “wellness” devices—falls outside federal privacy regulations, leaving it vulnerable to misuse, breaches, or sale to third parties. Some other privacy related healthtech suits that were filed have caused hundreds of millions of dollars in settlements over the last few years.

Privacy Risks in Focus

The push for universal health trackers amplifies several privacy concerns:

  • Data Breaches and Hacking: Wearables collect sensitive data, from heart rate to location, which can be exposed in breaches. A 2021 CVS Health database leak exposed over one billion records, including search queries for medications and vaccines, highlighting the vulnerability of health-related data.
  • Third-Party Sharing: Many wellness brands share data with advertisers or data brokers, often without clear user consent. A 2025 investigation by The Markup and CalMatters found state-run health exchanges in Nevada, Maine, Massachusetts, and Rhode Island sending sensitive user data to Google and LinkedIn, triggering lawsuits and federal scrutiny.
  • Insurance and Discrimination Risks: Health data can be used to adjust insurance premiums or deny coverage. Experts warn that compromised wearable data could lead to higher rates for less healthy individuals or even job discrimination, as seen in concerns raised by Privacy Rights Clearinghouse about fitness tracker data.
  • AI-Driven Surveillance: AI-powered wearables can infer detailed health profiles, but their opacity raises risks of misuse. For example, a location tracker might incorrectly flag a clinic visit as evidence of specific medical procedures, a concern echoed in a 2020 study on digital health footprints.

Chart: Health Data Privacy Incidents and Legal Actions

Case/Incident Year Issue Outcome
CVS Health Data Leak 2021 Exposed 1B+ records, including medication searches Database taken offline after disclosure; no fines reported
Covered California Tracker Scandal 2025 Shared health data with LinkedIn, Google Class-action lawsuit filed; data practices under review
GoodRx/PreMom FTC Lawsuits 2023 Shared health data without consent FTC fines; mandated stricter privacy policies
Mozilla Report on Wearables 2024 Weak legal protections for wearable data Called for expanded national privacy laws

Lessons from Health Data Lawsuits

The privacy risks of health trackers are not new. Recent lawsuits highlight the dangers of lax oversight. In 2023, the FTC sued GoodRx and Premom for sharing sensitive health data, like prescription and ovulation records, without consent, leading to fines and stricter data policies. Similarly, the 2025 Covered California scandal revealed state health exchanges leaking data to tech giants, prompting a class-action lawsuit and calls for reform. These cases underscore a key issue: without comprehensive federal privacy laws, health data from wearables can be exploited, sold, or hacked, mirroring concerns with the HHS wearable initiative. Connecticut’s SB 1295 (2025), which expands consumer rights to challenge AI profiling and access third-party data-sharing lists, offers a model for addressing these risks, but it’s state-specific and doesn’t cover wearables explicitly.

The AI Connection

AI amplifies both the potential and peril of health trackers. Devices like Ultrahuman’s Blood Vision use AI to predict disease markers, while Oura’s on-device AI aims to enhance privacy by reducing cloud reliance. However, AI’s ability to aggregate and analyze vast datasets increases the risk of deanonymization, where “anonymous” data is reverse-engineered to identify individuals. A 2023 report confirmed that U.S. government agencies purchase such data from brokers, raising fears of surveillance. The HHS initiative, with its ties to AI-driven devices like Levels’ CGMs (co-founded by Surgeon General nominee Casey Means), must ensure that data remains secure and isn’t misused for profit or profiling.

A Call for Robust Protections

The HHS wearable campaign could revolutionize healthcare but demands stringent safeguards. Oura’s CEO, Tom Hale, emphasized privacy-first design, moving to on-device AI to limit data exposure. Yet, most wearable companies lack such commitments, and federal laws lag behind. Mozilla’s 2024 report calls for expanding HIPAA to cover wearable data and simplifying opt-out processes, a sentiment echoed by privacy experts like Emory Roane. Businesses must adopt transparent data practices to build trust, as seen in Connecticut’s push for profiling transparency. Policymakers should learn from global models, like the EU’s GDPR, which restricts data sharing, and ensure subsidies don’t favor companies with weak privacy records.

The Path Forward

As health trackers become a cornerstone of public health, the stakes for privacy are immense. The HHS must prioritize consumer control, mandating clear consent mechanisms and data minimization, as seen in privacy-focused apps like Drip and Euki. Without these, the initiative risks repeating the mistakes of past data scandals, eroding trust in both technology and government. Consumers, too, can take action—using privacy browsers like DuckDuckGo or extensions like uBlock Origin to block trackers but the burden shouldn’t fall solely on individuals.

Ultimately, the vision of a healthier America through wearables is compelling, but it must not come at the cost of privacy. By learning from lawsuits like Covered California and GoodRx, and adopting robust protections like those in Connecticut’s SB 1295, the HHS can ensure that health trackers empower, not exploit, Americans. In an AI-driven world, privacy isn’t just a right it’s a prerequisite for trust.

Written by: 

Online Privacy Compliance Made Easy

Captain Compliance makes it easy to develop, oversee, and expand your privacy program. Book a demo or start a trial now.