Ready for a Comprehensive Analysis of America’s Latest Attempt at Federal Privacy Law? Every other year it’s been a trend of coming up with a new federal privacy law that ultimately stalls out. Will this be the same or is this the one that gets all the way there?
Introduction: A Long-Awaited Opening Move
On April 22, 2026, U.S. Representative John Joyce (R-PA), Vice Chairman of the House Energy and Commerce Committee, introduced HR 8413 — formally titled the Securing and Establishing Consumer Uniform Rights and Enforcement over Data Act, or the SECURE Data Act. The bill represents the first major attempt in the 119th Congress to establish a comprehensive federal consumer privacy framework, and the first significant federal privacy legislation proposed in years.
The bill is the product of the Privacy Working Group established in February 2025 by Energy and Commerce Committee Chairman Brett Guthrie (R-KY), which spent over a year soliciting input from at least 45 stakeholders — including civil society organizations, industry groups, state legislators, small businesses, and state-level enforcement agencies — before producing the current draft. Nine Republican members of Congress were directly involved in crafting the legislation.
Make no mistake: this is an opening salvo, not a finished product. The bill’s authors openly acknowledge it will be “refined significantly” through negotiations in the months ahead, and Working Group staffers have invited feedback from the broader privacy community. But what it establishes — in structure, philosophy, and ambition — tells us a great deal about where federal privacy law may be heading.
Why This Bill Matters Now
The United States remains one of the few major democracies without a comprehensive national consumer privacy law. In the absence of federal action, states have moved aggressively to fill the void. As of 2026, 21 states have enacted comprehensive consumer privacy legislation, creating an increasingly complex patchwork of obligations that vary by jurisdiction.
This fragmentation has become a source of growing frustration for both businesses and privacy advocates. Businesses must navigate a tangle of differing requirements. Consumers receive inconsistent protections depending on where they live. The SECURE Data Act is explicitly designed to address this — by establishing a single national standard and, crucially, preempting state laws in the process.
Structure and Legislative Origins
The SECURE Data Act closely follows what privacy professionals call the “Washington state model” — a framework that has become the de facto template for U.S. state privacy legislation since Virginia enacted its Consumer Data Protection Act in 2021. The bill’s structure and language align most closely with Virginia’s law, but also incorporates elements from Kentucky’s version, reflecting Chairman Guthrie’s home state influence.
This lineage is significant. The Washington/Virginia model is generally considered more business-friendly than California’s approach, and the SECURE Data Act’s partisan origins as a Republican bill reinforce that orientation. It lacks several features favored by privacy advocates, while offering a streamlined compliance regime that industry groups have tended to prefer.
Who Is Covered? Thresholds and Exemptions
Applicability Thresholds
The bill establishes two primary coverage thresholds:
1. Volume-based threshold: Any company that processes the personal data of more than 200,000 U.S. consumers annually is subject to the law. This mirrors the threshold used in the prior American Privacy Rights Act (APRA) and, when adjusted for population, is more inclusive than any state law except Texas and Nebraska, which impose no population-based threshold at all.
2. Revenue-from-data-sales threshold: Any company that processes data on more than 100,000 U.S. consumers AND derives more than 25% of its gross revenue from selling personal data is also covered. This dual test mirrors approaches taken in many state laws. The 25% revenue prong is as stringent as most states, though Connecticut (no percentage threshold), Delaware, Maryland, and Rhode Island (each at 20%) are more demanding on this point.
The Small Business Exemption
Perhaps the most significant threshold-related provision is the small business revenue exemption: companies with less than $25 million in adjusted gross annual revenue are exempt from the law’s requirements. This carve-out is more generous than what most states allow — the typical state threshold follows Small Business Administration guidelines, exempting businesses with under $7.5 million in revenue — but less generous than APRA, which would have exempted businesses under $40 million.
To encourage smaller businesses that fall outside mandatory coverage to nevertheless adopt good privacy practices, the bill instructs the Secretary of Commerce to develop a voluntary code of conduct that small businesses can use as a guide.
Entity-Level Exemptions
The bill carves out several categories of entities entirely:
- All government entities and processors acting on their behalf
- Entities covered by existing federal financial privacy laws (such as GLBA)
- Other federally regulated sectors
Data-Level Exemptions
Beyond entity-level exemptions, the SECURE Data Act also excludes certain categories of data from coverage, regardless of who holds them:
- Employee data
- Health records (presumably data already covered by HIPAA)
- Credit report data (covered by FCRA)
- Personal data intermingled with any of the above exempt data types
This last point — the intermingling carve-out — is particularly notable. It could allow organizations to argue that significant quantities of personal data fall outside the law’s reach simply by virtue of being stored or processed alongside exempt categories.
Consumer Rights: The Core Entitlements
The SECURE Data Act grants consumers a robust set of data rights, largely consistent with what has become standard across state frameworks:
Access, Correction, and Deletion
Consumers have the right to:
- Know what personal data a business holds about them
- Correct inaccurate personal data
- Delete their personal data
These three foundational rights are the bedrock of modern privacy law globally, reflected in the EU’s GDPR and virtually every U.S. state privacy statute.
Data Portability
Consumers can request a copy of their personal data in a portable, usable format — enabling them to take their data elsewhere. This right encourages market competition by reducing switching costs and data lock-in.
Opt-Out Rights
Consumers have the right to opt out of:
- Sales of their personal data
- Targeted advertising based on their personal data
- Profiling used to make decisions with legal or similarly significant effects on them
This package of opt-out rights is consistent with the approach taken in approximately 20 states. Iowa is the notable exception — it does not include a right to correct data or an opt-out right.
Controllers and Processors: Shared but Distinct Obligations
Like most modern privacy frameworks, the SECURE Data Act distinguishes between controllers (entities that determine the purposes and means of data processing) and processors (entities that process data on behalf of controllers). Each bears distinct legal obligations, though both are subject to the law’s requirements.
This structure aligns with the GDPR framework and is standard in U.S. state laws, providing a familiar compliance map for organizations already operating under state or international privacy regimes.
Sensitive Data: Heightened Protections and a Unique Teen Provision
What Counts as Sensitive?
The bill adopts Kentucky’s sensitive data definition verbatim, limiting the category to data types reflected across all 21 state privacy laws — the consensus baseline. This likely includes data such as:
- Race and ethnicity
- Religious beliefs
- Health and medical information
- Sexual orientation and gender identity
- Biometric data
- Precise geolocation data
- Financial data
Processing sensitive data requires opt-in consent from consumers — a higher bar than the opt-out standard that applies to most other categories of personal data.
Teenagers as a Sensitive Class
One of the bill’s most notable provisions treats personal data of consumers under age 16 as inherently sensitive data. This means that processing such data requires opt-in consent, and crucially, that consent must be provided by parents or guardians through a verified parental consent mechanism.
This approach extends COPPA — the existing federal children’s privacy law — by three years, from its current threshold of age 13 to age 16. It places the SECURE Data Act among the more protective federal frameworks for teen data, though its approach is distinctive:
- Florida applies similar parental consent requirements to all consumers under 18, but allows teens to consent themselves
- Maryland and Oregon take a more restrictive approach, banning certain types of processing outright for minors
- Colorado, Connecticut, and Montana impose a duty of care on businesses handling teen data
The SECURE Data Act’s approach — requiring parental consent and treating teen data as categorically sensitive — is unique at the federal level and reflects the Energy and Commerce Committee’s strong legislative interest in youth privacy.
A Notable Silence: Knowledge Standards
The bill conspicuously drops the word “known” from before “child,” and is entirely silent on knowledge standards generally — meaning it provides no guidance on how a business should handle situations where it doesn’t know a user’s age. This omission is expected to draw significant feedback from the privacy community, as it creates ambiguity about when obligations are triggered and how companies should implement age verification.
Enforcement: The FTC, State AGs, and the PRA Question
No Private Right of Action
As a Republican-authored bill, the SECURE Data Act does not include a private right of action — meaning individual consumers cannot sue companies directly for violations. This is the single most contentious dividing line in federal privacy negotiations: Democrats and privacy advocates have typically demanded a private right of action as a prerequisite for meaningful enforcement; Republicans and the business community have consistently opposed it.
The bill’s authors point out that the so-called “consensus framework” across states also largely does not include private rights of action — only California’s CPRA provides one, and even that is limited in scope.
Federal and State Enforcement
In place of private suits, the SECURE Data Act empowers two enforcement actors:
- The U.S. Federal Trade Commission (FTC): The primary federal enforcement authority, with rulemaking and enforcement powers.
- State Attorneys General: Each state’s AG would have authority to enforce the federal law within their jurisdiction.
This dual enforcement model is common across federal privacy proposals and mirrors the structure of many existing state laws.
Preemption: The Defining Policy Choice
The SECURE Data Act embraces a strong federal preemption regime — one of the most consequential and contested aspects of any federal privacy bill. The draft would render moot any state law or provision that “relates to” the subjects covered by the SECURE Data Act.
The practical implications are sweeping:
- State comprehensive privacy laws (Virginia, Colorado, Connecticut, Texas, California, and 17 others) would likely be preempted
- State data broker registration requirements would likely be preempted
- Some state sectoral laws could be affected, depending on their scope
Strong preemption is typically favored by businesses seeking uniform national compliance rules, and opposed by privacy advocates who argue that state laws — particularly California’s — offer stronger protections that would be eroded by federal floors. It is also a flashpoint in bipartisan negotiations, as states’-rights-conscious lawmakers of both parties have historically resisted broad federal preemption of state privacy authority.
The bill’s provisions would take effect within one to two years of enactment.
Cross-Border Data Flows and Codes of Conduct
International Data Flow Policy
One of the more striking departures from the state framework is the bill’s treatment of cross-border data flows. The SECURE Data Act would enshrine in statute the Secretary of Commerce’s role as the U.S. government’s representative on international data flows and the protection of personal data in international commerce.
More significantly, it would codify the longstanding U.S. policy position — one that has come under pressure in recent years — in favor of facilitating commercial data flows while ensuring personal data protection. This represents a reaffirmation of the U.S. approach to data governance that emphasizes openness and trade-friendliness, in contrast to more restrictive frameworks like the EU’s GDPR adequacy regime.
Codes of Conduct and Compliance Safe Harbors
The bill grants the Secretary of Commerce new authority to recognize voluntary industry codes of conduct that encourage privacy best practices. To qualify, a code must:
- Be voluntary
- Be enforceable by an independent organization
- Include a referral mechanism to enforcement authorities for violations
- Meet or exceed the compliance obligations in the SECURE Data Act itself
Companies that adhere to a recognized code of conduct would receive a rebuttable presumption of compliance with the law — a significant incentive. The bill specifically recognizes the existing CBPR (Cross-Border Privacy Rules) and Privacy Rules for Processors multilateral codes as automatically qualifying for this presumption upon enactment.
What’s Missing: Notable Absences
No Data Protection Impact Assessments
The SECURE Data Act does not require data protection impact assessments (DPIAs) — a tool that requires businesses to systematically evaluate the privacy risks of high-risk processing activities before undertaking them. DPIAs are mandatory under the GDPR and are required in nearly all U.S. states with comprehensive privacy laws. Even Kentucky, the bill’s closest model state, recently amended its law to include a DPIA requirement for profiling.
Instead, the bill follows the approach of Alabama, Iowa, and Utah — the most business-friendly state frameworks — by making DPIAs entirely optional.
No Explicit AI or ADMT Rules
The bill contains no explicit reference to artificial intelligence and no comprehensive rules governing automated decision-making technologies (ADMT). This is a significant omission given the rapid expansion of AI in commercial contexts, and one that the Working Group anticipated — it had specifically sought stakeholder comments on ADMT inclusion, and found that many commenters expressed reservations.
The absence likely also reflects the White House’s separate legislative plans to address AI governance — and potentially preempt state AI standards — through distinct legislation.
That said, the bill does include a narrow ADMT-adjacent provision: businesses that make decisions through fully automated profiling — with no human review, involvement, oversight, or intervention — must disclose this and provide consumers an opportunity to opt out. This is a meaningful but limited protection compared to the more comprehensive ADMT rules seen in states like Colorado and Connecticut.
No Universal Opt-Out Mechanism Requirement
At least 11 of the 21 states with comprehensive privacy laws now require businesses to recognize universal opt-out mechanisms (UOOMs) — browser signals or device settings that automatically communicate a consumer’s preference not to have their data sold or used for targeted advertising.
The SECURE Data Act does not require businesses to honor these signals. Instead, it instructs the Secretary of Commerce to conduct a study on UOOMs and publish a report within three years of enactment — evaluating available technologies, their feasibility, and their potential for beneficial use. While this provision may still have preemptive effect on state UOOM requirements (given the broad preemption language), it does not replicate their consumer-protective function.
What Comes Next: The Legislative Road Ahead
With the bill introduced, it enters the formal legislative process:
- Subcommittee Hearing: The House Subcommittee for Commerce, Manufacturing and Trade will hold a hearing at which members and witnesses can offer public testimony.
- Subcommittee Markup: Members will introduce and vote on amendments to the bill text.
- Full Committee Markup: The process repeats at the full Energy and Commerce Committee level.
- Floor Consideration: If the full committee passes an amended bill, it proceeds to a House floor vote.
- Senate: The bill would need to pass the Senate, where bipartisan agreement — particularly on preemption and private rights of action — has historically been the primary obstacle to federal privacy legislation.
The bill’s explicitly partisan origin as a Republican measure, and its lack of provisions typically demanded by Democrats (private right of action, stronger ADMT rules, less aggressive preemption), suggests that significant negotiation lies ahead before any version could attract the bipartisan support needed to become law.
Key Tensions and Likely Flashpoints
| Issue | SECURE Data Act Position | Likely Opposition View |
|---|---|---|
| Private right of action | Absent | Democrats and privacy advocates demand inclusion |
| Federal preemption | Strong (“relates to” standard) | States’ rights advocates and California coalition oppose |
| DPIA requirements | Not required | Privacy advocates favor mandatory assessments |
| AI/ADMT rules | Absent | Tech accountability groups and some states demand rules |
| Universal opt-out signals | Study only | States with UOOM requirements and privacy advocates oppose rollback |
| Teen privacy | Parental consent for under-16 | Strong bipartisan support likely |
| Small business exemption | $25M revenue floor | Advocates say too generous; industry may want higher |
A Serious Attempt, But a Long Road Ahead
The SECURE Data Act is the most substantive federal consumer privacy proposal in years, and it deserves to be taken seriously. It establishes a coherent, recognizable framework built on widely accepted privacy principles. It reflects genuine engagement with stakeholders. And it addresses several issues — particularly teen privacy — that have broad political appeal.
But its limitations are real. The absence of a private right of action, the weak stance on ADMT and AI, the lack of DPIA requirements, and the aggressive preemption of stronger state laws will generate significant opposition. The bill’s purely Republican authorship, while not disqualifying, means that bipartisan negotiation — the true test of any federal privacy bill — has not yet begun.
What the SECURE Data Act does most clearly is start a conversation. After years of stalled federal privacy efforts, that is not nothing. Whether that conversation produces a law — and what shape that law takes — will depend on whether the difficult compromises on preemption, enforcement, and AI governance can be brokered in the months ahead.
The privacy community, businesses, and consumers across the country have reason to watch closely.