There is a particular kind of corporate lawbreaking that is, in some ways, more troubling than the brazen kind. It is the kind that happens in plain sight, at enormous scale, by companies with armies of lawyers who are fully aware of their legal obligations — companies that have simply calculated that the cost of compliance exceeds the cost of getting caught, or that the probability of meaningful enforcement is low enough to make non-compliance the rational business decision.
A new audit from researchers at webXray suggests that is precisely what is happening with California’s Global Privacy Control requirement. And the companies allegedly doing it are not shadowy data brokers operating in regulatory gray zones. They are Google, Microsoft, and Meta — three of the most legally sophisticated corporations on earth, with combined market capitalizations measured in the trillions, and privacy compliance teams that almost certainly employ more lawyers than most state regulators have total staff.
The findings, if accurate, do not describe an industry struggling in good faith with a technically complex requirement. They describe an industry that has made a collective decision to ignore a law it finds inconvenient — and has been getting away with it.
What the Global Privacy Control Actually Is
Before examining what the audit found, it is worth being precise about what the Global Privacy Control is and what California law actually requires of businesses that receive it.
The GPC is a technical signal — a standardized browser-level or tool-level setting that a consumer can activate to communicate, automatically and persistently, that they do not want their personal data sold or shared with third parties. When a user enables GPC, every website they visit receives that signal. The California Consumer Privacy Act, as amended by Proposition 24’s enactment of the California Privacy Rights Act, requires covered businesses to treat a GPC signal as a legally valid opt-out of the sale or sharing of personal information.
This is not a voluntary standard. It is not a suggested best practice. It is a legal mandate backed by enforcement authority. The California Privacy Protection Agency — the dedicated state agency established specifically to enforce CPRA — has already demonstrated its willingness to use that authority, having fined companies millions of dollars for GPC-related violations in previous enforcement actions.
The requirement exists because the alternative — requiring consumers to navigate to each website’s privacy settings, find the opt-out mechanism, and manually submit their preferences — was recognized as a practical nullity. Dark patterns, buried menus, and deliberately friction-heavy opt-out flows had rendered the theoretical right to opt out largely meaningless in practice. The GPC was designed to solve that problem by making the opt-out signal automatic, universal, and technically unambiguous.
When a website receives a GPC signal, the required response is not complicated. The tracking code needs to check for the signal and, if present, not fire the trackers that would otherwise collect and share the user’s data. This is, as the webXray researchers note, a “slight change to the tracking code.” It is not a technically demanding requirement. The difficulty is not engineering. It is will.
What the Audit Found — and How Bad It Actually Is
The webXray team — headed by Tim Libert, a former Google privacy engineer who built cookie policy infrastructure at the company and, by his own account, understood its systems more deeply than almost anyone there — visited more than 7,000 popular websites from a California internet address with the GPC signal active.
The results should not be surprising given everything we know about how these companies operate. But stated plainly, they are still striking.
Google continued to track users via cookies despite receiving the GPC signal in 86% of cases. Not some cases. Not edge cases involving unusual browser configurations or ambiguous technical scenarios. Eighty-six percent. In nearly nine out of ten instances, Google’s tracking infrastructure ignored a legally mandated opt-out signal and continued doing exactly what the law says it cannot do.
Meta’s trackers failed to honor the GPC in 69% of cases — but the mechanism of failure is arguably more egregious than Google’s. According to the report, Meta’s trackers don’t just ignore the signal. They fail to check for it at all. The code does not look for the GPC signal, evaluate it, and then decide not to comply. It simply proceeds as if the signal does not exist. This is not a compliance failure in the technical sense of a system that tries and falls short. It is the complete absence of a compliance attempt.
Microsoft failed to honor the signal in 50% of instances — which, in a world where the baseline for compliance is 100%, represents a significant violation across an enormous volume of user interactions. Microsoft’s response is particularly revealing of the defense the industry will mount: that certain cookies are “necessary for operational purposes” and may therefore be placed regardless of the GPC signal. This is a reading of the law’s exceptions that, at best, requires serious legal scrutiny and, at worst, is a deliberate mischaracterization of narrow exceptions as blanket carve-outs.
The audit also examined the third-party compliance tools — the consent management platforms and ad-tech compliance products that businesses purchase specifically to help them comply with privacy regulations like CCPA and CPRA. These products exist in a market that generates substantial revenue precisely by promising to keep clients on the right side of the law. The finding that at least one such product failed to honor GPC requests more than 90% of the time raises a separate and uncomfortable question: are companies being sold compliance theater rather than actual compliance? And if so, is that theater being used — consciously or not — as a liability shield?
The Responses — and What They Actually Mean
Google, Microsoft, and Meta have all pushed back on the findings in ways that deserve careful examination, because the defenses they are mounting are likely to be the defenses they deploy in any enforcement proceeding.
Google’s spokesperson called the audit “based on a fundamental misunderstanding of how our products work.” This is a bold claim to make against a researcher who, by every available account, spent years at Google building and maintaining its cookie compliance infrastructure. Tim Libert was not a junior engineer who briefly encountered cookie policy on one project. He was, in his own words, the person who knew more about it than anyone else while he was there. Google’s claim that he fundamentally misunderstands how its products work is not self-evidently credible — and in litigation or a regulatory proceeding, it would need to be demonstrated, not asserted.
Microsoft’s response is more technically sophisticated and more legally interesting. The company says it opts users out of sharing personal data with third parties for personalized advertising when it receives a GPC signal, and that certain cookies are “necessary for operational purposes” and may therefore be placed even when the signal is detected. This invokes a genuine legal question: what does “necessary for operational purposes” mean under CPRA, and how broad is that exception? If Microsoft’s interpretation of operational necessity is narrower than the audit’s assumption about what counts as impermissible tracking, some portion of the apparent non-compliance may be defensible. If Microsoft’s interpretation is expansive — if it has defined operational necessity broadly enough to encompass activities that the legislature intended the GPC to cover — then it is not a compliance posture but a legal gambit.
The distinction matters because it determines whether this is a good-faith compliance dispute about interpretation or something more troubling. A company that genuinely believes its reading of “operational necessity” is correct and is prepared to defend it before the California Privacy Protection Agency or in court is engaged in the normal process of legal interpretation under an ambiguous statute. A company that has adopted an expansive definition of “operational necessity” primarily because it allows continued data collection while maintaining a surface-level compliance claim is doing something different.
Meta’s silence in response to the initial request for comment is consistent with the company’s general posture toward privacy enforcement, and the finding that its trackers don’t check for the GPC signal at all is the hardest to defend on any interpretation. There is no “operational necessity” argument that explains why a tracker would not even look for a legally required opt-out signal. The failure to check is a design decision, and design decisions are made by people with knowledge of legal obligations.
The Enforcement Gap: Why This Has Been Allowed to Continue
The question that any serious observer of this audit will eventually ask is the same one that hangs over most of California’s privacy enforcement efforts: if the law is clear, the violations are documented, and the companies are identifiable — why is this still happening?
The answer is multifactorial, but it starts with resources. The California Privacy Protection Agency, despite being the first dedicated state privacy enforcement agency in the United States and one of the most significant privacy regulatory bodies in the country, operates with a budget and staffing level that is genuinely inadequate to the scale of the market it is attempting to regulate. It oversees privacy compliance for an economy that, if it were a country, would be among the largest in the world. It does so with a fraction of the enforcement capacity of, for example, the Irish Data Protection Commission — which itself has been repeatedly criticized for insufficient enforcement of GDPR against large tech companies.
The result is an enforcement pattern that is, by necessity, selective. The Agency can pursue high-profile cases and signal its priorities through targeted actions, but it cannot systematically audit 7,000 websites and pursue enforcement actions against every company that the webXray report found to be non-compliant. The companies know this. The calculation embedded in their compliance postures — or lack thereof — reflects that knowledge.
WebXray’s estimate that full enforcement against all non-compliant websites could result in billions of dollars in penalties is striking not because it suggests enormous regulatory risk but because it suggests the opposite. If billions in potential penalties have been available to California regulators for an extended period and remain uncollected, it demonstrates that the enforcement threat, in practice, is far more limited than the statutory text would imply. Companies are not being irrational by treating GPC compliance as optional. They are reading the enforcement environment accurately.
This is the core of what the webXray researchers mean when they describe “industrial-scale noncompliance.” It is not merely that many companies are violating the law. It is that the violation has become the industry default — the rational equilibrium position given the low probability of enforcement relative to the commercial value of the data being collected. The law has been neutralized not by being repealed or amended but by being systematically ignored at a scale that overwhelms available enforcement capacity.
The Compliance Tool Problem
The finding that third-party compliance products — the consent management platforms and privacy compliance tools that businesses purchase specifically to handle their CCPA and CPRA obligations — are themselves failing to honor GPC signals deserves its own analysis, because it reveals a market failure that compounds the enforcement gap.
A substantial industry has grown up around privacy compliance. Consent management platforms, privacy-by-design consultants, compliance audit tools, and data governance software are all marketed to businesses as mechanisms for meeting their legal obligations. Many businesses purchase these products in good faith, implement them according to vendor specifications, and genuinely believe they are compliant.
If those products are failing to honor GPC signals more than 90% of the time — as the audit found in at least one case — then businesses relying on them face a difficult position. They have done what they were supposed to do: identified a legal obligation, sought out purpose-built tools to meet it, implemented those tools, and paid for ongoing service. The resulting non-compliance is, at least in a practical sense, not attributable to their own bad faith. It is attributable to a vendor’s product failure.
This creates both a legal question — what is a business’s liability when a compliance product it reasonably relied upon fails? — and a market question: why is the compliance technology market producing products that don’t work? The latter may have a depressingly simple answer. Consent management platforms are often also in the advertising technology business, or are owned by companies that are. The financial incentives for a compliance tool to rigorously enforce opt-out signals are not necessarily aligned with those of the broader data economy in which those tools operate.
Regulators who are serious about GPC enforcement will need to consider whether enforcement that targets only the businesses using non-compliant tools is sufficient, or whether the tool providers themselves — as processors or service providers under CPRA — bear independent responsibility for the compliance failures their products enable.
The Former Insider Dimension
There is something worth sitting with in the identity of the person who produced this audit. Tim Libert is not an academic researcher working from the outside with imperfect visibility into how large platforms operate. He is a former Google engineer who built cookie policy infrastructure. He understood, from the inside, how Google’s tracking systems work, how compliance decisions are made, and what it takes — technically and organizationally — to honor a user opt-out signal.
When he says that these companies “don’t make any substantive effort to comply,” he is not speaking from ignorance of their technical constraints or organizational complexities. He is speaking from a position of specific, detailed, experiential knowledge about what compliance would require and what its absence therefore represents.
Google’s suggestion that he “fundamentally misunderstands” how its products work is, in this context, not just a rebuttal. It is an assertion that the person who built the thing does not understand the thing he built — a claim that invites exactly the kind of scrutiny that discovery in a regulatory proceeding would apply to it. If Google is willing to make that claim publicly, it will presumably be willing to defend it in detail, with documentation, before a regulator or a court. That is a test worth applying.
What Enforcement Would Actually Require
The CPPA’s carefully worded response — appreciating that the report “brings visibility to the importance of opt out rights” without commenting on specific findings — is the statement of an agency that is aware of the problem, aware of its own resource limitations, and choosing its battles carefully. That is not a criticism of the agency. It is a description of its constraints.
What genuine enforcement of GPC requirements at scale would actually require involves several things that current California law and infrastructure do not fully provide:
Automated technical monitoring. The CPPA cannot manually audit 7,000 websites. It needs technical tools capable of continuously monitoring GPC compliance across the ecosystem — essentially the same capability that webXray demonstrated in producing its report, deployed as an ongoing regulatory function rather than a one-time audit.
Meaningful per-violation penalties at scale. Current penalty structures are meaningful for individual companies that face targeted enforcement. They are not large enough, relative to the commercial value of the data being collected, to change the underlying cost-benefit calculation for an industry that has concluded systematic non-compliance is rational.
Vendor liability. Compliance tool providers that produce products that fail to honor GPC signals need to face regulatory consequences, both to create incentives for the compliance technology market and to address the systematic failure their products enable.
Federal coordination. California’s law is the strongest in the country, but it only covers California residents, and its enforcement is limited by state resources. Federal privacy legislation with GPC recognition requirements and federal enforcement capacity would change the calculus in ways that state action alone cannot.
None of this is easily achieved, and some of it requires legislative action that is not currently on the near-term horizon. But the gap between where enforcement capacity is today and where it would need to be to make GPC compliance genuinely mandatory for large platforms is not a technical gap. It is a political and resource allocation gap — a question of whether the public and their elected representatives are prepared to fund and demand the regulatory infrastructure that effective privacy enforcement requires.
The Larger Stakes
Step back from the specific findings of the webXray audit, and what you see is something that matters beyond the specific question of GPC compliance. You see a test of whether privacy law in the United States can be made to work against large technology companies.
California has the strongest privacy statute in the country. It has a dedicated enforcement agency. It has a specific, technically implementable requirement that companies honor user opt-out signals. The signal is not ambiguous. The obligation is not subject to reasonable interpretive dispute in its core application. And the result — according to an audit conducted by someone with deep relevant expertise — is systematic violation at an industrial scale by some of the world’s most legally sophisticated and financially powerful corporations.
If this law, with these specifications, enforced by this agency, cannot produce compliance from Google, Microsoft, and Meta, then the question being posed is not just about GPC. It is about the entire project of privacy regulation through legislation. It is about whether the gap between the rights that privacy law promises consumers and the practices that companies actually follow can be closed by the instruments available to regulators operating within current political and resource constraints.
The answer to that question is not encoded in this audit. But the audit makes the question impossible to avoid.