New Mexico Delivers Historic $375 Million Blow to Meta: First State to Win Jury Trial Over Child Safety and Misleading Privacy Claims

Table of Contents

Privacy, compliance, and data governance professionals tracking Big Tech accountability just witnessed a major shift. In a landmark verdict handed down on March 25, 2026, a New Mexico jury found Meta Platforms, Inc. liable under the state’s Unfair Practices Act for deliberately misleading consumers about the safety of its platforms and for design choices that put children at risk. The jury ordered Meta to pay the maximum statutory penalty of $5,000 per violation, resulting in a total of $375 million in civil penalties.

New Mexico has become the first state in the nation to win a jury trial against a major technology company for harms caused to young users through social media. For privacy teams, this verdict goes well beyond a headline. It shows that courts are increasingly willing to hold platforms accountable when their internal data practices, algorithmic design, and public statements about child safety do not match reality.

The case, titled State of New Mexico v. Meta Platforms, Inc., focused on allegations that Meta’s platforms — Facebook, Instagram, and WhatsApp — put profits and user engagement ahead of child protection. New Mexico Attorney General Raúl Torrez argued that Meta knew its products exposed minors to sexual exploitation, eating disorders, self-harm content, and predatory behavior, yet continued to tell parents and the public that the platforms were safe.

Evidence presented at trial included internal documents and testimony from former Meta employees and child safety experts. Witnesses described how the company received repeated warnings about predators targeting children on its platforms. Testimony also showed that Meta’s algorithms actively pushed harmful content to young users, while the company publicly promoted safety features that the jury determined were not effectively implemented at scale.

The jury unanimously found Meta liable on two key counts under New Mexico’s consumer protection law:

– Deceptive trade practices: Meta made false or misleading statements to New Mexico consumers about the level of safety and protections provided to children on its platforms.
– Unconscionable trade practices: Meta knowingly took unfair advantage of children’s inexperience and vulnerability by designing its platforms to maximize engagement at the expense of safety.

The $375 million penalty reflects the jury’s finding that Meta committed thousands of individual violations. This outcome is the result of a full seven-week jury trial, not a settlement or consent decree. Everyday New Mexicans reviewed the evidence and sided with the state.

What This Means for Privacy and Child Data Governance

Privacy professionals have watched Meta rely on Section 230 of the Communications Decency Act and First Amendment arguments for years. Those defenses did not persuade the New Mexico jury. The verdict highlights a growing judicial view that when platforms use children’s personal data — including location, browsing behavior, interests, and social connections — to fuel addictive engagement, they move from protected speech into actionable consumer harm.

This decision comes at a pivotal time. While the federal Kids Online Safety Act remains pending, states are rapidly expanding their own child privacy and safety laws. The New Mexico verdict sets a strong precedent because it demonstrates that:

– Public statements about “safety” and “protections” can create real legal obligations under state unfair trade practices statutes.
– Internal knowledge of harms, documented in employee warnings, research reports, or safety team memos, can prove willfulness.
– Algorithmic amplification of harmful content to minors can support findings of unconscionable conduct.

A separate public nuisance claim is still pending and will be decided by the judge in a bench trial scheduled to begin on May 4, 2026. In that phase, New Mexico will seek additional penalties and court orders requiring Meta to improve age verification, more effectively remove known predators, and address risks created by encrypted messaging features.

Immediate Takeaways for Privacy, Compliance, and Risk Teams

Digital platforms, apps, and services used by minors should treat this verdict as an urgent signal to review current practices. Key steps include:

– Review all public-facing safety and privacy claims to ensure they are accurate, verifiable, and supported by effective controls. Vague or aspirational language now carries significant litigation risk.
– Map data flows involving minors, with close attention to how behavioral data, inferred interests, and algorithmic recommendations affect underage users.
– Strengthen internal escalation processes so that safety and privacy teams can clearly document and escalate known risks to senior leadership.
– Prepare for increased state-level enforcement. More attorneys general are likely to pursue similar cases, so compliance programs must withstand discovery of internal research and unreported incidents.
– Evaluate age assurance and verification methods. Courts and regulators are becoming skeptical of self-reported age or weak signals. Stronger, privacy-preserving age gating may become the expected standard.
– Update vendor and platform contracts to require partners to meet heightened child safety standards and reduce potential vicarious liability.

Meta has indicated it will appeal the verdict, arguing that it threatens core internet protections. Privacy professionals should follow the appeal closely, as the outcome could reshape the balance between platform immunity and accountability for data-driven harms to children.

A Watershed Moment for Child Privacy

Attorney General Torrez stated that Meta executives knew their products harmed children, ignored warnings from their own employees, and misled the public about what they knew. The jury’s quick decision after less than a day of deliberations suggests the evidence was compelling.

For privacy practitioners, this verdict makes clear that courts will hold companies responsible when profit-driven data practices endanger young users. The days of “move fast and apologize later” when it comes to children’s data are coming to an end. What follows is a stronger emphasis on verifiable safety by design, transparent data governance, and real accountability.

Privacy teams that view this only as a compliance item will miss the larger message. Those who use the ruling to drive meaningful changes in product architecture, risk assessment, and executive oversight will be far better prepared as states and regulators continue to raise the bar for protecting children online.

Online Privacy Compliance Made Easy

Captain Compliance makes it easy to develop, oversee, and expand your privacy program. Book a demo or start a trial now.