Meta and YouTube Found Negligent in Landmark Social Media Addiction Case

Table of Contents

In a verdict with potential implications for hundreds of similar lawsuits across the country, a Los Angeles County Superior Court jury on Wednesday found that Meta and YouTube’s parent company Google were negligent in designing addictive platform features that harmed a young user’s mental health.

The jury determined that design elements such as infinite scroll and algorithmic recommendations on Instagram (owned by Meta) and YouTube were a substantial factor in causing anxiety, depression, body dysmorphia, and other mental health distress for the plaintiff, a now 20-year-old California woman identified in court as K.G.M. or Kaley. She began using the platforms as a child — YouTube at age six and Instagram around age nine or eleven, according to testimony.

After nearly nine days of deliberations spanning more than 40 hours, jurors found both companies liable for negligence and for failing to adequately warn users about the potential dangers of their products. The jury apportioned responsibility with Meta bearing 70% of the harm and YouTube (Google) 30%.

The plaintiff was awarded $3 million in compensatory damages. The jury also assessed punitive damages after determining the companies “acted with malice, oppression, or fraud,” bringing the total judgment to approximately $6 million — with Meta responsible for about $4.2 million and Google/YouTube for about $1.8 million.

This case served as a bellwether trial — the first of its kind to reach a jury verdict — among thousands of pending lawsuits accusing social media companies of deliberately engineering addictive products that contribute to youth mental health crises. Snap and TikTok settled with the plaintiff prior to trial.

Lawyers for the young woman argued that the platforms were designed to be as habit-forming as cigarettes or digital casinos, prioritizing user engagement over safety. They presented evidence that company executives knew the risks to young users but failed to protect them.

Meta and Google maintained that their services cannot be held solely responsible for complex mental health issues and that users and families share responsibility. The companies are expected to appeal the verdict.

Parents, advocacy groups, and lawmakers pushing for stricter regulations on social media welcomed the decision as a significant step toward greater accountability. It comes amid growing national concern over the impact of platforms on children and teens, including ongoing debates in state legislatures — such as Maine’s consideration of the Online Data Privacy Act (LD 1822) — about protecting personal data and limiting harmful algorithmic practices.

The landmark ruling could influence how social media companies approach product design, age-appropriate safeguards, and warnings moving forward. Hundreds of similar cases remain pending in courts nationwide.

Online Privacy Compliance Made Easy

Captain Compliance makes it easy to develop, oversee, and expand your privacy program. Book a demo or start a trial now.