Two juries in three days have done what Congress, regulators, and shareholders collectively failed to do: hold Meta and Alphabet liable for designing products that harm children. The financial penalties matter less than the legal precedent. New Mexico awarded $375 million on Tuesday. Los Angeles awarded $6 million on Wednesday. For the first time, juries have found that social media platforms can be treated as defective products, and that Section 230 immunity does not extend to design decisions that addict and exploit minors.
KEY TAKEAWAYS
New Mexico: $375 Million for Enabling Predators
The New Mexico verdict arrived first. Meta violated the state's Unfair Practices Act by misleading consumers about platform safety and enabling child sexual exploitation across Facebook and Instagram. The penalty: $5,000 per violation, totaling $375 million. Raúl Torrez, the state's attorney general, told reporters the jury heard evidence Meta executives "knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew."
The Guardian's 2023 investigation showed Facebook and Instagram had become marketplaces for child sex trafficking. Internal Meta documents revealed company employees and external safety experts repeatedly warned about risks on the platforms. Operation MetaPhile, a state sting operation, led to arrests of three men who used Meta's platforms to groom minors. Law enforcement testified that Meta's 2023 decision to encrypt Messenger blocked access to evidence of these crimes. The encryption protected predators, not children.
Los Angeles: $6 Million for Addictive Design
Los Angeles delivered the second verdict less than 24 hours later. A jury awarded $6 million to a 20-year-old woman who testified she developed depression and anxiety from compulsive use of Instagram and YouTube as a child. Meta carries 70% of the liability. Google carries 30%. Compensatory damages: $3 million. Punitive damages: $3 million.
The plaintiff's lawyers shifted the legal target. Rather than arguing over what users see (content moderation, where Section 230 provides immunity) they attacked how platforms are engineered. Meta and Google deliberately built apps to be addictive. Executives knew this. They failed to protect young users anyway. That is what the jury concluded.
Mark Lanier, the lead trial lawyer, brought a jar of M&Ms to court during punitive damage arguments. Each piece represented $1 billion of the companies' market value. "You've got to talk to Meta in Meta money," he told the jury. They awarded $6 million. For companies worth trillions, the amount is rounding error. The precedent is not.
Joseph VanZandt, co-lead lawyer for families suing social media companies in consolidated federal litigation, called Wednesday's verdict "a referendum from a jury to an entire industry that accountability has arrived." Thousands of similar cases are pending. The Los Angeles decision provides a roadmap. Focus on design, not content. Prove knowledge of harm. Show failure to act.
The Legal Shift: From Content to Design
Meta has announced it will appeal both verdicts. A spokesperson said the company "respectfully disagrees" and accused New Mexico of making "sensationalist, irrelevant arguments by cherrypicking select documents." That defense failed twice in three days.
These verdicts expose the gap between what Meta says publicly and what it does privately. The New Mexico case revealed that Meta generates high volumes of what investigators called "junk" reports by over-relying on AI moderation. Law enforcement cannot use these reports. Crimes go uninvestigated. The National Center for Missing and Exploited Children sent witnesses who testified to deficiencies in how Meta reports child sexual abuse material. The system is designed to create the appearance of compliance, not actual protection.
The legal comparison being drawn is to 1990s tobacco litigation, which forced the industry to stop targeting minors with advertising. Whether social media faces similar industry-wide changes remains uncertain. The dam has cracked, regardless. Section 230, the liability shield that has protected tech platforms for nearly three decades, does not immunize product design. Juries now understand the distinction.
Alvaro Bedoya is an FTC Commissioner. He wrote on X after the verdict: "A jury of regular people has managed to do what Congress and even state legislatures have not: Hold Meta and Google accountable for addicting young people to their products."
The next phase of litigation will determine whether $381 million in damages across two cases becomes $38 billion across two thousand. Meta's market capitalization is $1.4 trillion. Alphabet is worth $2.1 trillion. The financial exposure is manageable. The reputational and regulatory exposure is not. One wonders whether the board has considered this properly.
Both companies have announced appeals. Both will argue the verdicts are outliers. Neither can argue they were unaware of the harm. The juries saw the documents.
The Governance Question
The governance failures here are instructive. Meta's board includes members drawn from government, academia, and technology. All presumably aware of child safety concerns. All presumably briefed on internal warnings. All apparently content to let management prioritize engagement metrics over user protection until juries forced accountability.
This is what happens when boards treat risk committees as compliance theatre rather than genuine oversight. The New Mexico evidence showed employees warning executives repeatedly. Those warnings reached the board level, or should have. If directors claim ignorance, they are admitting negligence. If they knew and did nothing, they are admitting complicity. Either way, the governance architecture failed.
The question now is whether shareholders will hold boards accountable the way juries held the companies accountable. The Australian Council of Superannuation Investors tracks governance failures like this. One imagines proxy advisors will have questions about board oversight at the next annual meeting. They should.
TLDR
Two juries in three days found Meta and Google liable for harm to children. New Mexico awarded $375 million after Meta failed to protect users from predators. Los Angeles awarded $6 million finding both companies designed addictive platforms that damaged mental health. The verdicts bypass Section 230 immunity by targeting product design rather than content moderation.
SOURCES & CITATIONS
FREQUENTLY ASKED QUESTIONS



