Saturday, April 4, 2026
ASX 200: 8,412 +0.43% | AUD/USD: 0.638 | RBA: 4.10% | BTC: $87.2K
← Back to home
Geopolitics

Meta and Google Lose $381M in Child Safety Verdicts

Two juries in three days find social platforms liable for harm to minors

7 min read
Parents and protesters gather outside Los Angeles Superior Court during the Meta and Google child safety trial verdict
Families outside Los Angeles Superior Court after the jury found Meta and Google liable for harm to children through platform design
Editor
Mar 31, 2026 · 7 min read
By Lachlan Voss · 2026-03-31

Two juries in three days have done what Congress, regulators, and shareholders collectively failed to do: hold Meta and Alphabet liable for designing products that harm children. The financial penalties matter less than the legal precedent. New Mexico awarded $375 million on Tuesday. Los Angeles awarded $6 million on Wednesday. For the first time, juries have found that social media platforms can be treated as defective products, and that Section 230 immunity does not extend to design decisions that addict and exploit minors.

KEY TAKEAWAYS

01New Mexico jury ordered Meta to pay $375 million for violating consumer protection laws by failing to protect children from predators
02Los Angeles jury awarded $6 million (70% Meta, 30% Google) for designing addictive platforms that harmed mental health
03Verdicts bypass Section 230 by targeting product design rather than content moderation
04Evidence showed Meta executives knew platforms harmed children and disregarded internal warnings
05Thousands of consolidated cases await similar trials with early verdicts signaling industry-wide exposure

New Mexico: $375 Million for Enabling Predators

The New Mexico verdict arrived first. Meta violated the state's Unfair Practices Act by misleading consumers about platform safety and enabling child sexual exploitation across Facebook and Instagram. The penalty: $5,000 per violation, totaling $375 million. Raúl Torrez, the state's attorney general, told reporters the jury heard evidence Meta executives "knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew."

The Guardian's 2023 investigation showed Facebook and Instagram had become marketplaces for child sex trafficking. Internal Meta documents revealed company employees and external safety experts repeatedly warned about risks on the platforms. Operation MetaPhile, a state sting operation, led to arrests of three men who used Meta's platforms to groom minors. Law enforcement testified that Meta's 2023 decision to encrypt Messenger blocked access to evidence of these crimes. The encryption protected predators, not children.

Los Angeles: $6 Million for Addictive Design

Los Angeles delivered the second verdict less than 24 hours later. A jury awarded $6 million to a 20-year-old woman who testified she developed depression and anxiety from compulsive use of Instagram and YouTube as a child. Meta carries 70% of the liability. Google carries 30%. Compensatory damages: $3 million. Punitive damages: $3 million.

The plaintiff's lawyers shifted the legal target. Rather than arguing over what users see (content moderation, where Section 230 provides immunity) they attacked how platforms are engineered. Meta and Google deliberately built apps to be addictive. Executives knew this. They failed to protect young users anyway. That is what the jury concluded.

Mark Lanier, the lead trial lawyer, brought a jar of M&Ms to court during punitive damage arguments. Each piece represented $1 billion of the companies' market value. "You've got to talk to Meta in Meta money," he told the jury. They awarded $6 million. For companies worth trillions, the amount is rounding error. The precedent is not.

Joseph VanZandt, co-lead lawyer for families suing social media companies in consolidated federal litigation, called Wednesday's verdict "a referendum from a jury to an entire industry that accountability has arrived." Thousands of similar cases are pending. The Los Angeles decision provides a roadmap. Focus on design, not content. Prove knowledge of harm. Show failure to act.

Meta has announced it will appeal both verdicts. A spokesperson said the company "respectfully disagrees" and accused New Mexico of making "sensationalist, irrelevant arguments by cherrypicking select documents." That defense failed twice in three days.

These verdicts expose the gap between what Meta says publicly and what it does privately. The New Mexico case revealed that Meta generates high volumes of what investigators called "junk" reports by over-relying on AI moderation. Law enforcement cannot use these reports. Crimes go uninvestigated. The National Center for Missing and Exploited Children sent witnesses who testified to deficiencies in how Meta reports child sexual abuse material. The system is designed to create the appearance of compliance, not actual protection.

The legal comparison being drawn is to 1990s tobacco litigation, which forced the industry to stop targeting minors with advertising. Whether social media faces similar industry-wide changes remains uncertain. The dam has cracked, regardless. Section 230, the liability shield that has protected tech platforms for nearly three decades, does not immunize product design. Juries now understand the distinction.

Alvaro Bedoya is an FTC Commissioner. He wrote on X after the verdict: "A jury of regular people has managed to do what Congress and even state legislatures have not: Hold Meta and Google accountable for addicting young people to their products."

The next phase of litigation will determine whether $381 million in damages across two cases becomes $38 billion across two thousand. Meta's market capitalization is $1.4 trillion. Alphabet is worth $2.1 trillion. The financial exposure is manageable. The reputational and regulatory exposure is not. One wonders whether the board has considered this properly.

Both companies have announced appeals. Both will argue the verdicts are outliers. Neither can argue they were unaware of the harm. The juries saw the documents.

The Governance Question

The governance failures here are instructive. Meta's board includes members drawn from government, academia, and technology. All presumably aware of child safety concerns. All presumably briefed on internal warnings. All apparently content to let management prioritize engagement metrics over user protection until juries forced accountability.

This is what happens when boards treat risk committees as compliance theatre rather than genuine oversight. The New Mexico evidence showed employees warning executives repeatedly. Those warnings reached the board level, or should have. If directors claim ignorance, they are admitting negligence. If they knew and did nothing, they are admitting complicity. Either way, the governance architecture failed.

The question now is whether shareholders will hold boards accountable the way juries held the companies accountable. The Australian Council of Superannuation Investors tracks governance failures like this. One imagines proxy advisors will have questions about board oversight at the next annual meeting. They should.

TLDR

Two juries in three days found Meta and Google liable for harm to children. New Mexico awarded $375 million after Meta failed to protect users from predators. Los Angeles awarded $6 million finding both companies designed addictive platforms that damaged mental health. The verdicts bypass Section 230 immunity by targeting product design rather than content moderation.

FREQUENTLY ASKED QUESTIONS

What is Section 230 and why does it matter for these cases?
Section 230 of the Communications Decency Act generally shields online platforms from liability for user-generated content. However, these verdicts targeted platform design and safety failures, not content moderation, allowing juries to hold Meta and Google liable despite Section 230 protections.
Will Meta and Google actually pay these damages?
Both companies have announced appeals, which could take years to resolve. Even if the verdicts are upheld, the financial impact is minimal for companies worth over $3.5 trillion combined. The legal precedent and potential for thousands more cases pose greater risk than the immediate financial penalties.
How many similar cases are pending against social media companies?
Thousands of cases have been consolidated in federal multidistrict litigation. The Los Angeles and New Mexico verdicts are among the first to go to trial, providing a template for future cases targeting platform design rather than content.
What evidence showed Meta knew its platforms harmed children?
Internal Meta documents and testimony revealed that company employees and external child safety experts repeatedly warned executives about risks. The New Mexico case also showed Meta's encryption of Messenger in 2023 blocked law enforcement access to evidence of crimes involving minors.
Could these verdicts force changes to how social media platforms operate?
The legal comparison being drawn is to 1990s tobacco litigation, which reshaped that industry's practices regarding minors. If verdicts like these multiply across thousands of pending cases, platforms may face pressure to fundamentally redesign features that drive engagement among young users.
Editor

Editor

The Bushletter editorial team. Independent business journalism covering markets, technology, policy, and culture.

The Morning Brief

Business news that matters. Five stories, five minutes, delivered every weekday. Trusted by professionals who need clarity before the market opens.

Free. No spam. Unsubscribe anytime.