Meta ordered to pay $375m in New Mexico child exploitation case
Consensus Summary
A New Mexico jury ruled Meta liable for violating consumer protection laws by misleading users about the safety of Facebook Instagram and WhatsApp while enabling child sexual exploitation and mental health harms. The company was ordered to pay $375 million in penalties the maximum allowed under state law after a six-week trial. Evidence included undercover investigations exposing predators targeting minors and internal documents showing Meta’s awareness of platform risks. Both sources agree on the verdict amount the state’s allegations and Meta’s intent to appeal but differ on framing Meta’s knowledge and the severity of its failures. The Guardian emphasizes Meta’s role in enabling trafficking and law enforcement obstacles while ABC highlights the company’s legal defenses and safeguards. The case marks the first jury ruling against Meta on such claims and sets a precedent for future lawsuits over tech companies’ responsibility for harm on their platforms.
✓ Verified by 2+ sources
Key details reported by multiple sources:
- A New Mexico jury found Meta liable for violating New Mexico’s consumer protection law in a case alleging misleading claims about child safety on Facebook, Instagram, and WhatsApp
- $375 million ($538 million including penalties) was awarded to New Mexico as civil penalties under the state’s Unfair Practices Act
- The trial lasted six weeks, with jury deliberation taking less than one day before the verdict
- New Mexico Attorney General Raúl Torrez accused Meta of enabling child sexual exploitation and prioritizing profits over child safety
- Meta plans to appeal the verdict and stated it works to keep users safe despite challenges in identifying harmful content
- The lawsuit included evidence from an undercover operation (Operation MetaPhile) where underage accounts received explicit material and predator contact
- Meta’s internal documents and whistleblower testimony were cited as evidence of awareness of platform harms
- The case focused on Meta’s platform design, including features like infinite scroll and encryption, which allegedly harmed children’s mental health and safety
- The jury rejected Meta’s attempts to invoke Section 230 of the Communications Decency Act and First Amendment defenses
- The trial began in December 2023, with the verdict announced on [date inferred: likely early 2025]
- Meta shares rose 0.8% in after-hours trading following the verdict
Points of Difference
Details reported by only one source:
- The state sought over $2 billion in damages, but the jury awarded $375 million (the maximum penalty of $5,000 per violation)
- Meta’s lawyer Kevin Huff argued Meta’s disclosures were clear and the company did not knowingly lie to the public
- Linda Singer, the state’s attorney, stated Meta failed to protect young people over a decade and repeatedly ignored warnings
- The second phase of the trial in May will seek platform changes and additional financial penalties
- The lawsuit grew from an undercover operation in 2023 where investigators created accounts for users under 14
- Meta’s lawyer argued the company’s safeguards for younger users were robust and well-documented
- The Guardian’s 2023 investigation into child sex trafficking on Facebook/Instagram was cited in the complaint
- Meta’s encryption of Facebook Messenger in 2023 was highlighted as blocking law enforcement access to evidence of crimes
- Witnesses testified Meta’s AI-generated ‘junk’ reports hindered law enforcement investigations of CSAM
- Mark Zuckerberg and Adam Mosseri testified harms to children were inevitable due to platform scale, despite billions spent on safety
- The trial included testimony from child safety experts, law enforcement, and NCMEC (National Center for Missing and Exploited Children)
- The Guardian quoted former New Mexico deputy DA John W. Day calling the verdict a ‘huge win’ opening floodgates for further litigation
- The case was the first bench trial to find Meta liable for acts on its platform
- Meta’s attempt to invoke Section 230 was denied due to focus on platform design, not user-generated content
- The next phase (May 4) will seek court-mandated changes like age verification and predator removal
Contradictions
Conflicting information between sources:
- ABC states the jury deliberated for less than one day, while The Guardian does not specify deliberation time but implies it was brief
- ABC reports Meta’s lawyer argued the company’s disclosures were clear and no lies were intended, while The Guardian emphasizes Meta’s executives knew of harms and lied to the public
- The Guardian highlights Meta’s encryption of Messenger as blocking law enforcement, but ABC does not mention this specific detail
- ABC notes Meta’s lawyer argued the company had extensive safeguards for younger users, while The Guardian focuses on internal warnings and failures to act
- The Guardian explicitly states the verdict was a ‘historic victory’ and a ‘huge win,’ while ABC frames it as a ‘historic verdict’ without such strong language
Source Articles
Meta ordered to pay $538m in US trial over child exploitation claims
The verdict marks the first time a jury has ruled on such claims against Meta, as the company faces a wave of lawsuits over how its platforms affect young people's mental health....
Meta ordered to pay $375m after being found liable in child exploitation case
New Mexico hails ‘historic’ win after jury finds firm misled consumers over safety and enabled harm against users A New Mexico jury on Tuesday ordered Meta to pay $375m in civil penalties after it fou...