Meta ordered to pay $375m in New Mexico child exploitation case
Consensus Summary
A New Mexico jury ruled Meta liable for violating consumer protection laws by misleading users about the safety of Facebook, Instagram, and WhatsApp while enabling child sexual exploitation and mental health harm. The company was ordered to pay $375 million, the maximum penalty under New Mexico’s Unfair Practices Act, following a six-week trial where evidence included undercover investigations, internal documents, and testimony from whistleblowers and law enforcement. Both sources agree on the verdict’s core details, such as the $375 million penalty, the state’s allegations of predatory platform design, and Meta’s intent to appeal. However, The Guardian provides additional context on Meta’s encryption policies blocking crime evidence and the overwhelming volume of AI-generated reports hindering law enforcement, while ABC highlights the state’s broader request for $2 billion in damages and Meta’s stock reaction. The case marks the first jury verdict against Meta on such claims and could set a precedent for future lawsuits, with a second phase seeking platform reforms like age verification and stronger child protections.
✓ Verified by 2+ sources
Key details reported by multiple sources:
- A New Mexico jury found Meta liable for violating New Mexico’s consumer protection law in a case alleging misleading claims about child safety on Facebook, Instagram, and WhatsApp
- $375 million ($538 million including penalties) was ordered as civil payment by the jury, the maximum penalty under New Mexico’s Unfair Practices Act ($5,000 per violation)
- The trial lasted six weeks (ABC) and nearly seven weeks (Guardian), with jury deliberation taking less than a day
- New Mexico Attorney General Raúl Torrez accused Meta of enabling child sexual exploitation, failing to implement age verification, and designing platforms to maximize engagement despite harming children’s mental health
- Meta’s spokesperson stated the company will appeal the verdict and claims it works to keep users safe, citing challenges in identifying harmful content
- The lawsuit stemmed from an undercover operation (Operation MetaPhile) where investigators created fake accounts for users under 14 who received sexually explicit material and contact from adults
- Meta’s internal documents and whistleblower testimony were cited as evidence of the company’s awareness of risks but failure to act transparently
- The trial focused on Meta’s platform design, including features like infinite scroll and auto-play, which the state claims foster addictive behavior in children
- Section 230 of the Communications Decency Act was denied as a defense by the judge in June 2024, as the case centered on non-speech issues like content curation and internal decisions
Points of Difference
Details reported by only one source:
- The state had asked the jury to award more than $2 billion in damages, but the jury capped penalties at $375 million
- Meta shares rose 0.8% in after-hours trade following the verdict
- The lawsuit grew out of an undercover operation in 2023 where investigators created accounts on Facebook and Instagram posing as users younger than 14
- Linda Singer, an attorney for the state, accused Meta of failing to act to protect young people in New Mexico for over a decade
- Meta’s lawyer Kevin Huff argued Meta’s disclosures were clear and the company did not knowingly lie to the public
- The trial was held in Santa Fe, New Mexico, and the jury deliberated for less than a day
- The Guardian’s 2023 investigation into Facebook and Instagram as marketplaces for child sex trafficking was cited multiple times in the complaint
- Meta’s encryption of Facebook Messenger in 2023 was highlighted as blocking access to crucial evidence of crimes, including child sexual abuse material (CSAM)
- Witnesses testified Meta’s AI-generated ‘junk’ reports overwhelmed law enforcement, hindering investigations into crimes on its platforms
- The next phase of proceedings (starting May 4) will seek additional financial penalties and court-mandated platform changes, including age verification and removing predators
- Mark Zuckerberg and Adam Mosseri testified that harms to children were inevitable due to the platforms’ vast user bases, despite billions invested in safety measures
- The trial included testimony from child safety experts, current/former Meta employees, and law enforcement, including NCMEC (National Center for Missing and Exploited Children)
- Former New Mexico deputy district attorney John W Day commented that the verdict was a ‘huge win’ and would ‘open the floodgates’ to further litigation and reforms
- The Guardian mentioned a separate Los Angeles lawsuit where Meta, Snap, TikTok, and YouTube are accused of designing platforms to be addictive for children, with Snap and TikTok reaching settlements
Contradictions
Conflicting information between sources:
- ABC states Meta shares rose 0.8% after the verdict, while The Guardian does not mention stock performance
- The Guardian highlights Meta’s encryption of Facebook Messenger in 2023 as a major issue blocking crime evidence, but ABC does not emphasize this detail
- The Guardian mentions Meta’s ‘junk’ reports overwhelming law enforcement, a claim not present in ABC’s coverage
- ABC reports the state asked for over $2 billion in damages, while The Guardian does not specify the requested amount beyond the $375 million awarded
- The Guardian includes a quote from former prosecutor John W Day about the verdict opening litigation floodgates, which ABC does not reference
Source Articles
Meta ordered to pay $375m after being found liable in child exploitation case
New Mexico hails ‘historic’ win after jury finds firm misled consumers over safety and enabled harm against users A New Mexico jury on Tuesday ordered Meta to pay $375m in civil penalties after it fou...
Meta ordered to pay $538m in US trial over child exploitation claims
The verdict marks the first time a jury has ruled on such claims against Meta, as the company faces a wave of lawsuits over how its platforms affect young people's mental health....