New Mexico Jury Orders Meta to Pay $375 Million Penalty
A New Mexico jury found Meta liable for unfair practices and unconscionable acts tied to alleged teen mental health harms and assessed $375 million in civil penalties.
Published on
A New Mexico state-court jury has ordered Meta Platforms Inc. to pay $375 million in civil penalties after finding that the company engaged in unfair practices and unconscionable acts related to alleged mental health harms to underage users of Facebook and Instagram. The verdict, reached after less than a day of deliberations following a six-week trial, reflects a bellwether outcome pursued by the New Mexico attorney general in broader, closely watched litigation targeting social media platform design and youth safety. While the state sought up to $2 billion, the jury applied the statutory maximum per-violation penalty and multiplied it across a violation count tied to estimates of teen users in New Mexico.
Verdict and Statutory Penalty Findings
The jury awarded $187.5 million on each of two theories, one based on unfair practices and the other on unconscionable acts, for a combined $375 million. It also found the civil penalty should be $5,000 per violation for each claim, and that there were 37,500 violations per claim, producing the $187.5 million figure for each cause of action. The violation count appeared to track trial evidence discussing ranges for the number of teen users of Facebook and Instagram in New Mexico, providing a measurable basis for applying a per-violation framework rather than a single aggregate penalty.
The outcome is significant because it illustrates how consumer-protection statutes can convert platform-wide allegations into a numeric penalty model anchored in user counts and maximum statutory amounts. In practical terms, the jury’s approach suggests acceptance of the state’s framing that each affected minor user, or each instance of exposure to the challenged practices, can constitute a discrete violation. Meta has stated it will appeal, setting up further litigation over how the statute’s penalty provisions should be applied in a case involving alleged omissions and design choices rather than a single, transaction-based misrepresentation.
Allegations Focused on Teen Safety and Disclosures
According to the state’s presentation, Meta allegedly obscured the full scope of mental health harms to minors while publicly presenting its policies and safeguards in a more favorable light than internal discussions reflected. The attorney general contended the platforms did not adequately protect teenagers from sexual predation, bullying, and exposure to content involving suicide and self-injury. The state also asserted that Meta failed to prevent children under 13 from accessing the services, a point that positioned age-gating and enforcement as central factual questions rather than ancillary compliance issues.
The state’s legal theory tied those alleged failures to claims under New Mexico’s consumer-protection framework, emphasizing the asserted gap between what the company knew internally and what it disclosed to users, parents, and the public. Court filings identified outside counsel participation in the trial team; the state was represented by Motley Rice LLC along with the New Mexico Office of the Attorney General. The case posture, including the requested damages and the focus on internal communications, indicates a strategy aimed at proving both the content of disclosures and the reasonableness of safeguards as implemented for minors.
Platform Design, “Problematic Use,” and Meta’s Defense
At trial, the state argued that product design choices increased engagement in ways that amplified risks for minors. One example highlighted was Meta’s shift from a reverse-chronological feed to an algorithmically ranked feed designed to optimize engagement, which the state linked to what the company has described as “problematic use” and the state characterized as addiction-like behavior. The state’s narrative framed this design evolution as a foreseeable driver of extended use and intensified exposure to harmful material, thereby supporting claims that the company’s practices were unfair or unconscionable when applied to underage users.
Meta disputed that it concealed risks or failed to act, emphasizing safety initiatives and disclosures directed to minors and parents. In closing arguments, Meta’s counsel pointed to the company’s content moderation and safety efforts, including an asserted large workforce dedicated to safety functions, and argued that the evidence showed extensive public-facing disclosures over time. After the verdict, Meta stated it “respectfully disagree[s] with the verdict and will appeal,” while maintaining it has worked to keep users safe and to communicate challenges in detecting and removing harmful content and bad actors.


