Meta Ordered to Pay $375 Million After New Mexico Jury Finds Company Misled Public on Child Safety

0Shares
Representational AI-Generated Image
Representational AI-Generated Image

Meta Ordered to Pay $375 Million After New Mexico Jury Finds Company Misled Public on Child Safety

The state’s case alleged that Meta used its recommendation algorithms to “steer” young users toward harmful content, including child sexual abuse material and solicitations for sex trafficking.

RMN News Legal Desk
New Delhi | March 25, 2026

A New Mexico court has ordered Meta to pay $375 million (£279 million) after a jury found the company liable for misleading users about the safety of its platforms for children. The verdict concluded that Meta, which owns Facebook, Instagram, and WhatsApp, endangered minors by exposing them to sexually explicit material and contact with sexual predators.

New Mexico Attorney General Raul Torrez characterized the outcome as “historic,” marking the first instance of a state successfully suing Meta over child safety issues. The jury found Meta responsible for violating New Mexico’s Unfair Practices Act, determining that the company had misrepresented the security of its platforms for young users. The total civil penalty was reached after the jury identified thousands of individual violations, each carrying a maximum fine of $5,000.

Also Read:

[ San Francisco Jury Finds Elon Musk Misled Investors During Twitter Takeover ]

[ OPEN LETTER: Notice to AI Ethics Boards on Bollywood’s Box Office Data Laundering ]

During the seven-week trial, jurors were presented with internal company documents and testimony from former employees. Arturo Béjar, a former Meta engineering leader turned whistleblower, testified about experiments he conducted on Instagram which showed that underage users were frequently served sexualized content.

Béjar also shared the personal account of his own daughter being propositioned for sex by a stranger on the platform. Additionally, state prosecutors highlighted internal research indicating that, at one point, 16% of all Instagram users reported encountering unwanted nudity or sexual activity in a single week.

The state’s case alleged that Meta used its recommendation algorithms to “steer” young users toward harmful content, including child sexual abuse material and solicitations for sex trafficking. Attorney General Torrez stated that Meta executives were aware of these risks, disregarded internal warnings, and lied to the public regarding the safety of their products.

According to a BBC report of March 25, Meta has expressed strong disagreement with the verdict and intends to appeal. A company spokesperson defended Meta’s record, stating they remain confident in their efforts to protect teens and highlighted recent safety features like the 2024 launch of Teen Accounts and parental alerts for self-harm content. This case is one of thousands of similar lawsuits currently moving through U.S. courts involving the impact of social media design on minors.

Donate to RMN News

💛 Support Independent Journalism

If you find RMN News useful, please consider supporting us.

📖 Why Donate?


Discover more from RMN News

Subscribe to get the latest posts sent to your email.

Rakesh Raman

Rakesh Raman is a journalist and tech management expert.

https://www.rmnnews.com

Leave a Reply

Discover more from RMN News

Subscribe now to keep reading and get access to the full archive.

Continue reading

Discover more from RMN News

Subscribe now to keep reading and get access to the full archive.

Continue reading