New Mexico gets a historic win after jury finds firm misled consumers over safety and enabled harm against users.
A New Mexico jury on Tuesday ordered Meta to pay $375 Million in civil penalties after it found the company misled consumers about the safety of its platforms and enabled harm, including child sexual exploitation, against its users. This is the first jury trial to find Meta liable for acts committed on its platform. The lawsuit was brought by Torrez’s office in December 2023. The lawsuit followed a two-year Guardian investigation published in April of that year revealing how Facebook and Instagram had become marketplaces for child sex trafficking. That investigation was cited several times in the complaint. The jury ordered Meta to pay the maximum penalty under the law of $5,000 per violation, totaling $375m in civil penalties for violating New Mexico’s consumer protection laws. The jury found Meta liable for both claims brought by the state of New Mexico under the Unfair Practices Act. Meta has said it will appeal the ruling, and accused Torrez of making “sensationalist, irrelevant arguments by cherrypicking select documents”.
Internal Meta documents and testimony obtained by the New Mexico department of justice during the litigation revealed that both company employees and external child safety experts repeatedly warned about risks and harmful conditions on Meta’s platforms. Evidence presented to the jury included details of the 2024 arrest of three men charged with sexually preying on children through Meta’s platforms, and attempting to meet up with them. This was part of a sting investigation operated by undercover agents and dubbed “Operation MetaPhile” by the attorney general’s office. The New Mexico court heard how Meta’s 2023 decision to encrypt Facebook Messenger – its direct messaging platform, which predators have used as a tool to groom minors and exchange child abuse imagery – blocked access to crucial evidence of these crimes.
Witnesses from law enforcement and the National Center for Missing and Exploited Children (NCMEC) testified about deficiencies in Meta’s reporting of crimes taking place on its platforms, including the exchange of child sexual abuse material (CSAM). Meta has generated high volumes of “junk” reports by overly relying on AI to moderate its platforms, investigators said. These reports were useless to law enforcement, and meant crimes could not be investigated, they said.
In taped depositions played at the trial, the Meta chief Mark Zuckerberg and Instagram leader Adam Mosseri said harms to children, such as sexual exploitation and detriments to mental health, were inevitable on the company’s platforms due to their vast user bases. Company executives also testified the company had invested billions in technology updates to keep children safe on their platforms. They include Instagram Teen Accounts, which debuted in 2024 and sets default protections for users aged between 13 and 17. Social media companies have long maintained they are not responsible for crimes committed via their networks because of a US federal law that generally protects platforms from legal liability for content created by their users: section 230 of the Communications Decency Act. Meta’s attempts to invoke section 230 and the first amendment to get the case dismissed were denied in a judge’s ruling in June 2024, due to the lawsuit’s focus on Meta’s platform product design and other non-speech issues, such as internal decisions about content and curation.