Meta faces another hurdle related to child abuse and safety on its social media platforms. In a landmark ruling that could affect the way Big Tech takes accountability, a New Mexico jury has ordered Meta to pay $375 million in civil penalties after finding that the company misled consumers about the safety of Facebook and Instagram, knowingly endangering children on its platforms. While Meta has faced similar legal trials elsewhere, this verdict marks Meta’s first jury loss in any child safety lawsuit.
After nearly seven weeks of trial under New Mexico’s Unfair Practices Act, the jury determined that Meta violated the state’s consumer protection law through “unfair and deceptive” and “unconscionable” trade practices that took advantage of children’s vulnerabilities.
New Mexico Attorney General Raúl Torrez hailed the decision as a “watershed moment.”
“Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew. Today the jury joined families, educators, and child safety experts in saying enough is enough,” Torrez said.
Undercover investigation exposes Meta’s failures
The case against Meta emerged from an undercover investigation in 2023, where state investigators created fake accounts posing as children under 14. These profiles were reported to be rapidly flooded with sexually explicit messages and solicitations. The operation led to several arrests, including two men who arrived at a motel expecting to meet a 12-year-old girl.
The prosecutors argued that Meta’s sophisticated recommendation algorithms, which are highly effective at delivering targeted ads, were equally capable of connecting predators with vulnerable minors.
Former Meta employee Arturo Béjar testified during the trials, stating that the platform excels at connecting users with shared interests, including harmful ones. “If your interest is little girls, it will be really good at connecting you with little girls,” he stated.
Another ex-executive, Brian Boland, told the jury that when he left Meta in 2020, “safety was absolutely not a priority” for CEO Mark Zuckerberg and the then-COO, Sheryl Sandberg. Internal documents revealed that researchers had studied features designed to trigger dopamine responses and boost engagement time.
Zuckerberg and Meta’s response
While Zuckerberg couldn’t attend the court proceeding, he shared a recorded deposition. When asked whether he had a right to know if a product was addictive before allowing his own children to use it, he replied that there was “a lot to unpack in that.” He added that Zuckerberg and his wife personally research products and monitor their kids’ usage.
Meta, however, strongly disagreed with the court’s verdict. A company spokesperson said it “respectfully disagrees” since Meta “works hard to keep people safe,” including through age restrictions and AI tools to detect harmful content. The company plans to appeal the decision.
A second phase of the trial, addressing public nuisance claims, is scheduled to begin on May 4 and could result in additional penalties or court-ordered platform changes.
