Meta Hit with $375 Million Fine for Child Safety Failures and Deceptive Practices
Meta Fined $375M Over Child Safety Violations

Meta Platforms Ordered to Pay $375 Million Over Child Safety Violations

A jury in New Mexico has delivered a landmark verdict against Meta Platforms, ordering the social media giant to pay a substantial $375 million penalty for violating consumer protection laws. The ruling represents the first time a jury has found Meta guilty of misleading users about the safety of its platforms while simultaneously enabling child sexual exploitation through inadequate content moderation systems.

Deceptive Practices and Child Exploitation Concerns

The case, brought by New Mexico Attorney General Raúl Torrez, centered on allegations that Meta falsely presented Facebook, Instagram, and WhatsApp as safe environments for children despite internal evidence highlighting significant risks. The jury determined that Meta engaged in unfair and deceptive trade practices under state consumer protection law, specifically finding the company committed 75,000 violations at $5,000 per violation.

The attorney general's office conducted an undercover investigation in 2023 that revealed serious deficiencies in Meta's content moderation approach. Investigators created fake accounts posing as users under 14 years old and documented how these accounts were quickly exposed to explicit material and contacted by adults seeking similar content. This investigation formed the foundation of the state's legal arguments against the technology company.

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

Platform Design and Engagement Features Under Scrutiny

The lawsuit further alleged that Meta deliberately designed certain platform features to maximize user engagement, even when these features contributed to addictive behaviors among younger users. Specific design elements mentioned in the case included:

  • Infinite scroll functionality that encourages prolonged platform usage
  • Auto-play video features that create continuous content consumption
  • Algorithmic recommendations that may expose users to harmful material

According to the state's arguments, Meta continued to publicly assure families about platform safety while internal documentation highlighted risks related to both exploitation and mental health impacts on younger users.

Meta's Response and Global Implications

Meta has stated it disagrees with the jury's verdict and plans to appeal the decision. The company maintains that it continues to invest significantly in safety measures and works diligently to remove harmful content from its platforms. A Meta spokesperson emphasized ongoing efforts to improve content moderation systems and protect younger users across all company services.

This ruling arrives amid increasing global scrutiny of social media platforms and growing governmental efforts to strengthen online protections for children. Countries including Australia, Indonesia, and Denmark are already moving toward implementing stricter age restrictions on social media use, reflecting a broader international push to address concerns around child safety, mental health impacts, and online exploitation risks.

Attorney General Raúl Torrez described the New Mexico verdict as a significant victory for families affected by Meta's practices and emphasized that the ruling sends a powerful message to major technology firms about their responsibilities toward younger users. The case highlights ongoing tensions between social media platforms' business models and their obligations to protect vulnerable populations from harm.

Pickt after-article banner — collaborative shopping lists app with family illustration