Meta Hit With $375 Million Fine Over Child Safety Violations

0

One of the world’s leading technology companies, Meta Platforms Inc. has been ordered to pay $375 million after a jury found the tech giant violated consumer protection laws by misleading users about the safety of its platforms and failing to prevent child sexual exploitation.

The verdict, delivered last Tuesday, marks the first time a jury has ruled against the company on claims of this nature.

The case was filed by the New Mexico’s attorney general, Raúl Torrez, who accused Meta of falsely portraying Facebook, Instagram, and WhatsApp as safe for children while failing to address risks tied to harmful content and exploitation.

Jurors found Meta guilty of engaging in unfair and deceptive trade practices under the state’s consumer protection laws, reinforcing growing scrutiny of Big Tech’s responsibility to safeguard users, especially minors.

Torrez described the outcome as a landmark moment in holding tech companies accountable.

“This is a historic victory for every child and family who has paid the price for Meta’s choice to put profits over kids’ safety,” he said, adding that the scale of the damages should serve as a warning to other firms in the industry.

“The substantial damages the jury ordered Meta to pay should send a clear message to big tech executives that no company is beyond the reach of the law.”

Meta, however, has rejected the ruling and confirmed plans to appeal.

“We respectfully disagree with the verdict and will appeal. We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content,” the company said in a statement.

The lawsuit stems from a 2023 undercover investigation by the attorney general’s office, which uncovered serious gaps in Meta’s content moderation systems. Investigators created fake accounts posing as users under 14 on Facebook and Instagram. These accounts were quickly exposed to explicit material and contacted by adults seeking similar content, resulting in several criminal charges.

According to the state, Meta continued to publicly assure users that its platforms were safe for children despite internal evidence highlighting risks related to sexual exploitation and mental health harm.

The suit also alleged that the company deliberately designed features such as infinite scroll and auto-play to maximise engagement—even when those features contributed to addictive behaviour among younger users.

In total, the jury found Meta committed 75,000 violations of state law, awarding $5,000 per violation, which led to the $375 million penalty.

The ruling comes amid intensifying global efforts to regulate social media and improve child safety online. In Nigeria, the federal government is currently consulting on potential age restrictions for social media use as part of broader measures to protect minors.

Elsewhere, countries are already taking action. Australia has introduced a ban on social media access for users under 16, requiring platforms to enforce stricter controls. Indonesia is considering similar restrictions, while Denmark is moving toward banning social media use for children under 15 with strong political backing.

Together, these developments signal a global shift toward tighter regulation of digital platforms, as governments respond to rising concerns around child safety, mental health, and online exploitation.

Leave A Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.