Meta Found Liable for Harming Children’s Mental Health
A New Mexico jury has found Meta, the parent company of Facebook and Instagram, liable for knowingly harming children's mental health and concealing its knowledge of child exploitation. The company was fined $375 million, a verdict seen as a landmark decision that could challenge the legal protections afforded to tech giants.
Meta Faces Landmark Jury Verdict on Child Harm
In a significant legal development, a jury in New Mexico has found social media giant Meta, the parent company of Facebook, Instagram, and WhatsApp, guilty of knowingly harming the mental health of children. The jury also determined that Meta concealed its knowledge of child sexual exploitation occurring on its platforms. This decision follows a seven-week trial and resulted in a $375 million fine against the tech company.
This marks the first time a jury has ruled against Meta on claims involving child exploitation. Legal experts are calling it a major victory in the ongoing effort to hold social media companies accountable for their impact on young users.
Jury’s Findings: Profits Over Protection
The New Mexico jury concluded that Meta engaged in an “unconscionable trade practice” by knowingly harming children’s mental health. They found thousands of violations across Meta’s platforms, leading to the substantial fine. While prosecutors had sought a larger sum, arguing Meta prioritized profits over child safety, the $375 million penalty was levied against a company valued at $1.5 trillion.
During the trial, prosecutors presented evidence that Meta’s actions were repeated over a decade, with the company knowing serious harm would result to children, and that harm did indeed occur. Meta, however, acknowledged the presence of harmful material on its sites but maintained it was doing everything possible to minimize risks.
Meta’s defense stated that the company discloses the risks associated with its platforms and that the evidence showed Meta works diligently to protect its users, especially teenagers. Mark Zuckerberg, Meta’s CEO, has consistently defended the company’s policies, noting that even with billions of users, a small percentage may be criminals, and the company strives to prevent such activity.
Legal Precedent and Section 230
The New Mexico verdict is seen as significant because it begins to pierce the legal shield known as Section 230. This law has historically protected tech companies from liability for content posted by their users. “This verdict is one of the first that really does not abide by the Section 230 architecture,” explained David Dean, executive editor of The American Prospect. “It enables some liability to pierce through for these tech platforms.”
Unlike previous cases that focused on user-posted content, this trial and a similar one in California centered on the design of the platforms themselves. Arguments included whether Meta deliberately made its platforms addictive and how its algorithms impacted users. The potential remedies sought, such as changes to the platform’s design and algorithms, could have far-reaching effects beyond monetary fines.
Potential for Systemic Change
The New Mexico Attorney General plans to seek injunctive relief, which could force Meta to make changes to its platform’s design and algorithms. Experts suggest this could lead to features like stricter age verification and algorithms that are less addictive. If upheld, this ruling could set a precedent for other states and countries looking to regulate social media’s impact on young people.
“If New Mexico is successful, it’s not going to end in New Mexico,” Dean noted. “Other states are going to adopt this. Other countries could look to this precedent.” This case is part of a broader trend of increased scrutiny on social media platforms worldwide, with countries like Australia already implementing measures like banning platforms for users under 16.
A Turning Point for Social Media Regulation?
While Meta plans to appeal the verdict, potentially taking the case to the Supreme Court, the ruling has increased pressure on the company to reconsider its business model and platform design. The accumulation of years of negative stories and a growing awareness of the harms caused by social media are contributing to this moment.
Some observers see this as a turning point, influenced partly by a desire among regulators to avoid repeating past mistakes with new technologies like artificial intelligence. The push for regulation is also evident at the state level in the U.S., where lawmakers are actively pursuing policies related to AI and online platforms, even as federal efforts sometimes lean towards deregulation.
Broader Implications and Future Outlook
A separate case in California awarded over $3 million to a young user who suffered mental distress due to the addictive design of Meta and YouTube platforms. Douglas Farah, former director of public affairs at the Federal Trade Commission, highlighted the significance of these rulings, stating, “for the first time, a jury of Americans did what the federal government… have been unable to do so far, and that’s find these companies liable and guilty of basically turning their platforms into an incredibly addictive and dangerous place for children.”
Farah compared the current situation to the fight against the tobacco industry, where state-level actions eventually led to significant changes and accountability. He believes these social media cases represent a “major inflection point.” While the financial penalties may seem small for these tech giants, the cost and complexity of facing numerous lawsuits, coupled with reputational damage, could drive significant behavioral changes.
The success of these cases, which focus on platform design rather than user content, could also worry other tech sectors, including artificial intelligence. If algorithms and platform concepts are found liable for harm, it could impact the development of future technologies. As more lawsuits emerge and regulatory pressure mounts, the way social media companies operate may be on the verge of a significant transformation.
Other countries are also exploring ways to address internet addiction and its effects. Australia has banned many social media platforms for users under 16, and Europe has seen experiments like a three-week smartphone-free challenge for thousands of schoolchildren, which revealed positive outcomes in terms of well-being and engagement.
Source: Meta found guilty of harming children's mental health — can this change Meta's way? | DW News (YouTube)





