
US Courts Hold Meta and YouTube Liable for Addictive Social Media Design and Harm to Minors
US courts have ordered Meta and YouTube to pay millions in damages for intentionally addictive platform designs and failure to protect minors, setting a precedent for future litigation and industry regulation
Key Points
- 1A Los Angeles jury found Meta and YouTube liable for mental health harm caused by addictive social media design
- 2Meta was ordered to pay $3 million in damages, with YouTube covering the remainder, and faces an additional $375 million judgment in New Mexico
- 3Court findings cited features like infinite scroll and autoplay as deliberate mechanisms to encourage compulsive use
- 4Evidence showed that fake child profiles on Meta platforms were quickly targeted by predators, highlighting safety failures
- 5Approximately 1,500 families are preparing lawsuits against Meta, signaling a rising wave of litigation challenging social media business models
In a landmark legal development, courts in the United States have found social media giants Meta and YouTube liable for intentionally addictive platform designs that contribute to mental health issues among minors. The decisions, delivered by juries in Los Angeles and New Mexico, mark a turning point in how the legal system addresses the impact of digital technologies on young users. The Los Angeles ruling ordered Meta and YouTube to pay $3 million in damages after a young woman developed social media addiction as a child. Meanwhile, a separate New Mexico verdict directed Meta to pay an additional $375 million for failing to protect minors from harmful content and exploitation
The cases center on the structural elements of platforms like Instagram and YouTube, including features such as infinite scroll, autoplay, and persistent notifications. These mechanisms, according to court findings, are "mechanisms that encourage compulsive use" rather than accidental byproducts of modern technology. Evidence presented in court revealed that Meta and YouTube deliberately engineered their platforms to maximize user engagement, with internal communications highlighting goals to increase user time on site. Mark Zuckerberg himself acknowledged in court that teams were previously given targets to increase user engagement, although he described such directives as outdated
The consequences of these design choices are far-reaching, with recent studies showing that global screen time averages more than six hours per day, and social media use alone accounting for over two hours daily. In countries like Brazil, Chile, and South Africa, social media consumption can exceed 20% of waking hours, with teenagers as the heaviest users. Research indexed in SciELO and referenced by the Pew Research Center draws a direct line between increased screen time and rising levels of anxiety and depression among adolescents, especially those spending more than four hours online daily
The litigation also brought to light significant safety concerns for minors. Prosecutors demonstrated that fake profiles posing as children on Meta platforms quickly attracted predatory attention and sexually explicit messages, exposing critical gaps in platform safeguards. As prosecutors argued, "the company allowed predators to access and contact underage users for years, enabling situations that could escalate into real-world abuse." The New Mexico jury found that Meta had violated consumer protection laws by withholding information about these risks and failing to implement effective protections or provide transparent warnings to users
These rulings have set off a wave of further legal action, with approximately 1,500 families reportedly preparing lawsuits against Meta this year alone. Legal experts have drawn parallels between these cases and historic litigation against the tobacco industry, signaling a potential shift in the regulatory landscape for tech companies. For the first time, courts are scrutinizing not just the content on social media platforms, but the very architecture designed to keep users engaged and, in some cases, addicted. From the OG Lab newsroom perspective, this marks a pivotal moment for digital accountability, with the outcomes likely to influence technology regulation, platform design standards, and the global conversation around digital well-being for years to come