LOS ANGELES — A U.S. jury has found Meta Platforms and Google liable for designing social media platforms that harm children and teenagers, in a landmark verdict that could reshape how tech companies address safety concerns, March 25, 2026. The ruling marks a significant moment in the growing scrutiny of social media’s impact on young users.
The case centered on a now 20-year-old woman, identified in court as Kaley, who argued that she became addicted to YouTube and Instagram as a minor due to features designed to maximize user engagement. The jury concluded that both companies were negligent in the design of their platforms and failed to adequately warn users about potential risks.
Meta was ordered to pay $4.2 million in damages, while Google faces $1.8 million in penalties. Although relatively small compared to the companies’ massive revenues, the verdict carries broader implications for the technology industry.
“This verdict is a referendum … that accountability has arrived,” the plaintiff’s lead attorney said in a statement, emphasizing the significance of the ruling beyond the financial penalties.
The lawsuit focused on platform design rather than user-generated content, a strategy that made it more difficult for the companies to rely on traditional legal protections. By targeting the mechanics of engagement—such as algorithms and user interface design—the case challenged the core business models of social media platforms.
Meta said it disagrees with the decision and is reviewing its legal options. Google confirmed it plans to appeal the ruling. Despite the outcome, shares of both companies rose slightly following the announcement, reflecting investor confidence that the financial impact will remain limited in the short term.
Industry analysts say the case could have lasting consequences. Gil Luria of D.A. Davidson described the verdict as a “setback” that may lead to increased regulatory and legal pressure on major tech firms. He noted that future cases and appeals could push companies to introduce stronger consumer protections, even if such measures affect user growth.
The case is part of a broader global trend, as governments and regulators increasingly examine the effects of social media on mental health, particularly among younger users. Concerns over screen time, addictive design features and exposure to harmful content have prompted calls for stricter oversight.
Other companies were also involved in the case. Snap and TikTok were initially named as defendants but reached settlements with the plaintiff before the trial began. The terms of those agreements were not disclosed.
The verdict could encourage similar lawsuits and influence regulatory approaches in the United States and beyond. While the legal battle is likely to continue through appeals, the decision signals a shift toward holding technology companies accountable for how their platforms are designed and used.
As scrutiny intensifies, the tech industry may face growing pressure to balance engagement-driven business models with stronger safeguards for vulnerable users, particularly children and teenagers.