Meta faces first of many lawsuits over allegedly intentional addictive design, harm to kids
Social media companies bear responsibility for willful negligence in growing addictive platforms that directly harm children, a lawsuit argues, in a case facing jury selection…
Social media companies bear responsibility for willful negligence in growing addictive platforms that directly harm children, a lawsuit argues, in a case facing jury selection Tuesday in Los Angeles.
“It is the first case where a jury will hear the evidence that the social media companies intentionally designed their platforms to addict children for the purpose of selling the children’s minutes to advertisers,” the plaintiffs’ attorney, Josh Autry of the firm Morgan & Morgan, told Court House News. “And the consequences of that – we now have a generation of children that have been struggling with this social media addiction.”
The lawsuit filed against Meta – parent company of both Facebook and Instagram – focuses on the case of a 19-year-old girl, identified as “K.G.M.,” who says her use of social media from a young age created an addiction to the platform and magnified her struggle with depression and suicidal thoughts, Fox 59 reports.
“Plaintiffs are not merely the collateral damage of Defendants’ products,” the lawsuit says, quoted by AP News. “They are the direct victims of the intentional product design choices made by each Defendant. They are the intended targets of the harmful features that pushed them into self-destructive feedback loops.”
TikTok and Snapchat were both originally included in the lawsuit, but each reached separate, undisclosed settlements before the case entered jury selection Tuesday, Politico reports.
The lawsuit attempts to focus on the product design itself and thus dodge the usual protection social media companies claim under Section 230 of the Communications Decency Act, which shields companies from responsibility for third party content displayed on their apps, according to Court House News.
“The evidence is quite strong that the decisions in these cases were not made by rogue level employees,” Autry said. “When you look at the highest level executives – they were directing these decisions to choose business growth over safety features. To choose growing the amount of time that’s being consumed by children and teenagers, as opposed to putting reasonable limits on that time.”
These “highest-level executives,” including Meta CEO Mark Zuckerberg and Head of Instagram Adam Mosseri, may be called to testify before the jury, Politico reports. Meta claims mental health is a “deeply complex and multifaceted issue” and argues lawyers paint a “misleading narrative” by taking “snippets of conversations out of context,” according to the company’s statement. In fact, Meta claims social media brings children great benefits.
“We strongly disagree with these allegations and are confident the evidence will show our longstanding commitment to supporting young people,” a Meta spokesperson told The Lion in a statement. “For over a decade, we’ve listened to parents, worked with experts and law enforcement, and conducted in-depth research to understand the issues that matter most. We use these insights to make meaningful changes – like introducing Teen Accounts with built-in protections and providing parents with tools to manage their teens’ experiences.”
Meta and other social media companies face a slew of lawsuits in 2026 accusing it of pursuing profit to the detriment of children’s wellbeing.
Additional federal cases, filed in the Northern District of California, include an alliance of school districts claiming social media companies were negligent to warn parents and teachers of the dangers and time-suck of their platforms. More than 30 state attorneys general have also filed suit against Meta for deliberately designing addictive features that capture kids, according to Politico. Trials for these cases are set to begin mid-June.
For now, K.G.M.’s suit serves as a test case on whether social media companies are culpable for their addictive design.


