Meta CEO Mark Zuckerberg is set to testify in a landmark California trial examining whether design features on Meta’s platforms – including Instagram – contribute to addiction and harm among young users. The case, unfolding in Los Angeles Superior Court, pits bereaved parents against the tech giant, raising critical questions about the responsibility of social media companies for the mental health of their youngest users.
The Core of the Lawsuit
The lawsuit centers on the claims of KGM, a 20-year-old who alleges that prolonged exposure to Meta and Google’s platforms as a child exacerbated her depression and suicidal thoughts. The plaintiffs argue that these companies deliberately engineered their platforms using addictive techniques, akin to those employed in casinos, to maximize engagement and profits at the expense of children’s well-being.
Initially filed against Meta, Google, TikTok, and Snap Inc., the case has seen settlements from TikTok and Snap before reaching trial. This bellwether trial – meaning its outcome could influence thousands of similar lawsuits – is testing the legal boundaries of tech accountability.
Zuckerberg’s Testimony and Key Arguments
Zuckerberg’s testimony will likely focus on Instagram’s algorithms and in-app features, which plaintiffs contend are designed to keep young users hooked. Meta’s defense, led by attorney Paul Schmidt, suggests that KGM’s mental health struggles stemmed from a difficult home life, not the platform itself, framing social media as a coping mechanism rather than a cause.
However, the judge has allowed the case to proceed, citing sufficient evidence to suggest Instagram’s engagement-driven features may have contributed to KGM’s mental health decline. Meta argues that features like “infinite scroll” cannot be blamed because users willingly choose to continue consuming content, a claim the court has allowed for jury consideration.
Broader Legal and Regulatory Context
This trial occurs as European regulators consider age-related restrictions on social media platforms, reflecting growing global concern over children’s safety online. In the United States, Section 230 of the Communications Decency Act has historically shielded social media companies from liability for third-party content.
However, this protection is being challenged, with the court allowing the case to move forward on the premise that platform design choices—not just user-generated content—may be culpable in harm.
Expert Testimony and Internal Findings
Instagram’s head, Adam Mosseri, testified last week, dismissing the idea of clinical addiction to social media but acknowledging the existence of “problematic use.” He asserted Meta’s commitment to protecting young users, claiming long-term profitability depends on their well-being.
Yet, a 2025 study by a Meta whistleblower, Arturo Béjar, and academics revealed that two-thirds of Meta’s safety tools were ineffective, exposing teen accounts to harmful content, including sexual material, self-harm themes, and body image issues.
The trial raises fundamental questions about the extent to which tech companies can be held responsible for the psychological effects of their products. The outcome will likely reshape the legal landscape for social media accountability, potentially forcing companies to re-evaluate their design choices and prioritize user safety over engagement at all costs.































