Meta CEO Mark Zuckerberg appeared in a Los Angeles courtroom on Wednesday to defend Instagram against allegations it harms young users.
Facing evidence of internal strategies aimed at “tweens,” he maintained the platform does not target children under 13.
Zuckerberg was pressed by lawyer Mark Lanier, representing a woman suing Instagram and YouTube for mental health harms she experienced as a child.
Lanier cited Meta documents showing Instagram once considered strategies to attract users younger than 13, including statements such as, “If we want to win big with teens, we must bring them in as tweens.”
Zuckerberg responded, saying these discussions never resulted in a product and that Meta has explored “different versions of services that kids can safely use,” but no platform for under-13 users was launched.
Allegations and legal context
The plaintiff claims she developed depression and suicidal thoughts from early use of Instagram and YouTube. Meta and Google deny wrongdoing, highlighting safety features and parental controls.
Zuckerberg’s testimony marks his first public court appearance on the issue of Instagram’s impact on youth mental health. The trial could set a precedent for other lawsuits targeting social media companies over their influence on young users.
Internal Meta documents
Jurors were shown Meta internal documents outlining goals to increase daily Instagram usage, including milestones from 40 minutes in 2023 to 46 minutes in 2026. Zuckerberg emphasized these were “gut checks” rather than formal employee targets, intended to measure service engagement rather than deliberately increase screen time.
Earlier emails also revealed discussions acknowledging difficulty in enforcing age limits, with Meta’s VP of global affairs, Nick Clegg, noting that differing policies between Instagram and Facebook complicated efforts. Zuckerberg said verifying user age is challenging and that responsibility lies partly with mobile device makers.
The trial is part of a growing wave of litigation in the U.S. against Meta, Google, Snap, and TikTok over youth mental health. Families, school districts, and states have filed thousands of lawsuits claiming social media platforms exacerbate mental health issues among children and teens.
A verdict against Meta could challenge long-standing legal protections shielding tech companies from liability for content decisions. Investigative reporting has revealed that internal research at Meta identified risks, including “eating disorder adjacent content” impacting teen users.
Global context
Countries worldwide are considering or implementing stricter regulations for social media use among children. Australia restricts platforms for users under 16, while Florida prohibits under-14 access. Tech industry groups are currently challenging some of these laws in court, signaling a global reckoning over youth digital safety.







