Instagram Chief Denies Platform Is 'Clinically Addictive' in Landmark Youth Harm Trial
Instagram Chief Rejects 'Clinically Addictive' Claim in Trial

Instagram Chief Denies Platform Is 'Clinically Addictive' in Landmark Youth Harm Trial

Instagram head Adam Mosseri testified on Wednesday, February 12, that he does not believe users can be "clinically addicted" to the social media platform, as he took the stand in a closely watched trial that could reshape how tech companies are held accountable for alleged harm to young users. This case is the first of more than 1,500 similar lawsuits to reach trial and is widely viewed as a test of whether social media platforms can be held legally responsible for youth addiction and psychological harm.

Testimony on Addiction and Problematic Use

During questioning by plaintiff's attorney Mark Lanier, Mosseri rejected the notion that Instagram is inherently addictive. "I don't think it's possible to be addicted to Instagram," Mosseri said, though he acknowledged that "problematic use" can occur. He compared excessive scrolling to "watching TV for longer than you feel good about," while conceding he is not a medical professional. "It's relative," he said. "Yes, for an individual, there's such a thing as using Instagram more than you feel good about."

Lanier pressed Mosseri on whether Instagram prioritizes growth and profits over the well-being of minors. Mosseri denied that the company targets teens for financial gain. "We make less money from teens than any other demographic on the platform," he said, noting that younger users generally click fewer ads and have limited disposable income.

Allegations of Intentional Design and Harmful Features

Kaley's attorneys argue that features such as infinite scroll, autoplay, and the "like" button were intentionally engineered to keep young users engaged. Lanier described the "like" feature as delivering a "chemical hit" that teens seeking validation can become dependent on. Kaley began using Instagram at age nine, despite the platform's minimum age requirement of 13. Her lawsuit also claims Instagram's "beauty filters" contributed to body image issues and that she experienced bullying and sextortion on the app.

When told that Kaley once spent more than 16 hours on Instagram in a single day, Mosseri responded: "That sounds like problematic use." A significant portion of Wednesday's testimony focused on Instagram's appearance-altering filters. Lanier cited internal 2019 emails in which Meta executives discussed concerns that certain filters could fuel body dysmorphia.

One internal message referenced experts who were "unanimous on the harm there," while another warned of encouraging "young girls into body dysmorphia." Mosseri testified that Instagram initially moved to ban filters that distorted facial features but later revised its approach. Filters that explicitly promoted cosmetic surgery—such as adding scars typical of surgical procedures—were removed. Others that subtly altered facial features, like enlarging lips or slimming noses, were allowed but no longer actively recommended.

Growth Targets and Competitive Pressures

Lanier questioned whether such decisions were influenced by growth targets and competition in international markets. An internal email presented in court suggested banning certain filters could "limit our ability to be competitive in Asian markets." "I was never concerned with any of these things affecting our stock price," Mosseri said. Mosseri confirmed his base salary is about $900,000 annually, though total compensation can exceed $10 million in some years, including bonuses and stock awards.

Background and Broader Implications

The trial comes years after whistleblower Frances Haugen released internal documents in 2021 indicating Meta was aware Instagram could negatively impact teen girls' mental health. That same year, Mosseri told lawmakers he supported stronger online safety regulations. Meta has rolled out new safeguards in recent years, including "teen accounts" with default privacy settings and content restrictions. The company has said it "strongly disagrees" with the allegations in Kaley's case.

Meta attorney Paul Schmidt argued in opening statements that Kaley's mental health challenges stemmed from difficulties in her personal life rather than Instagram use. He cited pretrial testimony from therapists who, he said, did not consider the platform central to her struggles. In a statement Wednesday, a Meta spokesperson said the evidence would show that Instagram "was not a substantial factor" in the plaintiff's mental health issues.

Kaley's legal team disputes that characterization. Attorney Matthew Bergman said Mosseri's testimony demonstrates executives "made a conscious decision to put growth over the safety of minors." The case is unfolding under the constraints of Section 230, the federal law that shields tech companies from liability for user-generated content. Judge Carolyn Kuhl has directed attorneys not to question Mosseri about specific content Kaley encountered on the platform, limiting the scope of testimony.

The outcome of the trial could have sweeping implications for the tech industry, as hundreds of similar lawsuits await their turn in court. This landmark case highlights ongoing debates about social media's role in youth mental health and corporate accountability in the digital age.