WASHINGTON (TNND) — Newly unsealed court documents are raising serious questions about what Instagram knew about harmful content on its platform, and when it knew it, particularly when it came to teens being exposed to posts related to suicide and eating disorders.
The documents were revealed as part of ongoing litigation in In re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, a massive federal case involving several technology companies. Portions of the filings were recently made public, offering a rare look inside internal conversations at Meta, the parent company of Instagram.
According to a 56-page opposition brief filed in the case, internal materials show a gap between Instagram’s public safety commitments and internal discussions about how harmful content appeared on the platform.
Internal documents highlight teen exposure to harmful content
The newly released documents reference internal data indicating hundreds of thousands of mentions of suicide on Instagram, and acknowledge that certain types of harmful content had an outsized teen audience.
LOS ANGELES, CALIFORNIA – FEBRUARY 18: Lori Schott , wears a picture of her daughter Annalee who died by suicide after consuming social media content on depression, anxiety and suicide, stands outside the Los Angeles Superior Court at United States Court House on February 18, 2026 in Los Angeles, California. A 20-year-old California woman sued Meta and YouTube accusing them of building addictive platforms causing harm to children. Schmitt is not part of this case but has a separate social media case and came to advocate and raise awareness. (Photo by Jill Connelly/Getty Images)
In one company PowerPoint cited in the filing, posted by Court Listener, Instagram employees wrote:
“Teens’ behavior on IG suggests a need for more support. We know that SSI (suicidal ideation) and ED (eating disorders) have a significantly disproportionate large teen audience.”
The presentation also reportedly noted that parents had asked the company for stronger tools to block harmful content from reaching teenagers, and suggested that competitors like TikTok were “seen as offering more safety measures.”
These internal findings contrast with statements made publicly by Instagram leadership in previous years. In 2019, Instagram head Adam Mosseri said the platform would block graphic self-harm content from appearing in searches, hashtags, and recommendations as part of safety improvements for young users.
Emails show concerns about media scrutiny
Other internal conversations surfaced in the filings appear to show company employees discussing the potential public relations impact of reporting on the issue.
According to an internal message referenced in court records, employees were discussing how harmful content surfaced in Instagram searches after a reporter from The Telegraph contacted the company in September 2020.
One internal comment cited in the filing reads: “On search we’re exposed with nowhere to hide.”
The email discussion also reportedly weighed whether restricting certain content in search results could conflict with other product priorities such as search functionality or shopping features.
A landmark social media trial underway in Los Angeles
The newly public documents come as a landmark civil trial unfolds in Los Angeles, where social media companies are facing allegations that their platforms were intentionally designed to be addictive to children and teenagers.
The lawsuit centers on claims brought by a 20-year-old woman named Kaley and her mother, who argue that several social media companies designed their platforms in ways that fueled compulsive use and contributed to serious mental health struggles, including an eating disorder, anxiety, and depression.
The companies named in the broader litigation include:
Snap and TikTok have settled some claims outside of court, while Meta and YouTube continue to contest the allegations.
Zuckerberg and Mosseri testify
Executives from major tech companies have been called to testify during the proceedings.
Meta CEO Mark Zuckerberg and Instagram head Adam Mosseri have both appeared in court, defending the company’s efforts to address teen safety on its platforms, according to NPR.
During testimony reported by CNN, Mosseri acknowledged that extremely heavy social media use could be problematic for teens.
He said scrolling for as much as 16 hours per day could be “problematic,” but he argued it should not necessarily be considered “clinically addictive.”
Meta and other tech companies have repeatedly argued that scientific research has not conclusively proven that social media causes addiction or mental health disorders, though critics say platform design can intensify harmful behaviors.
What the case could mean for social media regulation
The trial could ultimately shape how social media platforms are regulated when it comes to younger users.
Some companies have already begun rolling out new safety measures, including age-based content filtering systems similar to movie ratings, aimed at limiting what types of posts can be recommended to minors.
But the central question now before the court remains the same one raised by many critics of the industry: Did social media companies fail to fix harmful algorithms — or did they choose not to?
—
If you know someone who is struggling with thoughts of suicide, call the crisis hotline at 988 or click here for more information.