The Scott Jennings Show
8:00 pm - 10:00 pm

WASHINGTON (TNND) — Some of the world’s biggest tech companies are facing landmark trials that could alter the outlook of the social media industry when a jury decides whether their platforms are addictive and hurting young people.
The first in a series of trials started in a state court in Los Angeles on Tuesday with jury selection in a case claiming a young woman suffered all kinds of mental health problems because of social media platforms built to foster addiction. It is a first-of-its kind trial that could pave the way for thousands of similar lawsuits brought by other individuals, school districts and state attorneys general.
The trial will be a key test for social media companies that have mostly been able to escape legal scrutiny thanks to Section 230, which protects publishers from being held liable for content generated by their users. The lawsuit, brought by a young woman being identified as KGM, says she became addicted to social media as a child and suffered mental health issues as a result.
“This is more about the design patterns and the features, that’s what makes this really novel,” said Adam Peruta, an associate professor and director of Syracuse University’s Advanced Media Management program. “The question is — do platforms genuinely reduce those patterns, or are they just superficial friction?”
Meta CEO Mark Zuckerberg and Instagram chief Adam Mosseri are expected to testify during the trial that is expected to run for six to eight weeks. TikTok reached a settlement to avoid being involved in the trial just hours before jury selection started on Tuesday in California. Snap, the parent company of Snapchat, settled the case last week, leaving just Meta and YouTube as defendants.
Judges have bundled other cases from the thousands of lawsuits that have been filed that will be heard once KGM’s concludes. A second set of federal cases will go to trial in Oakland, California, where school districts and states are claiming social media is a public nuisance passing costs of treating addictive social media usage onto the public.
KGM’s lawsuit claims social media companies created features like infinite scrolling, algorithmic recommendations and auto-play on videos to increase time spent on platforms. Those features have led to mental health issues like depression, anxiety, eating disorders and self-harm, according to the lawsuit.
Endless content loops and similar features have also become more common across platforms, particularly since the proliferation of TikTok during the COVID pandemic. Algorithms are engineered to identify what a particular users’ interests are and continue feeding them content to watch.
“We have seen the social media platforms heavily lean into this over the past five to seven years, with infinite scrolling and tailoring the algorithm specifically to you,” Peruta said. “They’re definitely part of the user experience that makes people spend more time with those products.”
The lawsuit has drawn comparisons to Big Tobacco and opioid lawsuits that brought massive settlements requiring companies to pay out billions and face marketing restrictions.
Lawyers for the plaintiffs argue their clients are victims of active choices the companies made to prioritize profits over the wellbeing of their users. Big Tech companies have faced those accusations for years as outrage among Congress and parents has grown but are not facing any new regulations restricting how teens can interact with their products despite repeated pushes to get legislation signed into law.
“Plaintiffs are not merely the collateral damage of Defendants’ products,” the lawsuit says. “They are the direct victims of the intentional product design choices made by each Defendant. They are the intended targets of the harmful features that pushed them into self-destructive feedback loops.”
Tech companies have refuted those claims for years, pointing to splashy investments into parental controls, content moderation and other tools that have been added to their platforms as pressure has mounted on them. But parents and researchers have questioned the effectiveness of the tools that can easily be bypassed or go unused by parents who aren’t aware they even exist.
Scrutiny over Big Tech’s practices toward underage users have steadily picked up steam in recent years amid discovery of internal documents in lawsuits and congressional investigations that indicated companies were aware of the harms their products were causing but opting not to act. Whistleblowers have also come forward and spoken to Congress about a pattern of prioritizing profits over safety.
Those concerns are gaining new momentum with the rise of AI chatbots, many of which are created by the same tech companies, and whether there should be federal limitations about how youth can use them.
In a blog posted last week, Meta highlighted its child protection policies and claimed it has made countless changes that have cost it revenue and were unpopular amongst its teenage users for the sake of protecting them. It also disputed claims that social media companies are directly to blame for teen mental health struggles, arguing “this oversimplifies a very serious issue.”
“The plaintiffs’ lawyers will try to paint an intentionally misleading picture of Meta with cherry-picked quotes and snippets of conversations taken out of context. The full record will show a company that has consistently put teen safety ahead of growth for over a decade,” the company wrote.
YouTube has tried to remove itself from scrutiny by arguing that it is not a social media platform and is instead a streaming company at its core.
Individual plaintiffs are asking for compensation for pain and suffering they claim was caused by social media platforms. Districts and state AGs are also asking for monetary damages along with court orders requiring platforms to change their business practices and delete some types of data and algorithms that power recommendations on feeds.