Blog
5 min read

A jury just ruled that addictive design harmed a child

Published on
March 26, 2026

Kaley started watching YouTube at 6. She opened an Instagram account at 9. By her early teens, she was dealing with severe depression, body dysmorphia, and suicidal thoughts. She testified in court that social media had become consuming enough that she stopped spending time with her family altogether. She described comparing herself constantly to other users, relying on beauty filters, and seeking out ways to buy likes.

Last week, a Los Angeles jury of five men and seven women delivered its verdict after more than forty hours of deliberation. Meta and Google-owned YouTube were negligent in designing their platforms. The companies were ordered to pay $6 million in total damages, split equally between compensatory and punitive, with Meta bearing 70% of the responsibility.

It is the first time a jury has decided that tech companies bear liability for the mental health harms their platforms cause to young users.

The design, not the content

What made this case legally unusual was the instruction the jury received before deliberating: ignore the content. The posts, videos, and feeds Kaley encountered were not up for debate, partly because Section 230 of the 1996 Communications Decency Act shields platforms from liability for user-generated content. So the plaintiff's team built their case entirely around the architecture of the apps themselves.

Their argument, led by attorney Mark Lanier, centered on what he called "the engineering of addiction." Four specific product features were placed under the microscope.

Infinite scroll. Feeds designed to never naturally end, removing any moment where a user might simply choose to stop.

Autoplay. Content that advances on its own, bypassing the small friction of actively deciding to keep watching.

Algorithmic recommendations. Systems optimized for time-on-platform rather than user wellbeing, particularly dangerous for children whose sense of identity is still forming.

Constant notifications. A persistent stream of alerts pulling users back to the app, even after they'd put it down.

The jury was asked to evaluate whether these features, taken together, constituted defective design that harmed the plaintiff. They concluded that it did.

What the companies argued

Meta and Google pushed back hard. Meta pointed to Kaley's turbulent home life and mental health struggles that predated her social media use, and their attorneys noted that none of her therapists had ever documented social media as a cause of her condition. YouTube took a different angle, arguing it functions more like a television platform than a social media site, and cited internal data showing Kaley spent only about a minute per day on YouTube Shorts.

The plaintiff's legal team didn't have to prove social media caused her suffering outright. They only had to show it was a "substantial factor." The jury agreed that it was.

One juror, speaking to reporters outside the courtroom, said Zuckerberg's shifting testimony hadn't sat well with the group.

Why the dollar amount isn't really the point

Six million dollars is not a number that frightens companies of this scale. Meta's market capitalization runs into the hundreds of billions. The verdict's significance lies in what it establishes legally, and in the roughly 2,000 pending lawsuits now watching from the wings.

This case was selected as a bellwether, chosen specifically to calibrate how similar cases across California might proceed. A federal trial involving claims from school districts and parents nationwide is scheduled to begin this summer in Oakland. Legal experts have drawn comparisons to the tobacco litigation of the 1990s, which didn't kill the industry but did fundamentally change how it operated and what it was permitted to do with young audiences.

The timing adds weight to all of this. The day before the Los Angeles verdict, a New Mexico jury ordered Meta to pay $375 million for enabling child sexual exploitation on Instagram and Facebook. Two juries, two states, one week. The legal exposure for social media companies around the treatment of minors is accumulating faster than at any point since these platforms launched.

The broader question

For those working on platform design, digital safety, and the ethics of user experience, this verdict confirms something that researchers and advocates have argued for years. Certain design patterns are not neutral. Infinite scroll and autoplay weren't built for the user's benefit. They were built to reduce the likelihood that users would leave, and they work. Whether that constitutes negligence when the user is a child with a not-yet-fully-developed brain is a question a jury just answered.

Meta and YouTube both plan to appeal, and that process will take time. But the legal landscape for platforms built around engagement mechanics targeting young users has shifted in a way that will be hard to reverse.

If the design was the problem, that's also where any real solution has to begin.

Amurabi helped us think out of the box in a very powerful way

Jolling de Pree

Partner at De Brauw

"Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat."

Name Surname

Position, Company name