Blog
5 min read

Designed to Addict Children: What the Meta and YouTube Trial Actually Revealed

Published on
March 27, 2026
A California jury awarded $6 million against Meta and YouTube after finding that both companies deliberately built their platforms to addict children. What made this case different from years of prior criticism was that the internal documents came out in court.

Social media companies have spent years fielding criticism about what their platforms do to young people, and for most of that time the response has been fairly consistent: parents are responsible, the platforms are neutral, the research is inconclusive. This trial was harder to dismiss. Through the discovery process, Meta and YouTube were required to hand over internal memos, retention data, and records of their own internal debates. A California jury spent nine days hearing that material read back in a courtroom.

This was the companies' own thinking, documented at the time decisions were being made.

What the documents showed

Document 1: The strategy memo

One internal Meta document read: "If we wanna win big with teens, we must bring them in as tweens." The framing was striking in its plainness. The goal was not to build something useful for young people; it was to reach them before they developed preferences, or the judgment to form them.

Document 2: The retention data

A separate internal memo showed that 11-year-olds were four times as likely to return to Instagram compared to competing apps. The platform's own minimum age requirement was 13 at the time. Someone had run the numbers, noticed that the youngest users were the stickiest, and the platform carried on as it was.

Document 3: The beauty filter decision

When Meta was deciding whether to allow filters that alter users' physical appearance, its own employees raised concerns, as did 18 external experts who were brought in to assess the risks. The company considered that input and launched the filters regardless. The trial put the paper trail of that decision in front of the jury.

Document 4: What leadership knew

Senior executives took the stand, and documents were produced showing that people at the top of both companies had been briefed on evidence that their platforms were causing harm to children. The question the trial posed was not whether they knew, but what they chose to do with that knowledge.

The architecture of compulsion

Running through all of it is a set of product decisions that researchers and designers have written about extensively. Individually, each one has a plausible product rationale. Taken together, they describe something more deliberate: a platform built to make stopping harder than continuing.

  • Infinite scroll removes the natural stopping point that pagination once provided. There is no page 2 to decide whether to click.
  • Autoplay eliminates the small moment of friction where a user might decide to stop. The next video begins before the choice to pause can form.
  • Algorithmic recommendations are optimised for engagement, not for the user's wellbeing. The system learns what keeps people watching, not what leaves them feeling good.
  • Constant notifications create a continuous pull back to the platform. For a child, they function as a behavioural drip-feed.

The trial did not introduce these features as novel discoveries. What it did was ask, with documents in hand, whether the companies understood the effect these design choices had on younger users specifically. The answer, based on the retention data and the internal debates that came out in discovery, appeared to be yes.

The verdict

  • 44 hours of deliberation across nine days
  • Negligence found
  • Malice found
  • Foreseeable harm found
  • $6 million awarded

After 44 hours of deliberation across nine days, the jury found negligence, malice, and foreseeable harm, and awarded $6 million in damages. The dollar figure drew attention, but the more significant part of the outcome was the legal theory that held. Platforms have historically been shielded from liability for third-party content under Section 230. This case was built around design decisions rather than content, and the jury accepted that framing.

"They buried their own research showing children were being harmed, and used kids and society as guinea pigs in massive, uncontrolled, and wildly profitable experiments." CEO, Common Sense Media, speaking after the verdict

What comes next

One verdict does not change an industry, and Meta and YouTube have the resources to appeal. But the legal theory that worked here is worth paying attention to. Previous attempts to hold platforms accountable tended to get blocked because they involved content. This case was about the product itself: the scroll behaviour, the notification systems, the recommendation engine. Those are engineering decisions, made by people, documented in writing.

That documentation is now part of the public record. Companies building consumer products for young audiences will have seen this trial. Whether that changes how those products get designed, or simply changes how carefully those design discussions get written down, probably depends on what follows.

Amurabi helped us think out of the box in a very powerful way

Jolling de Pree

Partner at De Brauw

"Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat."

Name Surname

Position, Company name