The Market That Ate Humans, and Itself: On predatory design, systemic exploitation, and the way out

Somewhere in Meta's servers, a slide deck sat marked confidential. Itwas written in 2019, and its conclusion was blunt: "Teens can't switch offfrom Instagram even if they want to." 7 years later, a Los Angeles juryread it into the record and found Metaand YouTube liable for designing addictive products.
Perhaps one of the most striking examples of systemic, predatory design. When amarket starts feeding on its users, we need systemic remedies.
When I practised competition law in the early 2000s in BigLaw, the racebetween competitors could turn ugly in familiar ways. Predatory pricing.Foreclosure. Killer acquisitions. Pay-for-delay…Practices that were prosecuted,eventually, and resolved with fines running into the hundreds of millions andthe occasional structural remedy. The market, in theory, would be restored.Consumers would benefit: better products, more innovation, lower prices. Thatis the promise a market economy makes.
The internal documents produced through discovery in the K.G.M. v. MetaPlatforms, Inc. and YouTube LLC case describe a different race entirely.
Predatory by design.
In 2016, Metawas losing ground against TikTok and Snapchat. That year, executives set the"overall company goal" as "total teen time spent" andplaced teens at the "top priority in H1 2017.” Why? Aninternal memo showed that 11-year-olds were four times as likely to keep comingback to Instagram compared with competing apps, despite the platform requiringusers to be at least 13 years old. The logic was ruthlessly simple: childrenwho arrived the youngest were the “stickiest” users. Hence the formulation that would later beread aloud in a Los Angeles courtroom: "If we wanna win big with teens, wemust bring them in as tweens."
The company knew exactly what it was building.Internal research, later produced in discovery, found that teens describedInstagram in terms of what the documents called an "addict'snarrative": spending too much time indulging in a compulsive behaviourthey knew was negative but felt powerless to resist. One set of slides, dated2019, was blunter still: "Teens can't switch off from Instagram even ifthey want to." (Docket #2651, filed 1/20/26).
Employees knew it too. One internal message read:"Oh my gosh yall IG is a drug. I mean, all social media. We're basicallypushers".
Meta had even commissioned a study called ProjectMyst, that surveyed a thousand teenagers and their parents about social mediause. Its findings were unambiguous: children who had experienced adverseevents, such as trauma or chronic stress, were the most vulnerable toaddiction, and parental supervision and controls made little difference. Thestudy was never published.
At its core, this “competition race” was deliberatelyexploiting children’s cognitive weaknesses: the prefrontal cortex, the part ofthe brain responsible for impulse control and the weighing of long-termconsequences, is not fully formed until the age of twenty-five. So thatchildren are unable to resist impulses, especially when the stimuli areengineered.
Not exploiting raw material, infrastructure or evenits own dominance. Exploiting humans. Users. Kids.
It’s actually not the first time that we have internalevidence of exploitative design.
The Amazon case tells the same story at a differentscale and with a different prey. In 2023, the Federal Trade Commission suedAmazon over its Prime subscription programme, alleging that the company haddeliberately engineered its interface to trap consumers into memberships theyhad not chosen and could not easily escape. The complaint was not aboutmisleading advertising or hidden fees. It was about design. Amazon had builtits checkout process to exploit the way human attention works: a large, prominent button reading "Get FREE Two-DayShipping" that enrolled users in Prime, and a small grey text link, easyto miss, to decline. On mobile, the price and auto-renewal terms were buried atthe bottom of the page, visible only to those who scrolled. The company's owninternal documents called this "misdirection." (source; FTC).
What makes the case remarkable isthe paper trail on intent. Employees had been raising the alarm since 2016,warning leadership in emails, meetings and presentations that users weresigning up for Prime without realising it. Those concerns were heard, considered,and set aside. An internal memo recorded the reasoning with striking candour:"clarifying" the process was not the "right approach"because it would cause a "shock" to business performance. The companyhad calculated, with precision, the revenue generated by each additional hurdleit placed in the user's path.
Cancellation was designed with thesame logic applied in reverse. The process, which Amazon internally named the"Iliad Flow" after Homer's epic of the long and gruelling Trojan War,required users to navigate four pages, six clicks and fifteen separate optionsbefore reaching the exit. Customer service agents who answered cancellationcalls had the technical ability to cancel subscriptions immediately. They wereinstructed instead to route callers back to the Iliad Flow online.
By Amazon's own internalaccounting, thirty-five million consumers had been enrolled in Prime withouttheir meaningful consent over seven years. The settlement reached in September2025 cost the company $2.5 billion: a record to date. The mechanism it punishedwas straightforward: a company had studied how human attention, decisionfatigue and cognitive friction work, and had built those vulnerabilities intothe architecture of a product used by hundreds of millions of people. Predatoryby design. Exploitative by design.
When human exploitation issystemic.
Exploitation of humans through online design is notnew at all. The term “dark pattern” was coined by Dr. Harry Brignull (one ofour senior advisors) back in 2010, to describe “tricks online that make you dothings that you did not mean to”. Since then, the field has grown into one ofthe most active research communities, producing hundreds of peer-reviewedarticles, more than sixteen distinct taxonomies, and a formal ontology, all ofwhich I patiently analyzed in our small R&D Lab for 3 years. New scientificpapers appear at the rate of several each week.
What that bodyof research has progressively mapped is the systematic weaponisation ofcognitive science against the people interfaces are supposed to serve.
Online, all humans are vulnerable. Simplybecause we all have hundreds of cognitive biases, which are like “mentalshortcuts” that can lead us to make irrational decisions (Kahnemann, Tversky,1979). These biases make us predictable, for example all humans are going toreact in the same way when faced with information overload or a risk of loss,which in turn makes us manipulable (Susser, Roessler, and Nissenbaum, 2019).
Dark patterns exploit these weaknesses and engineeraround them: pre-ticking consent boxes to exploit our tendency to acceptdefaults; burying cancellation options behind multiple screens to exploitdecision fatigue; using false scarcity and countdown timers to exploit ouraversion to loss. The interface becomes an adversary, designed with greaterknowledge of the user's psychology than the user has of their own.
And that’s really not just theory: 97% of Europeanfavorite e-commerce sites contain dark patterns according to the EuropeanCommission, 76% in the US according tothe FTC, 90% in Japan…UK consumers unwittingly paid £38.3 Bn in “Payment Protection Insurance”because of a pre-ticked box (source), and the European Commissionestimates that online manipulation causes €7.9 Bn of harms to Europeanconsumers, each year. This figure is necessarily understated given the UKinsurance example. The “dark patterns tax” is probably more in the range of €40Bn at European level.
Addictive design takes thisexploitation a step further, targeting not the moment of decision but thebrain's reward architecture itself.
The foundational mechanism wasestablished in neuroscience. Dopamine neurons respond not primarily to rewardsreceived but to the uncertainty of whether a reward will arrive at all: themore unpredictable the outcome, the stronger the dopamine signal (Fiorillo,Tobler & Schultz, 2003). This is the same principle that makes slotmachines the most addictive gambling device ever studied. The gambling industryhad long weaponised variable reward timing to produce what researchers call the"ludic loop": a dissociative, self-reinforcing state of compulsiveengagement from which disengagement becomes progressively harder (Schüll,2012).
Digital platforms have replicatedthis architecture with greater precision and at incomparably larger scale.
Infinite scroll removes thenatural stopping points that allow disengagement; algorithmic feeds withholdand then deliver content in unpredictable sequences; notifications arrive onthe platform’s schedule rather than the user’s; and the pull-to-refresh gesturereplicates, almost exactly, the physical act of pulling a slot machine lever.
None of these features arrived byaccident. Adolescents are particularly vulnerable, caught in a dopamine cycleof desire induced by endless feeds, seeking and anticipating rewards in theform of likes and comments, which continuously reinstate the desired behaviour.Over time, the overactivation of the dopamine system reduces the capacity toexperience pleasure from ordinary life, a hallmark of dependency that addictionmedicine has documented across both substance and behavioural addictions(Lembke, 2021).
The distinction between deceptivedesign and addictive design matters, because the cognitive mechanisms beingexploited are different in kind.
Deceptive design manipulates themoment of choice, inducing a decision the user would not freely make with fullinformation and adequate time (Brignull, 2010). Addictive design goes further:it gradually erodes the capacity to choose at all, manufacturing a compulsionthat overrides preference and eventually operates below the threshold ofconscious awareness.
Both are human exploitationsystems. Both currently affect pretty much three quarters of humanity(basically, the 6 Bn people online).
The corruption of marketsthemselves.
I left the bar and competition lawyears ago. The lawyer, it turns out, never quite left me. And the lawyer in mecannot help but wonder: is there still such a thing as a market, as competitionitself, when some of the largest companies in the world prey on the very peoplethey’re supposed to serve?
Markets require several conditionsto function as such: consumers must be able to attend to alternatives deliberately ratherthan being captured by engineered stimuli; compare them on undistorteddimensions; form preferences that reflect their actual interests rather thanmanufactured compulsions; and switch freely rather than being locked in throughbiological dependency.
Digital platform exploitation, atits most extreme, systematically undermines all four. Infinite scroll andalgorithmic amplification capture attention. Dark patterns distort comparison.Dopaminergic engagement loops manufacture compulsion. Addiction engineeringeliminates effective switching.
The standard competition analysisreads market outcomes as signals: high prices signal market power; risingoutput signals competitive health; consumer switching signals competitivepressure. These signals are the epistemic instruments by which competitionauthorities detect harm and evaluate conduct.
Cognitive exploitation corruptsthese signals systematically. When preferences are manufactured rather thanrevealed, output does not measure welfare. When addiction creates lock-in thatmimics loyalty, switching rates do not measure competitive pressure. Whencompulsion overrides deliberation, revealed preference data does not reflectactual interests. The market signal mechanism that competition law relies on nolonger functions in these markets.
Securities regulation offers aclose analogy, and it is an instructive one. When a trader manipulates afinancial market, the law does not treat the harm as a series of individualwrongs visited on the counterparties who transacted at an artificial price. Ittreats the harm as something done to the market itself, as a social institutionon which everyone depends. The corrupted price no longer tells the truth.Everyone who relies on price signals to make decisions, including those whonever transacted at all, is affected. The damage is structural, nottransactional. That is why we have securities regulators with systemic powers,not just contract claims between aggrieved buyers and sellers.
Cognitive exploitation works bythe same logic, at a much larger scale. When digital platforms systematicallymanufacture the preferences of billions of users, the harm is not simply thatindividual users make choices they would not otherwise have made. The harm isthat the market itself loses its navigational instrument. Consumer preferencesare the signals that are supposed to guide competition toward better products,lower prices, and genuine innovation. Corrupt those signals at the source, andthe market no longer points anywhere useful. Instead, It optimises for theextraction of whatever value can be harvested from the gap between what usersare made to “want” and what they would freely choose.
It is a structural failure of thecompetitive process, and it calls for the same systemic response that financialregulators developed when they recognised that price manipulation was a crimeagainst the architecture of markets.
Now, dark patterns and addictivedesign breach an impressive number of laws, pretty much around the world. Wehave data protection law, consumer protection law, competition law, productliability law, the Digital Services Act, the Digital Markets Act. The LosAngeles verdict was won on products liability and negligence. The EU has finedMeta and Google billions. The regulatory apparatus is moving.
And yet the harm continues, atscale, by design, because none of these instruments offer an adequate, systemicresponse. While I truly hope that the rule of law will prevail and that ourlegal systems are robust enough to fight against systemic threats to humans’freedom, agency, dignity and health, I believe we also need human safety tech.
The premise is simple, and it isthe same premise that underlies every safety regime we have ever built forpowerful technologies. Products that interact directly with human cognitive andpsychological systems, with the capacity to alter mood, attention, belief,desire, and self-perception at population scale, are not neutral utilities.They are interventions in human experience. And interventions powerful enoughto cause systematic harm have always required, in every domain where we havetaken them seriously, a framework of minimum safety standards, independentassessment, and ongoing monitoring that sits outside the control of the peoplewho profit from the intervention.
The corollary follows directly.The burden of proof falls on the platform, not on the victim. The question isnot whether a harmed user can demonstrate that a specific product causedspecific damage. The question is whether the company can demonstrate, beforedeploying its product to billions of people, that the product is safe. That isthe standard we apply to drugs. It is the standard we apply to medical devices.It is the standard we apply to the structural integrity of aircraft. Thisstandard should apply to systems that have been deliberately engineered torewire the brain's reward architecture and human decision-making process.
The tobacco litigation took fortyyears to produce its master settlement agreement. We do not have forty years.The experiment is ruining the developing nervous systems of billions ofchildren, simultaneously, at the speed of AI.
The harm dark patterns andaddictive design are causing is neurological. It is epistemic. It is social and democratic. And it is, the evidencenow shows, by design. We need an antidote, now.
Marie Potel-Saville is the Co-Founder of FairPatterns, a member of the Support Pool of Experts on DarkPatterns at the European Data Protection Board and the Paris Chair of Women inAI Governance. She practised competition law for 10 years (Freshfields, Allen& Overy) before becoming VP Legal EMEA in a US listed group and then ahuman-centric AI entrepreneur.

