Concept
Dark Patterns
UI designs deliberately engineered to trick users into actions they did not intend — subscribing to services, sharing more data, or making purchases they would have declined if presented clearly. The term was coined by UX designer Harry Brignull in 2010. Research finds dark patterns present in 95% of popular apps and 1,254 of the 11,000 most-visited websites. They are not design failures; they are design successes optimised for the wrong objective. The EU Digital Services Act and US FTC have begun treating them as a regulatory category, marking the first serious legislative acknowledgement that interface design can constitute consumer harm.
Dark patterns are user interface designs deliberately engineered to trick users into taking actions they did not consciously choose — signing up for recurring subscriptions, surrendering more personal data than intended, making accidental purchases, or finding it impossible to cancel services they no longer want. The term was coined by UX designer Harry Brignull in 2010, who built a taxonomy and public archive of documented instances. The framing was precise and intentional: these are not design failures. They are design successes optimised for the wrong objective.
The scale of deployment is not marginal. A 2019 Princeton and University of Chicago study found dark patterns present in 11% of the 11,000 most-visited websites — and in 95% of the most popular apps when the definition was expanded to include softer manipulations such as misleading framing and artificial urgency. A parallel EU study reached similar figures. Dark patterns are not edge-case malpractice; they are a dominant design paradigm.
Brignull's original taxonomy identified a dozen recurring types. Roach motels make it easy to enter a commitment and nearly impossible to leave — subscription cancellation flows that require a phone call, a specific time window, and several confirmation screens are the canonical example. Confirmshaming labels the opt-out option with language engineered to produce discomfort: "No thanks, I prefer to pay full price." Disguised ads render paid placements indistinguishable from editorial content or search results. Trick questions use double negatives or ambiguous phrasing so that the default action — doing nothing, or interpreting naturally — produces the outcome the company wants. Misdirection draws attention to one element of a screen precisely to prevent attention to another.
What unites these patterns is an inversion of the foundational purpose of user interface design. Good UX removes friction between a user's intention and its execution. Dark patterns insert friction strategically — or remove it — not in the user's interest but against it. The craft skills are identical; the objective function has been reversed.
The psychological mechanisms being exploited are well-documented. Default effects — the tendency of people to accept pre-selected options — are among the most replicated findings in behavioural economics. Cognitive load effects mean that a sufficiently complex cancellation flow will cause most users to abandon the attempt regardless of their preference. Framing effects mean that "95% fat free" and "5% fat" describe the same product but produce different purchasing decisions. Dark patterns are essentially applied behavioural economics, deployed commercially against the people it was ostensibly designed to understand.
The regulatory response has accelerated. The EU's Digital Services Act, which came into force in 2024, explicitly prohibits dark patterns for large platforms, requiring interfaces to be "not designed, organised or operated in a way that deceives or manipulates recipients of the service." The US Federal Trade Commission issued enforcement guidance treating certain dark patterns as unfair or deceptive practices under existing consumer protection law, and brought actions against Amazon's Prime subscription cancellation flow — internally codenamed "Iliad" after the epic poem, a name that itself acknowledged the journey's designed difficulty. Norway's Consumer Council published a landmark 2018 report documenting Facebook's privacy settings as a systematic dark pattern operation, which contributed directly to subsequent GDPR enforcement.
The individual-level implications mirror those of the attention economy: the problem is structural, not personal. A user who fails to notice a pre-checked box, or who gives up on a cancellation flow, is not exhibiting a character defect — they are responding predictably to an interface engineered to produce that response. The practical counter-strategy is the same one that applies to any asymmetric environment: change what you expose yourself to rather than trying to resist in the moment. Subscribing through channels that make cancellation observable (credit card virtual numbers with spending controls, or services that send renewal reminders) moves the friction back to a moment when deliberate choice is possible.
Key Figures
Harry Brignull
UX designer, coined the term and built the original dark patterns taxonomy in 2010
Morten Machholm
Norwegian Consumer Council, led the 2018 Deceived By Design report on Facebook and Google
Lior Strahilevitz
Legal scholar, developed the framework for treating dark patterns as consumer law violations
Further Reading