Design ethics

Dark patterns at scale: what Amazon Prime teaches us about deception in design

Amazon will pay $2.5 billion after the FTC said it tricked people into signing up for Prime and made cancellation difficult. The case highlights dark patterns: design choices that benefit the business but work against users on a massive scale.
Jess Eddy 9 min read
Dark patterns at scale: what Amazon Prime teaches us about deception in design
Dark patterns at scale: what Amazon Prime teaches us about deception in design
TL;DR
Dark patterns are design tricks that steer people into choices they didn’t intend and make it difficult to opt out or cancel. The FTC’s $2.5B settlement with Amazon put them in the spotlight. They’ve been around for years, from Microsoft’s Windows 10 upgrade to LinkedIn’s “friend spam,” Ticketmaster’s hidden fees, Facebook’s privacy traps, and TurboTax’s fake “free” ads. The takeaway: design has real power, and it should be used to build trust, not erode it.

As reported by The New York Times, Amazon just agreed to pay up to $2.5 billion to settle claims that it tricked millions of people into signing up for Prime, then made it hard for them to cancel. The FTC called it deceptive practices. Designers know it by another name: dark patterns. The settlement is one of the largest in the history of the Federal Trade Commission, which sued Amazon two years ago.

The Federal Trade Commission has formally identified dark patterns as deceptive practices. In recent years, it has pursued cases against some of the world’s largest tech companies, making clear that manipulative design isn’t just poor user experience, it’s a consumer protection issue.

The case highlighted the tactics Amazon used to encourage people to sign up for Prime and the hurdles it put in their way when they attempted to cancel.

That’s the heart of dark patterns: intentional design choices that make it harder to do anything that cuts into revenue.
🔐
Paid members get access to the full research behind this piece. A curated Notebook LM with 18 sources, including reporting, case studies, and analysis that shaped the article. Already a paid member? Log in to access. Not a member yet? Subscribe to unlock.

The FTC highlighted several of these tactics in Amazon’s Prime flows. The first is misdirection.

Tactic 1: Misdirection

Amazon leaned on a tactic as old as advertising itself: misdirection. When someone wanted to buy an item, the page presented a giant orange button promising “Get FREE Same-Day Delivery.” What most people didn’t realize was that clicking it also enrolled them in Prime. If you didn’t want Prime, the only way forward was a small, low-contrast text link buried underneath: “No thanks, I do not want FREE delivery.” By making the appealing option loud and obvious, and the real alternative quiet and easy to miss. Amazon nudged people into a choice they hadn’t intended to make.

Key characteristics
  • Goal: Steer attention toward actions that benefit the company while distracting from less profitable options.
  • Mechanism: Use color, size, placement, or animation to emphasize one choice and downplay another. Critical options, such as “opt out” or “cancel,” are made less visible or less intuitive.
  • Common use cases: Include subscription sign-ups, cancellations, downgrades, software installations, and checkout flows.
Other examples
  • Software installation: Installers hide unrelated programs behind menus and checkboxes. Users who click out of habit often end up with extra software they never intended to have.
  • Airlines: Advertised fares are highlighted in bold, while taxes and fees are hidden until checkout.
  • Service downgrades: Companies may color downgrade or cancel buttons in bland, neutral tones while highlighting “keep” or “upgrade” options in bright, inviting colors.

Tactic 2: Roach motel

The “roach motel” pattern is simple: easy to get in, hard to get out. Signing up for Prime took a single click. Canceling, on the other hand, was a gauntlet of steps that slowed people down and introduced friction at every turn. What should have been a straightforward process became confusing and time-consuming, making it far more likely that people would give up and keep paying.

Key characteristics
  • Goal: Maximize retention by prolonging unwanted subscriptions.
  • Mechanism: Increase friction in the exit process, with more steps, distractions, and emotional nudges.
  • Common use cases: Subscription cancellations, downgrading services, deleting accounts, or unsubscribing from a mailing list.
Other examples
  • Spotify makes users press “cancel” four times.
  • The Boston Globe required calling a number hidden in the FAQ.
  • Ticketmaster once required customers to print and mail a form to cancel a Rolling Stone Magazine subscription.

Tactic 3: Confirmshaming

Language matters. In this case, the opt-out was phrased as “No thanks, I do not want FREE delivery.” That copy isn’t neutral, it’s designed to make you feel foolish for saying no. It framed the user’s decision as irrational. Who doesn’t want something for free? This kind of subtle guilt-tripping is known as confirmshaming. It doesn’t remove choice, but it colors the choice with judgment, making people second-guess their instincts.

Key characteristics
  • Goal: Encourage users to comply by making the decline option seem shameful, irrational, or self-defeating.
  • Mechanism: Frame the negative choice using guilt-inducing or mocking language (often as a text link, or “manipulink”), while presenting the affirmative choice as enthusiastic or rewarding.
  • Common use cases: Include pop-ups, subscription flows, cookie banners, newsletter sign-ups, and cancellation processes.
Other examples
  • Financial subscriptions: “Yes! Count me in!” vs. “No, I love being poor.”
  • Health products: Pop-ups with rejection phrased as “No, I’d rather bleed to death.”
  • Retail newsletters: “No thanks, I prefer to stay uninformed.”
  • Discount offers: “That’s okay, I like paying full price.”
  • Recipes/newsletters: “No thanks, I’m not interested in delicious recipes.”
  • Data privacy: Cookie prompts that require the user to accept, implying rejection, can be perceived as hurtful.

Tactic 4: Obstruction

Even when customers reached the cancellation flow, the process didn’t end there. The FTC found that Amazon attempted to discourage people from canceling by presenting offers during the cancellation process. Instead of a clean exit, customers were diverted and slowed down, faced with choices designed to make them reconsider. This is an obstruction in practice: adding friction so the path out feels harder than staying in.

Key characteristics
  • Goal: Encourage users to avoid canceling, opting out, or taking other actions that reduce revenue, data intake, or retention.
  • Mechanism: Add friction, hide or obscure options, and slow down the process through unnecessary steps or diversions.
  • Common use cases: Subscription cancellations, data privacy opt-outs, checkout flows, and account deletions.
Other examples
  • Prime enrollment declines: The FTC also alleged that Amazon made it difficult for shoppers to find the less-prominent “decline” link to avoid enrolling in Prime, and that the company had been aware of this issue since at least 2018.
  • Privacy opt-outs: Apps may require users to complete a survey before opting out of data sharing, which increases the likelihood that they will abandon the process.
  • Service downgrades/cancellations: Instead of showing clear cancel buttons, companies insert intermediate offers (e.g., “Switch to monthly payments” instead of canceling outright), which prolongs the process and confuses the user’s intent.

Deception at scale

It’s easy to shrug dark patterns off as “just business,” but the scale matters. Hundreds of millions of people shop on Amazon. When design works against users at that scale, it’s wildly deceptive and benefits Amazon at the expense of many customers. That’s why the FTC went after it, and why designers should pay attention.

The lesson isn’t just about Amazon, it’s about the power of design choices. A button label, a link, or a flow might seem small, but each can chip away at customer trust. To build products that people truly trust, we must design for them, not against them.

When companies lean on dark patterns, they also erode confidence in digital services more broadly. Over time, these tactics undermine the credibility of entire industries and cast doubt on UX itself as a discipline that should serve people.

A short history of dark patterns

The concept of “dark patterns,” also referred to as “deceptive patterns,” has a clear origin in the design community, although the practices themselves date back decades. These are patterns in user interfaces designed to trick people into doing things they might not otherwise choose, such as signing up for recurring bills or buying insurance through a product.

Inventor of the term

The term dark patterns was coined in 2010 by British UX designer Harry Brignull. Brignull, a London-based designer with a PhD in Cognitive Science and a background in psychology and human–computer interaction, coined the phrase on July 28, 2010, when he registered DarkPatterns.org. His idea was to build a “pattern library with the specific goal of naming and shaming deceptive user interfaces.”

https://www.deceptive.design

He later explained that the phrase originated from a practical need: he was preparing a conference talk. He wanted a catchy name for the library of deceptive design practices he was assembling, sitting right at the intersection of psychology and design.

Brignull defines a dark pattern as:

“A type of user interface that appears to have been carefully crafted to trick users into doing things that are not in their interest and is usually at their expense.”

In 2023, he published a book on the subject, Deceptive Patterns.

The roots of dark patterns

While the label is relatively modern, the practices it describes grew out of three long-running trends.

1. Deception and manipulation in retail

The retail industry has always played with perception. Some tactics are legal and normalized, like psychological pricing (pricing an item at $9.99 instead of $10). Others are unlawful, like bait-and-switch advertising. These manipulative habits set the stage for digital dark patterns.

2. The origins of nudging

Starting in the 1970s, research in behavioral economics uncovered how heuristics and biases influence decision-making. This led to the idea of nudging, shaping choices in ways that guide people toward outcomes that are supposedly in their best interest.

One famous example: automatically enrolling people in organ donation programs, since most stick with the default. But businesses didn’t just use nudges for good. They adopted the techniques in an adversarial manner to extract money, data, and attention from users, rather than providing them with assistance.

3. Growth hacking

The most direct precursor to dark patterns was the rise of growth hacking. Growth hackers combined design, programming, and marketing to drive adoption. An early example was Hotmail’s viral strategy of appending “Get your free email with Hotmail” to every outgoing email.

As the web matured, growth hacking shifted focus. It shifted from prioritizing rapid growth at all costs to focusing on maximizing revenue from existing users. Growth hackers leaned on two tools:

  • Behavioral influence principles borrowed from the nudge movement
  • A/B testing, which allowed them to rapidly test, optimize, and weaponize interface tweaks at scale

The convergence into dark patterns

Dark patterns emerged at the crossroads of these three trends. Designers applied behavioral insights and A/B testing not to improve experiences but to tilt decisions in favor of business goals, whether that was revenue, data collection, or attention.

The result: interfaces that appear to be helping but instead undermine informed choice and autonomy.

The scale of the problem

By the late 2010s, dark patterns had become widespread:

  • A 2019 study of 11,000 popular e-commerce websites found that 10% used deceptive practices.
  • A 2019 study of mobile apps showed that 95% of 240 Google Play add-ons contained dark pattern designs.
  • By 2022, a report found that 97% of popular apps used by EU consumers displayed dark UX patterns.

These numbers reflect not only the spread of deceptive tactics but also copycat behavior; once one company uses a manipulative design to extract more money, competitors tend to follow suit.

The opposite of this race to the bottom is ethical design. Where dark patterns make cancellation harder than signup, ethical patterns keep the two equally simple. Where dark patterns manipulate, ethical design respects choice. The difference may feel small in a single interaction, but at scale, it shapes whether people feel tricked or not.


Other famous examples

Microsoft Windows 10 upgrade

In 2016, Microsoft reprogrammed the familiar “X” close button to trigger a Windows 10 upgrade, a classic bait-and-switch tactic. Public backlash forced them to reverse course.

LinkedIn “friend spam” lawsuit

In 2015, LinkedIn tricked users into granting email access, then sent spammy invitations to their contacts that appeared to be personal. The case ended in a $13 million settlement.

Ticketing services and hidden fees

Ticketmaster and others added hidden costs late in the checkout process and used opt-out traps, such as a default Rolling Stone subscription, which required mailing a cancellation form, a “roach motel” in practice.

Facebook’s “privacy zuckering”

Facebook collected phone numbers for security but used them for ads, and obstructed account deletion. Regulators fined Meta billions for deceptive privacy practices.

TurboTax deceptive advertising

Intuit advertised “free” tax filing, but most filers didn’t qualify for it. The FTC sued, leading to a $141 million settlement and refunds to millions of customers.


Ultimately, the Amazon case is more than just a lawsuit and a settlement. It’s a reminder that design choices carry real weight—billions of dollars, millions of users, and the trust of entire industries. The FTC called these practices deceptive. Designers know them as dark patterns.

The label matters less than the lesson: design can either respect people or exploit them. And the choice is ours.
🔐
Paid members get access to the full research behind this piece. A curated Notebook LM with 18 sources, including reporting, case studies, and analysis that shaped the article. Already a paid member? Log in to access. Not a member yet? Subscribe to unlock.

Receive updates

Get updates delivered straight to your inbox, once every two weeks.

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to everyday ux.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.