June 24, 2022

Privacy dark patterns

Fifth in a six-part series on digital privacy, data collection, and the erosion of informed consent.

A cookie banner appears. There are two options. “Accept all” is a large button in the site’s brand color. “Manage preferences” is a small gray link, almost the same shade as the background. If you click it, you arrive at a panel with dozens of toggle switches, most of them pre-set to “on,” organized under headings like “legitimate interest” and “performance cookies” that mean nothing to a non-specialist. Turning them all off requires individual clicks, one by one, across multiple categories. Accepting everything requires a single click.

This is not an accident of design. It is a dark pattern.

What dark patterns are

Harry Brignull coined the term in 2010 to describe interface designs that nudge users toward behavior that serves the company’s interests at the expense of the user’s own. The concept has roots in behavioral economics and choice architecture: the understanding that how options are framed determines, to a significant degree, which option people choose. Dark patterns apply this understanding adversarially, structuring choices so that the path of least resistance leads to the outcome the designer wants (Brignull, 2010).

I have built interfaces. I understand the decisions that go into button placement, color contrast, and information hierarchy. Every designer knows that the primary action should be the most visible element on the screen. Every designer also knows that what counts as the “primary action” depends on whose interest you are designing for. When the primary action is “accept all cookies,” the interface is designed for the data collector, not the user. The visual hierarchy says so, even if the privacy policy does not.

Where regulation meets design

In the context of privacy, dark patterns operate at the precise point where regulation was supposed to return control to the user. Cookie consent mechanisms, mandated by legislation like the GDPR, were intended to give people a genuine choice about data collection. In practice, as Hausner and Gertz (2021) document, many implementations do the opposite: they use visual hierarchy, pre-selected options, confusing language, and asymmetric effort to steer users toward accepting the maximum possible data collection. The regulation required a question. The design ensured the answer.

The consent model’s failure

The underlying problem is the informed consent model itself, or rather, its implementation. Both Warner and Sloan (2012) and Nissenbaum (2011) identify issues with the transparency-and-choice framework that sits at the core of current privacy regulation. Warner and Sloan argue that the model fails because declining data collection carries real costs: loss of access to services, degraded functionality, and time spent navigating opt-out mechanisms, while the benefits of accepting remain vague and abstract. The cost of saying no is immediate and concrete. The cost of saying yes is diffuse and deferred. This asymmetry is not an oversight in the system. It is the system.

Nissenbaum goes further, questioning whether the way these choices are framed can produce anything that deserves to be called freely chosen consent. If the options are structured so that one requires a single click and the other requires twenty, if the language describing one is clear and the language describing the other is deliberately opaque, if the default is set to maximum data collection and the user must actively override it, then the form of choice exists without the substance. The user has been given a right and then placed in an environment engineered to prevent them from exercising it.

Consent without understanding

The evidence supports their skepticism. Bornschein et al. (2020) found that the design of cookie notices significantly affects user behavior, with more prominent notices leading to higher rejection rates, suggesting that most acceptance reflects default behavior rather than deliberate choice. This finding alone should give pause: if the acceptance rate changes based on the design of the notice rather than the content of the policy, then what is being measured is not consent but compliance with a visual prompt.

Kulyk et al. (2018) found widespread confusion among users about what cookie disclaimers actually mean, with many participants unable to articulate what they had consented to moments after consenting to it. The consent was given. The understanding was not. And the system depends on that gap. If every user who clicked “accept all” actually understood what they were agreeing to, and if the process of declining were as simple as the process of accepting, the data collection industry would look very different.

Web browsers allow users to reject cookies, cookie notices exist to offer a choice, and privacy policies are meant to explain what happens to collected data. The mechanisms of informed consent are formally in place.

The irony of empowerment

There is a particular irony here that is worth naming. The regulation that was supposed to empower users has, in practice, created a new layer of interaction design that the data collection industry has optimized as aggressively as it optimizes everything else. The cookie banner has become the consent equivalent of a speed bump in a parking lot that routes all traffic toward the exit the owner prefers. The infrastructure of choice exists. The architecture of that infrastructure has been designed to produce a specific outcome.

The problem is that each of these mechanisms has been designed, or redesigned, to produce a predetermined outcome: the user can say no, but the system has been built to ensure they almost never do.

Legal but manipulative

The legal status of these practices remains ambiguous. Some recent regulatory proposals in the United States have begun to address deceptive design in privacy interfaces, and the GDPR itself has been interpreted by some data protection authorities as prohibiting the most egregious dark patterns. But enforcement is uneven and slow, and the line between permissible persuasion and impermissible manipulation is not well defined in law. Dark patterns largely remain within the bounds of what is legal, occupying a space where the manipulation of autonomous choice is difficult to prove in court even when its effects are clearly measurable.

The manipulation works precisely because it is subtle. A blatant refusal to honor a user’s preferences would be illegal. A design that makes honoring those preferences so burdensome that most users give up is, for now, permissible. The distinction between these two outcomes, from the user’s perspective, is negligible. The distinction, from a legal perspective, is the entire defense.

What dark patterns reveal is that the question of privacy is not only a question of regulation or technology. It is a question of design ethics, and specifically of what happens when the people designing the interface and the people using it have opposing interests in the same interaction.

References

  • Bornschein, R., Schmidt, L., & Maier, E. (2020). The effect of consumers’ perceived power and risk in digital information privacy: The example of cookie notices. Journal of Public Policy & Marketing, 39(2), 135–154. https://doi.org/10.1177/0743915620902143
  • Brignull, H. (2010). Dark Patterns: User interfaces designed to trick people. https://darkpatterns.org/
  • Hausner, P., & Gertz, M. (2021). Dark patterns in the interaction with cookie banners. arXiv:2103.14956v1.
  • Kulyk, O., Hilt, A., Gerber, N., & Volkamer, M. (2018). “This website uses cookies”: Users’ perceptions and reactions to the cookie disclaimer. Proceedings of the 3rd European Workshop on Usable Security. https://doi.org/10.14722/eurousec.2018.23012
  • Nissenbaum, H. (2011). A contextual approach to privacy online. Daedalus, 140(4), 32–48. https://doi.org/10.1162/DAED_a_00113
  • Warner, R., & Sloan, R. H. (2012). Behavioral advertising: From “One-Sided Chicken” to informational norms. Vanderbilt Journal of Entertainment and Technology Law, 15(1), 49–.