Opinion

Dark Patterns Are Not the Right Framework for Regulating Privacy Choices

Manipulative design techniques known as “dark patterns” are a hot topic in tech and consumer protection circles. Examples of dark patterns include websites and apps that sneak extra items into a consumer’s online cart, require users to navigate a maze of screens and confusing questions to avoid being charged for unwanted products, or trick users into unnecessary purchases or providing their personal information under a misleading premise.

These are user interface techniques intentionally designed to manipulate – subverting user decision-making and resulting in a harmful outcome for the consumer. Technologists and law professors are studying dark patterns, and on April 29, the Federal Trade Commission will hold a workshop to examine their impact.

Understanding dark patterns is essential for state and federal law enforcers who must keep up with scammers’ tricks and traps in order to prosecute and stop them. And the government has a responsibility to educate the public about the ways digital scammers try to snare them. But policymakers should resist the urge to legislate around the concept of dark patterns – it’s not necessary, given existing authorities. In addition, new laws could undermine legitimate companies’ best intentions to educate their customers.

The FTC has substantial authority under Section 5 of the FTC Act to protect consumers against dark patterns, which are a modern form of deceptive and unfair practices. For example, in its recent enforcement action against Age of Learning Inc., the FTC exercised its Section 5 authority, in conjunction with the Restore Online Shoppers Confidence Act, to enforce against practices it called an “auto-renewal trap,” where billing practices and subscription cancelation combined to present a situation that was both misleading and manipulative. Many FTC advertising and fraud cases, in both the online and offline contexts, implicate practices that amount to consumer “traps,” without labeling them “dark patterns.”

In their paper “Shining a Light on Dark Patterns,” University of Chicago legal scholars Jamie Luguri and Lior J. Strahilevitz conclude that existing precedents suggest that “the law restricting dark patterns does not need to be invented; to a substantial degree [it is] already present.”

The FTC’s enforcement authority against deception and unfairness extends to manipulative design for collection of consumer data, such as misleading information to convince people to give up their personal information, or hidden pre-checked boxes that don’t offer real consumer choice.

When it comes to privacy choices, the commission and other policymakers should not only enforce the law against deceptive and unfair choice mechanisms, but also adopt public policy alternatives that lessen the reliance on the notice and choice model altogether. To that end, the Network Advertising Initiative urges adoption of a federal privacy law that requires data-driven businesses to implement strong data protection and privacy practices, and to prohibit a set of unreasonable data practices, rather than placing the burden on consumers from whom data is collected. That is, some uses of data should be banned, and companies that are not responsible stewards of consumer data should face strong enforcement measures.

Our years of experience reviewing member practices and engaging in discussions with researchers and privacy advocates reveals that clear, easy-to-understand user interfaces are vital to providing proper notice and choice, but they can also be quite challenging to get right. It is important not to punish companies that are trying to provide information to consumers about how their data may be used.

Providing privacy notices regarding varying consumer data uses that are sufficiently descriptive, yet succinct and timely, is not a trivial exercise. While some users actively prefer to know more about what type of data is collected from them, and control whether and how data is used, others have less interest in managing their data sharing practices.

Regulators assessing potential policies around dark patterns should refrain from dictating specific practices or erecting limitations on the ability of businesses to effectively communicate with their users. Because of the diversity of consumer experiences and preferences, it can be difficult to draw a clear line between practices that harm consumers and those that simply inform users that the free digital content they are consuming is supported by data-driven advertising, which is consistent with the preferences of many people.

The best way for lawmakers to clarify this distinction is to provide the FTC with more resources to enforce the law under its existing unfairness and deception authorities and educate consumers and businesses, not to enact legislation specific to regulating the use of digital user interfaces. Industry self-regulation can promote best practices. Ultimately, advertisers, digital publishers and policymakers are all looking for effective models to utilize efficient interfaces without impairing user decision-making.

 

David LeDuc is vice president of policy for the Network Advertising Initiative, the leading self-regulatory association of third-party digital advertising companies committed to consumer privacy; NAI members commit to uphold an extensive and detailed Code of Conduct.

Morning Consult welcomes op-ed submissions on policy, politics and business strategy in our coverage areas. Updated submission guidelines can be found here.

Do NOT follow this link or you will be banned from the site!