Regulators in Europe and the US are paying attention to the harmful impacts of dark patterns. But the digital design features that make it difficult for users to control and manage their privacy and other consents have proved challenging to regulate.
In response, the European Data Protection Board (EDPB) has released guidelines on “Dark patterns in social media platform interfaces”. While the guidance is targeted toward social media platform providers, there are takeaways for all organizations – particularly since this guidance categorizes and provides key examples of the different types of dark patterns.
Dark Patterns Categories
The categories outlined by the EDPB are as follows:
This refers to the practice of burying users under a mass of requests, information, options, or possibilities so as to deter them from taking certain actions and, instead, keeping or accepting a certain data practice.
Examples of overloading include:
- Continuous prompting – for instance, repeatedly asking users to provide data or additional consents, often in a manner that disrupts a users experience on the website or platform. Offering arguments as to why they should provide the requested information or consents also fall under this category.
- Privacy mazes – which refers to designs where users find it difficult to navigate a website or platform to find relevant information or access privacy or user account control.
- Too many options – a practice that involves providing users too many options to choose from.
Dark patterns that rely on skipping are designed to cause users to forget or overlook data protection.
Examples of skipping include:
- Deceptive snugness – which refers to the practice of automatically selecting the most data-invasive features and options.
- Look over there – a practice that puts data protection information in competition with another design element, which may or may not be related to data protection. It’s essentially a tactic that distracts the user from protecting their data.
This practice relies on emotional tactics and visual nudges to ‘stir’ the user into taking (or not taking) certain actions.
Examples of stirring include:
- Emotional steering – which uses words and visuals to play to a user’s emotions. It covers both highly positive portrayals, which may make users feel safe, and highly negative ones which may cause users to feel scared or guilty if they don’t make a certain choice.
- Hidden in plain sight – a practice that involves implementing visual design elements that nudge users towards more invasive options.
Dark patterns that rely on hindering tactics tend to make the process of obtaining information or managing data hard or impossible to achieve. (We also see this tactic in subscription control options in the US – a practice which is attracting enforcement by the FTC).
Examples of hindering include:
- Dead ends – a practice that prevents users from finding information or managing their settings by implementing links that don’t lead anywhere or end at a 404 error page.
- Longer than necessary – for instance, designs that require users to click through multiple (unnecessary) pages or to take unnecessary steps to manage their privacy (or subscription) settings.
Fickle interfaces are unstable and inconsistent, which make it difficult for users to manage their personal information.
Examples of fickle dark patterns include:
- Lacking hierarchy – which occurs when information is not structured in a logical or easily digestible way. Often, information will be presented in a way that’s repetitive and may describe the same information in different ways, leaving users unable to understand and properly manage their data.
- Decontextualizing – this practice puts important privacy information and controls in locations that aren’t logical or intuitive, which makes it difficult for users to find the settings.
Left in the Dark
Websites, apps and platforms that rely on ‘left in the dark’ tactics design their interfaces in a manner that hides information or controls related to data protection.
Common examples of left in the dark patterns include:
- Language discontinuity – which refers to the practice of including important information about data protection in languages that are not the official language(s) of the country where users live.
- Conflicting information – a practice that relies on giving users contradictory or conflicting information about their ability to control their data. This often results in users taking no action and keeping the default settings.
- Ambiguous wording or information – which involves using ambiguous or vague language to describe privacy and data handling practices.
Again, while the guidance is specifically targeted toward social media platform providers, all organizations could use the above guidance to recognize and avoid these common dark patterns. We previously outlined the pitfalls of dark patterns and the benefits of avoiding them, so feel free to review that blog post for more information about why you might consider recognizing and avoiding the dark patterns outlined above.
If you need assistance with developing and implementing good privacy and data security practices, reach out. Our attorneys would love to help.
The materials available at this website are for informational purposes only and not for the purpose of providing legal advice. You should contact your attorney to obtain advice with respect to any particular issue or problem. Use of and access to this website or any of the e-mail links contained within the site do not create an attorney-client relationship between CGL and the user or browser. The opinions expressed at or through this site are the opinions of the individual author and may not reflect the opinions of the firm or any individual attorney.