You’ve likely seen a call to action somewhere online designed to get you to choose the less privacy-centric (or more data-intrusive) option:
- A pop-up window with two buttons: one saying “Subscribe to save 10%” (which requires you to provide your email address) and the other “No, I don’t like discounts”.
- An unsubscribe button with the note “Ten kittens will cry if you [Unsubscribe]”.
- A “Get it free” button, which transports you to a page asking for your credit card details (and, in some cases, your social security number).
This is ‘Dark UX’ in practice.
First Things First: What is UX?
UX, or “User Experience”, refers to the ease with which a person interacts with a product, system, or service. Apple products are known for their extremely good UX – meaning most end-users of Apple’s products find them easy and enjoyable to use.
Privacy UX delves deeper into the user experience by emphasizing transparent and user-friendly interactions related to a user’s personal data. It seeks to ensure that users not only understand and easily control how their data is used but also trust the entities handling their information.
Dark UX, on the other hand, refers to design strategies that prioritize business goals at the expense of the user’s best interests. These strategies can often mislead, confuse, or manipulate users into taking actions they might not have taken otherwise.
“Dark Patterns” are a part of Dark UX. They are deceptive design tricks that intentionally guide users toward decisions that benefit the business, usually at the user’s expense. For instance, a dark pattern might make it very easy to sign up for a recurring charge but convoluted and/or time-consuming to cancel that same charge. (The Federal Trade Commission has warned businesses against this practice.)
For clarity: Dark UX and Dark Patterns are intertwined in that the former often employs the latter to achieve its goals (manipulating the user).
Why Does Dark UX Exist?
Dark UX (and the use of Dark Patterns) can lead to short-term gains for businesses. They have been shown to increase subscriptions and revenue in the short term. However, employing them is short-sighted because they also erode trust, damage brand reputation, and lead to user frustration in the long term.
Moreover, as users become more aware and regulatory bodies clamp down, businesses might find themselves facing backlash or legal consequences for such designs. We’re already seeing the FTC hand out significant penalties for the use of dark patterns.
Implementing Privacy UX
The bottom line is privacy-centric UX is an easy and relatively cheap way to reduce the risk posed by the increasing number and influence of the privacy laws coming into effect across the US and worldwide. It’s also an easy and relatively cheap way to build trust, promote loyalty, and avoid the negative impacts of Dark UX.
Here are some tips to implement Privacy UX:
Explain Why Consent Will Benefit the User.
Consent is a crucial part of the privacy puzzle, and a crucial element of compliance too. But we often see consents bundled together, and even lumped with terms of use (these should be separated).
Privacy UX prioritizes consent at the right time with the right information. Here’s a good example:
This banner image is compelling, interesting, and privacy-focused. It explains clearly what the user gets – they will subscribe to the newsletter. But it allows the user to assess whether they want it before giving their personal information. How? By offering them the opportunity to read recent newsletters.
Similarly, Introduce More Intrusive Privacy Settings at the Right Time.
This is something we see mobile development get wrong all the time. Most mobile apps seek extremely privacy-intrusive permissions from the outset (access to contacts, the camera and microphone, and location are common culprits). This is a turn off for many users.
Privacy UX, on the other hand, would introduce these settings (and ask for consent to collect the data) when (and only if) the user wants to access functionality those settings impact.
Let’s look at an example of a flashlight app asking for access to your location when you download the app. Would you download the app – or would you not trust it because of the intrusive privacy settings?
What about if you later found out that it had (optional) functionality that would automatically turn off the flashlight after five minutes during daylight hours to prevent the battery draining if it were accidentally turned on? It would need access to your location (and time zone) to complete this task, and it asks for permission (and explains why it’s needed) when you turn on this functionality. That’s much better, right?
Prioritize Simplicity and Clarity
Finally, Privacy UX is most powerful when it empowers users to understand. There are a few elements to this:
Clear, Simple Language
The language you choose to share privacy-related information is going to be critical to user understanding. You should priorize clear, simply language that is devoid of jargon when explaining privacy settings and options.
Immediate Feedback
If a user changes a privacy setting, you should offer immediate feedback showing that the change has been made (where possible).
Easy Access to Privacy and Account-Level Controls
Users should be able to easily access and change their privacy settings and account details, including the option to close their account and/or delete their data.
Reach out to discuss your company’s privacy practices. Our attorneys would love to help.
Disclaimer
The materials available at this website are for informational purposes only and not for the purpose of providing legal advice. You should contact your attorney to obtain advice with respect to any particular issue or problem. Use of and access to this website or any of the e-mail links contained within the site do not create an attorney-client relationship between CGL and the user or browser. The opinions expressed at or through this site are the opinions of the individual author and may not reflect the opinions of the firm or any individual attorney.