I first learned about dark UX patterns when I worked at Blizzard Entertainment, where our UX team fought endlessly to thwart any experiences that could be remotely perceived as dark patterns,
I first learned about dark UX patterns when I worked at Blizzard Entertainment, where our UX team fought endlessly to thwart any experiences that could be remotely perceived as dark patterns, since it is so prevalent in the games industry. Using some of those learnings as my foundation, I led an interactive session on UX dark patterns at The Rise of Privacy Tech‘s Virtual Summit in June alongside my colleague Simon, the Director of Design here at Ethyca. I’ll take you through the highlights here, but to see a recording of the session (with all the audience engagement!), check out their YouTube channel here in the coming weeks!
In 2010, UX researcher Dr. Harry Brignull coined the term “dark patterns” to refer to
A user interface that has been carefully crafted to trick users into doing things…they are not mistakes, they are carefully crafted with a solid understanding of human psychology, and they do not have the user’s best interests in mind
For a simple example, consider an ecommerce platform where you decide to buy a new airhorn (I go through a few annually!). At checkout, you see that another item — a $20 airhorn insurance plan — has appeared in your basket, even though you never added it. This is a toy example, of course. But the underlying issue of manipulative design in the user’s digital experience is what drives dark patterns: a phenomenon finally getting the attention it needs.
As I’ll describe in the next section, dark patterns can take a variety of shapes. Once you start seeing dark patterns, it will be difficult to stop—and that’s a good thing. A decade after the term was coined, regulations are finally coming around to the serious dangers of dark patterns: they can manipulate consumers into overspending, and they can infringe on consumers’ right to make informed privacy decisions. The next year will be pivotal for companies to shrewdly identify and root out dark patterns in their own UX, lest they face wrath and fines for non-compliance with emerging regulations, with CCPA and GDPR already prohibiting dark patterns.
Dark patterns are not exclusive to data privacy, but they can be especially impactful when they mislead users as they navigate privacy controls. From Simon’s and my experiences designing respectful UX, we want to share this primer on dark patterns, with actionable steps to improve your company’s privacy experience.
Dr. Brignull’s website, darkpatterns.org, identifies twelve types of dark patterns, and I encourage you to check out all of them. For just a taste, we’ll focus on three here:
This dark pattern occurs when a user encounters obscure or indirect terms that require a close reading in order to make sense of the actual meaning. However, this is the internet, where speed is the name of the game. Users often skim online material, and they miss out on important data controls. Crucially, the users are not at fault. The flaw is at the design level, where data controls are made more complex than they need to be. For instance, some dark patterns reverse the intuitive checked-box = opt-in concept to mean the exact opposite: you have to opt-in to the opt-out. Make sense? Yeah, not really.
Similar to my airhorn example earlier, this dark pattern involves a manipulation of the customer journey, where the customer literally gets more than they bargained for, unless they dig into the details and correct the automatic addition to their cart. Sometimes, though, removing the unwanted item isn’t even an option:
Whether in privacy controls or elsewhere, users might encounter messages dissuading them from withholding their data. In other words, the messaging tries to force users into sharing more information, even in instances where users have the legal right to choose not to.
In our session, we explored three privacy scenarios where dark patterns are particularly prevalent. We also discussed best practices to promote respectful privacy UX. Incorporating audience input, here are the highlights.
First, we looked at one of 2021’s most talked-about pieces of privacy UX: mobile device tracking under Apple’s new iOS 14.5 update. The update included a measure called App Tracking Transparency, which requires all third-party apps (though not first-party apps, I should add) to obtain explicit opt-in consent from a user prior to tracking that user. Abstracting away the technical details, let’s focus on the UX. In our session, we discussed whether there were dark patterns in the messaging presented to users. Audience members pointed out how “Ask App To Not Track” is much more circuitous than a simple “Do not Allow”. It’s also unclear who is getting the better ad experience: me, the end-user, or the advertiser?
Key Take-Away: If your mobile product requests a user’s consent, focus on clear and concise copywriting in the request. Make sure that you are presenting privacy controls fairly, not relying on oblique language or confirm-shaming to coerce a particular outcome.
Next, we discussed the ways in which companies implemented the CCPA’s “Do Not Sell My Personal Information” requirement. We reviewed websites that required a user to provide personal information in order to opt out of their personal information being used in a marketing campaign. A user could actually be providing more information in an effort to protect their information! In a textbook example of the “trick question” dark pattern, some websites upended user expectations with the simple configuration of a toggle.
Key Take-Away: Evaluate the process by which users need to opt out of activities like data sales, and ensure that the process does not rely on competing calls to action.
Finally, we looked to the fan favorite: the cookie banner. Don’t get me wrong: a properly implemented cookie banner can be a key piece of good privacy UX. But we’ve all visited websites where the cookie notice comes with the logical responses to a yes/no request: an “Accept” button and “More Options.” Our audience made keen observations on our examples of cookie banners: The “Agree” button is more visually engaging. It’s unclear if a user can actually use this service if they select “I do not agree.” Last, cookie banners can be plain hard to read from a font perspective.
Key Take-Away: Beyond concise copy, make sure that your UX is accessible. Instead of implementing UX tricks to coerce users, present their options in a way that allows them to make their own decisions about their data.
As with data privacy as a whole, education is vital. Defeating dark patterns begins with understanding what shapes they can take in the messaging on your website or app. The wider community is just now coming to grips with the prevalence of dark patterns, and you can do right by your users by actively assessing and improving your privacy UX. It’s much better than the alternative: assume you are compliant and then face regulatory fines under the growing web of privacy regulations, losing your users’ trust in the process.
As Simon mentioned in our session, there is a rich history of marketing and psychology research behind the most compelling UX, some of which is truly outstanding and some which is flat-out manipulative. That same degree of intentionality and thought must go into building respectful UX that empowers users to confidently exercise controls over their own privacy.
Introducing consent management in Fides 2.0. With the coming state privacy laws in 2023, your business needs to have granular control over users’ data and their consent preferences. Learn more about how Fides can enable this for your business, for free.
Ethyca launched its privacy engineering meetup, P.x, where Fides Slack Community members met and interacted with the Fides developer team. Two of our Senior Software Engineers, Dawn and Steve, gave presentations and demos on the importance of data minimization, and how Fides can make data minimization easier for teams. Here, we’ll recap the three main points of discussion.
We enjoyed two great days of security and privacy talks at this year’s Symposium on Usable Privacy and Security, aka SOUPS Conference! Presenters from all over the world spoke both in-person and virtually on the latest findings in privacy and security research.
At Ethyca, we believe that software engineers are becoming major privacy stakeholders, but do they feel the same way? To answer this question, we went out and asked 337 software engineers what they think about the state of contemporary privacy… and how they would improve it.
The UK’s new Data Reform Bill is set to ease data privacy compliance burdens on businesses to enable convenience and spark innovation in the country. We explain why convenience should not be the end result of a country’s privacy legislation.
Our team at Ethyca attended the PEPR 2022 Conference in Santa Monica live and virtually between June 23rd and 24th. We compiled three main takeaways after listening to so many great presentations about the current state of privacy engineering, and how the field will change in the future.
Our team of data privacy devotees would love to show you how Ethyca helps engineers deploy CCPA, GDPR, and LGPD privacy compliance deep into business systems. Let’s chat!Get a Demo