Dark Patterns and Design Policy

/dark-patterns-and-design-policy-75d1a71

  • Dark Patterns and Design Policy - Data & Society : Points
    https://points.datasociety.net/dark-patterns-and-design-policy-75d1a71fbda5

    These are what Designer Harry Brignull calls “dark patterns,” a term he created in 2010 to describe design choices and characteristics that trick users into making decisions they normally wouldn’t make. Dark patterns can be purposeful or accidental, though it’s often hard to determine the intent behind them. For example, is it an intentional dark pattern to suppress privacy when social networks bury their security settings, or are designers and engineers not sure where to place the appropriate buttons for users to find? Is it intentional that marketplaces like Amazon have price confusion, or again, is it accidental bad design? (In 2019, a few US senators tried to create legislation to ban some forms of dark patterns, though the bill ultimately failed.)

    What matters here is not unpacking direct intent, but rather looking at the outcomes and harms of these design decisions and then, creating policy that accounts for these decisions.

    What is design policy?

    “Design policy” looks at the role of design in products and software—how designers “make things”—and then analyzes how design terms and the structure of design operate in relation to policy and technology. User experience design is what makes technology usable and accessible. For example, iOS follows flat design principles for mobile, and Android follows material design. Most digital design follows the principles of human-centered design, a design methodology created in the 1990s that focuses on user experience. These methodologies are how, for example, a person in Croatia can use an app designed in India and understand how to use the app.

    If policy is, roughly, a set of rules that outline and/or govern how things work, then design policy is the act of using design to make policies around software and hardware understandable. It recognizes that design affects technology, including how technology looks and feels to consumers, and what consumers can ‘see’ or know about technology.

    This has changed in the last years: Today, across many industries, companies use digital design practices that harm consumers, erode privacy, and harm competition.” Dark patterns have only been focused in larger, general press journalist publications recently, even though designers have known about dark patterns for years. Rieger points out that this could be because dark patterns have started to cause more serious, widespread damage; for instance, when filing taxes or in implementations of the European Union’s General Data Protection Regulation (GDPR).

    In 2018, Germany passed the Network Enforcement Act or NetzDG, which allows users to report illegal content and specifically online hate speech. However, the interface to report hate speech was buried and confusing for users; this blog post had to illustrate how to access the reporting mechanism and then how to use it.

    Rieger and I argue that this is a dark pattern, since the reporting form was difficult to access and the design of the flow itself was confusing to users. User interface for specific kinds of features need to be viewed under a lens of ‘searchability,’ or access to information, and if a feature is buried or difficult to find, that makes it harder to access.

    GDPR asks, how do we design for user consent and transparency in products and in ad tracking? GDPR is what we could label a ‘wicked design problem’, which is design problem that seems hard or impossible to solve, since it has to address privacy settings and cookies, create user consent-focused flows in a way that is understandable to users, and provide ways to opt-in and opt-out, while not radically slowing down users (which would cause user frustration).

    Since it’s been passed, the GDPR has been radically misinterpreted across platforms and US news media websites. Below are a few examples from the websites of Le Monde, Vice Germany, Harper’s Bazaar, The Daily Beast, and The Atlantic.

    Harper’s Bazaar initial GDPR settings (Figure 9), where they only offer the ‘accept’ button, is a design that some argue is probably illegal. This is not a consensual choice: users either select ‘yes,’ or forgo reading the website. The May 2020 examples from Le Monde, Vice Germany, Harper’s Bazaar, and The Daily Beast could also be considered dark patterns because users lack clear options to reject cookie tracking. Harper’s Bazaar does a slightly better job once the user clicks into “cookie settings,” but it takes an extra step to get there.

    Design policy in practice

    It’s time for policy teams, think tanks, independent research institutes, and research labs to engage with designers in the same ways that technologists and engineers are being welcomed into research, policy, and academia. Design can unintentionally shift, mask, and hide policy, as well as the intentions of policy.

    When we set out to regulate tech, let’s remember that technology employees on a granular level—not just the monolith of the company—are the ones interpreting that regulation, whether or not the employee realizes it. When providing policy regulations, if researchers aren’t thinking about how a designer will take what we write and implement it in a specific way, then our research and recommendations aren’t producing the intended impact. Designers can take the directive of “tell people about cookies and give them consent options” and create many different things, which is why we see so many different results of GDPR. But imagine if in legislation, it was outlined that “the choices need to be equal in hierarchy and legible for all audiences.” That tells a designer a different story.

    #Design #Dark_patterns #RGPD