News

How Dark Patterns Manipulate Internet Users Every Day

  • Interdisciplinary Centre for Security, Reliability and Trust (SnT)
    10 October 2022
  • Category
    Research

As humans, we make hundreds of decisions in a day – and we like to think that we are free and autonomous. However, the truth is that we can be easily influenced. Marketers and vendors have known influencing strategies for as long as they’ve existed, but now such tactics exist on the digital platforms we interact with every day – and we might not even notice their effect on our behaviour online.

The experience we have with digital platforms lies with the companies that produce them, and the designers who materialise their vision. So, when faced with a decision – be it to subscribe to a service, accept or reject cookies or share with your network – although the choice is ours, a preferred option is often emphasised. In fact, even the colour of a button can statistically affect whether someone presses it or not, so it’s little wonder that one is often in a bright colour and the less preferred button sits in grey beside it. And this influence isn’t the only one – did you ever subscribe to a service that was then far more difficult to end than it was to initially enter? These practices, known as dark patterns, are being widely exposed as manipulative tactics designed to trick or mislead users into taking decisions that benefit a service, rather than serve their own interests.

“The concept of dark patterns reaches far beyond the colour of buttons,” said Dr. Arianna Rossi, a research scientist within the Socio-technical Cybersecurity (IRiSC) research group at the University of Luxembourg’s Interdisciplinary Centre for Security, Reliability and Trust (SnT). Manipulation tactics are everywhere in fact – from overloading us with too much information and too many choices, automatically selecting options for you, or even playing on your emotional heartstrings. “We often think that children or elderly people would be the most vulnerable, but none of us can be shielded from these tactics, as they’re playing on cognitive biases and bounded rationality that we’re all subject to – it’s in our human nature,” she continued.

Cookie banners, for example, are often full of dark patterns that make it easy to accept data-intrusive settings, but hard to reject them. The consequences of accepting cookies are something we don’t care too much about – after all, we visited the website to engage with its content and so, in that moment, this is our primary concern. Underlying this decision, however, are the privacy-invasive implications of extensive tracking of our online activities, and the inferred data about us – including sensitive data, such as health status, sexual orientation, and many more insights about ourselves that can be derived from our navigation habits, and that perhaps not even us are aware of.

So why hasn’t this issue been tackled sooner? Well, it’s only now that we’re starting to see the real consequences of online manipulation. “In the last few years, we have seen a surge of digital platform empires, with millions or even billions of users – therefore their influence affects a great part of the global population,” said Dr. Rossi. “What is also now coming to light is how much these manipulations can be personalised to the individual internet user, and geared to their characteristics and weaknesses,” she continued. Getting down to the minutiae, even the actual definition of dark patterns can be up for interpretation. “Although some practices are illegal already, many websites and applications use them at scale – and with no current mechanisms on how to tackle them effectively, or clearcut distinctions of what constitutes illegitimate or manipulative tactics, the problem will continue to persist,” she continued to explain.

This creates an unequal playing field online, where we’re exposed to websites who are following an ethical (and legal) path in their design and ones who are not. The latter can benefit from extensive collection of personal data or aggressive marketing tactics. In a project funded by the Fonds National de la Recherche, entitled Deceptive Patterns Online (DECEPTICON), researchers from the IRiSC research group, the Human-Computer Interaction (HCI) group within the Faculty of Humanities, Education and Social Sciences (FHSE), as well as the Luxembourg Institute for Science and Technology (LIST) are collaborating to examine the effects of dark patterns on internet users, and establish a set of multi-disciplinary criteria and tools to reliably identify and avoid dark patterns in the design of digital products.

From SnT, the team comprises Dr. Rossi, alongside Prof. Gabriele Lenzini, head of IRiSC group and principal investigator on the project, as well as Dr. Maria Botes, a research associate, and Emre Koçyiğit, a doctoral researcher. They are providing the expertise in cybersecurity, online privacy and data protection compliance. Meanwhile researchers from HCI, including Prof. Vincent Koenig, serving as Vice-PI, Lorena Sanchez Chamorro, a doctoral researcher, as well as Dr. Anastasia Sergeeva and Dr. Kerstin Bongard-Blanchy, both post-doctoral researchers, are looking at the effects of website designs on internet users, and study the perspective on online manipulation of various stakeholders of the data economy, including UX designers. From LIST’s Reliable Distributed Systems Team, Philippe Valloggia provides expertise in data protection assessment engineering and privacy international standards.

Their project is still set to run for the next few years, but already they have conducted a large-scale online survey to establish why dark patterns work: is it because people are unaware, are they unable to recognise dark patterns when they encounter them, or rather they can’t resist their influence nevertheless? The teams created a game in which the participants needed to spot existing manipulative designs in online interfaces, with some harder to recognise than others – including some that are so deceptive that only experts could detect them. The study showed that people under 40 years of age, educated to a Bachelor level, were the most likely to identify dark patterns than others. This shows more concretely that there are pockets of the population that can defend themselves less than others. The researchers have also proposed a multi-layered intervention space with various solutions to tackle dark patterns, from increasing users’ awareness to large-scale automated detection tools. The road is still long to free the web from these and other manipulative practices, but interdisciplinary research can give us the means to succeed.