[Quizz] Will you be able to recognize dark patterns?

Rédigé par Alixe Peraldi (designer)

 - 

09 January 2026


The LINC published Apparences Trompeuses, a website featuring awareness tests about dark patterns to train users to recognize them and to learn how certain practices push us to share more personal data than we would initially intend.

 

Could you recognize a “Look over there” pattern? Spot an invasive default setting when creating an account? Make your way through a “Privacy maze”, or resist “emotional steering” when deleting your data? These deceptive designs are embedded in everyday interfaces and user journeys. Through 20 scenarios, learn to identify interface and user journey design techniques that can affect your decisions about protecting your personal data.

 

A short awareness test (around 10 minutes), with resources to better understand dark patterns.

 

A constantly evolving topic

 

In 2024, the implementation of the European Digital Services Act intensified ongoing questions surrounding design practices and the use of digital interfaces. Article 25 states that: « Providers of online platforms shall not design, organise or operate their online interfaces in a way that deceives or manipulates the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions»..

In other words, the regulation directly addresses a phenomenon that is already well known and widely discussed: dark patterns. The LINC began working on the topic as early as 2019.

 

What is a dark pattern?

 

When dark patterns are mentioned, the first things that come to mind are usually related to commercial practices: pre-selected options, labyrinthine purchasing paths, or misleading pricing displays… These misleading commercial practices are governed by consumer protection law. A significant portion of these practices also directly involves the use of personal data, aiming to steer - or even compel - users to share information, influence their consent, and hinder privacy-protective choices. This legacy of consumer society was already outlined in our 2019 article (in French) : « Dark patterns : quelle grille de lecture pour les réguler ? ».

Since then, in its guidelines adopted in February 2023, the European Data Protection Board (EDPB) proposed defining deceptive designs as « interfaces and user journeys (…) that attempt to influence users into making unintended, unwilling and potentially harmful decisions (…) regarding the processing of their personal data ».

Indeed, in the digital world, a large portion of the revenue of platforms offering free services comes from the collection and analysis of their users’ personal data. This data allows them to personalize their services and target online advertising, and it may lead some platforms to encourage users to share as much data as possible.

Thus, in an already information-overloaded digital environment, the proliferation of deceptive dark patterns can become a real obstacle in users’ journeys. The study « Protection des données personnelles et Cookies », conducted in collaboration with the Direction interministérielle de la transformation publique (DITP) at the request of the CNIL in 2023, also demonstrated their impact on consent rates for the collection of personal data. As a result, individuals may be less able to understand the terms and purposes of the collection and use of their personal data.

Reminder: Personal data refers to any information about an identified or identifiable person. Since it concerns individuals, they should remain in control of it..

 

Long-term negative consequences for platforms 

 

By relying on manipulative journeys or biased consent, they compromise the trust they have with their users. Studies are beginning to show that these practices, while effective in the short term, ultimately erode trust and, by extension, the use of the service. When a user realizes they have been influenced, or even manipulated, into sharing personal data or making a decision they would not have made consciously, they develop a form of mistrust toward the service. This is known as a reactance effect: a psychological mechanism whereby an individual, feeling deprived of their freedom of choice or action, adopts an attitude opposite to what is suggested to them.

This loss of trust can not only affect the brand image, but also lead to a decrease in user loyalty, or even to users abandoning the service.

 

How to recognize a dark pattern?

 

It is to raise public awareness and help people recognize these practices that « Apparences Trompeuses » (dark patterns) was launched. This tool contributes to collective awareness, promoting digital practices that are more respectful of users’ rights and their personal data.

Through a series of 20 questions, the test allows users to practice recognizing the presence of a dark pattern. To ensure the relevance and impact of the tool, the scenarios presented were developed based on examples described by the EDPB in its guidelines.

Each screenshot was designed to illustrate the 16 categories established in the guidelines, such as “emotional steering”, “ambiguous wording or information”, and “privacy maze”.

 

Click on the image to download the typology in PDF format

 

An initiative to raise awareness

 

The result of the test is neither a skills assessment nor a measure of personal performance, but rather a way to illustrate how pervasive these practices can be and how sometimes difficult they are to detect. The main goal remains to raise awareness of the presence of deceptive designs and to help users make more informed choices regarding data protection.

Moreover, the conditions of this test do not fully reflect those of real-world browsing. In practice, we tend to adopt a more passive approach to digital interfaces: we often skim content quickly without reading it in detail. To effectively deal with deceptive designs, we would need to maintain a constant analytical mindset.

Some dark patterns seem to exploit our automatic behaviors, such as cognitive biases, and our conditioning when interacting with digital interfaces. These design techniques manipulate our behavior by relying on our habits and instinctive reactions to visual stimuli, as well as our tendency to choose the easiest and least restrictive path when navigating online. We have also learned to recognize and associate certain visual elements with specific actions through cues like color, contrast, icons, size, hierarchy, and so on. These conventions can therefore be used against us to influence our online decisions. 

Alongside the test, the application offers a set of resources and links related to dark patterns. These allow users to deepen their understanding of the concepts covered, explore the applicable legal frameworks, and discover ethical design practices. The materials, including studies, guides, posters, and external resources, provide insights to better identify these practices and assess their impact. They are intended both for users who want to strengthen their vigilance and for digital professionals seeking to design interfaces that respect fundamental rights. Beyond raising awareness, this initiative aims to foster a more responsible digital culture based on transparency, fairness, and respect for individual autonomy.

 

To explore these issues further and discover practical tools that promote design more respectful of the GDPR and personal data, resources are available on the website : Données et Design : design.cnil.fr

 

 


Illustrations : LINC


 

Document reference

Download the typology of deceptive patterns


Article rédigé par Alixe Peraldi (designer)