Feedback on the 2025 Privacy Research Day
Rédigé par Mehdi Arfaoui et Vincent Toubiana
-
25 August 2025On July 1, 2025, the CNIL's annual academic event took place, bringing together 164 on-site participants and 750 people online. This article aims to review the organization and content of this event.

A Demanding Selection Process
Following our call for papers, which closed on March 31, we were pleased to receive 85 submissions. This enthusiasm demonstrates the vitality of research in the field of privacy protection. The international dimension of the event was confirmed with contributions from 18 countries, including France, the United States, Germany, Vietnam, China, Australia, and Canada.
Given the quality of the submissions, a rigorous two-step selection process was implemented to ensure the scientific quality and balance of the program.
- In the first stage, each paper was evaluated based on two main criteria: its academic "merit" (quality and scientific recognition, methodological robustness) and its relevance to the various departments and priorities of the CNIL. This pre-selection phase mobilized the expertise of CNIL staff and led to nearly 200 evaluations conducted in just three weeks. This first step relied on the expertise of several departments (the Technology Expertise Department, the Artificial Intelligence Department, the European and International Affairs Department, the Economic Analysis Mission, and the LINC) who were heavily involved in this task, ensuring a multidisciplinary analysis of the proposals. At the end of this first stage, 38 proposals were selected.
- The second selection stage involved grouping the selected proposals around common themes, with the goal of creating coherent panels and promoting interdisciplinarity. This work led to the identification of 9 themes, some of which emerged in addition to the topics initially outlined in the call for papers - an expansion that allowed us to identify and engage with new areas of concern and innovation in the privacy field. To ensure coherence and depth of discussion during the day, we had to make the difficult choice to focus on 5 main themes. This decision compelled us to keep only 16 proposals out of the 38 initially selected. The authors of some of the unselected proposals were invited to share their research findings at internal CNIL research seminars (the Research@LINC).
A Successful Event and Stimulating Exchanges
The 2025 edition of Privacy Research Day was a great success, bringing together a large community of researchers and practitioners in the digital and regulatory fields. The event gathered 164 on-site participants (not including the many CNIL staff present) and was followed by 750 people online throughout the day, with a peak audience of 316 simultaneous viewers. Here is an overview of the discussions that enlivened this rich day.
1) Perceptions of Risks and User Strategies
This session explored how individuals perceive and understand risks to their privacy. Discussions focused on the impact of "privacy scores," protection strategies on Twitter during the "Justice for Nahel" movement, and the perspective of American teachers on "deepfakes." The Q&A session notably questioned the potential impact of the Danish law on intellectual property of images on protection against deepfakes.
- Christoph LUTZ, professor at BI Norwegian Business School (Norway) - Visual Privacy: The Impact of Privacy Labels on Privacy Behaviors Online - Design
- Hiba LAABADLI, PhD student at Duke University (United States) - Exploring Security and Privacy Discourse on Twitter During the ‘Justice Pour Nahel’ Movement in France - Sociology
- Miranda WEI, PhD student at the University of Washington (United States) - We’re utterly ill-prepared to deal with something like this": Teachers’ Perspectives on Student Generation of Synthetic Nonconsensual Explicit Imagery - Computer Science
2) Protecting Data in the Age of Social Media
This session examined our practices on platforms, addressing the issue of sharing screenshots without people's knowledge, and methods for protecting data collected from social media for academic research. The debate particularly focused on a major dilemma: how to balance respect for privacy with the evidentiary value of screenshots (in cases of harassment, for example)?
- Alexis Shore INGBER, postdoctoral fellow at the University of Michigan, School of Information (United States) - Understanding screenshot collection and sharing on messaging platforms: a privacy perspective - Sociology
- Kyle BEADLE, PhD student in Computer Science at University College London (United Kingdom) - A Privacy Framework for Security Research Using Social Media Data – Computer Science
- Aleksandra KOROLOVA (video), lecturer at Princeton University (United States) - Having your Privacy Cake and Eating it Too: Platform-supported Auditing of Social Media Algorithms for Public Interest – Computer Science
3) Transnational Compliance
In this session, speakers addressed the challenges of cross-border law enforcement, examining opportunities for improvement in regulations like the GDPR and the future AI Act. The often polysemous notion of "data anonymization" was also at the heart of the discussions. The audience sparked discussions on the lessons to be learned from the German authority's decision to rely on the DSA to have the DeepSeek application removed from app stores.
- Helena KASTLOVA, researcher at Princeton University (United States) - Report on extraterritorial enforcement of GDPR - Law
- Michal CZERNIAWSKI, researcher at Vrije Universiteit Brussel (Belgium) - From the General Data Protection Regulation to the Artificial Intelligence Act: Improving Extraterritorial Enforcement Through Criminal Law - Law
- Ludovica ROBUSTELLI, postdoctoral fellow at the University of Nantes (France) - Guidance in anonymization: When Ambiguity Meets Privacy-Washing – Law and Computer Science
4) Data Protection in the Age of AI
Dedicated to the challenges of artificial intelligence, this session questioned several recent phenomena: new forms of AI-powered surveillance, detectors for AI-generated hate speech, and discrimination produced by automated recruitment systems. A fundamental question was also posed: what balance should be struck in model training between privacy and data utility, especially when models rely on the use of synthetic data?
- Clément LE LUDEC, postdoctoral fellow at Paris Panthéon Assas University (France) - Who makes images readable? Artificial intelligence as the basis for a recomposition of surveillance work – Sociology
- Xinyue SHEN, PhD student at CISPA Helmholtz Center for Information Security (Germany) - HateBench: Benchmarking Hate Speech Detectors on LLM-Generated Content and Hate Campaigns – Computer Science
- Nina BARANOWSKA, postdoctoral fellow at Radboud University (Netherlands) - Equality Monitoring Protocol: using sensitive data in post-deployment monitoring of AI hiring systems (project FINDHR) - Law
- Juliette ZACCOUR, PhD student at the Oxford Internet Institute, University of Oxford (United Kingdom) - Access Denied: Meaningful Data Access for Quantitative Algorithm Audits - Computer Science
5) Protecting Health Data
This final roundtable addressed the complexity of protecting health data. Discussions covered the European governance of health data, the role of minors in consent collection, and the strategies of users of menstrual tracking apps in the United States following the overturning of Roe v. Wade. The debate raised questions that concern us all: how can we choose our health apps wisely, and are we in Europe protected from a jurisprudential reversal like the one in the United States?
- Laurène ASSAILLY, teaching and research fellow at Sciences Po Strasbourg (France) -Governing by risk: health data captured by European law - Sociology
- Annagrazia ALTAVILLA, researcher at Espace Ethique PACA-Corse (France) - Children rights and data protection: from principles to practices in biomedical sector – Law and Computer Science
- Hiba LAABADLI, PhD student at Duke University (United States) - “I Deleted It After the Overturn of Roe v. Wade”: Understanding Women's Privacy Concerns Toward Period-Tracking Apps in the Post Roe v. Wade Era - Law and Computer Science
Numerous Projects Presented and Exhibited
Alongside the conferences, Privacy Research Day was also an opportunity to discover several projects involving the LINC. Audrey Pety presented the results of the Observatory of the Exercise of Rights (▶ Rewatch the session). This internal project aims to finely analyze the user journeys of those wishing to exercise their right of access on major digital platforms. The objective is to identify obstacles and best practices to make this fundamental right more effective.
This event was also an opportunity for us to present the work of some of our collaborators:
-
The Data AirBag (▶ Rewatch the session) : Led by Corentin Loubet (PhD student at ENSAD), this "design friction" project was very thought-provoking. It materializes as a computer keyboard that gradually inflates, preventing typing as the user enters personal data. A striking way to make the volume of data we share tangible.
-
The Cookie Banner Archive (▶ Rewatch the session) : Led by Yana Dimova (PhD student at KU Leuven University), this ambitious project analyzes web archives to study the impact of European regulation on the compliance of cookie banners over time.
The Design Spot Exhibition Finally, participants were also able to discover the "Surveillance: design and law in perspective" exhibition, the result of a collaboration between the Jean Monnet Faculty (Paris-Saclay University) and the École nationale supérieure de création industrielle (ENSCi-Les Ateliers). Through the combined lens of law and design, this exhibition explores the tensions between protection and control, security and freedom. More than a simple presentation, it is an invitation to question established norms and to imagine a future where technology serves humanity without enslaving it. |
Two Awards to Recognize Privacy Research
In parallel, Privacy Research Day was also the occasion to present two awards by two independent juries:
- The 9th edition of the CNIL-INRIA Prize (▶ Rewatch the session), which rewards the best paper in the field of computer science and privacy protection. This year, it was awarded to Nataliia Bielova, Cristiana Santos, and Colin Gray for their publication "Two worlds apart! Closing the gap between regulating EU consent and User studies".
- The 1st edition of the CNIL-EHESS Prize (▶ Rewatch the session), which distinguishes the best publication in the humanities and social sciences on the same theme. This year, it was awarded to Laurène Assailly for her thesis "The health of data: an inquiry into hospital work of producing, controlling, and providing access to health data in the face of public action strategies to enhance its value".
These two awards mark the CNIL's desire to encourage and promote multidisciplinary research, which is essential for understanding and protecting data and privacy. They are concrete illustrations of our partnerships with two prestigious teaching and research institutions: EHESS and INRIA.