[1/3] Post-quantum cryptography

Rédigé par Monir Azraoui

 - 

25 March 2025


The discussions with cryptography experts provided insight into the transition to « post-quantum » cryptography, meaning cryptography that is robust against attacks from quantum computers, a field currently experiencing strong momentum both in academia and in industry. In this article, we present a synthesis of the ideas shared during these interviews, highlighting the challenges and opportunities ahead.

 

The threat of quantum computers to traditional cryptography

 

Introduction to quantum computing

 

Traditional computing relies on a binary system, in which the bit, taking a value of 0 or 1, is the smallest unit of information processed by a computer. In quantum computing, the fundamental unit is the qubit, which can take on a whole range of states between 0 and 1, and these states can be entangled across multiple qubits. These characteristics can, in some cases, enable faster computations than classical bits by processing multiple possible values in parallel. Ultimately, a quantum computer[1] could perform calculations that traditional computers would never be able to complete within a reasonable timeframe.

 

Extensive research on quantum computing has been conducted for several decades (notably by IBM, Google, and Microsoft), and the ecosystem is highly dynamic. In France, the President presented in January 2021 a €1.8 billion investment plan in quantum technologies over five years. Various technologies for building quantum computers exist but currently they only allow very limited computations. However, progress is fast: several players, including IBM and startups such as the French companies Pasqal, Alice & Bob, and Quandela, are developing increasingly sophisticated quantum computer prototypes.

 

However, the cybersecurity community is concerned about the potential impact of quantum computing on cryptography. Indeed, it has been known since 1994 that a quantum computer could break most of the cryptographic algorithms currently in use. What was until recently a purely theoretical threat has, over the past decade, become a more serious concern with the development of the first quantum computers. This applies mainly to asymmetric mechanisms, but also (to a lesser extent) to symmetric mechanisms, whether they consist of encryption, hashing, and digital signature techniques.

 

The looming threat of quantum computers

 

Thus, what is considered secure today could be compromised with the advent of quantum computers of sufficient capacity. It is difficult to predict if or when a quantum computer will ever reach the required power. This is likely a matter of decades. Nevertheless, this threat is already prompting cryptography researchers (including some of the experts interviewed) to anticipate the challenges posed by quantum computing. This anticipation is driven by two main reasons:

  • The potentially very serious consequences of attacks (see below); and
  • The fact that industry often takes time to react to new technologies, which requires preparations must be made as early as possible.

 

Specifically, the ANSSI has identified the particular threats posed by quantum computers and quantum algorithms to the current security of data:

  • Retroactive attacks, which consist in storing today data that is transmitted in encrypted form with the expectation of being able to easily decrypt it in the future (« store now, decrypt later »); 
  • Forgery attacks on digital signatures, which allow an attacker to impersonate the signer.

The best-known quantum attacks rely on specific algorithms that theoretically enable the resolution of computationally « hard » problems (for example, factorisation for RSA or the discrete logarithm for Diffie-Hellman key exchange) that traditional computers cannot solve within a reasonable timeframe:

  • Shor’s quantum algorithm, which threatens the most widely used public-key cryptography systems today. It efficiently solves the discrete logarithm problem (Diffie-Hellman, DSA) and the integer factorization problem (RSA) with a sufficiently large quantum computer (even though such a size is currently beyond the reach of existing technologies);
  • Grover’s quantum algorithm, which impacts the security of symmetric cryptographic systems and provides a so-called « quadratic » speedup (meaning a moderate improvement, but theoretically sufficient for sufficiently powerful adversaries) to solve the problem of finding the secret key.

 

In the case of quantum attacks on symmetric cryptosystems, the impact of a quantum computer is generally limited. Indeed, simply doubling the key size and slightly adjusting the hash function parameters will suffice to restore a level of security equivalent to that against a classical computer.

For public-key cryptosystems, the impact is far more serious. One of the main approaches consists in replacing current asymmetric mechanisms with new mechanisms robust against quantum attacks: so-called post-quantum cryptography.

 

Post-quantum cryptography

 

The goal of post-quantum cryptography is to develop cryptography that is robust against both quantum and classical computers, while remaining compatible with existing architectures and protocols (such as TLS, a secure communication protocol used on the web). In other words, post-quantum cryptography is not quantum.  

 

In 2016, the National Institute of Standards and Technology (NIST) launched an international call for proposals for algorithms with the goal of standardising a new family of quantum-resistant cryptographic algorithms. The interviews took place a few days or weeks after NIST announced the first four selected algorithms : one for key establishment (for encryption) and three for digital signatures (for authentication). Several of the selected algorithms include French researchers among their authors. The first draft standards based on these four algorithms were published in August 2023 for public consultation. Other algorithms are still in the selection process[2]. Indeed, NIST’s goal, and more generally that of the cryptographic community, is to provide the industry, in the near future, with several standards adapted to the constraints of each domain. This will ensure the availability of alternative solutions in the event that one standard is compromised.

 

The candidates for standardisation have generally followed the same approach: building cryptographic primitives based on computational problems that a quantum computer cannot solve efficiently. These problems are generally grouped into several families: 

 

  • Lattice-based cryptography relies on the problem of finding the shortest vectors in a lattice of regularly spaced points within a multidimensional mathematical space. This approach has been widely studied in academia since 2005 and is also used for homomorphic encryption. Of the four algorithms selected by NIST, three are based on this approach.

 

  • Code-based cryptography relies on the problem of decoding a pseudo-random error-correcting code. This is also a well-established approach, studied since the 1970s. Among the algorithms still under consideration for standardisation, some are based on this mathematical problem.

 

  • Hash-based cryptography relies on the problem of inverting a hash function. This approach is very mature and considered highly secure by the interviewed experts, but it is only relevant for constructing digital signatures and is generally somewhat inefficient.

 

  • Isogeny-based cryptography relies on the problem of finding an isogeny between elliptic curves (simply put, an isogeny is a special function linking two elliptic curves while preserving their key properties). This approach is the most recent compared to the others (about ten years old) but promising. Among the candidate algorithms, the one based on this cryptography was broken. The interviewed experts believe that research on this approach is worth continuing due to its small key sizes, although it still suffers from a lack of maturity and the computational power required for encryption. It can be adapted to certain applications where data volume constraints are more critical.

 

  • Multivariate polynomial-based cryptography mainly relies on solving systems of multivariate polynomial equations. This approach dates back to the 1980s. While the basic scheme remains secure, many mechanisms based on this technique have been broken. Some multivariate mechanisms may nevertheless remain relevant, particularly for digital signatures.

 

Smooth migration to post-quantum cryptography

 

The ANSSI has commented on NIST’s decision regarding the first algorithms selected for the standardisation of post-quantum cryptographic algorithms. While satisfied from a scientific standpoint, the ANSSI does not advocate the immediate replacement of current asymmetric algorithms with post-quantum algorithms. Given that these algorithms are recent, the ANSSI recommends using them in a hybrid manner, that is, combining them with well-established pre-quantum algorithms (such as RSA encryption and elliptic curve cryptography) to ensure security at least equivalent to each of the two mechanisms used. This approach provides long-term protection against quantum computing while preventing any security regression. The agency recommends deploying post-quantum cryptography in three phases.

  • From now: hybridising post-quantum algorithms with classical cryptography for certain specific use cases (e.g., particularly sensitive data or products that cannot be updated before 2030);
  • In a second phase (around 2025): possible implementation of post-quantum algorithms, still in hybrid mode, is strongly recommended whenever long-term security is required (the ANSSI may provide more detailed guidance on post-quantum cryptography and recommendations on how to implement hybrid schemes);
  • By around 2030: use of post-quantum algorithms alone, without hybridisation, if they are recognised as secure.

Other European security agencies share the same position and adopt recommendations very similar to those of the ANSSI, particularly regarding hybridisation.

 

By exception to the general rule, signature mechanisms based on hash functions, which are well-known and have been studied for a long time, are considered secure today and can therefore be used without hybridisation. However, these algorithms are not very efficient (notably in terms of signature size), and their practical use is limited to specific use cases.

 

Conclusion

 

The potential threat posed by quantum computers to current cryptographic systems was clearly highlighted during the interviews. This sword of Damocles is driving significant efforts to develop robust solutions through post-quantum cryptography. The NIST’s work to standardise post-quantum algorithms marks a crucial milestone in this evolution.

 

However, the need for a gradual transition to these systems, as recommended by the ANSSI, underscores the importance of adopting a careful and well-thought-out approach for deploying post-quantum cryptography. While some actors handling information requiring long-term protection must begin addressing this migration today, others can still wait a few years for the ANSSI to provide guidance, although it is possible to anticipate these changes, particularly by promoting « crypto-agility ».

 

Securing personal data largely relies on the use of robust cryptographic systems, whose deployment the CNIL has long recommended in numerous contexts. The potential emergence of powerful quantum computers within the next few decades not only requires designing technologies that will remain secure in this new context but also demands proactive anticipation, as it is no longer feasible to ensure the long-term security of personal data without taking this likely evolution into account.

 

 

[1] Here, « quantum computer » refers to a device capable of performing large-scale quantum computations, potentially on a limited number of specialised tasks, rather than as a quantum equivalent of a « classical » computer.

[2] At the time of preparing this article, NIST announced the conclusion of the selection process for encryption algorithms and chose a second algorithm based on error-correcting codes. As for digital signatures, the selection process is still ongoing.

 

 


Illustration : Flickr