Post-quantum cryptography (PQC), also called quantum-resistant cryptography, refers to the pursuit of cryptographic systems that would be secure against attacks from both conventional and quantum computers. In recent years, quantum computers—machines that exploit quantum mechanical phenomena to evaluate math problems that are too complex or intractable for traditional computers—have been the subject of an extensive amount of research. As of 2021, while no such quantum computer has been built, researchers in the field agree that these powerful computers would be able to break many of the public-key cryptosystems that are currently in use. Doing so would severely compromise the privacy and integrity of digital communications on the internet and elsewhere. Post-quantum cryptography strives to address this issue before it arrives, by working to create cryptographic algorithms that would be secure from a cryptoanalytic attack by a quantum computer.
As of 2021, the exact time frame for creation of a large-scale quantum computer is indefinite. Past research dictated that it was unclear whether or not construction of a quantum computer could ever be physically achieved. According to the US National Institute of Standards and Technology (NIST), many scientists now see the construction of quantum computers to be merely a significant engineering challenge and theorize that quantum computers with the ability to break all public key schemes in use could be built within the next twenty years or so.
The US has taken the lead in defining PQC, whereas China has chosen to focus on quantum cryptography techniques utilizing quantum mechanical properties to secure information. In 2016, NIST launched an ongoing process of standardizing one or more quantum-resistant public-key cryptographic algorithms, soliciting proposals from around the world. In three rounds, NIST has narrowed down the algorithm candidates, announcing seven finalists and eight alternates on July 22nd, 2020. NIST standards are expected in 2024.
There are multiple ways of categorizing modern non-quantum cryptographic algorithms, including the number of keys required and the algorithm's intended use. Modern cryptographic algorithms are typically divided into three main types:
- symmetric cryptography
- asymmetric cryptography
- hash functions
In symmetric cryptography, the sender and the receiver have the same secret key and utilize the same cryptographic algorithm for encryption and decryption. For example, Alice encrypts a message using the shared secret key. Bob can decrypt the message using the same key and algorithm Alice used. The key has to remain secret and symmetric cryptography requires an efficient way to exchange this key over public networks. Examples of symmetric algorithms include the advanced encryption standard (AES) and the data encryption standard (3DES).
Also known as public-key cryptography (PKC), asymmetric cryptography uses a private and public key, solving the key distribution problem in symmetric cryptography. In asymmetric cryptography, Bob encrypts a message using Alice's public key then Bob sends the message to Alice who can decrypt the message using her private key.
PKC is secured via computational problems such as the difficulty of factorizing large prime numbers and the discrete logarithm problem. These are known as one-way functions that are easy to compute in one direction while the inverse is difficult.
Invented by Ronald Rivest, Adi Shamir, and Leonard Adleman in 1977, RSA is one of the most important PKC schemes. It is used to secure key exchange between end nodes and due to the higher computational cost, is often used together with symmetric algorithms such as AES. RSA utilizes the difficulty present in factorizing bi-prime numbers and is therefore theoretically vulnerable to future fast factorizing algorithms or a significant increase in computation power.
Asymmetric cryptographic systems such as Diffie-Hellman (DH) and Elliptic Curve Cryptography (ECC) are based on the discrete logarithm problem where large parameters make it extremely difficult to solve. However, quantum computers offer new methods for solving the discrete logarithm problem that threatens its future use.
In 1994, American mathematician Peter Shor formulated a quantum algorithm for integer factorization known as "Shor's algorithm." It showed that a quantum computer with a sufficient number of fault-tolerant qubits could theoretically break public-key cryptography schemes such as RSA. Shor's algorithm became a significant motivator in the design and construction of quantum computers, as well as research into new cryptosystems, secure from their threat. In 2001, Shor's algorithm was demonstrated by a team at IBM factorizing 15 into 3 x 5 using 7 NMR qubits.
In 1996, Indian-American computer scientist Lov Grover introduced a quantum algorithm for searching an unsorted database. While Grover's algorithm is generally described as "searching a database," another way of describing it is "inverting a function." Grover's algorithm poses a threat to systems not based on mathematical structures, such as symmetric encryption, halving the security level. For example, breaking AES-128 would take 264 quantum operations instead of the 2128 steps that current attacks require. However, this can be countered by doubling the key size.
In the US, the first warning from the National Security Agency (NSA) of the need for the public to transition to PQC algorithms came in August 2015. In 2017, NIST launched its PQC Standardization activity, overseeing the selection of new quantum-resistant public-key algorithms.
PQC algorithms can be divided into encryption vs signature schemes. Whereas encryption provides confidentiality, allowing messages to be encrypted using a public key and then decrypted only with the corresponding private key. Signatures bind the messenger (source) to the message, ensuring data integrity, message authentication, and non-repudiation. Digital signature algorithms use a private key to write the message's signature and the public key to authenticate.
Lattice-based cryptography dates back to 1996, with the designs of Ajtai and Hoffstein, Pipher, Silverman. It is a form of PKC that avoids the weaknesses of RSA via the multiplication of matrices rather than primes. Lattice-based schemes are built on the difficulties of lattice problems and finding the original vector given a disturbed one. The most simple example is the shortest vector problem where a lattice is represented by an arbitrary basis and the goal is to output the shortest non-zero vector within it.
With the increased number of parameters from utilizing lattices, lattice-based cryptography has the potential to offer solutions better adapted to a given situation while also offering more attack surface. Both encryption and signature systems are possible with lattice-based cryptography. Five of the NIST round three candidates are Lattice-based algorithms. They primarily use two basic hard problems: Module-Learning-with-Errors (Module-LWE) and Module-Learning-with-Rounding (Module-LWR).
First proposed by McEliece in 1978, code-based encryption systems are among the most studied PQC schemes. They use the theory of error-correcting codes. While specially constructed codes make it possible to correct many errors, this is a difficult problem to solve for random linear codes. The difficulty of decoding linear codes is considered secure from quantum attacks when key sizes are increased by a factor of four. The proposed best way to solve the decoding problem is transforming it to a Low-Weight-Code World Problem (LWCWP). However, solving an LWCWP in large dimensions is considered infeasible.
Classes of cryptosystems have to balance a trade-off between efficiency and security. McEliece's cryptosystem for encryption and decryption is fast with low complexity while making use of large public keys (100 kilobytes to several megabytes).
Code-based signature systems have been designed. They offer short signatures at the expense of using very large keys. Basing systems on binary Goppa codes are generally considered secure and systems based on quasicyclic medium-density parity checks have also been shown to be secure and are gaining confidence. However, all code-based signature systems submitted to the NIST PQC project were based on new assumptions and have since been broken. The remaining encryption NIST candidates are Classic McEliece (finalist), BIKE (alternative), and HQC (alternative). The latter two use special codes to reduce the size of the public key, which is seen as the main drawback of code-based systems.
Multivariate cryptography dates back to the late 80s. A public key scheme, multivariate-based system's security is derived from the difficulty of solving systems of multivariate polynomials over finite fields. Signature schemes based on systems of equations with uniformly random coefficients are possible and considered the most secure multivariate systems. The more efficient multivariate schemes use trapdoored systems of equations that appear random to outsiders but have hidden structures allowing solutions to be found efficiently by the person that constructed the system. These are known as oil-and-vinegar schemes.
Research has shown difficulties in developing multivariate-based encryption algorithms, with most being insecure because certain quadratic forms associated with their central map have low rank. They are also not very efficient, often requiring very large public keys and significant decryption times. Multivariate-based systems show more promise for signature schemes, seven of the nineteen signature schemes submitted to NIST were multivariate-based. Two of these seven proceeded to the third round with the Rainbow scheme one of the finalists and the GeMMS scheme selected as an alternate.
Hash functions map strings of arbitrary length to strings of fixed length. Research shows cryptographic hash-functions are one-way and collision-resistant, meaning it is hard to find an element in the preimage of a given image and it is hard to find inputs that map to the same output. Hash functions are widely used cryptographic tools used in virtually any cryptographic construction in practice. Hash functions are used in every practical signature system to handle arbitrary length messages; however, they can also be used as the lone building block.
Isogeny between elliptic curves produces a non-constant map that is possible to write as a fraction of polynomials that is compatible with addition on both curves. This means the image of the sum of two points on the first curve is equal to the sum of images when computed on the second curve. Isogeny-based cryptography schemes use isogenies between elliptic curves over finite fields. The problem used for cryptography purposes comes from finding an isogeny between two elliptic curves that are known to be isogenous. First introduced in 2005, isogeny-based cryptography is the most recent basis for post-quantum candidates.
On December 20th, 2016, NIST made a formal call for proposals for public-key PQC algorithms. The announcement stated:
It is intended that the new public-key cryptography standards will specify one or more additional unclassified, publicly disclosed digital signature, public-key encryption, and key-establishment algorithms that are capable of protecting sensitive government information well into the foreseeable future, including after the advent of quantum computers.
The final deadline for submissions was November 30th, 2017. On December 21st, 2017, round 1 algorithms were announced with sixty-nine submissions accepted as "complete and proper." In April 2018, NIST held the first PQC standardization conference co-located with PQCrypto 2018. The conference allowed first-round candidates to publicly discuss and explain their accepted algorithm.
After over a year of evaluation, twenty-six second-round candidate algorithms were announced on January 30th, 2019. The second-round candidates include seventeen public-key encryption and key-establishment algorithms and nine digital signatures. Submission teams were given the option to provide updated specifications and implementations until March 15th, 2019. In August 2019, NIST held the second PQC standardization conference to discuss aspects of the second-round candidates and obtain feedback on the selection. Each submission team was invited to give an update on their algorithm.
On July 22nd, 2020, NIST announced seven third-round finalists and eight alternate candidates. The seven finalist algorithms will continue to be reviewed for consideration for standardization at the conclusion of the third round. Submission teams were given until October 1st, 2020, to provide updated specifications and implementations. The alternate candidates will also advance, and NIST states these algorithms still have the potential for standardization, although it is unlikely that will occur at the end of the third round.
NIST round-three finalists
Classic McEliece (Encryption)
decoding random binary Goppa codes
cyclotomic module-LWE and module-SIS
cyclotomic NTRU problem
NIST noted that CRYSTALS-KYBER, NTRU, and SABER are all structured lattice schemes and they intend to select one at most for standardization. The same is true for the signature schemes CRYSTALS-DILITHIUM and FALCON. At the announcement of round-three finalists, NIST's view was the structured lattice schemes were the most promising general-purpose algorithms for public-key encryption/key encapsulation mechanism (KEM) and digital signature schemes.
A code-based KEM based on the 1979 McEliece cryptosystem built from a hidden Goppa code. The Classic McEliece includes modern improvements for efficiency and CCA (Chosen-Ciphertext Attack) security. The scheme has a very large public key, but the smallest ciphertexts of all competing KEMs. While this makes its use in internet protocols problematic, in some applications the very small ciphertext makes it beneficial.
Code-based cryptography is the oldest public-key encryption system expected to resist attacks by quantum computers and is one of the oldest public-key encryption systems in general. During round two, the Classic McEliece scheme merged with NTS-KEM, which was using the same codes.
The Classic McEliece uses a public key to specify a random binary Goppa code. The ciphertext is a codeword with the addition of random errors. The private key provides decoding—extracting the codeword from the ciphertext while identifying and removing the errors.
The Classic McEliece system is designed to be OW-CPA (One-Wayness Against Chosen Plaintext Attack). This means an attacker cannot efficiently find the codeword from a ciphertext and public key, as long as the codeword is chosen randomly. The original McEliece parameters were designed for only 264 security. However, the system scales parameters to provide a security margin against advances in computer technology, including quantum computers. The two main attack vectors against code-based cryptography are information set decoding and structural attacks.
During NIST submissions, a full KEM has been specified and implemented with improvements. The software is available on the submission team's web page. Classic McEliece has been implemented for the ARM Cortex-M4, FPGAs, and into the network protocols McTiny and Post-quantum Wireguard.
The main advantage of Classic McEliece is its long history (over forty years) of analysis without significant security concerns and its small ciphertext size. Its primary disadvantage is the large public key size, which can be more than 1MB for the highest security levels.
Crystals-Kyber is an IND-CCA (Indistinguishability under Chosen Plaintext Attack) secure KEM based on the hardness of solving the learning-with-errors (LWE) problem over module lattices. The NIST submission lists three different parameter sets for different security levels:
- Kyber-512—roughly equivalent to AES-128
- Kyber-768—roughly equivalent to AES-192
- Kyber-1024—roughly equivalent to AES-256
With roots in the LWE-based encryption scheme of Regev, the practical efficiency of LWE encryption schemes has since improved by using the same distribution as the noise for the secret and noticing that "LWE-like" schemes can be built by using a square (rather than a rectangular) matrix as the public key. Also, an idea from the NTRU cryptosystem has been implemented, defining the Ring-LWE and Module-LWE problems using polynomial rings rather than integers. The CCA-secure KEM Kyber is built on top of a CPA-secure cryptosystem that is based on the hardness of Module-LWE.
Cyber-Kyber has been integrated into libraries and systems by industry, including the following:
- Cloudflare integrated Kyber alongside other post-quantum algorithms into CIRCL (Cloudflare Interoperable, Reusable Cryptographic Library)
- Amazon supports hybrid modes involving Kyber in their AWS Key Management Service
- In 2019, IBM advertised the "World’s First Quantum Computing Safe Tape Drive" that uses Kyber and Dilithium.
Kyber is designed with Number-Theoretic transform (NTT) multiplications in mind, allowing for efficient implementations on a variety of platforms. However, as some elements are generated and compressed in the NTT domain, it is impractical to use other multiplication algorithms for Kyber. Additionally, polynomial multiplications are in the same ring for all security levels, making it easy to scale between security levels. Kyber's security has strong reductions to underlying hard lattice problems.
One of the oldest encryption schemes making use of structured lattices, Nth Degree Truncated Polynomial Ring Units (NTRU) was first developed by Hoffstein, Pipher, and Silverman in 1998. The current round-three NIST submission is a merger of the original NTRU and NTRU-HRSS submission. The merger occurred after the first round, due to significant design overlaps.
A structured lattice-based KEM, NTRU's security is based on a different assumption to the Ring-LWE (RLWE) or MLWE-based approach several other lattice-based candidates utilize. NTRU lacks a formal worst-case-to-average-case reduction. However, it is a widely analyzed scheme with a long and established history. Versions of NTRU have already been standardized by other organizations.
The NTRU-HRSS part of the submission already contains a high-speed, constant-time implementation. NTRU-HRSS was one of the fastest first-round submissions. NTRU is known for its speed on constrained devices, with implementations dating back to at least 2001. Security implementation of NTRU is advanced. It was chosen by Cloudflare and Google for their second PQC experiment and used in connections from users running Chrome Canary to Google and Cloud-
NTRU has a small performance disparity compared to Kyber and Saber. However, its longer history helped NIST’s decision to select NTRU as a finalist with less risk of unexpected intellectual property claims. NIST has stated that at most, only one of these candidates, Kyber, Saber, or NTRU, will be standardized at the end of the third round. In the event that new cryptanalytic or intellectual properties threaten Kyber or Saber, NTRU would be seen as the more appealing finalist.
SABER is a family of cryptographic primitives, which includes an IND-CPA secure encryption scheme and an IND-CCA secure KEM. Its security is based on Module Learning with Rounding (MLWR), a variant of MLWE in which the addition of small error terms is instead replaced by rounding from one modulus to a second, smaller modulus. Reductions to MLWR from MLWE do exist. However, they are not directly applicable to Saber, which is a mild concern to NIST.
Saber boasts public key sizes of 672, 992, and 1312 bytes and ciphertext sizes of 736, 1088, AND 1472 bytes for
security level 1, 3, and 5 respectively. Saber has been implemented on high-end processors, including the Cortex-M4 and Cortex-M0. The scheme has also been integrated into the network protocol Post-quantum WireGuard for exchanging ephemeral keys.
Dilithium is a signature scheme that follows the concept of designing signature schemes from identification schemes using Fiat-Shamir with aborts. Its security relies on the hardness of the Module-Learning With Errors (MLWE) and Module Short Integer Solution (MSIS) problems. Its design allows for fast multiplications using the NTT transformation and avoiding the generation of randomness from a discrete Gaussian distribution, instead opting for sampling from a uniform distribution. This results in a simpler implementation compared to its main competitor, Falcon. Dilithium has a balanced performance in terms of key and signature sizes and in the efficiency of the key generation, signing, and verification algorithms, while also performing well in real-world experiments.
Falcon is a signature scheme based on the Gentry-Peikert-Vaikuntanathan (GPV) blueprint for lattice-based signatures using (preimage-samplable) trapdoor functions. Falcon instantiates that framework over NTRU lattices, with a trapdoor sampler called "fast Fourier sampling". Security is derived from the underlying short integer solution problem (SIS) over NTRU lattices. There are no efficient solving algorithms currently known in the general case, even with the advent of quantum computers. This problem benefits from the history of cryptanalysis for the NTRU cryptosystem.
Falcon is a compact (smallest combined size of public key and signature among all NIST candidates) and efficient post-quantum signature scheme with security based on well-established assumptions. Its chosen ring structure and error distribution facilitate efficient FFT-based (Fast Fourier Transform) implementation, partially counteracting the less beneficial effects of using a Gaussian error distribution. A drawback of the Falcon scheme is the complexity of its construction and ensuring correct implementation.
Rainbow is a multivariate signature scheme designed in 2004 by Jintai Ding and Dieter Schmidt, based on the oil-vinegar signature scheme originally invented by Jacques Patarin. Rainbow's theoretical security is based on the fact that solving a set of random multivariate quadratic system is an NP-hard problem (nondeterministic polynomial time problem). Its trapdoor function is best described as a composition of two or more oil-and-vinegar trapdoors. The scheme's design philosophy is that by iterating the OV trapdoor, it increases resistance to attacks and allows for more efficient parameter
choices. However, the additional complexity also opens up some new attack strategies.
Rainbow has small signatures of only a few hundred bits (only 528 bits or 66 bytes for the NIST level 1 security), much shorter than other post-quantum signature schemes. Also, Rainbow uses only simple operations over small finite fields, meaning signature generation and verification are extremely efficient and suitable for implementation on low-cost devices. However, Rainbow uses public keys that are relatively large (e.g. 158 kB for NIST level 1 security). It is possible to reduce the public key size by almost three times, at the expense of lower signing times.
NIST round-three alternate candidates
decoding quasi-cyclic codes
GeMSS (Digital signature)
Coding variant of Ring-LWE
non-cyclotomic NTRU problem or Ring-LWE
NIST expects to undergo a fourth round of evaluation for some of the alternative candidates. While several of these alternatives may have worse performance than finalists, they may still be selected for standardization due to high confidence in their security. Other alternatives have acceptable performance but require additional analysis to ensure confidence in their security.
Other PQC standardization projects are underway, including work from:
- The Chinese Association of Cryptographic Research
- The NSA has plans to add the Commercial National Security Algorithm Suite lattice-based and stateful hash-based signatures
- Consortiums of academic and industrial entities, such as Open Quantum Safe, Safecrypto, PQCRYPTO, PROMETHEUS, RISQ.
It is likely future PQC standards will require multiple algorithms, depending on the application and different implementation constraints (e.g. sensitivity to large signature size of large keys). For some applications, a large signature or key size may not be a problem while for others it is untenable. Therefore, NIST standards may recognize the need for multiple algorithm deployments depending on the application.
Replacing algorithm generally requires changing or replacing:
- cryptographic libraries
- implementation validation tools
- hardware that implements or accelerates algorithm performance
- dependent operating system and application code
- communications devices and protocols
- user and administrative procedures
- security standards, procedures, and best practice documentation
- installation, configuration, and administration documentation
Many IT and operational technology (OT) systems are dependent on public-key cryptography. However, many organizations lack inventories of where public-key cryptography is used. This creates challenges determining where to prioritize replacing the current public-key systems with PQC algorithms. Similarly, cybersecurity standards, guidelines, and operational directives generally specify or presume the use of public-key cryptography, and no inventory of these exists to guide updates to the standards, guidelines, and regulations necessary for migration to post-quantum cryptography.
Companies involved in post-quantum cryptography
The algorithm poses a threat to cryptographic systems not based on mathematical systems, potentially halving the security level.
Post-Quantum Cryptography: Current state and quantum mitigation
Cyprien de Saint Guilhem,
Nigel P. Smart.
May 3, 2021
The Impact of Quantum Computing on Present Cryptography
Vasileios Mavroeidis, Kamer Vishi, Mateusz D. Zych, Audun Jøsang
March 31, 2018
- Quantum computingQuantum computing is a type of computation that harnesses the collective properties of quantum states, such as superposition, interference, and entanglement, to perform calculations. The devices that perform quantum computations are known as quantum computers.
- Quantum cryptographyQuantum cryptography is a method of encryption using naturally occurring quantum mechanical properties to secure and transmit data in a way that cannot be hacked.