Skip navigation

Quantum technology and its impact on security in mobile networks

While today’s systems will remain secure against crypto-breaking quantum computers for many years to come, they do present a serious potential risk further into the future. To address this risk, new post-quantum algorithms that can easily be added to existing equipment and protocols are already in the final stages of standardization.

Ericsson CTO Erik Ekudden’s view on quantum technology

The quantum technology of the future may have the potential to break some of the cryptography that provides security in today’s mobile networks. While the risk is only theoretical at present and there is no way of knowing for certain if crypto-breaking quantum computers will ever actually exist, I encourage all communication service providers to prepare for that possibility. With the ability to decrypt communication, forge certificates and install fraudulent firmware updates, a quantum attacker could do enormous damage.

This article begins with a concise overview of the risks posed by quantum technology and goes on to explore the post-quantum cryptography solutions currently being standardized by organizations such as the US National Institute of Standards and Technology and the Internet Engineering Task Force.

Ericsson Technology Review logo

December 7, 2021

Authors: John Preuß Mattsson, Ben Smeets, Erik Thormarker

Download article as pdf

AES – Advanced Encryption Standard
CNSA – Commercial National Security Algorithm
CRQC – Cryptographically Relevant Quantum Computer
ECC – Elliptic Curve Cryptography
ECDH – Elliptic Curve Diffie–Hellman
ECDSA – Elliptic Curve Digital Signature Algorithm
IRTF – Internet Research Task Force
KEM – Key Encapsulation Mechanism
LMS – Leighton-Micali Signature
NIST – National Institute of Standards and Technology (US)
NSA – National Security Agency (US)
NSS – National Security System (US)
NTRU – N-th degree Truncated polynomial Ring
PQC – Post-Quantum Cryptography
QKD – Quantum Key Distribution
QRNG – Quantum Random Number Generator
RSA – Rivest-Shamir-Adleman
SHA – Secure Hashing Algorithm
SIKE – Supersingular Isogeny Key Encapsulation
XMSS – eXtended Merkle Signature Scheme

While today’s systems will remain secure against crypto-breaking quantum computers for many years to come, they do present a serious potential risk further into the future. To address this risk, new post-quantum algorithms that can easily be added to existing equipment and protocols are already in the final stages of standardization.

Over the last 50 years, cryptography has evolved from its military and diplomatic origins to become a rich and widely-used tool to create complex cryptographic solutions for a multitude of applications. In the ICT industry, for example, an efficient combination of symmetric and public-key (asymmetric) cryptography is critical to the security of virtually every product, service and interface in use today.

Modern critical infrastructure such as 5G is implemented with zero trust principles where cryptography is used for confidentiality, integrity protection, and authentication on many of the logical layers of the network stack, often all the way from device to software in the cloud [1]. The cryptographic solutions in use today are based on well-understood primitives, provably secure protocols and state-of-the-art implementations that are secure against a variety of side-channel attacks.

The first signs of a serious quantum challenge to modern cryptography arose in 1994, when the mathematician Peter Shor proved that quantum computers can efficiently factor large integers and solve the discrete logarithm problem, which is believed to be intractable on ordinary computers. Unfortunately, Shor’s result also showed that if sufficiently large and robust quantum computers can be built, then today’s public-key cryptography – which relies on the intractability of these problems – will be broken.

There are multiple public engagements in industry and academia to build quantum computers at present, but the gap between today’s quantum computers and ones that could threaten current public-key cryptography is huge. It is believed that the ability to break today’s public-key cryptography with Shor’s algorithm would require millions of so-called qubits – the quantum equivalents of bits in ordinary computers. Today’s quantum computers typically have a maximum of about 100 qubits and they are not as robust as they would need to be to execute Shor’s algorithm.

While the future progress of robust quantum computers is complex and uncertain, it should not be judged on simple metrics such as qubit-count alone. Assuming a Moore’s law type of growth in qubit count, the scaling from 100 qubits to millions of qubits would take 25-30 years. Recent claims of researchers reaching quantum supremacy do not tell us anything substantial about the speed at which the gap is closing between today’s quantum computers and the hypothetical machines that could threaten public-key cryptography.

1976 – Diffie-Hellman key exchange
1977 – RSA cryptosystem
1978 – Code-based cryptography
1979 – Hash-based cryptography
1980 – Realization that a quantum computer can simulate things a classical computer cannot
1984 – Quantum key distribution
1985 – Elliptic curve cryptography
1986 – Grover’s quantum algorithm inverts any function using only √N evaluations of the function
1994 – Shor’s quantum algorithm introduces integer factorization in polynomial time instead of sub-exponential
1996 – Multivariate-quadratic-equations cryptography
1998 – Lattice-based cryptography
1998 – Quantum computer with two physical qubits
2001 – First quantum key distribution network
2011 – Supersingular elliptic curve isogeny cryptography
2015 – US government (NSA) announces it is planning to transition “in the not too distant future” from Suite B/CNSA to a new suite that is resistant to quantum attacks
2017 – The NIST announces the PQC standardization program
2018 – Standardization of stateful hash-based signatures (XMSS and LMS) by the IRTF Crypto Forum Research Group and the NIST
2019 – Quantum computer with 53 physical qubits
2022 – Target date for NIST to announce the first set of PQC algorithms for standardization and for the NSA to update the CNSA suite with PQC
2022-23 – Target date for draft NIST PQC standards
2024 – Target date for final NIST PQC standards

Risks presented by quantum technology

Nobody knows if large-scale, robust quantum computers capable of attacking public-key cryptography – sometimes called Cryptographically Relevant Quantum Computers (CRQCs) – will ever be built. A 2019 estimate by a committee of experts said that the emergence of a CRQC during the next decade would be highly unexpected [2]. The committee also pointed out that there are no known applications for the intermediate medium-scale quantum computers that may appear in the coming years.

For most types of problem solving, quantum computers are much slower than ordinary computers, as the quantum error correction decimates the clock speed and number of usable qubits with several orders of magnitude, as shown in Figure 1. As a result, quantum computers are not general-purpose super computers, but rather potential special-purpose machines for physics simulations and certain problems that require clever quantum algorithms. 

Figure 1: Envisioned structure of future quantum computers

Figure 1: Envisioned structure of future quantum computers

Some commentators have argued that the development of quantum computing could lose momentum due to a lack of short-term applications or if its progress is too slow [3]. Nonetheless, as the consequences of success would be so severe from a security point of view, anyone who uses public-key cryptography such as RSA and elliptic curve cryptography (ECC) should start preparing now for the possibility that such large-scale machines could someday be built.

After all, a quantum attacker could not only decrypt communication, but also forge certificates and install fraudulent firmware updates. This would completely break the security of most consumer electronics, enterprise networks, the industrial Internet of Things and critical infrastructure. Even worse, information encrypted using public-key cryptography today could be recorded by attackers and used for attacks in the future when large-scale robust quantum computers potentially exist.

Fortunately, an alternative is already available for very long-lived signature keys such as those used in firmware updates. Stateful hash-based signatures have well-understood security, and have already been standardized by the Internet Engineering Task Force (IETF) and the US National Institute of Standards and Technology (NIST) [4]. There is a serious limitation to stateful hash-based signatures, however. Because they are stateful, they are only suitable for very specific applications.

Migration toward post-quantum cryptography

The NIST’s post-quantum cryptography (PQC) standardization [5] is the most important ongoing project aimed at securing public-key cryptography against the threat of quantum computers. The purpose of the project is to standardize new algorithms that are believed to be secure against quantum computers. When standardized, these new primitives can replace today’s public-key cryptography used for key exchange, public-key encryption and digital signatures. The new algorithms are typically as fast as today’s ECC, but with significantly larger public keys, key encapsulations and signatures. The NIST aims to release draft standards for the first new PQC algorithms in 2022-23.

Lattice-based algorithms

The most important new class of post-quantum algorithms is lattice-based. These have public keys, key encapsulations and signatures starting in the 600-900 byte range. The corresponding quantities when using the current ECC are typically 32-64 bytes. There have been no new significant attacks against the lattice-based proposals during the standardization process, and the related mathematical problems have been studied extensively for the past two decades. Lattice-based proposals such as Kyber/Dilithium [6] offer a good middle way for PQC with efficient running times and average-sized communication overhead.

Potential key encapsulation mechanism and digital signature candidates

The two tables in Figure 2 list performance and communication overhead for some of the key encapsulation mechanism (KEM) and digital signature candidates (finalists and alternates) in the NIST PQC standardization at their smallest parameter set [7, 8, 9]. The LMS algorithm is a stateful hash-based signature scheme with slow key generation, and signing and verification take at most a few milliseconds on a comparable platform to those used by the other algorithms in the table. Being stateful, LMS is not in scope in the NIST PCQ standardization. We have included it in the tables for comparison purposes, along with today’s most important public-key cryptography algorithms.

Figure 2: Tables showing performance and communication overhead for some of the KEM and digital signature candidates in the NIST standardization

Figure 2: Tables showing performance and communication overhead for some of the KEM and digital signature candidates in the NIST standardization

Ericsson’s role

Ericsson is engaging in the NIST PQC standardization and the PQC discussions in the IETF, 3GPP and ETSI, and will remain active when standards used in 5G such as TLS (Transport Layer Security), IKEv2 (Internet Key Exchange version 2), X.509, JOSE (JavaScript Object Signing & Encryption) and 5G SUCI (Subscription Concealed Identifier) are updated with the finalized NIST algorithms. While standards may be updated to support the new NIST PQC algorithms, it remains to be seen at what speed our current public-key cryptography is deprecated. This may, in part, depend on the progress in building quantum computers in the coming years. There is a balance between prudent preparations for switching to PQC and making sure that the investment in implementing PQC will be a long-term secure and good choice.

One way in which we are preparing Ericsson’s products is by aligning with practices in the NIST Migration to Post-Quantum Cryptography project [10]. One key is crypto agility – the ability to upgrade cryptography and be prepared for the larger public keys used in PQC, for example. The US National Security Agency’s (NSA’s) Commercial National Security Algorithm (CNSA) cryptography suite is used to protect information in national security systems (NSSs) [11]. The CNSA suite is still not quantum-resistant, and information in NSSs may need protection for decades. This indicates that the NSA feels confident that large-scale robust quantum computers will not be a threat for decades to come.

For the most part, standardization organizations, governments and industries are waiting for the final outcome of the NIST PQC standardization before they take action. The NSA became the exception recently when it announced its plans to add support in the CNSA suite for some of the lattice-based proposals at the end of the third round of the NIST standardization, planned for early 2022.

Post-quantum cryptography algorithm deployment

The initial deployment of the new PQC algorithms may be done in combination with current public-key cryptography so that, for example, an attacker would need to break both conventional elliptic curve Diffie-Hellman KEMs and one of the new PQC KEMs to learn an established session key in a communication protocol. For the most part, the migration to PQC is an algorithm update just like the previous updates from DES (Data Encryption Standard) to AES (Advanced Encryption Standard) and SHA (Secure Hashing Algorithm)-1 to SHA-2, but the larger sizes and slightly limited properties may require changes in protocols and application programming interfaces. The communication overhead of the new algorithms could lead to packet fragmentation in network communication, for example.

Quantum impact on symmetric cryptography

In 1996, Shor’s result was complemented by an algorithm developed by the computer scientist Lov Grover, which showed that quantum computers could search through the possible inputs to a black-box function to find an input that gives a sought output. While Grover’s algorithm can do this in much fewer evaluations of the black-box function than any ordinary algorithm, it is still very slow compared with Shor’s quantum algorithm. (The meaning of black box in this context is that Grover’s algorithm does not rely on any internal structure of the function – it is a generic method.)

In theory, an attacker with a quantum computer can use Grover’s algorithm to break the symmetric cipher AES-128 through a quantum computation that consists of 264 serial AES-128 encryptions. Each such AES-128 encryption in turn consists of approximately 211 serial quantum gates. This gives a total serial computation of length 275 quantum gates. However, the quantum gates can introduce errors, and further overhead piles up from quantum error-correction. What all this means in practice is that the attacker must split up the computation over multiple quantum computers. Since Grover’s algorithm does not parallelize efficiently, as illustrated in Figure 3, the use of 100 quantum computers would only speed up the computation by a factor of 10.

Figure 3: Parallelization of key search using (a) ordinary computers and (b) quantum computers and Grover’s algorithm

Figure 3: Parallelization of key search using (a) ordinary computers and (b) quantum computers and Grover’s algorithm

Considering all this, Grover’s algorithm does not pose any apparent threat to symmetric cryptography. Some years ago, there was a common conception that Grover’s algorithm required symmetric key sizes to be doubled – requiring use of AES-256 instead of AES-128. This is today considered a misconception – NIST, for example, now states that AES-128 will likely remain secure for decades to come, despite Grover’s algorithm [5].

In fact, one of the security levels in the NIST PQC standardization is equivalent to that of AES-128. This means that NIST thinks it is relevant to standardize parameters for PQC that are as strong under quantum attacks as AES-128. There could, of course, be other reasons why a longer key is needed, such as compliance, and using a longer key only has a marginal effect on performance.

In summary, our most important symmetric cryptographic tools (AES, SNOW 3G, SHA2, SHA3 and so on) remain secure against quantum computers as they are. This also applies to the authentication, key generation, encryption and integrity in 3G, 4G and 5G that rely purely on symmetric cryptography.

Quantum cryptography

The idea of quantum cryptography is to leverage quantum mechanics to build cryptography. This is very different from, for example, the post-quantum cryptography that is being standardized by NIST, which can run completely in software like any other conventional cryptography. While quantum cryptography is an exciting academic research topic, its practical security applications are as yet uncertain. So far, quantum key distribution (QKD) and quantum random number generators (QRNGs) are the two types of quantum cryptography that have sparked the most interest. However, current implementations still have a long way to go before they are hardened and certified for practical use.

Quantum key distribution

QKD is a quantum-resistant mechanism for key distribution in which two parties agree on a secret key by sending photons between them with the help of a second (ordinary) authenticated communication channel, as shown in the bottom half of Figure 4. An idealized mathematical abstraction of QKD is famously unconditionally secure. While security proofs for theoretical constructions are an important building block in conventional cryptography as well, it is important to understand that the most important threat surface of cryptography is consistently found to be in the implementation details. The main principle for managing this threat in conventional cryptography is to use well-reviewed implementations that build on collective implementation knowledge that has been gained over decades.

Figure 4: Differences between PCQ and QKD when applied to network infrastructure

Figure 4: Differences between PCQ and QKD when applied to network infrastructure

In contrast to conventional cryptography and PQC, the security of QKD is inherently tied to the physical layer, which makes the threat surfaces of QKD and conventional cryptography quite different. QKD implementations have already been subjected to publicized attacks [12] and the NSA notes that the risk profile of conventional cryptography is better understood [13]. The fact that conventional cryptography and PQC are implemented at a higher layer than the physical one means PQC can be used to securely send protected information through untrusted relays, as illustrated in the top half of Figure 4. This is in stark contrast with QKD, which relies on hop-by-hop security between intermediate trusted nodes. The PQC approach is better aligned with the modern technology environment, in which more applications are moving toward end-to-end security and zero-trust principles. It is also important to note that while PQC can be deployed as a software update, QKD requires new hardware.

Regarding QKD implementation details, the NSA states that communication needs and security requirements physically conflict in QKD and that the engineering required to balance them has extremely low tolerance for error. While conventional cryptography can be implemented in hardware in some cases for performance or other reasons, QKD is inherently tied to hardware. The NSA points out that this makes QKD less flexible with regard to upgrades or security patches. As QKD is fundamentally a point-to-point protocol, the NSA also notes that QKD networks often require the use of trusted relays, which increases the security risk from insider threats.

As QKD requires external authentication through conventional cryptography, the UK’s National Cyber Security Centre cautions against sole reliance on it, especially in critical national infrastructure sectors, and suggests that PQC as standardized by the NIST is a better solution [14]. Meanwhile, the National Cybersecurity Agency of France has decided that QKD could be considered as a defense-in-depth measure complementing conventional cryptography, as long as the cost incurred does not adversely affect the mitigation of current threats to IT systems [15].

Quantum random number generators

Secure randomness is critical in cryptography – if the quality of randomness generators is poor, numerous cryptographic protocols will fail to deliver security. Although conventional hardware randomness generator technology is robust and secure against quantum computers, QRNGs have nonetheless attracted some attention in recent years. QRNGs work according to a physical realization of a quantum model, instead of the other physical processes used in conventional hardware randomness generators.

QRNGs are sometimes advertised as generating perfect unbiased random bits in contrast to the biased bits that come from conventional generators. In reality, though, any bias in the bits output by conventional generators is smoothed out in post-processing through the application of pseudo-random number generators, which work according to the same mechanism that enables a single 128-bit AES key to produce many gigabytes of random-looking encrypted data.

If QRNG technology becomes as well understood in the future as our current hardware randomness generator technology, then it could, in principle, be certified, validated and evaluated on the same grounds.

Conclusion

While we do not expect quantum computers with the ability to attack current cryptography to emerge for many years to come, we strongly encourage communication service providers to start planning the process of migrating to post-quantum cryptography. With the support of vendors including Ericsson, standards-developing organizations such as the US National Institute of Standards and Technology, the Internet Engineering Task Force and the 3GPP are working on new, post-quantum algorithms and updated protocols that can easily be added to existing equipment and interfaces. Currently in the final stages of standardization, these algorithms will be available in the next couple of years to help our industry mitigate potential future threats against mobile infrastructure and services.

References

Receive the latest industry insights straight to your inbox

Subscribe to Ericsson Technology Review

Subscribe here

Further reading

The evolution of cryptography in mobile networks and how to secure them in the future

Mobile network security has come a long way in the decades since cryptography was first introduced into the GSM standard. We take an in-depth look back over the algorithms and protocols that brought us this far – and see how well prepared we are for the quantum challenges of the future.

DOI, The security implications of quantum cryptography and quantum computing

Classical cryptography relies on the assumption that nobody can solve a certain difficult mathematical problem in a realistic amount of time or rely on information theory arguments. Quantum cryptography relies instead on fundamental quantum physics laws. Using large quantum computers, one could break all classical asymmetric algorithms currently used for key distribution and digital signatures.

An introduction to quantum computer technology

In 5G, cloud technologies have become a critical part of radio access network (RAN) management e.g. our 5G vRAN solution. The ability to orchestrate both equipment and functions is enabling more automatized and intelligent forms of network management, from network set up to alarm management, license renewal and more. Such tasks, for example,  can now be performed more efficiently by intelligent agents that make optimised decisions in real time and under unpredictable circumstances.

Authors

John Preuß Mattsson

John Preuß Mattsson

is a senior specialist in internet security protocols. He joined Ericsson in 2007 and has been active in many standardization organizations such as the 3GPP, the GSMA, the IETF, the IRTF (Internet Research Task Force) and the NIST. His work focuses primarily on cryptography, security protocols, the Internet of Things and trade compliance. Mattsson holds an M.Sc. in engineering physics from KTH Royal Institute of Technology, Stockholm, Sweden, and an M.Sc. in business administration and economics from Stockholm University.

Ben Smeets

Ben Smeets

is a senior expert in trusted computing at Ericsson Research. He joined Ericsson in 1998 and started out working on security solutions for mobile phone platforms. He is currently working on trusted computing technologies in connection with containers and secure enclaves. Smeets holds a Ph.D. in information theory from Lund University, Sweden, where he also serves as a professor.

Person placeholder

Erik Thormarker

joined Ericsson in 2018 as an experienced researcher. His research interests include post-quantum cryptography, cryptographic protocols and cryptanalysis. Thormarker holds an M.Sc. from the joint master’s program in mathematics at KTH Royal Institute of Technology and Stockholm University.