feature article
Subscribe Now

NIST Issues New Quantum Crypto Standards for Cyberspace

NIST, the US National Institute of Standards and Technology, has finally published a trio of new standards for post-quantum cryptography (PQC) in an attempt to get ahead of the coming cryptography crisis that’s forecast for the time when quantum computers get powerful enough to crack current RSA (Rivest-Shamir-Adleman) public-key encryption standards.  Although the RSA algorithm was published in 1977 and predates the Internet by a decade or so, today’s Internet services, including the Secure Shell (SSH), OpenPGP, S/MIME, and SSL/TLS protocols, rely on RSA for encryption and digital signature functions. Other public-key crypto standards, including the Digital Signature Algorithm (DSA) and elliptic key cryptography, are similarly at risk. Once RSA and these other standards can be easily broken, all data flowing through the Internet’s pipes will be at risk.

RSA cryptography is based on the factoring of very large integers into prime numbers, which is a time-consuming task for conventional computers based on the von Neumann architecture. As conventional computers became faster, we simply increased the size of the integers being factored, from 128 bits to 256, 512, 1024, and 2048 bits, and beyond. We thought that these bigger and bigger numbers would keep the crypto wolves away from our doors.

Then, in 1994, Peter Shor published an algorithm that would allow quantum computers to quickly find the prime factors of large integers. Given enough qubits, a quantum computer could break RSA encryption in hours, and the entire Internet would be vulnerable. Back in 1994, this situation didn’t seem dire. Now, it does. A general consensus is that quantum computers that can decrypt RSA-encrypted messages will be available in 2030. That’s only six years from now.

NIST issued its original call for PQC standards back in 2016 and went through four rounds of submissions consisting of 82 proposals, which are all based on hard math problems. In 2022, NIST announced the four PQC algorithm finalists:

  • CRYSTALS-Kyber
  • CRYSTALS-Dilithium
  • FALCON
  • SPHINCS+

Now, NIST has published three of these finalists as AFIPS standards:

  •  FIPS 203: ML-KEM (derived from CRYSTALS-Kyber) — a key encapsulation mechanism selected for general encryption, such as for securing websites
  •  FIPS 204: ML-DSA (derived from CRYSTALS-Dilithium) — a lattice-based algorithm chosen for general-purpose digital signature protocols
  •  FIPS:205: SLH-DSA (derived from SPHINCS+) — a stateless hash-based digital signature scheme

NIST expects to publish the proposed FALCON algorithm as the FIPS 206 standard in late 2024.

Notably, IBM Research cryptography researchers in Zurich developed two of these standards – ML-KEM and ML-DSA –with external collaborators. The third standard – SLH-DSA – was co-developed by a scientist who has since joined IBM Research. Clearly, this is a topic that greatly interests IBM. Considering that IBM is also a leading developer of quantum computers, you could say that the company is working on both the problem and its solution, with respect to cryptography in cyberspace.

With standard algorithms in place, the next hurdle is implementing these algorithms and permeating them throughout cyberspace. That seems like a task similar in concept to the Y2K exercise, but now cyberspace is much bigger. The US White House issued a National Security Memorandum (NSM-10) outlining how US agencies will migrate to the new standards in May 2022. The US Congress then passed the Quantum Computing Cybersecurity Preparedness Act, which directs federal agencies to prepare an inventory of quantum-vulnerable cryptosystems that need to be upgraded to the new crypto standards. The US National Security Agency (NSA) then issued a Commercial National Security Algorithm Suite 2.0, which sets a deadline for National Security Systems to complete the quantum-safe migration by the year 2035. Protecting commercial and enterprise systems will be an even bigger task, as will making sure that converted systems will be able to handle both the old and new crypto standards.

Initial implementation of these PQC standards will be in software. For example, Google began testing PQC in its Chrome Web browser in 2016, and the company has been using PQC to protect its internal communications since 2022. In May 2024, the company enabled ML-KEM by default in its desktop browser and on Google’s servers. Connections between Chrome Desktop and Google’s products, such as Cloud Console or Gmail, are already experimentally protected with a post-quantum key exchange.

Eventually, these PQC standards will migrate into hardware implementations because that’s the lowest-cost, most power-efficient, and least expensive way to implement crypto standards. IBM started developing PQC implementations on its z15 mainframe computers in 2017. In 2022, IBM started advertising its z16 mainframe as the industry’s first quantum-safe system. The crypto algorithms are built into the z16 computer’s HSM (Hardware Security Modules), which is a dedicated stack of crypto processors that includes a dedicated crypto acceleration processor. IBM’s work on the z16 mainframe computer is a start.

It’s going to take a while for more advanced implementations of these new FIPS PQC standards to become reality. I can recall working with a developer at Tensilica (now part of Cadence) in the early 2000s named Michael Ji, who was accelerating RSA algorithms by adding Galois Field instructions to the Xtensa processor core’s instruction set. Back then, this was a new concept. A quick Google search shows this is now a common practice when accelerating these crypto algorithms.

We’re going to need a lot more people to look at the math behind these algorithms to develop similar acceleration methods for the three (soon to be four) FIPS PQC algorithms, and, even then, it will be years before we see these ideas gel to the point that they can be implemented in hardware. What I expect to see is the usual migration from software, to instruction extensions for configurable processor cores (like the Arc, RISC-V, and Xtensa configurable cores), and finally to dedicated hardware. That’s how these sorts of adaptations have gone in the past and how I expect them to progress for PQC.

References

Francis Sideco, “Major Quantum Safe Milestone Reached As NIST Publishes PQC Standards,” Forbes.com, August 13, 2024

Michael Osborne, Katia Moskvitch, and Jennifer Janechek “NIST’s post-quantum cryptography standards are here,” IBM Research, August 13, 2024

Silvio Dragone, “How we quantum-proofed IBM z16” IBM Research, October 11, 2022

Dina Genkina, “NIST Announces Post-Quantum Cryptography Standards,” IEEE Spectrum, 13 Aug 2024

Leave a Reply

featured blogs
Oct 9, 2024
Have you ever noticed that dogs tend to circle around a few times before they eventually take a weight off their minds?...

featured chalk talk

Accelerating Tapeouts with Synopsys Cloud and AI
Sponsored by Synopsys
In this episode of Chalk Talk, Amelia Dalton and Vikram Bhatia from Synopsys explore how you can accelerate your next tapeout with Synopsys Cloud and AI. They also discuss new enhancements and customer use cases that leverage AI with hybrid cloud deployment scenarios, and how this platform can help CAD managers and engineers reduce licensing overheads and seamlessly run complex EDA design flows through Synopsys Cloud.
Jul 8, 2024
24,611 views