Maria Korolov
Contributing writer

NIST finally settles on quantum-safe crypto standards

News
Aug 13, 20247 mins
Network Security

After years of review, the National Institute of Standards and Technology has chosen three encryption algorithms as the basis for its post-quantum security strategy.

quantum computing digital communication network security
Credit: Shutterstock

After years of review, the National Institute of Standards and Technology officially picked the world’s first three post-quantum encryption algorithms as the basis for its post-quantum security strategy: ML-KEM, ML-DSA, and SLH-DSA.

NIST first asked cryptographers to develop these new standards in 2016, when the threat of quantum computers started becoming a reality. Quantum computers are expected to be able to break common encryption algorithms used today, such as RSA.

By 2022, 69 such algorithms had been submitted to NIST, out of which the agency chose four for further review. The fourth algorithm, Falcon, was not selected as an initial standard, but evaluation will continue.

NIST is also continuing to identify and evaluate other algorithms. It expects to announce about 15 algorithms in the near future that will proceed to the next round of testing, evaluation and analysis, the agency announced today.

But enterprises should start switching to post-quantum encryption now, without waiting, says NIST mathematician Dustin Moody, who heads the PQC standardization project. The new algorithms will just be backups to the three that NIST announced today.

“There is no need to wait for future standards,” Moody says in today’s announcement. “Go ahead and start using these three. We need to be prepared in case of an attack that defeats the algorithms in these three standards, and we will continue working on backup plans to keep our data safe. But for most applications, these new standards are the main event.”

Lattice-based cryptography

The three new algorithms are all designed for asymmetric encryption – which is when the key used to encode the message is different from the key used to decode it. You keep the decoder key secret, just to yourself, and publicize the encoder key. Now anyone can send you a secret message that only you can read.

This is called public key encryption and serves as the basis for basically all online communications, for securing websites, for financial transactions, and for key management systems and other specialized applications.

At the heart of the system is the idea that multiplying two large numbers together is relatively easy, but dividing a very large number into its factors is extremely difficult.

The new lattice-based encryption methods rely on a different mathematical mechanism, one that isn’t just difficult for traditional computers, but for quantum computers as well.

It’s based on something called the knapsack problem, says Gregor Seiler, a cryptography researcher at IBM. You have a collection of very large numbers. Then you take some of these numbers and add them up. The total is another large number. Adding up numbers is very easy. But figuring out which numbers were used to add up to this total is very difficult.

“This is a very hard problem when the set is really big and the integers are really long,” says Seiler.

Lattice-based cryptography takes this idea and ramps up the difficulty. Instead of the knapsack being full of numbers, it’s now full of vectors. If you think of a single number as being a dot on a line, a vector is an arrow pointing to a dot floating in space. And instead of adding up a bunch of vectors, you can also add up multiples of these vectors.

ML-KEM

This algorithm, originally known as CRYSTALS-Kyber, is a standard based on module-lattice-based key encapsulation. It was originally developed by IBM researchers in collaboration with other institutions. It’s a standard designed to be used for general encryption, such as for accessing websites securely, because it’s fast to use.

ML-DSA

This algorithm was originally known as CRYSTALS-Dilithium and was also originally developed by IBM. This standard is the second-fastest of the three algorithms, and it was designed to be used for digital signatures.

According to Seiler, the trick to this algorithm is that decoding the message requires knowing all the multipliers of the vectors that had been added up.

Digital signatures are used to authenticate documents or software, “helping make sure that those aren’t modified or tampered with,” says Seiler. “Since they are used in sensitive industries such as healthcare, finance, and manufacturing but also by government agencies, there is a palpable urgency to migrate to quantum-safe digital signature methods.”

SLH-DSA

This is another digital signature standard, but it is more secure than the other two – at a cost. According to Seiler, depending on which variant is implemented, it either has a larger signature or requires more time to create the signature.

NIST says that this algorithm is intended to serve as a backup in case ML-DSA proves vulnerable.

More than algorithms

In addition to the mathematical encryption algorithms, NIST also released the relevant implementation details.

“These finalized standards include instructions for incorporating them into products and encryption systems,” says Moody. “We encourage system administrators to start integrating them into their systems immediately, because full integration will take time.”

The change to quantum-safe cryptography is going to be more complicated than previous cryptographic evolutions because the algorithms are very different from classical encryption, because there are multiple different algorithms that will be used for different use cases, and because the software supply chain is more complicated than ever before.

The solution is to become flexible about which specific standard is being used, says Tom Patterson, emerging technology security lead at Accenture. That will allow enterprises to integrate with vendors and partners that might be using a different encryption standard or that still rely on classical encryption. It will also allow enterprises to switch out to new, better, or more efficient standards that may come in the future.

These three standards aren’t the last we’re going to see when it comes to quantum-safe encryption, says Patterson. “There’s going to be, for the next few years, a series of different algorithms that will be available and standardized,” he says.

“This is the opening bell for most CISOs around the world,” says Patterson. “Now they know what algorithms they’re going to work with.”

Read more about quantum computing

Exit mobile version