Return to Security and Privacy

NIST’s Quantum Cryptography Update 20200815

The National Institute of Standards and Technology (NIST) has been engaged in a program to find ways to securely encrypt messages in a world where there is quantum computing. I previously wrote about this in Encryption and Quantum Computing in April of 2019, but there is some progress on this front.

The Issue

Encryption is central to how the Internet operates. Web sites that takes credit card transactions have to create secure, encrypted connections to keep your credit card number safe. In 2020 it is now rare to find any Web site that does not offer a secure connection (https) by default since search engines have downgraded ordinary http connections in their search results. Let’s Encrypt has come through with free TLS certificates that anyone can use, and hosting companies are now offering this type of encryption to customers. My web sites, for instance, have this simply by me checking a box on my control panel. So it couldn’t be more easy.

Businesses of course need to have secure connections to suppliers and other business partners, and in these days of the Covid-19 pandemic when so many people are working from home, secure VPN connections are a necessity to link employees with their companies and their work assets like e-mail and file access. With identity theft and spoofing, ordinary users need protection as well. And of course the many people fighting against repressive governments have an obvious need.

For most of the last few decades the answer was public key cryptography, which we have discussed previously, based on the work of people like Whitfield Diffie, Martin Hellman, and Ralph Merkle, and made easily available by Phil Zimmermann’s Pretty Good Privacy (PGP). This technology enabled encryption that could not easily be broken if implemented correctly. The term of art used for this is that breaking such encryption was computationally infeasible, and for the last few decades this has been true. The only changes needed were increasing the key length from time to keep up with improvements in computing.

Then along came Quantum Computing, which seemed to usher in an age where encryption would be useless because what had been infeasible with classical computing would suddenly be feasible with quantum computing. For example, quantum computers can easily factor very large numbers into their component primes, the backbone of the RSA algorithm. But it is a poor sword indeed that cannot cut both ways. It soon became clear that in the arms race between encryption and decryption, there would be a new form of quantum encryption that would make quantum decryption infeasible. (Note: the term infeasible generally means something along the lines of requiring every computer on Earth operating for millions of years to decrypt the message.)

NIST Challenge

This is where NIST comes in to the picture. They are the government agency charged with setting standards for encryption, and they have in general done a good job, though they did get burned in the elliptical curve standard when NSA people (probably) got them to approve a standard that had “back door” vulnerabilities. The experience does seem to have gotten them to be more careful though, and I am not aware of any better candidate for stewardship of the standards.

They set up a competition for various teams primarily from industry and academia to submit proposed encryption methods for review. This process began in December of 2016 with a call for submissions, and within a year 69 submissions that met all of the requirements and the minimum acceptance criteria were received. These submissions could then be tested by researchers and some vulnerabilities could be found. While it would be nice to say that all vulnerabilities would be found, it doesn’t work that way. Even approved standards can have vulnerabilities that get discovered later, which is why ongoing testing and research is necessary, and why standards will change over time.

In January, 2019 that group was winnowed down to 26. And because the future capabilities of quantum computers were unclear, this group reflected a variety of approaches. As I said then:

He (Matthew Scholl, Chief of the Computer Security Division at NIST,) did make clear that NIST is not looking for a single algorithm, or even a specific number of algorithms, which may be a good thing. One thing we know from experience is that monocultures can fall to a single vulnerability. And it looks like they expect that different needs will lead to different algorithms being used.

“This is to ensure that we have some resilience so that when a quantum machine actually comes around — not being able to fully understand the capability or the effect of those machines — having more than one algorithm with some different genetic mathematical foundations will ensure that we have a little more resiliency in that kit going forward,” Scholl said.

Now NIST has concluded that second round of testing with a notice dated July 2020 called Status Report on the Second Round of the NIST Post-Quantum Cryptography Standardization Process. And this officially kicks off Round Three with 15 candidates to be standards. Of these, 7 are far enough along that if they survive this round of testing they will become approved standards. The other 8 will continue the process, but it is not anticipated that they will receive approval after Round Three. Of the 7 that are pretty far along, 4 are proposed standards for Public-Key Encryption and Key establishment: Classic McEliece, CRYSTALS-KYBER, NTRU, and SABER. Then there are 3 proposed standards for Digital Signatures: CRYSTALS-DILITHIUM, FALCON, and Rainbow. These 7 proposals have in common that they are more general purpose and could find wide adoption fairly easily on that account, so it made sense to push them into the high priority group.

The other 8 algorithms are ones that either need more work, or are targeted to more specific applications, so their participation in Round Three is expected to be additional development not leading to final approval. They are: BIKE, FrodoKEM, HQC, NTRU Prime, SIKE, GeMSS, Picnic, and SPHINCS+.

The goal according to NIST mathematician Dustin Moody, is this:

“The likely outcome is that at the end of this third round, we will standardize one or two algorithms for encryption and key establishment, and one or two others for digital signatures,” he said. “But by the time we are finished, the review process will have been going on for five or six years, and someone may have had a good idea in the interim. So we’ll find a way to look at newer approaches too.”

So this research is ongoing as it should be. I consistently refer to this as an arms race because that is the most accurate description. As Bruce Schneier likes to quote:

But there’s an old saying inside the NSA: “Attacks always get better; they never get worse.”

So whenever clever hackers find ways to break encryption, other clever hackers find ways to prevent the exploits. But this leads to a few conclusions worth pointing out:

  1. Encryption is never eternal. Just because it was secure when you first encrypted something does not mean it is secure 20 years later when the technology has changed. If you need to keep something secure for long time periods you should not rely on encryption alone. The NSA facility in Utah (the Utah Data Center) will eventually be able to decrypt all of those messages they stored there. In fact, they have probably decrypted a good many already.
  2. For most purposes, Digital Signatures are perfectly safe. Needing to verify the sender 20 years later is a very rare case. Mostly you are verifying the sender and message at the time.
  3. Similarly, for most purposes encryption for TLS and VPN are fine. Again, you are trying to achieve security at the moment. If someone wants to hack my login to Amazon in 20 years I doubt it will matter much.
  4. If someone tries to tell you that “Quantum computing will kill encryption for all of us,” you can confidently assure them it is not true. We are building the tools to assure encryption in the post-quantum world.

Listen to the audio version of this post on Hacker Public Radio!