If you have been paying attention to encryption technology, you probably know that the safety of encryption from being cracked relies on the concept of “computational infeasibility”, which is a fancy way of saying that any encryption can be broken if you have enough time and enough resources, but if those quantities are simply impractical you can regard encryption as “safe enough”. In previous articles I have done some examples of this showing that a good long password with high entropy can resist cracking for a very long time, such as:
Passwords, Entropy, and Good Password Practices
But we have also mentioned that this is always an arms race where attackers are always getting better, and defenders need to improve was well. So far that has worked reasonably well. As computing has gotten cheaper and more powerful (making it easier to crack encryption) the defenders have responded by improving encryption through superior algorithms, longer key lengths, and so on. In this kind of arms race, a reasonable view in general is that anything encrypted today will, if you did it properly, remain safe for at least a period of decades before technical advances make it unsafe.
That is not to deny that some older encrypted data may become vulnerable over time if anyone cares enough to save it and attack it when the technology has matured that far. For example, there is speculation that a NSA facility constructed in Utah, called the Intelligence Community Comprehensive National Cybersecurity Initiative Data Center, was constructed for this precise purpose, and I suspect that either GCHQ is participating or has similar plans in mind. The facility is capable of storing immense amounts of data, and is near to sources of low-cost hydro-electricity, as well as being very favorably situated on Internet trunk lines. All of this certainly makes a plausible case for what they are doing, at the very least.
Personally, I have not worried too much because this is not the threat model I need to defend against, and I always start by defining the threats I care about. If the NSA can decrypt my e-mails 20 years from now I doubt they would find anything terribly interesting, and when I read my e-mails from long ago I am usually totally puzzled by what they are about. But there are people who have very legitimate reasons to be concerned, such as democracy activists in totalitarian countries like Russia, Chine, Turkey, etc. They should indeed be paying attention to the capabilities of the spy agencies, and taking steps to protect themselves. And for everyone who is concerned, the biggest wildcard has been Quantum Computing.
Quantum Computing differs from the traditional computing we are used to by the way bits work. In traditional computing, bits are either a 0 or a 1. Encryption in that environment is simply manipulating those bits, such as by XOR. Where the quantum difference comes in is that each quantum bit (qubit) can take on many values simultaneously. This is a superposition that allows both values to exist simultaneously (kind of like Schrodinger’s Cat, which is both alive and dead until you look). A single qubit can be in two states at once, two cubits can be in four states at once, three qubits in 8 states at once, and so on. For our purposes I do not propose to go into a detailed description of quantum computing (which I am completely unqualified to deliver, and which makes my brain hurt in any case.) The point we need to keep in mind is that quantum computing has the power to make feasible those decryptions that were previously considered infeasible. That said, we are not there yet. So far the quantum computers that have been developed are limited and finicky things. But given the intense interest, it is only a matter of time until they are developed to the point that they are practical. And when that happens, those messages the NSA has stored in Utah will be decrypted. That is unavoidable at this point. I am not sure that is all that much different from the march of decryption capabilities we witnessed until now. Encryption standards we once relied on, such as MD5, are now considered useless for any security purpose. (MD5 still lives on as a way of verifying that files have not been changed in any way, so you still see that with downloads of some kinds, such as Linux ISOs, where file integrity matters a lot.) So will files encrypted today using something like Elliptical Curve Cryptography be broken in 20 years? I would consider that highly likely. So if you are going to overthrow the government you might want to get a move on.
But I have seen some people claim that quantum computing means the end of the age of encryption, and that is nonsense. The arms race will continue, and quantum computing will be used to create new forms of encryption that have equivalent safety in the quantum age to what we have had over the last 30 years. In fact, it is happening right now.
In the U.S., the National Institute of Standards and Technology drives encryption standards, and in practice does so for most of the world, not just the U.S. They have a project called Post-Quantum Cryptography which in December, 2016 issued a Request for Nominations for the proposed new standard. As they state:
“If large-scale quantum computers are ever built, they will be able to break many of the public-key cryptosystems currently in use. This would seriously compromise the confidentiality and integrity of digital communications on the Internet and elsewhere. The goal of post-quantum cryptography (also called quantum-resistant cryptography) is to develop cryptographic systems that are secure against both quantum and classical computers, and can interoperate with existing communications protocols and networks.”
NIST did receive a number of submissions, and on March 20, 2019 delivered a briefing to the Information Security and Privacy Advisory Board, a Congressionally established board within NIST. There Matthew Scholl, Chief of the Computer Security Division at NIST, said that they had spent most of the previous year evaluating 69 submissions, and then selected 26 of the most promising of them for further investigation with an eye to whittling down the list some more later in 2019.
He did make clear that NIST is not looking for a single algorithm, or even a specific number of algorithms, which may be a good thing. One thing we know from experience is that monocultures can fall to a single vulnerability. And it looks like they expect that different needs will lead to different algorithms being used.
“This is to ensure that we have some resilience so that when a quantum machine actually comes around — not being able to fully understand the capability or the effect of those machines — having more than one algorithm with some different genetic mathematical foundations will ensure that we have a little more resiliency in that kit going forward,” Scholl said.
So it is clear that we will have encryption around for the foreseeable future, even if the precise methods change. I will keep an eye on this and report back when I hear of any new developments.
Listen to the audio version of this post on Hacker Public Radio!