A new era of unimaginably fast quantum computers is only a few years away, with machines set to completely transform the way we solve problems, communicate and compute. However, in the immortal worlds of Voltaire and Spiderman, 'with great power comes great responsibility', and many computing experts fear functional quantum computers could also effectively break even the strongest encryption we have today, and with it, the internet as we know it.
But are those fears unfounded? Ian Farquhar, director, Worldwide Security Architecture Team at Gigamon, says information and communications security aren't about absolutes of yes or no, but about risk. “The risk here is that at some point in the future, possibly secretly and without any public disclosure, organisations may be able to build quantum computers that can practically break crypto systems, which classical computers can't. That is what has people concerned, and that is how it could impact current algorithms.”
For Farquhar, the big question is, how long will this take? Will it be 10 to 15 years? Will it be 30 years? Will it be never? Or will there be some major breakthrough in the design of quantum computers that doubles their capacity overnight? “Nobody really knows, which is why planning is happening now.”
Taking us back to the beginning, he says in 1994, mathematician Peter Shor created something called Shor's algorithm, which runs on a quantum computer, and can do integer factorisation, or the decomposition of a composite number into a product of smaller integers - the very problem many public key cryptography systems are based on. In 1996, another mathematician, Lov Grover, produced Grover’s algorithm, which could potentially attack non-public key algorithms.
“Think about archived data, often on magnetic tape, stored in long-term storage. They all have to be pulled out, read and unencrypted, and then written to new media and re-archived.”
Ian Farquhar, Gigamon
He says modern cryptography is based on algorithms, which are designed to be as hard to break as possible. For example, the public key algorithms we use today, such as RSA, Diffie-Hellman, and Elliptic Curve, are used in cryptography to allow communicating parties to agree on cryptographic keys, or to create and verify digital signatures. “All of these are vital functions in creating secure communications,” says Farquhar.
Break the code
Most public key algorithms are based on mathematical problems, which are easy to do one way, or hard to do in another. “As an analogy, if I gave you two prime numbers - let’s say 13 and 17 - it's really easy to multiply them together and get 221. But if I gave you the number 221 and said it was the product of two prime numbers, it would take longer to identify 13 and 17 as the prime factors. This might seem esoteric, and it's a little more complex in practice, but this is the basis for the RSA public key algorithm, which has been in widespread use since it was developed by Ron Rivest, Adi Shamir, and Len Adleman in 1977. Except in real terms, the factors (numbers) we use are 2048-bits (~617 digits) long or more. Other algorithms use different mathematical structures.”
Now, Farquhar says, for traditional computers to break these cryptographic algorithms would mean breaking the mathematics, and while algorithms exist for the classical computers we use today, they're slow. “In fact, we usually make the numbers so huge that even massive computers running for billions of years shouldn’t be able to break it.
In 1977, the US government released the Data Encryption Algorithm (DEA), more widely known as the Data Encryption Standard (DES). At the time, no one expected DES to be used for very long, experts citing a few years at most. DES was only withdrawn in May 2005. It remained in use for 28 years, despite huge concerns about it being theoretically susceptible to attack due to its small keys.
The lesson here, says Farquhar, is that cryptographic algorithms stick around for a long time, and are much harder to change than when they first appear, which is why people are getting worried about quantum computing.
Step in post-quantum cryptography (PQC) – an initiative that recognises the risk posed by existing algorithms. The aim, he says, is to create new algorithms based on structures, which aren't amendable to computation on quantum computers.
The US National Institute of Standards and Technology (NIST) is running a selection process to evaluate and select PQC algorithms. “This is a fantastic and laudable idea, following the same process that gave us AES and SHA-3. Researchers submit candidate algorithms, against publicly defined criteria, and these are evaluated by cryptographers all over the world, as well as cryptographers working for the US government. AES is the poster-child for how this works well, and it's resulted in what appears to be a very solid algorithm.”
He says this isn't as trivial as just proposing a new algorithm and everyone implementing it. “For example, most cryptographic libraries allow algorithm selection, but many were created before PQC, when having a key that is over 16K in size didn't seem necessary. Certain PQC algorithms have very large keys indeed, so we will need to change that.”
No small feat
The reality is that changing algorithms, and even more so changing the APIs that are used, isn't that easy. This is why some organisations are already asking vendors questions about their PQC plans.The other big challenge, he says, is that many organisations have encrypted data that must remain secure for decades. “It's easy to think about files, or even encrypted servers. But think about archived data, often on magnetic tape, stored in long-term storage. They all have to be pulled out, read and unencrypted, and then written to new media and re-archived. Some of this data must be maintained by law. This simple problem quickly turns very complex.”
The important thing to understand here is that at the moment, we aren't aware of a quantum computer big enough to attack real, fielded algorithms. “In fact, they're not even close. Quantum computers use qubits (quantum bits), and the biggest one publicly announced contains 53. It's also not like 2048-bit RSA needs 2048-qubits: actually it needs just over twice that based on estimates.”
In addition, we don't yet know if it’s even practical to build quantum computers at that scale. “These things need to be cooled to a fraction of a degree above absolute zero. It may be that a quantum computer capable of attacking real-world cryptosystems may be impractical. But is anyone willing to say it's impossible? No. Organisations with very smart people, such as IBM, Google, Microsoft, and probably most major governments, are putting a lot of money into this.”
Keep calm and plan
So in a nutshell, he says we have the possibility that quantum computers, in some indefinite time in the future, might be able to practically break cryptographic systems that are deemed secure against classical computers. Moreover, the organisations that do it - particularly governments - might not tell us when they have the capability.
“Teams of researchers all around the world, coordinated by NIST, are trying to develop new algorithms that are resistant to quantum computers breaking them, and many of these algorithms are very unusual, some are quite slow, and many have huge keys, which might make them difficult to integrate into existing cryptographic implementations. Fortunately, we ‘probably’ have time.”
This article was originally published in the March 2020 issue of ITWeb Brainstorm magazine.
Share