Distant, yet urgent
Any organisation which handles sensitive data should start preparing now for the arrival of quantum computing. The technology is unlikely to be ready for widespread use for years – maybe another couple of decades – but it has been known for some time that when it is, it will crack the encryption used by governments and armies, banks and hospitals. Messages sent today will become insecure overnight.
A tough subject
Quantum computing is a tough subject to explain. As Niels Bohr put it, “Anyone who is not shocked by quantum theory has not understood it”. Richard Feynman helpfully added “I think I can safely say that nobody understands quantum mechanics”.
In a nutshell, quantum computing employs the weird properties of quantum mechanics like superposition and entanglement. Classical computing uses binary digits, or bits, which are either on or off. Quantum computing uses qubits, which can be both on and off at the same time, and this characteristic makes them enormously more computationally powerful.
Dr Ignacio Cirac
In order to understand this properly we need a deep expert who is capable of explaining quantum computing in lay terms. Someone like Dr Ignacio Cirac. He is director of the Max Planck Institute of Quantum Optics in Germany, and holds honorary and visiting professorships pretty much everywhere that serious work is done on quantum physics. He has done seminal work on the trapped ion approach to quantum computing and several other aspects of the fields, and has published almost 500 papers in prestigious journals. He is spoken of as a possible Nobel Prize winner. Most recently he has been working on quantum simulation, a specialised area within the field which is useful to scientists.
Quantum computing is not just classical computing on steroids. It is something quite different. Dr Cirac compares it to the difference between sending a message with a rider on horseback, and sending a message via telephone. It does take much less time, but it is also quantitatively different – it uses different laws of physics. And just as telephones are hard to understand for people who haven’t experienced them before, so quantum computing is hard for people to understand if they have only experienced classical computers. Telephones didn’t replace horses, but instead complemented them. In the same way, quantum computing won’t replace classical computing, but will complement it.
Quantum computing is like nuclear fusion in that it is about fifty years away, and has been that far way for many years. But real, tangible progress is being made. When Dr Cirac entered the field in 1995, the best machine had one qubit. A couple of years later the record was two qubits. Today there are machines with hundreds of qubits. What’s more, the qubits in 1995 were of poor quality, whereas today they are much improved: they are successfully isolated, and don’t interact with their environment, which would cause them to make an error.
However, the field has a long way to go before quantum computers fulfil their enormous promise. The progress in classical computers known as Moore’s Law, which observes that they get twice as powerful every 18 months or so, has been going on for decades. In quantum computing there is now an analogous rate of improvement, but it has only been going on for a single decade. Also, there is no similar rate of improvement in the quality of the qubits, and just increasing the number of qubits does not enable us to make the machines that we want. The more qubits you have, the more opportunities for the interactions which cause errors. Quantum computing is very hard to scale.
Another point of difference between quantum computing and classical computing is that there are many different types of quantum computing. Some involve very low temperatures, some involve trapped ions, and others involve exotic things like Majorana particles, which are their own antiparticles! A single isolated hydrogen atom can function as a qubit because the orbit of its electron could be in more than one state. This atom would need to be in a vacuum so that no other particle can impact it, and keeping a single atom isolated in a vacuum is hard.
Superconducting materials can function as qubits by treating the direction of the current as the bit, the unit of computation: clockwise current means “one” and anticlockwise means “zero”. To make this work the material must be kept colder than the temperature in outer space.
Another approach is to use photons. These are plentiful, but they behave strangely, and they travel very fast – the speed of light, obviously – and that makes them tricky to work with.
Dr Cirac thinks these different approaches will each turn out to be suitable for different applications. Solid state systems like superconductivity may turn out to be more scalable, with millions of qubits and lots of error correction, and hence better suited to industrial applications. Quantum simulation does not require such robust error correction, so platforms using atoms may be more useful.
Big Tech vs academia
Google, IBM, and Microsoft are all investing heavily in quantum computing, and trying to scale up their machines. It is not possible for most academic institutions to compete with the hardware these firms can afford to buy or build, so they often focus on more basic research rather than solving engineering problems. Academics can also work more with software than with hardware, for instance by attempting to prove what quantum computers will and will not be able to do.
Quantum supremacy, or quantum advantage, is achieved when a quantum computer performs a calculation which cannot be done with a classical computer – or at least, cannot be done within a reasonable amount of time, say the amount of time the universe has existed for. This milestone was achieved by Google researchers in 2019, but the calculation they performed was specially tailored for their machine, and it was a calculation there is no reason to perform other than to achieve the milestone.
The science of artificial intelligence went through a couple of “winters” in the 1970s and 1980s, as hype ran ahead of performance and led to disappointed expectations. Both times, funding dried up for a few years until a new, more promising approach emerged. Dr Cirac will not be surprised if the same thing happens in quantum computing. At the moment, a lot of investor money is flowing into the field, but he expects that much of it will not yield the hoped-for returns. He is therefore very careful when he predicts the outcome of any project because he does not want to contribute to this problem.
One of the widely anticipated effects of quantum computing is that machines running something called Shor’s algorithm could render useless much of today’s encryption techniques, the security systems which keep our data safe. Dr Cirac says this will probably not happen in the next decade because it requires fault-tolerant quantum computers using millions of qubits. In fact, while he acknowledges that breakthroughs can occur suddenly and unexpectedly, he does not expect to see good, fault-tolerant quantum computing at scale for twenty years or so.
But that does not mean we should be complacent. Once such machines do exist, they will be able to crack the encryption of all databases and all messages sent years earlier. This means that any organisation where data security is mission-critical should be working on their defences now. Fortunately, there are already two possible lines of defence, known as quantum cryptography, and post-quantum cryptography.
Data security is mission-critical for most large organisations, so most large organisations should have a team which is going up the learning curve about quantum computing. It is not a good area to be naive about.