Thorsten Strassel ABB Research Switzerland Baden-Daettwil, Switzerland, firstname.lastname@example.org; Elsi-Mari Borrelli ABB Research Switzerland Baden-Daettwil, Switzerland, email@example.com
Quantum computing is currently one of the most talked-about topics in technology. Yet as it is far from our daily experiences in the classical world, its disruptive power remains somewhat elusive to most. Each new day brings fresh headlines, hype and headway, crowned in late 2019 by the spectacular news of Google beating its rivals in the race for “quantum supremacy.” The breakthrough stirred wild excitement in the technology community, with comparisons being made between this major engineering achievement and the Apollo 11 moon landing →01–02.
Back on Earth, it is of benefit to question and evaluate the significance of quantum computing for future industrial applications: What kind of technologies would quantum computing enable for industrial applications? What are the obstacles preventing quantum computers from being used today? Are there other technologies emerging that could have a similar impact?
Can I buy a quantum computer now?
A good starting point is to establish what quantum computers are and who is in the race to build them.
Quantum computers are not anticipated to be a turbo-boosted replacement for laptops – not even in the long term – nor are they universal supercomputers replacing large-cluster computing. Quantum computers are, in fact, large special-purpose machines that aim to get an edge over conventional hardware (and its successors) in dedicated computational problems – of which, more below →03. Quantum computers make use of principles of quantum mechanics to crack problems in minutes for which the fastest supercomputers would need thousands of years or longer. Importantly, the speed-up can only be achieved by using special algorithms (the design of which is a science in itself) that harness the laws of quantum physics; current algorithms are of no use.
The idea of quantum computing was triggered by the Nobel Laureate, Richard Feynman, in a now-famous lecture at IBM in 1982. Experimental efforts followed, resulting, in the late 1990s, in the first quantum computational devices that allowed the manifestation of the so-called qubit, the quantum relative of the bit in classical computing. Recent theoretical progress has advanced the field significantly. Today, 54-qubit chips represent the cutting-edge and corporations such as Google, IBM and Honeywell have taken over hardware leadership with high-profile research departments while a couple of hardware startups are trying to catch up. (These efforts refer to a universal quantum computer architecture; for hardware optimized for the specific algorithm of quantum annealing, the company D-Wave is best known.) Simultaneously, governmental agencies and the Wallenberg Foundation in Sweden have assisted knowledge development and commercialization. So far, despite all these efforts, the technology remains behind laboratory walls and without commercial application. But for how long?
What is under the hood?
Since the idea of quantum computing was first introduced and first qubit implementations realized, multiple different candidates for hardware implementation have been investigated. Today, ion traps and superconducting loops are the most mature technologies used for the implementation of quantum computers →04–05.
The most advanced demonstrator chips have 54 qubits, with each connected to four other qubits for superconducting universal quantum computing. In →04, microwave cables for control signals are connected to the quantum chip, which is the dark square in the lower part of the image. For ion traps, the reported qubit number is 11 qubits with full connectivity. Qubits can be implemented on various technology platforms that have pros and cons for the quantum computer design. To run an algorithm on the quantum hardware, the qubits are made to interact via various types of logic gates. Besides the properties of the qubits themselves, their specific arrangement and the gate operation characteristics determine the successful implementation of the quantum algorithms. As the qubits, and thus the information stored in them, are very sensitive to noise introduced by outside sources and the gate operations themselves, it is very challenging to achieve faithful implementation of the algorithms. To reduce noise introduced by interaction with their surroundings, today’s quantum computers must be operated at temperatures close to absolute zero or in a high-quality vacuum →06. A future universal quantum computer would, therefore, require efficient error correction algorithms to counteract the detrimental effects of noise. Various error correction strategies have been proposed, but these use up qubits, thus reducing the final number of qubits available on the chip for the computational problem itself.
05 Qubit technologies used for quantum computing. The qubits are typically manipulated with microwave or laser technology.
Today’s quantum chips are limited to some tens of qubits, with still a significant number of errors in the gate operations. But how is the hardware expected to scale up in the medium term and long term? An equivalent of the well-known Moore’s law has been proposed that forecasts the doubling of quantum computing power (“quantum volume”) every year . However, significant technical breakthroughs are needed to scale the technology beyond a few thousand qubits. While waiting for the technology to scale up to the promised immense computing power, it is practical to ask if there may be some benefit already in the medium term.
Let the algorithms play
Quantum computers are probably best known for the theoretical threat they pose to existing encryption systems. However, as factoring cryptographically significant numbers requires computation with around 1 million high-quality qubits, even today’s best noisy small-scale quantum chips fall far short of the capabilities required. The same outlook applies to all the “Mars mission” quantum algorithms – ie, those that promise an astonishing speed-up for a vast number of applications (as opposed to the “moon shot” algorithms promising a less impressive speed-up, but with limited hardware).
Such a noticeable gap between the existing hardware and the requirements for running meaningful algorithms has naturally offered an optimal breeding ground for skepticism: Would the engineering challenge involved in gaining a real advantage with quantum hardware be simply too great? To give the skeptics a better argument than just “wait and see,” in 2012, physicist John Preskill suggested another kind of landmark for quantum computers: quantum supremacy. To achieve this landmark, one would demonstrate a quantum computer doing any non-trivial task (which would not even have to be useful) much faster than the best classical hardware. This landmark is what, in October 2019, Google’s quantum computing team achieved .
The Google scientists used an artificial computational problem of sampling random quantum circuits. Sequences of quantum gates were randomly generated, ie, without any intention to form actual algorithms, and the 54-qubit quantum computer was asked to send back the result of the “computation” in the form of a bit string output (eg, 0101…). Due to the quantum nature of the gates, for each sequence, there is a statistical variation with inherent quantum properties in the obtained outputs. The implemented quantum algorithm repeats the computation and creates distributions for sampling the bit strings corresponding to each quantum circuit as fast as possible. Finding the most likely output of such a random quantum circuit on a classical computer becomes exponentially more difficult as the numbers of qubits and implemented gates grow. The scientists calculated it would take a classical computer 10,000 years to perform this task; the quantum computer took a few minutes.
The task in which the supremacy was achieved is not useful for any applications and there is no suggestion that any useful task could be solved any time soon with the current chips. The demonstration “merely” showed that even a small-scale quantum computer could do one particular non-trivial task better than a classical computer. In this light, a term like quantum “advantage” instead of “supremacy” could be more appropriate to describe the achievement.
There is, however, also evidence that even such noisy intermediate-scale quantum (NISQ) computers with some hundreds or thousands of qubits could help solve complicated combinatorial optimization problems with quantum heuristics such as quantum annealing or a quantum approximate optimization algorithm (QAOA) →07. Such combinatorial optimization problems have a vast number of applications and quantum chips of relevant scale are already in the short- and intermediate-term roadmaps of all the relevant quantum hardware players. In September 2020, IBM released their quantum computing roadmap, which foresees a 1,000-plus qubit device, called IBM Quantum Condor, by 2023 . For sectors ranging from finance to automotive, there have been recent indications on how such advances in quantum chips could bring benefits [4,5].
In the industrial autonomy driving seat
ABB has looked into quantum computing technology to map its potential in improving the optimization of large autonomous fleets, energy systems, supply chains, or manufacturing processes. Many of the applications involve extremely complex optimization problems that cannot currently be solved efficiently, or at all. Will quantum computing be the key to solving these problems?
Amid all the recent hype surrounding quantum computing, it is good to keep in mind that it is not only hardware that can revolutionize computational approaches to future autonomous systems. Innovations in optimization algorithms themselves may, in the end, present a better return on investment, even with the same old classical hardware. Some of these innovations have been inspired by quantum computing algorithms and have shortened solution times greatly for some problems. These algorithmic developments have not hit the headlines as quantum computing has done, but they could have a bigger impact in the near term.
In the driver’s seat of industrial autonomy, ABB is steering the adaption of emerging computational technologies to novel industrial applications. As technological advances unfold in software and quantum hardware, it is crucial to be prepared to unlock any potential to redraw the industrial automation landscape.
 J. Gambetta and S. Sheldon, “Cramming More Power Into a Quantum Device,” IBM Research Blog 2019. Available: https://www.ibm.com/blogs/research/2019/03/power-quantum-device/. [Accessed September 2, 2020].
 F. Arute, et al., “Quantum supremacy using a programmable superconducting processor,” Nature 574, pp. 505–510, 2019. Available: https://doi.org/10.1038/s41586-019-1666-5. [Accessed September 2, 2020].
 J. Gambetta, “IBM’s Roadmap For Scaling Quantum Technology.” Available: https://www.ibm.com/blogs/research/2020/09/ibm-quantum-roadmap/. [Accessed September 21, 2020].
 F. Neukart et al., “Traffic Flow Optimization Using a Quantum Annealer,” Frontiers in ICT, December 20, 2017. Available: https://www.frontiersin.org/articles/10.3389/fict.2017.00029/full. [Accessed September 2, 2020].
 M. Kühn et al., “Accuracy and Resource Estimations for Quantum Chemistry on a Near-Term Quantum Computer,” Journal of Chemical Theory and Computation 2019 15(9),4764-4780 Available: https://pubs.acs.org/doi/abs/10.1021/acs.jctc.9b00236%E2%80%AC. [Accessed September 2, 2020].
 A. Montanaro, “Quantum algorithms: an overview”. npj Quantum Information 2, article number 15023, 2016. Available: https://doi.org/10.1038/npjqi.2015.23. [Accessed September 2, 2020].