Elsi-Mari Borelli Theoretical and Computational Methods Baden-Dättwil, Switzerland, email@example.com; Thorsten Strassel Switchgear Baden-Dättwil, Switzerland, firstname.lastname@example.org
Anybody following the tech news will have encountered the topic of quantum computing. Whether in IT magazines, science news, business reports of mainstream media, the promising new technology is regularly explained, hyped over, or debunked. The spectacular news of Google in 2019 achieving “quantum supremacy” stirred wild excitement, with the achievement even being compared to Apollo 11’s landing on the moon.
From the moon, it is good to bring one’s feet back on the ground and take a deeper look into the question on the “nuts-and-bolts” impacts of quantum computing for future industrial applications from a present-day perspective.
First, quantum computers are not anticipated to be a turbo-boosted replacement for one’s laptop, not even in the long-term. Second, they are not universal supercomputers replacing all big-cluster computing. They are large special purpose machines, aiming to get an edge over conventional hardware (and its successors) in dedicated computational problems. They make use of principles of quantum mechanics to crack problems in minutes for which the fastest super computers would need thousands of years or longer. But in order to benefit from the speed-up, the algorithms of today will not do the trick. Special algorithms are required that harness the laws of quantum physics.
The idea of quantum computing is not new by itself: It was triggered by the Nobel Laureate in Physics Richard Feynman in 1982. Rather than digital bits that either have a value of zero or one, a qubit can hold multiple values at different probabilities - until it is measured. Logical operations that can be performed on these qubits thus cannot follow the ones on classical computers.
Quantum computing has received widespread attention due to the theoretical threat it poses to all present-day encryption systems (Shor’s algorithm). However, factoring cryptographically significant numbers require computation with around one million high-quality qubits. Today’s best in the business quantum chips, such as Google’s 54-qubit Sycamore design, are still light years away in terms of size and low noise for such applications →01. In September 2020, IBM released their quantum computing roadmap, which foresees a 1,000-plus qubit device, called IBM Quantum Condor, by 2023 .
There is, however, evidence that even so-called noisy intermediate-scale quantum (NISQ) computers can help solve complicated combinatorial optimization problems with quantum heuristics. Quantum chips in this scale are already in the intermediate-term roadmaps of many companies. Even if the speed-up on NISQ hardware is less pronounced, when tailored to the right application, the technology may have a major commercial impact. Therefore, many companies and government agencies are advancing research with the aim to have solutions ready in the medium term.
Quantum computing may therefore play a key role in building autonomous systems in the future. But it is not the only technology that has the power to revolutionize computational achievements.
Innovations in the optimization algorithms can themselves deliver much more bang for the buck, even on classical hardware. In fact, the algorithms developed for quantum hardware have recently inspired a new breed of optimization algorithms that have proven to provide great speedups for some problems.
The topic of quantum computing (and its potential role in industrial automation) will be covered in greater detail in an article in an upcoming issue of ABB Review.
 J. Gambetta, “IBM’s Roadmap For Scaling Quantum Technology.” Available: https://www.ibm.com/blogs/research/2020/09/ibm-quantum-roadmap/. [Accessed September 21, 2020].