Once considered a pie-in-the-sky technology, quantum computing is emerging as a way for enterprises to tackle machine learning (ML), optimization, search and challenges that classic computing models can’t touch. CIOs must begin exploring the technology now or risk falling behind rivals, according to Gartner.
Quantum machines will process in seconds data that takes years for supercomputers to process. That could be game-changing for enterprises that figure out how to use them to solve significant computing challenges, says Gartner analyst Matthew Brisse. Machine learning may be the perfect use case, as quantum machines will process ML algorithms faster, accelerating enterprises’ ability to process information and derive insights, according to Brisse. “If you can speed up the machine learning aspect of quantum computing you will accelerate the adoption of AI and make it more efficient,” he says.
Gartner estimates that 20 percent of Fortune 500 companies will be budgeting for quantum computing projects by 2021. Brisse says IT leaders ask him what quantum computing is, what they can do with it and where to find engineers to work with the technology. Most importantly, CIOs want to know how to apply quantum computing to their business and identify opportunities for concise innovation.
All great questions, Brisse says, because CIOs don’t want to make the same mistake they did ignoring ML and AI. When ML and AI hit mainstream CIOs suddenly couldn’t find enough data scientists or other engineers, Brisse says. For many enterprises, implementing quantum machines into an enterprise IT shop is an uphill, likely fruitless affair, considering that achieving stability in quantum physics is a challenge itself. To that end, here are facts CIOs should know about quantum computing.
What is quantum computing?
Quantum computing is a powerful approach to processing data using quantum bits, or qubits, that can represent and store values of 0, 1, or, in superposition, a combination of both 0 and 1 simultaneously. This superposition of states enables quantum computers to operate at a vastly accelerated scale that dwarfs the speed of classical computing, which uses binary bits to store information as either 0 or 1.
How does quantum computing work?
To perform a computation with many qubits, all must be sustained in interdependent superpositions of states, also known as quantum-coherent states, in which the qubits are said to be entangled. While a qubit is entangled, a tweak to one may influence all the others. However, qubits are fragile; fluctuations in temperature, noise, frequency and motion can lead to a condition physicists refer to as decoherence, a leakage of energy that derails calculations. Today, qubits can only maintain their quantum state for about 100 microseconds before decoherence, according to IBM researchers.
Quantum computing conditions are extreme, requiring cooling at 0.015 degrees Kelvin, or 180 times cooler than interstellar space, says Brisse. They must also be shielded to 50 times less than earth’s magnetic field; placed in a high vacuum, where they are pressurized at 10 billion times lower than atmospheric pressure; and placed on a low vibration floor.
How fast is a quantum computer?
A much sought-after benchmark, quantum supremacy is the point at which a quantum computer can perform a calculation faster than today’s fastest supercomputer. The generally accepted number at which this occurs is 50 qubits. But the challenge of maintaining coherence gets ever greater as the number of qubits increases. To wit, Brisse says quantum performance ultimately hinges on minimizing quantum errors.
“What really matters is how long you can keep a qubit in superposition and entangled,” Brisse says. “That’s the real metric.” He estimates that true quantum supremacy, in which systems can maintain coherence enough to fulfill their quantum promise, is a year or two away.
Quantum computing early adopters
CIOs interested in the potential of quantum computing should take stock of early adopters. Volkswagen in March 2017 began using quantum machines from D-Wave Systems to optimize traffic flow for 10,000 taxis in Beijing, China. Volkswagen CIO Martin Hofmann says that his team had to program a quantum chip to address every bit on the chip. “Quantum computing in the next five years will be a dominating technology,” Hoffman says. Accenture and 1Qubit are working with Biogen to speed up drug discovery by accelerating the rate at which they can simulate molecules and chemical reactions. J.P. Morgan Chase is working with IBM to use quantum computers for risk analysis and trading strategies.
Quantum computing applications
In looking where to apply quantum computing, Brisse says CIOs should identify problems involving large data sets that can’t be solved by classic computers, including NP-hard problems, such as the travelling salesman optimization problem. Analyzing death and mortality tables in insurance and calculating risk in securities are a couple of common problems for which Brisse fields inquiries. Once a quantum-worthy problem has been defined, CIOs can begin looking at vendors with which to work, Brisse says.
Quantum computing vendors and platforms
IBM and Intel have developed quantum computers with 50 and 49 qubits, with Google working on one capable of similar scale. Brisse, who is tracking 50 vendors in this space, says IBM, 1Quibit, D-Wave, Microsoft and Rigetti Computing are among those who have developed programming interfaces for quantum machines. CIOs should familiarize themselves with the vendors and their offerings, and download a quantum computing framework, including a software development kit and APIs.
Brisse says CIOs shouldn’t rush to purchase quantum machines, especially not until they have the talent and coding efficacy to build the software to run on them. Instead, test computationally intensive stages of ML workflows, including training, inference and optimization, on a quantum computing-as-a-service system, such as IBM’s Quantum Experience service.
“The key advice is: Don’t panic,” Brisse says. “You have time, but don’t get behind like you did with ML and AI.”
Related artificial intelligence articles: