How much should CIOs care about quantum today?

Quantum computing may not be a cybersecurity concern yet, but CIOs should hope their vendors are already developing software with quantum in mind.

conceptual representation invoking a powerful supercomputer futuristic servers
Vladimir Timofeev / Getty Images

 [This article was co-written by Michael Hickins.]

In an industry constantly looking over the horizon for the “next big thing,” quantum computing (QC) is hands-down the winner when it comes to magazine covers and breathless reporting.  But, unlike many overhyped technologies, QC already exists – albeit in very limited scope –and it may surpass current computing paradigms by many orders of magnitude for some types of computation. While many researchers are looking into how QC can be applied to real problems in the future, some areas are already in the crosshairs, and should be on the radar of CIOs.

One of those areas is in the field of cryptography, where QC may be capable of breaking modern cryptography by 2030, rendering existing encryption protocols obsolete. While even that may seem like a long way off, “a decade is not a long time,” Sir Peter Knight, chair of the Quantum Metrology Institute at the United Kingdom’s National Physical Laboratory, told the Wall Street Journal last year.

Or as the CIO of one multinational chemical company put it, “it's no longer science fiction… rather, it’s ‘Big Bang Theory’ moving towards ‘Modern Family.’”

Not waiting for NIST standards

The National Institute of Standards and Technology (NIST) has already instituted a formal working group charged with developing new cryptographic standards that will withstand QC’s charge.

NIST is expected to issue draft standards for post quantum cryptography that will be available for public comment sometime between 2023 and 2025, says Daniel Southern, Information Security Senior Manager, Global Information Security, at Oracle, who is participating in the working group. “But that doesn’t mean we’re just going to wait seven years until these new algorithms become available,” says Southern.

In the meantime, he says, Oracle is already working on security measures that help strengthen its position on QC. “Proactive solutions are always the goal when it comes to information security, and in order to prevent issues you have to understand them… Our board has considerable experience dealing with these types of risks, and while modern cryptography is severely impacted, there are ways it can be used to bolster our resistance in the meantime.”

Even academics are suggesting companies start taking action. “By investing in the proper defenses now, those who plan ahead can keep their cool and protect their critical data when everybody else is either panicking or drowning in a sea of quantum disruption,” writes Alan Usas, program director and adjunct professor of computer science for the Executive Master in Cybersecurity program at Brown University.

Quantum computing will impact more than just cybersecurity

QC will also have an impact in fields where extremely large data sets can be used to predict future outcomes – think weather forecasting and the health sciences. It’s unclear, however, where else QC will have an impact on business, which is why vendors are in many cases partnering with academic and other specialists.

Oracle, for example, has begun working with USC’s Daniel Lidar, one of the leading QC experts in North America. Lidar has directed research and development at the USC-Lockheed Martin Quantum Computation Center (QCC) since 2011.

The strategy is to “leverage the work of a specialized partner in ways that benefit Oracle’s customers,” says Alan Wood, senior research director of Oracle Labs.

Lidar will be responsible for studying aspects of QC scalability and “the time frame in which quantum computing may become relevant to business applications,” says Kresimir Mihic, who along with Michael Delorimier, is Oracle’s principal investigator for the project. Mihic and Delorimier, meanwhile, will try to determine which of Oracle’s applications could be enhanced by QC.

This is very much the position most CIOs are in today.

“I think we are really trying to understand what this really is from a laypersons perspective and will then look for business application - but in the world of AI, AR, VR this tech can move us forward quickly,” noted the chemical company CIO.

But a significant number of hurdles remain before QC can become a factor, even in the hands of very large companies or even nations.

QC will become valuable when it comes to predicting future outcomes based on enormous data sets but doing so means being able to harness significantly larger numbers of qubits (the QC world equivalent to classic binary bits) than anyone has managed to date.

To put this into perspective, consider that it would take the world’s fastest supercomputer more than a thousand years to perform the number of calculations needed to break modern cryptography; a quantum computer could do it in under a minute, provided, however, that it had over 4,000 qubits. The most anyone has harnessed as of this writing is still under 100 qubits.

For most computing functions, traditional computers will work just fine. The way Mihic sees it, the Space Shuttle can be driven by the Commodore 64. “You need quantum computing to go into warp drive,” he says.

Essentially, the way QC works is that all the qubits in a computer perform the same calculation at the same time and reconcile the “correct” answer between themselves. While these calculations could be considered nothing more than educated guesses, the number of qubits, the speed with which the calculations are performed (microseconds), and the number of repetitions possible in a very short interval of time practically guarantee a correct outcome.

“When NIST says it'll take a few hours to solve for an encryption key in 2030, they aren't necessarily saying it'll be a single attempt that will take a few hours to run,” Southern says. “Rather that they might have run several attempts and will average one correct solution in a few hours. Having more qubits increases your probability of being right, so once you have a few hundred thousand [qubits], it's more likely I'll teleport home today after work rather then giving you a wrong answer,” he says.

The challenge of harnessing that many qubits is that “in order to keep a superposition over large numbers of states we need error correction,” Delorimier says. Error correction offsets interference from external factors, which becomes inevitable when dealing with very large numbers of qubits. “Without error correction, whenever a quantum computer interacts with the outside world, its quantum state collapses. That is, the exponentially large state that it uses to go so fast is no longer exponentially large.”

More hurdles to QC loom

Another challenge is power and cooling – familiar to engineers of contemporary computing, but at a much greater scale. Running a quantum computer today requires keeping the qubits at near absolute zero degrees.

Other challenges that must be solved before QC can be brought into the computing mainstream include: quantum circuits and control logic, problem and algorithm reformulation to fit the quantum paradigm, and manufacturability.

Costs will eventually come down, the cooling requirements will probably become slightly less onerous, but QC will still probably only be useful for arriving at answers to questions involving enormous data sets, such as the huge number of variables involved in weather forecasting or simulating the human brain.

Michael Hickins is director of strategic communications at Oracle. Previously, he was the founding editor of The Wall Street Journal’s CIO Journal, the editor of WSJ.com’s Digits blog, and executive editor at eWEEK. He is also a noted fiction writer.

This article is published as part of the IDG Contributor Network. Want to Join?

SUBSCRIBE! Get the best of CIO delivered to your email inbox.