IBM to Invest $3 Billion in Chips for Cloud, Big Data Systems

To address the physical limitations of silicon transistors, IBM is investing heavily in research of new chip architectures and materials to power new generations of cloud, big data and cognitive systems.

IBM tomorrow will announce a $3 billion research and development investment to create a new generation of semiconductors geared for cloud, big data and cognitive systems. Over the next five years, Big Blue says it will use the money to fund two broad research and early-stage development programs to overcome the challenges involved in making ever-more-powerful chips.

"The simple scaling of semiconductors died almost 10 years ago and people have been quietly able to hide that," says IBM Fellow Bernie Meyerson, of vice president of Innovation.

Meyerson notes that it's not that engineers haven't been able to continue scaling down semiconductors over the past few years; it's that individual components and materials have reached their physical limits.

[Related: IBM Looks to Outsmart Big Data Competition With Watson ]

"There have been fundamental changes in the materials used," Meyerson says. "Almost 10 years ago, it was very clear that the gate oxide in the semiconductor, an insulator, became so small that it became a conductor. That's useless."

While engineers have developed workarounds and continued increasing the density of chips, it hasn't been without cost, Meyerson says. Because some of the individual materials are not scaling well, he says, the development of new chips in recent years has become an "exercise in better economics rather than increased performance."

Meyerson says he hopes that IBM's research efforts will pay off in new semiconductors that will transform the very nature of computing, allowing powerful big data analytics to become as pervasive in the future as the hand calculator is in the present.

"It could be as much of a revolution as when engineers put down their slide rules and started using the hand calculator," he says.

The first IBM research program will aim at scaling semiconductors to 7 nanometers and below from today's 22 nanometers. The company notes that doing so will require significant investment and innovation in semiconductor architectures as well as invention of new tools and manufacturing techniques.

[Related: IBM Bets Big on Watson-branded Cognitive Computing ]

"The question is not if we will introduce 7 nanometer technology into manufacturing, but rather how, when and at what cost?" says John Kelly, senior vice president, IBM Research.

"IBM engineers and scientists, along with our partners, are well-suited for this challenge and are already working on the materials science and device engineering required to meet the demands of the emerging system requirements for cloud, big data and cognitive systems," Kelly says. "This new investment will ensure that we produce the necessary innovations to meet these challenges."

The second program will focus on developing alternative technologies for post-silicon era chips to get around the physical limitations of silicon-based semiconductors. These new materials will require entirely different approaches, IBM says.

Is There a Future for Moore's Law Past 7 nm?

Moore's Law, named after Intel co-founder Gordon Moore, states that the number of transistors in a dense integrated circuit doubles approximately every two years — in practice it's been roughly every 18 to 24 months. Moore made his famous observation in a 1965 paper published in Electronics Magazine, and it has held true since that time. But physics suggests that eventually there must come a point where it is no longer possible to scale the integrated circuit.

[Related: 10 Big Myths About Big Data ]

Many experts believe the end will come somewhere between 10 nm and 5 nm on silicon. IBM believes there is a future beyond 7 nm, but it won't be found in silicon. Instead, it points to potential silicon alternatives like carbon nanotubes and computational approaches such as neuromorphic computing and quantum computing.

Specific areas of research will include quantum computing, neurosynaptic computing, silicon photonics, III-V technologies, carbon nanotubes, graphene, and next-generation low-power transistors (like tunnel field effect transistors that use the quantum-mechanical effect of band-to-band tunneling to drive the current flow through the transistor).

"In the next 10 years computing hardware systems will be fundamentally different as our scientists and engineers push the limits of semiconductor innovations to explore a post-silicon future," says Tom Rosamilia, senior vice present, IBM Systems and Technology Group. "IBM Research and Development teams are creating breakthrough innovations that will fuel our next era of computing systems."

The new research efforts will bring together IBM Research scientists and engineers from Albany and Yorktown, NY; Almaden, Calif.; and Zurich, Switzerland. IBM plans to "hire significantly" in emerging areas of research that are already underway at IBM, including carbon nanoelectronics, silicon photonics, new memory technologies and architectures that support quantum and cognitive computing.

IBM says the teams will focus on providing an order of magnitude improvement in system-level performance and energy efficient computing. The company also plans to continue funding and collaborating with university researchers to explore and develop semiconductor technologies. It notes that it will continue to support and fund university research through private-public partnerships such as the NanoElectronics Research Initiative (NRI), the Semiconductor Advanced Research Network (STARnet) and the Global Research Consortium (GRC) of the Semiconductor Research Corporation.

To comment on this article and other CIO content, visit us on Facebook, LinkedIn or Twitter.
Get your IT project the recognition it deserves.
Submit your CIO 100 Award application today!