by Roger Kay

IBM promotes IT infrastructure for cognitive workloads

Opinion
Jun 28, 2016
AnalyticsBig DataData Center

In its latest transformation, IBM has designed open infrastructure elements from the ground up to deliver cognitive insights in the cloud, on premise, or both.

Over the past century, IBM has reinvented itself again and again. This evolution is in stark contrast to one-trick ponies in the information technology business. Companies like LinkedIn do one thing, maybe quite well, until they reach the end of their run, in this case a purchase by Microsoft. IBM, on the other hand, has been forced to remake its business a number of times — moving from typewriters to mainframes, from mainframes to PCs, from PCs to infrastructure, and now, building on that last round, to cloud computing, analytics and infrastructure for the cognitive era.

Each time, the change has been painful but effective. Thousands of people lose their jobs in the older businesses, and thousands more are hired to staff the new ones. The company is such a veteran of this type of move that it has even developed a template for self-regeneration. Few other companies have that.

The philosophy behind this continuous upheaval is to migrate away from commodity businesses and toward high-value, high-margin sectors. This latest approach is all about enabling customers to get business value out of the vast streams of data being generated every day from both inside and outside their companies.

IBM’s current focus is on IT infrastructure for cognitive workloads, an idea that combines two company strengths: its growing business in intelligent analytics and its mature position in the powerful computing hardware needed to run that type of intensive activity effectively. They are interdependent and mutually reinforcing. And at the moment, they are being co-developed, each with the other in mind.

Some of the principles IBM has adopted to realize this new take on infrastructure include designing it from the ground up to deliver cognitive insights, working in an open environment that allows partners to help create the solution and customers to choose how they want to deploy it, and tying powerful on-premise equipment to valuable cloud services securely.

As this phase of IBM’s business evolves, the company expects to build intelligence right into the solution stack itself. An early example of this type of capability is already available on the company’s mainframe, z Systems. It’s called “z aware,” a type of operational analytics, and it detects anomalies in sensor data, using machine learning to recognize patterns that lead to a failure, which can be mitigated before it affects users.

Another example on the storage side is an offering that detects patterns in file usage and moves data to the appropriate storage tier. So, a spike in viewing of a particular video tips the system to move that file to fast flash memory. When the video goes cold, it’s moved back down to a lower tier automatically. In general, the hot data all sits on flash drives, while the cold data goes to disk, cloud, or tape archive.

Complex analytical queries are resource intensive, and response times need to be fast — measured in microseconds — so that customers can make use of intelligence in real time. IBM’s open architecture is designed to work with other systems, like hardware accelerators for rapid data crunching. With partner technology and IBM’s own optimized hardware, workloads can be managed intelligently. Areas where IBM brings in partners include I/O, memory, and specialized accelerators. For example, Mellanox supplies networking and nVidia provides hardware accelerators for Power8 systems.

With a combination of structured corporate files, unstructured social media activity, and multimedia sensor information thrown off by mobile devices and the Internet of Things (IoT), the system has to ingest a flood of heterogeneous data, which can create a processing bottleneck. To avoid such problems, the data needs to be placed intelligently, located in the right place, tiered efficiently.

And processing needs to be designed around data. Analytics must run as close to the data as possible. For example, within z Systems, data can be moved to a hotter tier inside the actual mainframe, allowing analytics to be run during a transaction. For example, fraud detection software can issue a threat score in 2 microseconds, and transactions that exceed some preset level can be funneled to a fraud prevention policy — all while the transaction is still underway.

And IBM delivers this infrastructure through a hybrid cloud model as a set of composable services consumed through standard application programming interfaces (APIs). This hybrid environment — which may involve company-proprietary assets and internal services, external partner assets, and other cloud services — is designed to be open yet secure. IBM balances openness with a secure architecture that allows multiple classes of users and applications (potentially internal and external) to touch systems of record without exposing them to threats. High-speed encryption — which can be managed down to the file level, potentially limiting a breach to a single instance — can be accomplished with a minimal tax on system performance. Such granularity confers enormous control to IT managers while allowing great flexibility on policy implementation and user experience.

An infrastructure designed for cognitive workloads can run services that scale all the way from development to live worldwide deployment. As applications are revised, they can be redeployed with no unplanned downtime, eliminating disruption of production services.

IBM has evolved its infrastructure offerings to power online, real-time marketplaces; mission-critical healthcare systems; heavily loaded financial transaction systems; delicate, expensive scientific installations; and a host of other tough data-processing environments. Infrastructure for cognitive workloads is aimed at the computing problems of tomorrow.