The AI Data Bottleneck

BrandPostBy Beth Stackpole
Jan 22, 2019
Data Center IT Leadership

Collecting and managing huge volumes of data means nothing if the compute and storage infrastructure can’t keep up.

istock 925878034
Credit: istock

While big data has become synonymous with artificial intelligence (AI), there are other elements that are equally important to squeezing optimal mileage out of the 21stcentury analytics driving more intelligent decision-making.

It’s been said that organizations run at the speed of their data. However, if that data is hampered by throughput and latency challenges resulting from legacy compute and storage architectures, organizations will be hard pressed to achieve the desired insights leading to competitive advantage. A survey of 2,300 global business and IT leaders by MIT Technology Review Insights, in association with Pure Storage®, found that 83% of data-rich companies are prioritizing data analysis as much as possible in order to gain such an edge.

Yet because market and customer preferences move so quickly, the speed of data analysis and interpretation has to happen at pace. Eighty-four percent of survey respondents confirmed that the speed at which data can be received, analyzed, interpreted, and acted upon is central to their analytics initiatives. At the same time, 87% said data must be analyzed for meaning and context in order to derive the most business value out of AI and data-driven decision making.

Pick Up the Pace

While many aspects of the big data challenge are slowly being addressed, the velocity issue is still a hurdle. The MITTR/Pure Storage survey found that 78% of respondents are having trouble digesting, analyzing, and interpreting the volumes of data at scale, and 81% earmarked the need to analyze more data at greater speeds as a problem going forward.

In addition, 43% of survey respondents said data infrastructure was a real barrier with the potential to hinder AI adoption. Specifically, the increasing volume and velocity of data, coupled with the real-time processing demands of Internet of Things (IoT) solutions and AI-enabled applications such as image processing, are putting stress on existing architectures.

What happens if current infrastructure can’t keep up with the processing and storage demands of big data at scale? It means organizations will have to take a hard look at their existing platforms and rethink architecture to support best-of-breed performance, including addressing any lingering latency, capacity, and throughput issues that would cause bottlenecks for AI and data-driven analytics. Older storage systems that can’t operate easily at scale will hold those companies back.

To change the game, organizations need to consider a transformation of existing infrastructure by adding new elastic, scale-out systems that can handle petabyte-level data sets in real-time speeds and that can deliver all-flash performance levels. Such systems can handle the massively parallel needs of AI and thus are designed from the ground up to support modern-day intelligent analytics.

Collecting and managing all the data in the world means nothing if the compute and storage infrastructure can’t keep up with the deluge. Only by embracing new architectures that are built specifically for speed and scale will organizations be able to fully capitalize on a future fueled by AI and data-driven transformation.

For more information about how Pure Storage can help your organization speed up your data processing, visit