BrandPosts are written and edited by members of our sponsor community. BrandPosts create an opportunity for an individual sponsor to provide insight and commentary from their point-of-view directly to our audience. The editorial team does not participate in the writing or editing of BrandPosts.
By Chris Purcell
IDC’s recent Data Age Study predicts the global data sphere will reach 175 zettabytes (ZB) by 2025. Exactly how much data is that? According to the accompanying IDC video, “If one were able to store 175ZB onto Blu-ray discs, then you’d have a stack of discs that can get you to the moon 23 times.”
How businesses will organize and extract value from all that data using artificial intelligence (AI) is the topic of a recent BriefingsDirect Voice of the Innovator podcast.. Dana Gardner, Principal Analyst at Interarbor Solutions, spoke with Rebecca Lewington, Senior Manager of Innovation Marketing at Hewlett Packard Enterprise (HPE), to hear her views on AI and how it will automate IT operations.
Exploding data combined with shrinking time to act
Lewington starts the interview by reminding listeners that exploding data is not something new – we’ve had this problem for a while. Yet now, things are different.
“It’s not just the amount of data; it’s the number of sources the data comes from and what you need to do with it that is challenging,” Lewington explains. “The data is coming from a variety of sources, and the time to act on that data is shrinking. We expect everything to be real-time. If a business can’t extract and analyze information quickly, they could very well miss a market or competitive intelligence opportunity.”
That’s where AI comes in – a term originally coined by computer scientist, John McCarthy, in 1956. He defined AI as “the science and engineering of making intelligent machines.”
Lewington thinks that the definition of AI is tricky and malleable, depending on who you talk to. “For some people, it’s anything that a human can do. To others, it means sophisticated techniques, like reinforcement learning and deep learning. One useful definition is that artificial intelligence is what you use when you know what the answer looks like, but not how to get there.”
No matter what definition you use, AI seems to be everywhere. Although McCarthy and others invented many of the key AI algorithms in the 1950s, the computers at that time were not powerful enough to take advantage of them. The industry has now hit an inflection point in terms of computational performance, and as a result, AI has taken off.
Applying AI to IT
According to Lewington, the heart of what makes AI work is good data: the right data, in the right place, with the right properties. “It’s a circular thing. You can use good data to train a model, which you can then feed new data into it in order to get results that you couldn’t get otherwise. So we are using AI to make the data better — to make AI better.”
When businesses apply AI to IT operations (AI Ops), they can use it to manage their exploding data better. For example, IT could do a particular task the traditional way: import data, spin up clusters, and install applications. These are all things IT could do manually.
But once IT tasks get to a certain scale, it becomes overwhelming. They need to do manual processes hundreds of times, thousands of times, even millions of times. And a typical business doesn’t have the manpower or time to do that.
AI gives IT a way to augment what humans can do. Using AI, the mundane stuff can be taken away, so humans can get straight to what they want to do. Instead of spending weeks and months preparing an IT environment to start to work out an answer, AI automates the preparation and users can get to work immediately.
This type of data manipulation using AI is what Lewington is working on at Hewlett Packard Labs. Her team is using AI to automate the process of creating more powerful, more flexible, more secure, and more efficient computing and data architectures. For example, the Deep Learning Cookbook allows customers to find out ahead of time exactly what kind of hardware and software they are going to need to get to a desired outcome.
Deploying AI at the edge
In the future, businesses will increasingly need to analyze and act on data from devices on the edge of the networks. Often there isn’t time to move data into the cloud and make a decision, then wait for the required action to return.
“This means we need a data strategy to get the data in the right place together with the AI algorithm at the edge,” continues Lewington. “Once you begin doing that, once you start moving from a few clouds to thousands and millions of endpoints, how do you handle multiple deployments? How do you maintain security and data integrity across all of those devices?”
Using AI at the edge will allow businesses to act quickly on a continuous stream of data. Researchers are also investigating something called swarm learning, where devices learn from their environment and each other using a distributed model that doesn’t use a central cloud at all.
Lewington explained that this type of AI is complex, but researchers at HPE are exploring the concept. “Because AI is changing so fast, businesses need a partner who has expertise they need currently, yet they also want to know that their partner is still going to be an expert in innovative technologies five, ten, or even twenty years from now.”
The very near future of AI
In the days to come, AI will impact everything within an enterprise in one way or another. Because so much data is flowing from so many sources, manual processes will be overwhelmed. All things that are rule-based can be more powerful, more flexible, and more responsive using AI. Anywhere there is an equation, AI can manipulate data to run the processes more effectively.
“Businesses need to look at AI as another class of analytics tools to help run their IT systems,” Lewington explains. “It’s not magic, it’s just a different and better way of doing IT analytics. AI lets you harness more difficult datasets, more complicated datasets, and more distributed datasets.”
For example, an IT operations team may have 1,000 people starting a factory on the other side of the world, and they need to provision their datacenter. Using AI, IT Ops could simply put on a headset and say, “Hey, computer, set up all the stuff we need in our new factory.”
As mankind rushes headlong into the era of the Zetabyte, AI can make everything simpler by automating and analyzing the ever-growing data. And if AI is done well, it is not only powerful, it is invisible.
Chris Purcell drives analyst relations for the Software-Defined and Cloud Group at Hewlett Packard Enterprise. The Software-Defined and Cloud Group organization is responsible for marketing for HPE Synergy, HPE OneView, and HPE SimpliVity hyperconverged solutions. To read more from Chris Purcell, please visit the HPE Shifting to Software-Defined blog.