Two IT industry analysts discuss taming multi-cloud complexity

BrandPost By Chris Purcell
Feb 13, 2019
IT Leadership

How a new era of artificial intelligence, orchestration, and automation is helping the enterprise manage diverse systems, multi-clouds, and growing IT complexity

multicloud
Credit: bigstock

Much has changed for businesses in the last 40 years. In the 1980s, personal computer growth lead to microcomputers (servers), and by the 1990s, data centers were commonplace. Then, virtualization and the need to process an explosion of data fueled data center growth in the early 2000s. When Amazon launched its commercial web service (EC2) in 2006, cloud computing dramatically changed how businesses handle their data – and their businesses.

As an IT industry analyst, Martin Hingley, President and Market Analyst at IT Candor Limited, based in Oxford, UK, had a front row seat to all of this change. In a recent BriefingsDirect podcast, Dana Gardner, Principal Analyst at Interarbor Solutions, discusses some of these changes with Hingley. The two analysts examine how artificial intelligence, orchestration, and automation are helping tame complexity brought about by continuous change.

After 30 years of IT data center evolution – are we closer to simplicity?

Gardner began the interview by asking Hingley if a new era with new technology is helping organizations better manage IT complexity. Hingley responded, “I have been an IT industry analyst for 35 years, and it’s always been the same. Each generation of systems comes in and takes over from the last, which has always left operators with the problem of trying to manage the new with the old.”

Hingley recalled the shift to the client/server model in the late 1980s and early 1990s with the influx of PC servers. “At that point, admins had to manage all of these new systems, and they couldn’t manage them under the same structure. Of course, this problem has continued over time.”

Management complexity is especially difficult for larger organizations because they have such a huge mix of resources. “Cloud hasn’t helped,” Hingley explained. “Cloud is very different from your internal IT stuff — the way you program it, the way you develop applications. It has a wonderful cost proposition; at least initially. But now, of course, these companies have to deal with all of this complexity.” Managing multi-cloud resources (private and public) combined with traditional IT is much more difficult.  

Massive amounts of data: get your house in order using AI

Additionally, consumers and businesses create massive amounts of data, which are not being filtered properly. According to Hingley, Every jetliner flying across the Atlantic creates 5TB of data; and how many of these fly across the Atlantic every day?” In order to analyze this amount of data properly, we need better techniques to pick out the valuable bits of data. “You can’t do it with people. You have to use artificial intelligence (AI) and machine learning (ML).”

Hingley emphasized how important it is to get a handle on your data – not only for simplicity, but also for better governance. For example, The European Union (EU) General Data Protection Regulation (GDPR) reshapes how organization must handle data, which has far-reaching consequences for all businesses.

“The challenge is that you need a single version of the truth,” explains Hingley. “Lots of IT organizations don’t have that. If they are subpoenaed to supply every email that has the word Monte Carlo in it, they couldn’t do it. There are probably 25 copies of all the emails. There’s no way of organizing it. Data governance is hugely important; it’s not nice to have, it’s essential to have. These regulations are coming – not just in the EU; GDPR is being adopted in lots of countries.”

Software-defined and composable cloud

Along with AI, organizations will also need to create a common approach to the deployment of cloud, multi-cloud, and hybrid-cloud, thereby simplifying management of diverse resources. As an example of such a solution, Gardner mentioned the latest composable news from Hewlett Packard Enterprise (HPE).

Announced in November 2018, the HPE Composable Cloud is the first integrated software stack built for composable environments. Optimized for applications running in VMs, containers, clouds, or on bare metal, this hybrid cloud platform gives customers the speed, efficiency, scale, and economics of the public cloud providers. These benefits are enabled through built-in AI-driven operations with HPE InfoSight, intelligent storage features an innovative fabric built for composable environments.

I like what HPE is doing, in particular the mixing of the different resources,” agreed Hingley. “You also have the HPE GreenLake model underneath, so you can pay for only what you use. You have to be able to mix all of these together, as HPE is doing. Moreover, in terms of the architecture, the network fabric approach, the software-defined approach, the API connections, these are essential to move forward.”

Automation and optimization across all of IT

New levels of maturity and composability are helping organizations attain better IT management amidst constantly changing and ever-growing complex IT environments. Gaining an uber-view of IT might finally lead to automation and optimization across multi-cloud, hybrid cloud, and legacy IT assets. Once this challenge is conquered, businesses will be better prepared to take on the next one.

To hear the full interview, click here. To learn more about the latest insights, trends and challenges of delivering IT services in the new hybrid cloud world, check out the IDC white paper, Delivering IT Services in the New Hybrid Cloud: Extending the Cloud Experience Across the Enterprise.

__________________________________________

About Chris Purcell

chris purcell e1510269412532
Chris Purcell drives analyst relations for the Software-Defined and Cloud Group at Hewlett Packard Enterprise. The Software-Defined and Cloud Group organization is responsible for marketing for HPE Synergy, HPE OneView, HPE SimpliVity hyperconverged solutions, and HPE OneSphere. To read more from Chris Purcell, please visit the HPE Shifting to Software-Defined blog.