Much has changed for businesses in the last 40 years. In the 1980s, personal computer growth lead to microcomputers (servers), and by the 1990s, data centers were commonplace. Then, virtualization and the need to process an explosion of data fueled data center growth in the early 2000s. When Amazon launched its commercial web service (EC2) in 2006, cloud computing dramatically changed how businesses handle their data \u2013 and their businesses.\nAs an IT industry analyst, Martin Hingley, President and Market Analyst at IT Candor Limited, based in Oxford, UK, had a front row seat to all of this change. In a recent BriefingsDirect podcast, Dana Gardner, Principal Analyst at Interarbor Solutions, discusses some of these changes with Hingley. The two analysts examine how artificial intelligence, orchestration, and automation are helping tame complexity brought about by continuous change.\nAfter 30 years of IT data center evolution \u2013 are we closer to simplicity? \nGardner began the interview by asking Hingley if a new era with new technology is helping organizations better manage IT complexity. Hingley responded, \u201cI have been an IT industry analyst for 35 years, and it\u2019s always been the same. Each generation of systems comes in and takes over from the last, which has always left operators with the problem of trying to manage the new with\u00a0the old.\u201d\nHingley recalled the shift to the client\/server model in the late 1980s and early 1990s with the influx of PC servers. \u201cAt that point, admins had to manage all of these new systems, and they couldn\u2019t manage them under the same structure. Of course, this problem has continued over time.\u201d\nManagement complexity is especially difficult for larger organizations because they have such a huge mix of resources. \u201cCloud hasn\u2019t helped,\u201d Hingley explained. \u201cCloud is very different from your internal IT stuff -- the way you program it, the way you develop applications. It has a wonderful cost proposition; at least initially. But now, of course, these companies have to deal with all of this complexity.\u201d Managing multi-cloud resources (private and public) combined with traditional IT is much more difficult. \u00a0\nMassive amounts of data: get your house in order using AI \nAdditionally, consumers and businesses create massive amounts of data, which are not being filtered properly. According to Hingley,\u00a0\u201cEvery jetliner flying across the Atlantic creates 5TB of data; and how many of these fly across the Atlantic every day?\u201d In order to analyze this amount of data properly, we need better techniques to pick out the valuable bits of data. \u201cYou can\u2019t do it with people. You have to use artificial intelligence (AI) and machine learning (ML).\u201d\nHingley emphasized how important it is to get a handle on your data \u2013 not only for simplicity, but also for better governance. For example, The European Union (EU) General Data Protection Regulation (GDPR) reshapes how organization must handle data, which has far-reaching consequences for all businesses.\n\u201cThe challenge is that you need a single version of the truth,\u201d explains Hingley. \u201cLots of IT organizations don\u2019t have that. If they are subpoenaed to supply every email that has the word\u00a0Monte Carlo\u00a0in it, they couldn\u2019t do it. There are probably 25 copies of all the emails. There\u2019s no way of organizing it. Data governance is hugely important; it\u2019s not nice to have, it\u2019s essential\u00a0to have. These regulations are coming \u2013 not just in the EU; GDPR is being adopted in lots of countries.\u201d\nSoftware-defined and composable cloud\nAlong with AI, organizations will also need to create a common approach to the deployment of cloud, multi-cloud, and hybrid-cloud, thereby simplifying management of diverse resources. As an example of such a solution, Gardner mentioned the latest composable news from Hewlett Packard Enterprise (HPE).\nAnnounced in November 2018, the\u00a0HPE Composable Cloud\u00a0is the first integrated software stack built for composable environments. Optimized for applications running in VMs, containers, clouds, or on bare metal, this hybrid cloud platform gives customers the speed, efficiency, scale, and economics of the public cloud providers. These benefits are enabled through built-in AI-driven operations with\u00a0HPE InfoSight, intelligent storage features an innovative fabric built for composable environments.\n\u201cI like what HPE is doing, in particular the mixing of the different resources,\u201d agreed Hingley. \u201cYou also have the HPE GreenLake model underneath, so you can pay for only what you use. You have to be able to mix all of these together, as HPE is doing. Moreover, in terms of the architecture, the network fabric approach, the software-defined approach, the API connections, these are essential to move forward.\u201d\nAutomation and optimization across all of IT\nNew levels of maturity and composability are helping organizations attain better IT management amidst constantly changing and ever-growing complex IT environments. Gaining an uber-view of IT might finally lead to automation and optimization across multi-cloud, hybrid cloud, and legacy IT assets. Once this challenge is conquered, businesses will be better prepared to take on the next one.\nTo hear the full interview,\u00a0click here. To learn more about the latest insights, trends and challenges of delivering IT services in the new hybrid cloud world, check out the IDC white paper, Delivering IT Services in the New Hybrid Cloud: Extending the Cloud Experience Across the Enterprise.\n__________________________________________\nAbout Chris Purcell\n\nChris Purcell drives analyst relations for the Software-Defined and Cloud Group at Hewlett Packard Enterprise. The Software-Defined and Cloud Group organization is responsible for marketing for HPE Synergy, HPE OneView, HPE SimpliVity hyperconverged solutions, and HPE OneSphere.\u00a0To read more from Chris Purcell, please visit the HPE Shifting to Software-Defined blog.