BrandPosts are written and edited by members of our sponsor community. BrandPosts create an opportunity for an individual sponsor to provide insight and commentary from their point-of-view directly to our audience. The editorial team does not participate in the writing or editing of BrandPosts.
By Patrick Osborne
In recent years, the popularity of hyperconverged infrastructure, or HCI, has grown at an exponential rate. Customers have loved the simplicity of deployment, the ease of management, and the readily available, uniform scale. Traditional HCI made Day 0 and Day 1 much easier — but for Day 2 operations and beyond, there were challenges that persisted. IT admins leveraging early HCI technology continue to fight fires and generally maintain a reactive attitude toward infrastructure. And while they may have loved the ease-of-use of hyperconvergence, they also were limited by the need to keep more demanding workloads on three-tier infrastructure.
Organizations have found the entire experience of HCI to be far simpler than legacy infrastructure. That simplicity has translated to better resource efficiency, increased productivity, faster time to market, and budget savings. HCI has delivered software-defined infrastructure that functions as a pool of centrally-orchestrated resources, making deployments fast and easy. Unified management and policy-based automation drive simplicity by eliminating silos and manual processes. And simply adding a node provides predictable scale and automatic balancing of both compute and storage, enabling organizations to start small and grow easily.
HCI has matured considerably in the last decade, incorporating better data protection and increased resiliency. However, in recent months, two key developments in the hyperconverged space have transformed the HCI experience for IT professionals. One is artificial intelligence (AI), and the other is a choice of architecture that delivers the hyperconverged experience for every workload. These two innovations are reframing industry’s view of HCI.
Evolving HCI from software-defined — to AI-driven
Across industries, AI-driven operations and insights are transforming IT’s approach to infrastructure management. The introduction of AI in the form of cloud-based machine learning and deep telemetry provides a global intelligence across IT environments. In HPE SimpliVity, a hyper-efficient HCI platform, AI removes the limitations of reactive, basic software-defined architectures and evolves to a model in which HCI becomes self-managing, self-optimizing, and self-healing — with predictive analytics that actually save customers millions of hours of lost productivity. That’s a lot of Day 2 challenges that don’t impact businesses.
HPE has been leading the move to intelligent IT operations for a decade with HPE InfoSight, the industry’s most advanced AI for infrastructure. With its predictive resource planning, global visibility and analytics, and support case automation, HPE InfoSight predicts and prevents problems across the IT stack and delivers prescriptive recommendations that take the guesswork out of managing infrastructure. That machine intelligence has redefined the industry’s conception of what HCI can be.
Expanding the HCI experience to every workload
The other half of this new definition of HCI involves extending the simplicity and ease-of-use of the hyperconverged model to all workloads. Having had great results with general-purpose apps in edge deployments and distributed environments, organizations now want to leverage the HCI experience for scale-up workloads in the data center.
Conventionally-defined HCI is ideal for workloads with predictable rates of storage and compute growth. However, for business-critical apps requiring high storage IO performance, six-nines availability, and/or the resource efficiency of independent scale, IT has typically resorted to a three-tiered infrastructure. But this move comes at the cost of introducing unwanted complexity into your IT management. There’s a better way.
Last year, HPE introduced a new, complementary hyperconverged architecture called disaggregated HCI, or dHCI. By combining HPE Nimble Storage with industry-leading HPE ProLiant servers, dHCI delivers the consistent HCI experience that organizations know and love in an architecture that’s fully capable of handling Tier 1 workloads. This is disaggregated infrastructure that doesn’t require a storage admin and doesn’t introduce the complexity of silos, but does deliver six-nines of availability and the high performance that business-critical apps demand.
The combination of dHCI and intelligent HCI delivers the power to run every VM with a hyperconverged experience that’s optimized to the needs of each workload. Both HPE Simplivity and HPE Nimble Storage dHCI operate seamlessly through VMware vCenter and share data mobility out to the cloud — so your hyperconverged management experience is consistent across all your workloads and you’re free to unlock the value in your data wherever it is.
The evolution of HCI is now
New models of hyperconvergence deliver an intelligent HCI experience that can address every VM and workload — from edge/distributed environments to general-purpose to business-critical and large-scale consolidation — with two workload-optimized architectures. This infrastructure is orchestrated via a hyperconverged control plane providing pooled resources, accelerated app delivery via automation, and VM-centric management and data services. Above that, global intelligence powers effortless, autonomous IT operations across your HCI environment.
Intelligence and the extension of the HCI experience to three-tiered architecture have truly transformed HCI, simplifying and automating your infrastructure — so you can focus on driving the business forward.
Download this technical validation report to get the results of hands-on evaluation and testing of HPE Nimble Storage disaggregated hyperconverged infrastructure (dHCI).
About Patrick Osborne
Patrick Osborne is the Vice President & GM of the Hyperconverged Infrastructure team in HPE Storage & Big Data group. In this role, Patrick is responsible for the technology strategy, development, and product management for the HPE SimpliVity Hyperconverged platform and deploying hybrid cloud solutions across the Intelligent Data Platform. Patrick joined HPE in 2009 via the IBRIX acquisition and has held VP of Product and GM positions for Secondary Storage, Big Data & Analytics and Scale-out Data Platforms.