BrandPosts are written and edited by members of our sponsor community. BrandPosts create an opportunity for an individual sponsor to provide insight and commentary from their point-of-view directly to our audience. The editorial team does not participate in the writing or editing of BrandPosts.
By Gary Thome
It wasn’t that long ago that industry experts were quick to declare on-premises computing a thing of the past. It seemed like everyone was moving toward the public cloud, and pundits were calling on-premises computing dead and buried.
Fast forward a few years, and those very same experts were beginning to backtrack. Maybe on-premises infrastructure wasn’t as dead as we’d thought. An interesting headline appeared on Forbes.com: Reports of the Data Center’s Death Are Greatly Exaggerated. The author explains that although public cloud is pervasive, the data center is actually thriving. Industry analysts seem to agree. Even public cloud vendors recognized the market is not satisfied with only public cloud – and they began pursuing on-premises opportunities. With the announcement of AWS Outposts, AWS, the largest public cloud vendor, admitted that not everything is moving to the public cloud.
The future is all about choice
Previously, technology limited organizations to two choices – either go to the public cloud or stay on premises. In the future, businesses large and small will have a wide range of options of locations and consumption models. And the lines separating all of those options will blur as IT embraces the cloud experience.
As organizations evaluate their best options, many understand that cloud is not a destination. Instead, it is new way of doing business focused on speed, scalability, simplicity, and economics. This new business model allows IT to distribute data and apps across a wide spectrum of options. It also shifts the focus of IT away from optimizing infrastructure to positively impacting applications. Instead, they will manage applications to deliver the best experience wherever the infrastructure is located.
Choose what’s best for each individual workload
In the future, organizations will place data and applications where each performs best. And constantly changing requirements of service delivery, operational simplicity, guaranteed security, and optimization of costs will dictate placement. For example, the General Data Protection Regulation had far-reaching changes in terms of how global businesses secure customer data. Consequently, these changes led organizations to make adjustments in how they deployed applications.
Organizations who have done their homework will deploy applications and data where it makes the most sense – on a myriad of points along the two ends of the public cloud and on-premises spectrum. Some may choose colocation because it provides the infrastructure and security of a dedicated data center without the costs of maintaining a facility. Other workloads are best served in a private cloud using traditional on-premises infrastructure with consumption-based pricing. Another application may demand high security and control, yet flexibility and scalability, which would make on-premises private cloud the best alternative.
Having choices clearly gives organizations better deployment and consumption options for each individual application. And as needs change, deployment and consumption models will also change. The beauty of having numerous choices is that it gives organizations more flexibility to manage costs, security, and technical needs.
More choices may equal more complexity
The downside to all of this choice is that it can mean more complexity, as each deployment model is different. And as new technologies are introduced, the lines between all of these options are often obscured. For example, consumption-based pricing gives customers the flexibility to pay for what they use but still manage the infrastructure themselves, which fits neither the traditional on-premises nor the public cloud model.
As technology advances and choices continue to expand, it’s often difficult for an organization to adapt. To solve this challenge, they need a new mindset, one that is agile, adjusting to IT changes quickly. Too many times, they are constrained by legacy thinking, infrastructure, and tools. Better training and tools from industry experts can solve these issues.
Required: Expertise and a trusted advisor
To succeed in this agile yet often complex environment, many businesses will need valuable expertise. They should seek out partners that provide more than just the two options of on-premises or public cloud. Instead, savvy organizations will choose experts who provide solutions along the spectrum of deployment options. For instance, a vendor such as Hewlett Packard Enterprise (HPE) provides a wide range of solutions: Offerings include traditional on-premises infrastructure, private cloud (both owned and rented with consumption-based pricing), as well as colocation.
A successful organization will also need tools and professional services to help support as many options as possible. HPE advisory services can help identify the best place to run applications, while HPE GreenLake Hybrid Cloud delivers operational managed services for your cloud, allowing you to focus on your applications and deliver business outcomes while HPE takes care of the rest.
It’s time to stop limiting choices to only on-premises versus public cloud. Instead, consider all the options available for new opportunities and long-term success. Compliance, cost, performance, control, complexity of migration – all of these factors will determine the right mix for deployment of data and applications.
About Gary Thome
Gary Thome is the Vice President and Chief Technology Officer for the Software-Defined and Cloud Group at Hewlett Packard Enterprise (HPE). He is responsible for the technical and architectural directions of converged datacenter products and technologies. Over his extensive career in the computer industry, Gary has authored 50 patents. To read more articles from Gary, check out the HPE Shifting to Software-Defined blog.