BrandPosts are written and edited by members of our sponsor community. BrandPosts create an opportunity for an individual sponsor to provide insight and commentary from their point-of-view directly to our audience. The editorial team does not participate in the writing or editing of BrandPosts.
By Chris Purcell
Trying to get different teams to collaborate is never an easy feat. But no two groups are more notoriously difficult to work together than developers and IT operators. After all, the two groups have had a contentious relationship for quite some time. Historically, developers want to release features as quickly and efficiently as possible, while IT operators want to ensure things are done reliably, securely, and meet corporate and compliance policies.
In a recent BriefingsDirect podcast, Dana Gardner, Principal Analyst at Interarbor Solutions, talks with Daniel Newman, Principal Analyst and Founding Partner at Futurum Research and Analysis, about how developers and IT operators can find newfound common ground around making hybrid cloud the best long-term economic value for their organizations.
The great divide
Gardner opens the interview by asking Newman to give his thoughts on the ever-increasing separation between DevOps and IT Ops. “We now have two worlds colliding. You have a world of strict, confined policies. That’s the ‘ops’ side of DevOps,” Newman explains. “You also have the developers who have been given free rein to do what they need to do; to get what they need to get done, done.” But, Newman explains, the industry is now experiencing a massive shift that requires more orchestration and coordination between these groups.
With the introduction of new cloud options, lack of collaboration between developers and IT Ops leads businesses to experience out-of-control expenses, out-of-control governance and security, and difficulty taking advantage of private, public, or hybrid cloud successfully.
“There is a big opportunity [for better cloud use economics] through better orchestration and collaboration, but it comes down to the age-old challenges inside of any IT organization: having Dev and IT Ops share the same goals,” Newman explains. But he does offer some good news, “New tools may give them more of a reason to start working in that way.”
It’s not just about DevOps
Gardner brings up two other areas that could benefits from collaboration — data placement and data analytics. According to Gardner, “We talked about trying to bridge the gap between development and Ops, but I think there are other gaps, too.”
Newman agrees. In terms of data placement, Newman says, “Developers are usually worried about data from the sense of what can they do with that data to improve and enhance the applications.” But when you add in elements like machine learning and artificial intelligence (AI), it ups the compute and storage requirements. “With all of these complexities, you have to ask, ‘Who really owns this data?’” Newman adds.
Data placement for IT Ops, according to Newman, typically just worries about capacity and resources performance for data.
Newman goes on to explain that businesses can’t leave the data lifecycle to developers and IT operators — business leadership should be asking, ‘We have all this data. What are we doing with it? How are we managing it? Where does it live? How do we pour it between different clouds? What stays on-premises and what goes off? How do we govern it? How can we have governance over privacy and compliance?’
“So your DevOps group just got bigger, because the data deluge is going to be the most valuable resource any company has. It will be, if it isn’t already today, the most influential variable in what your company becomes,” Newman explains.
And, he says, it all comes back to shared tools, shared visibility, and shared goals.
Data analytics is another thing all together, Newman shares. First, the data from the running applications is managed through pure orchestration in DevOps, he explains, “And that works fine through composability tools. Those tools provide IT the ability to add guard rails to the developers, so they are not doing things in the shadows, but instead do things in coordination.” The disconnect comes from the bigger analytical data. “It’s a gold mine of information. Now we have to figure out an extract process and incorporate that data into almost every enterprise-level application that developers are building,” he says.
Is hybrid cloud the answer?
The bottom line is that there are many moving parts of IT that, in its current state, remain disjointed. “But we are at the point now with composability and automation of getting an uber-view over services and processes to start making these new connections – technically, culturally, and organizationally,” explains Gardner.
One way the industry is moving toward unity is through multi-cloud or hybrid cloud. And according to Newman, this is a welcome shift. Multi-cloud brings together the best components of each cloud model and allows businesses to choose where each application should live based on its unique needs. “However, companies right now still struggle with the resources to run multi-cloud,” he says. They don’t know which cloud approach is best for their workloads because they are not getting all of the information delivered as a total, cohesive picture. “It depends on all of the relationships, the disparate resources they have across Dev and Ops, and the data can change on a week-to-week basis. One cloud may have been perfect a month ago, yet all of a sudden you change the way an application is running and consuming data, and it’s now in a different cloud.”
What is needed is a unified view that allows everyone, including developers and operations (and beyond), to make informed decisions that take each part of cloud deployment into account.
A move in the right direction
Newman and Gardner both agree that HPE Composable Cloud is a step in the right direction. HPE Composable Cloud is a hybrid cloud platform that delivers composability across the data center with an open integrated software stack that enables businesses with the speed, scale, and economics of public cloud providers. As a turnkey cloud platform with compliance and security, it offers enhanced capabilities such as end-to-end automation, built-in AI operations, an innovative fabric built for composable environments, and hybrid cloud management ready to scale.
According to Newman, “What HPE is doing with Composable Cloud takes the cloud plus composable infrastructure and, working through HPE OneSphere and HPE OneView, brings them all into a single view.” This type of unified view delivers the most usable and valuable dashboard-type of cloud use data. And, Newman thinks, this type of single view can bridge the gap between IT groups that seem to have trouble collaborating. “Give me one view, give me one screen to look at, and I think your Dev and Ops — and everybody in between – and all your new data and data science friends will all appreciate that view,” he explains.
Gardner concludes the interview by reiterating Newman’s view. “What I have seen from HPE around the Composable Cloud vision moves a big step [in the right] direction. It might be geared toward operators, but ultimately it’s geared toward the entire enterprise, and gives the business an ability to coordinate, manage, and gain insights into all these different facets of a digital business.”
Chris Purcell drives analyst relations for the Software-Defined and Cloud Group at Hewlett Packard Enterprise. The Software-Defined and Cloud Group organization is responsible for marketing for HPE Synergy, HPE OneView, HPE SimpliVity hyperconverged solutions, and HPE OneSphere. To read more from Chris Purcell, please visit the HPE Shifting to Software-Defined blog.