Cloud 2.0 is coming – and it’s what you thought Cloud 1.0 would be

The cloud has changed the way we think about data management and distributed workloads. With one big gotcha. Thankfully, that’s about to change.

Ask any CIO for their thoughts on multi-cloud – where data and applications flit seamlessly across all the public clouds, private cloud and on premise, for lowest cost and best performance – and you’re likely to get one of two distinctly divergent responses.

One typically comes from the uninitiated, who haven’t yet moved computing to a public cloud like AWS, Microsoft Azure or Google Cloud. They are wholly unimpressed with the concept. Because they were under the impression that that’s how the cloud is supposed to work anyway.

And the other? It’s from those who once had the same impression as the others. Then they moved data and workloads to public platforms. Now, they are left wondering if true multi-cloud is even possible.

Actually, it is possible. In fact, we’re now starting to see a new stable of products crop up to address today’s multi-cloud shortcomings. In January, Gartner coined the term “Cloud Data Ecosystems” for this emerging class of products. Analyst firm 451 Research late last year knighted them “Enterprise Intelligence Platforms.” And Cloudera, which boasts arguably the most comprehensive platform in the category, last summer dubbed the category “Enterprise Data Cloud.”

I call it Cloud 2.0. And when all is said and done, it will turn out to be everything Cloud 1.0 was supposed to be.

What goes up…

Lured by the promise of seemingly boundless on-demand storage and compute capability, many early adopters anxious to shed datacenter capitalization decisions jumped into the cloud model with both feet. But many soon found their feet stuck where they’d landed.

That’s for two reasons. First, they were locked into long-term contracts, which they signed to secure preferential pricing without knowing how much capability they’d actually need. So even though they were paying a fixed rate, in practice they ended up paying more. Some only ended up using a fraction of the capability they reserved. And those who underestimated how much storage and compute they’d need ended up facing eye-opening overage charges.

The second reason is they learned how difficult it can be to bring data home from cloud services with all associated metadata intact. Abandoning historical records and other descriptive, contextual associations dramatically diminishes data’s value for future analysis.

In part as an effort to wriggle free, enterprises increasingly invested in their own private-cloud capability, building out either by outsourcing to a third-party datacenter, or by investing in good old-fashioned, back-down-to-earth, on-site capitalization. The goal was to max out their own resources, and then burst up to the cloud only when absolutely necessary.

“There’s definitely a lot of repatriation activity going on,” Henry Vail, Technical Director for the Software-Defined Infrastructure business in at Lenovo’s Data Center Group, told me. “Customers really like the concept of hybrid cloud. So, they’re building or outsourcing their own private cloud.”

It’s important to understand that enterprises that have ventured into the cloud are just as enthusiastic as ever about the cloud model and remain committed to it. Even some that haven’t yet taken the plunge into the actual cloud are investing in – and benefiting from – the flexibility and efficiency of virtual machines, containers and microservices.

Cloud providers are all responding with hybrid-cloud options of their own. Late last year, in fact, the big three public cloud providers each unveiled or enhanced programs to extend their services to private cloud deployments:

  • AWS announced its own infrastructure-as-a-service offering, AWS Outposts
  • Microsoft, which first introduced Azure Stack to extend its cloud services into customers’ datacenter in 2017, released Azure Arc, which extended Azure Stack’s umbrella to include a wider range of hardware and services, and
  • Google disclosed its own hybrid-cloud platform, Google Anthos.

Enter multi-cloud

The services are all welcome extensions, as far as they go. That is to say that while they’re all extending their platforms from their own public clouds down to hosted private clouds and on-premises datacenters, none are doing much to help their customers wander outside their own domain and into competitors’ cloud infrastructures.

You’ll encounter much the same with other cloud providers, whether it be IBM and Redhat, HPE or Oracle. Platforms like Nutanix and Dell’s VMware are unique in that they support multiple cloud platforms. But first you must pick a horse. Sliding from one to the other can still be challenging.

It is hard to blame public cloud providers for not wanting to pave the way out the door for paying customers, though they may have to someday. Because customers do want multi-cloud.

Indeed, the ability to move left and right between the cloud platforms as well as up and down from the public cloud to privately owned or leased assets is what we all hear when we listen to cloud pitches. Unfortunately, it’s not what the cloud providers are actually saying. At least, not yet.

“Customers really want that single pane of glass to manage it all,” Cindy Maike, Vice President of Industry Solutions at Cloudera. “They want the flexibility to access data and place workloads where they make the most sense, either for cost or capability reasons. Today, customers want to know how quickly they can move.”

The answer, of course, is that it depends – mostly on how well you developed your cloud architecture. If you’ve prepared your data and workloads from the start, you’ll have a much easier time.

And there’s more help on the way.

On the business analytics and machine-learning side, for example, companies like Databricks, Looker and Rancher help orchestrate ML projects across diverse deployments. And firms like Panoply, Qubole and Snowflake coordinate data across hybrid cloud and multi-cloud deployments. Cloudera is unique in that it provides both data and ML management with its new Cloudera Data Platform, the result of pooling capabilities with Hortonworks, which it acquired a little over a year ago.

A key piece of Cloudera’s platform is its governance capabilities, which help operators set and maintain metadata parameters for security, regulatory compliance and data analytics – even across cloud platforms. Naturally, other independent data management platforms are chasing multi-cloud governance as well. Ultimately, everyone will need to. Because we can’t achieve true multi-cloud data without effortless data and application portability.

And that’s the heart and soul of Cloud 2.0: exactly what we wanted from Cloud 1.0.

Copyright © 2020 IDG Communications, Inc.

7 secrets of successful remote IT teams