The impact the cloud is having on big data and advanced analytics is shocking. We’ve hit a go public or go home situation – and while many enterprises I’ve spoken to about this migration are still on the fence, they understand they need to invest in more public cloud to engage with empowered customers. The problem is many are struggling with organizational momentum and regulatory issues that often manifest in technical objections that don’t hold water.
Public cloud was the number one priority for big data in 2016. Why? Because firms are running into a cost wall as they scale out their one premise infrastructures. They want to go bigger and faster and on premise configurations, including the on-premise portion of hybrid, but can’t keep the pace. The consensus in the industry is that hybrid is the best most can do – I disagree. Firms should have a public-first policy and rely on hybrid or on premise as interim measures only when necessary.
In new research, I found a startling amount of evidence that led me this logical conclusion. Most importantly, some leading firms believe that their hard-won big data know-how in the public cloud is their new competitive advantage. They realize that they will be able to understand customers more deeply and adapt more quickly to accelerating customer expectations and ever-changing customer needs. Here is why I believe they are right to think so:
- The cloud plus big data creates exponential changes – think Moore’s law. Google has committed to a Moore’s law philosophy for cloud pricing publically. Consider what will happen if cloud infrastructure prices continue getting cut in half, while big data processing and analytic power doubles every 18-24 months. Cloud vendors can leverage scale to provide new capabilities, updated versions and fixes to all their customers faster and faster than on premise or hybrid competitors.
- Exponential changes are driving a blinding pace of innovation in the cloud. For example, serverless innovations like AWS Athena for SQL analytics, new AI services from Google, container based multi-version support different version of open source tools like Spark.
- Firms that have transitioned to a public cloud first policy and are positioned to take advantage of exponential changes and the pace of innovation will win. For example, firms who are building their insight applications on more PaaS and managed services will be able to absorb new capabilities and new versions of open source at quicker paces as well.
These trends spell doom for companies that are spending time on Hadoop and Spark hardware and software upgrades to modernize their data architecture. It will only take one or two doubling cycles until the anchor of on premise drags laggards under.
So, what should you do? First you need a plan, and Forrester thinks there are for steps to build one: Select an initial basic cloud strategy for your big data analytics focus (SaaS, PaaS, or IaaS); identify candidate cloud platform services that meet your highest priority system of insight needs; adjust your big data analytics road map to include evolving your cloud management strategy; finally rinse/repeat for the other basic cloud strategies as appropriate. I’ll be writing a lot more research on the “how to execute” in the coming months, so stay tuned.
I think the big data migration to the public cloud has started, but it’s going farther and happening faster than you think. Leaders who have the architecture and know how to take advantage of the new exponential pace will win. Don’t be a laggard – go cloud or go home.