The impact the cloud is having on big data and advanced analytics is shocking. We\u2019ve hit a go public or go home situation \u2013 and while many enterprises I\u2019ve spoken to about this migration are still on the fence, they understand they need to invest in more public cloud to engage with empowered customers. The problem is many are struggling with organizational momentum and regulatory issues that often manifest in technical objections that don\u2019t hold water.\nPublic cloud was the number one priority for big data in 2016. Why? Because firms are running into a cost wall as they scale out their one premise infrastructures. They want to go bigger and faster and on premise configurations, including the on-premise portion of hybrid, but can\u2019t keep the pace. The consensus in the industry is that hybrid is the best most can do \u2013 I disagree. Firms should have a public-first policy and rely on hybrid or on premise as interim measures only when necessary.\nIn new research, I found a startling amount of evidence that led me this logical conclusion. Most importantly, some leading firms believe that their hard-won big data know-how in the public cloud is their new competitive advantage. They realize that they will be able to understand customers more deeply and adapt more quickly to accelerating customer expectations and ever-changing customer needs. Here is why I believe they are right to think so: \u00a0\n\nThe cloud plus big data creates exponential changes \u2013 think Moore\u2019s law. Google has committed to a Moore\u2019s law philosophy for cloud pricing publically. Consider what will happen if cloud infrastructure prices continue getting cut in half, while big data processing and analytic power doubles every 18-24 months. Cloud vendors can leverage scale to provide new capabilities, updated versions and fixes to all their customers faster and faster than on premise or hybrid competitors.\n\n\nExponential changes are driving a blinding pace of innovation in the cloud. For example, serverless innovations like AWS Athena for SQL analytics, new AI services from Google, container based multi-version support different version of open source tools like Spark.\n\n\nFirms that have transitioned to a public cloud first policy and are positioned to take advantage of exponential changes and the pace of innovation will win. For example, firms who are building their insight applications on more PaaS and managed services will be able to absorb new capabilities and new versions of open source at quicker paces as well.\n\nThese trends spell doom for companies that are spending time on Hadoop and Spark hardware and software upgrades to modernize their data architecture. It will only take one or two doubling cycles until the anchor of on premise drags laggards under.\nSo, what should you do? First you need a plan, and Forrester thinks there are for steps to build one: Select an initial basic cloud strategy for your big data analytics focus (SaaS, PaaS, or IaaS); identify candidate cloud platform services that meet your highest priority system of insight needs; adjust your big data analytics road map to include evolving your cloud management strategy; finally rinse\/repeat for the other basic cloud strategies as appropriate. I\u2019ll be writing a lot more research on the \u201chow to execute\u201d in the coming months, so stay tuned.\nI think the big data migration to the public cloud has started, but it\u2019s going farther and happening faster than you think. Leaders who have the architecture and know how to take advantage of the new exponential pace will win.\u00a0 Don\u2019t be a laggard \u2013 go cloud or go home.