by Mark Settle

Cloud report-2017

Feature
Jul 16, 20124 mins
Cloud Computing

Mark Settle, Personal diary… July, 2017

Just returned from an anniversary party to commemorate ten years of public cloud computing.

I joined several other CIOs who were pioneers in using Amazon’s Elastic Computing Cloud (EC2) when large enterprise instances initially became available in 2007.

Who would have ever thought that things could have changed so much in ten years?

Even as little as five years ago, large enterprise CIOs thought that public clouds such as EC2 would only be used on a limited basis, primarily for the development of non-business critical applications.

We responded to the threat posed by public cloud vendors by virtualising our own data centres, automating our provisioning processes, procuring capacity in advance of demand, and upgrading our system monitoring capabilities.

We thought that we could provide the same services as the public vendors and would only utilise public clouds on an as-needed basis for application prototyping or scalability testing.

We referred to this as the bursting model for hybrid cloud computing in which one bursts out to public platforms on a temporary basis for very specialised purposes.

Big wall Five years ago, we were smugly entrenched behind our firewalls believing that the gravitational forces exerted by our corporate data warehouses and the information security concerns of our corporate executives would ultimately deter any wholesale movement of enterprise computing to the cloud.

None of us really saw Big Data coming. We thought Big Data was a trendy buzz word, never fully understanding its true implications.

Network and database engineers found a new lease of life with the advent of Big Data.

The desire to move terabyte and even petabyte-sized databases around the Internet spawned a whole new generation of data management technologies.

Content delivery networks (CDNs), initially developed to stream video across the Internet, were hijacked by large enterprises to transport business data.

CDNs triggered investments in data compression, data caching, dynamic latency routing and database virtualisation that revolutionised data transport capabilities.

Similarly, investments in data encryption, data aliasing, key management, field-level access controls and session segmentation technologies considerably reduced information security concerns.

While there will always be computing workloads that will never be performed outside the corporate firewall, these technologies have liberated a considerable cross-section of business-related data from the security constraints that existed five years ago.

Process sophistication has also played a key role in improving information security.

Public cloud users now understand the precautions they need to take on their side of corporate firewall before data is transmitted to a public cloud vendor.

The revolutionary changes introduced by Big Data initially transformed the way in which business intelligence (BI) and data mining activities were performed within large enterprises.

We got out of the business of building elaborate data stores within our firewalls and made increasing use of cloud-based BI platforms, especially for complex data analysis and dashboard development activities.

Once we perfected the ability to move data in and out of cloud-based BI platforms securely, the data-anchors that tied many corporate applications to our internal data centres were broken.

Cloud-based application development was finally adopted on a wholesale basis by most large enterprises.

The Big Data revolution ended up turning our bursting model inside out.

We initially thought we would continue to develop most business-critical applications on our own private cloud infrastructures and burst out to the public clouds when we needed temporary access to large scale computing resources.

That’s not how things turned out.

Opportunity mist Now we perform the vast majority of our application development activities on the public clouds, including the development of business-critical applications.

While development and test are ongoing activities, their computing needs are highly variable and public clouds are the perfect means of satisfying this variability in demand.

On the other hand, with the internal improvements we made in capacity planning, resource provisioning and availability management within our own data centres, large enterprises now have the ability to manage the computing infrastructures that are required to support transaction-intensive business processes.

So instead of bursting out to the public clouds to gain access to large pools of computing resources, we now develop critical applications on the public clouds on an ongoing basis; validate their computing needs on the public platforms; and then burst in, hosting production versions of transaction-intensive applications within our own data centres.

Large enterprises have taken a page from Facebook, Google and Zynga, building and optimising large scale computing infrastructures that are uniquely suited to the transactional needs of their individual businesses.

Learning is what happens when things don’t go exactly as you planned. We sure have learned a lot the past ten years!

Mark Settle is CIO at BMC Software and a former CIO for Visa International

Pic: Orange Tuesdaycc2.0