4m read time
To enable IT decision makers to embrace the agility, flexibility, and cost advantages of multi-cloud, some security changes are in order.
The transformative power of multi-cloud has moved from cutting-edge to mainstream. New IDG research commissioned by Dell Technologies and VMware shows that 86 percent of IT leaders are already using multiple clouds, and the vast majority of the remainder are looking at multi-cloud adoption in the next year or two. Businesses thrive on the agility, flexibility and cost-effectiveness of moving workloads to the cloud, and mainstream organisations now find that the added flexibility and power of multi-cloud adoption is well within their reach.
At the same time, 43% of respondents find that security is a top challenge in moving workloads to the cloud, and that number rises to 56% for smaller firms. Approximately 50% of IT leaders report having not migrated workloads to cloud because of security concerns. While no simple path exists to overcoming that obstacle, consensus is developing around best practices for improving security in a multi-cloud world.
Decoupling cloud-resident workloads from the underlying hardware fundamentally changes the calculus involved in protecting them. In a multi-cloud world where containers and virtual machines pass freely among on-premise and public infrastructures, the notion of a network perimeter breaks down. Richard Bennett, Head of Industry Solutions & Strategy, EMEA at VMware, illustrates, “The castle wall doesn’t work anymore. You can have a moat with crocodiles, but what happens if you get a big pole and jump over?”
Indeed, even within a conventional data centre, the majority of traffic is East-West (internal), putting it beyond the view and control of traditional perimeter defences, such as firewalls and intrusion prevention systems (IPS). Multi-cloud scenarios take that reality further, with workloads sharing physical hosts and internal network resources with unknown third parties.
To operate safely under this type of architecture, protections cannot simply be arranged to create a border around the environment. And because the specific hardware you are using is subject to constant change, hardware-oriented protections must be reconsidered. An updated approach depends on software-defined security. Mike van Vliet, Consulting Pursuit Lead for EMEA with Dell Technologies, recommends, “All the security rules … should be put as close as possible to the data and the application: literally in the VM or container.”
In this vision, protection measures such as application firewalls and intrusion protection systems are defined in software, spun up and down at the same time as the workloads themselves. More importantly, these resources are defined at the workload level, within the container or virtual machine. Thus, they remain operational and co-located with the workloads as they traverse internal and external systems.
This paradigm is essential to decouple security from the hardware in the same way that the workloads themselves are decoupled, enabling data protections to be fundamentally independent of the operating environment. Because security operates at the per-workload level, it helps to protect against lateral movement of threats that perimeter defences cannot.
The long-standing compromise between security and ease of use is familiar to anyone working in cybersecurity. Classic examples include shutting down TCP ports that are not explicitly needed. While doing so objectively helps block illicit traffic, it also has the potential effect of blocking desirable traffic. Likewise, locking down a corporate desktop image and denying users admin rights prevents actions that could compromise security, but at the same time, it can get in the way of those users doing their jobs most efficiently and effectively.
The difficulty of affecting balance reflects the fact that static measures can never precisely accommodate the dynamic needs of the moment. That makes them not only prone to interfering with the user experience but also insufficient for unforeseen threats. Multi-cloud operations add to this shortcoming by creating a more complex environment, part of which lies outside the control of the organisation that owns the workload.
Flexible security calls for measures that are adaptive in the moment. That is in contrast, for example, to a locked-down security measure that disallows a SaaS resource with potential for disruption or security exposure. A difficult compromise arises between obtaining the value of the service versus accepting the risk that comes along with it.
Instead, it is critical to deliver a cyber hygiene-oriented approach, which means ongoing monitoring and assessment of all technology services, along with the overall IT environment. For example, if an application is identified as behaving badly, autonomous measures are taken such as locking it down and quarantining it. This approach avoids the traditional danger of blocking innovation due to inflexible IT, and it ensures that sacrificing control doesn’t fundamentally undermine the business.
As organisations develop multi-cloud environments, many find that the notion of infrastructure controlled by a diverse range of providers is incompatible with their efforts at control. Nowhere is that reality more evident than their need to document the conformance of external providers’ operations to internal and external security standards and requirements.
In the case of internal audit, that shortcoming can be a significant black eye on IT; for external regulations, it can be a significant financial and operational liability.
The inescapable reality is that some aspects of the infrastructure and operations of public cloud providers are simply outside the realm of visibility to the businesses they serve. Accordingly, it can be difficult or impossible to characterise and document conformance to cyber hygiene requirements for internal or external audits.
While that shortcoming is somewhat inevitable, IT organisations are wise to include auditability as a primary requirement when considering the optics of multi-cloud architectures and how they integrate to business needs. Involving audit teams from the start in the selection and design process is critical to avoiding the discovery of problematic realities later on. In addition, that forethought protects the business from factors that can compromise operational security, even as it builds goodwill within the organisation as a whole.
The measures to secure workloads in a multi-cloud world are an extension of the best practices that companies have developed over decades. To adapt to a software-defined reality, security must be applied at the workload level. For greater flexibility, especially in the multi-cloud context, a looser grip on control must be combined with an embrace of cyber hygiene as an equal partner to older notions of locked-down security. And to facilitate the smooth, secure integration of an open-ended set of public resources with a conventional on-premise one, auditability must guide adoption from the start.
Secure adoption of multi-cloud is essential, and a structured approach to on-boarding these architectures is critical to success. Bennett provides a central insight for that goal: “The technology vendor conversation should really be over. What you need is a technology partner who begins with an organisations’ business needs and helps map those to IT imperatives”
See all in collection