by Amy Newman

6 steps for a future-ready cloud storage strategy

May 17, 2017
Cloud ComputingCloud Storage

Growing storage demands are prompting many enterprises to move to the cloud, but ensuring that cloud storage is properly architected regarding security, performance and scalability is critically important.

Blame it on poor copy data management, compliance requirements or the internet of things (IoT), but data storage volumes are growing exponentially, and that growth shows no signs of abating. Managing skyrocketing storage demands on limited resources is a top challenge for many enterprises. To alleviate the burden, many organizations have turned to the cloud.

A survey of 451 Global Digital Infrastructure Alliance members illustrates the extent to which businesses are growing their off-premises storage footprint. Thirty-five percent of the 647 enterprise technology and IT professionals surveyed said are considering cloud-based storage for immediate purchase. While currently only 20 percent of storage is in the cloud, survey respondents estimate that within two years one-third will reside in either the public cloud or a SaaS environment.

There are multiple reasons for this growth. Many organizations turned to cloud storage as an efficient or less-expensive location for inactive or archived data, situations where latency and security are not of primary importance. But as cloud usage in general has grown, so too has interest in cloud storage.

However, as is typically the case with cloud, much of this growth and expansion has been on an ad hoc, as-needed basis. Many cloud migrations were a result of SaaS deployments or shadow IT, and cloud policies were formulated as an afterthought in bits and pieces. In addition, a typical SaaS deployment features little interaction with infrastructure, and the IT group’s involvement is often limited to security. With the underlying complexity abstracted out to the SaaS provider, little thought is given to policy, Steven Hill, senior analyst for storage technologies at 451 Research, tells

The result is a hodge-podge of reactive approaches to cloud storage that has left many traditional businesses without a single strategy for managing growing volumes of critical data stored in the cloud.

Independently assess cloud and storage needs

Today, only 16 percent of respondents have more than half of their of their total storage capacity residing off-premises, according to the 451 Research survey. But that number is expected increase to 26 percent over the next two years. Taken together with the with the overall rise in storage volumes, this is significant growth.

[ Related: How to pick the right cloud storage service ]

The current haphazard approach to managing storage no longer suffices, and as more mission-critical data and applications find their way to the cloud, analysts argue that firms must adopt longer-term strategies that examine cloud storage in the context of the broader business goals.

“A cloud storage strategy in and of itself wouldn’t stand alone,” says Seth Robinson, senior director of technology analysis at CompTIA, an industry trade group. He goes a step further and argues that storage and data must go hand in hand, so while a storage strategy is important, equally important is an understanding of the data being stored. That underscores the need for a strategy for data management, Robinson says, especially as companies increasingly turn to big data analytics.

Once a basic strategy is in place, it is time to evaluate cloud storage needs. Hill recommends using a cloud storage decision tree to determine whether it is optimal to locate storage in the cloud, on-premises or a combination of the two.

Start with three questions:

  • What is the application you are supporting (is its SaaS, cloud-based or on-premises?), and what is the extent of your support?
  • What are the performance requirements and other needs for the application? For example, latency is a big issue that can directly impact whether you hit your service-level agreements (SLA).
  • What other factors are involved?

With these questions answered, you can determine whether it makes more sense to bring the application to the data, or the data to the application.

Hill recommends looking at the cloud strategy as well. Certainly, the cloud has hit all of the milestones of what makes for a mainstream technology — CompTIA’s 2016 Trends in Cloud Computing study, for example, found that well over 90 percent of respondents are turning to cloud computing to meet enterprise needs, and one-third are using the cloud in full production. However, in many respects the cloud still lags in maturity, having taken hold quickly and often organically in the enterprise.

“We should not be looking at the cloud as a binary decision. Choose it when right. Define what cloud and on-premises can do for you,” Hill says.

Robinson also notes that a mixed environment that combines multiple managed clouds, public cloud, and on-premises private cloud should not be overlooked. The key, he says, “is knowing which pieces of your architecture work best with each model.”

That determination should factor in the five key areas of security, integration and manageability, performance, cost, and scalability.

Security is paramount

Both Robinson and Hill consider security the most important aspect of any cloud storage strategy.

And as mission-critical applications and related data find their way to the cloud, security is more vital than ever, they say. Robinson emphasizes that businesses must rethink security around storage and look at it from a business continuity and disaster recovery perspective, moving away from the conventional thinking of security just in terms of backup and recovery. Not all data that is backed up is equally important, he observes. Businesses should determine which data is most critical, and treat it accordingly.

Then there is the question of safeguarding the data itself. A system, according to Hill, is “only as good as its data protection.”

Therefore, firms must know exactly what steps their cloud providers are taking to protect their data and ensure compliance. Who has access to the data? Who holds the encryption keys? How much security is in the application?

[ Related: What cloud security challenges keep CISOs up at night? ]

At a minimum, a cloud provider should provide encryption both for data in transit and at rest, but that is often not enough. Ultimately, responsibility for securing customer data resides with the business, not the cloud provider, putting the onus on the firm to ensure that there is no data leakage.

In some cases, businesses may need to take actions to address how vendors protect the data. Robinson suggests finding out whether it is possible to augment what the cloud provider is offering with in-house protections, or possibly turning to another provider for additional security.

It’s also worth noting that while strong security is important, it is possible to have too much of a good thing. Too much or misplaced security can hamper performance or negatively impact the user experience, experts caution, so finding the balance is critical.

Integration and manageability

Because few organizations will shift to an all-cloud storage model, they must integrate legacy on-premises storage with the newer cloud-based systems.

On-premises storage-area network (SAN) and network-attached storage (NAS) solutions typically use block- and file-level storage, whereas cloud storage uses an object-based model. Moving data between the two introduces the potential for data loss and requires software that can integrate the two systems.

[ Related: IT is getting cloud storage security all wrong ]

Then there is the cloud-to-cloud component. Enterprises that experienced organic cloud growth are likely dealing with applications that rely on data stored in multiple clouds with multiple access points, and moving data from one cloud to another can introduce numerous risks. And even though cloud storage typically relies on an object-based model, Hill points out that unless the two clouds are using the same metadata architecture, moving data is not a simple migration — it requires exporting and importing.

Ideally, for the end user, cloud storage should look, feel and perform like local storage, and data should move seamlessly from one cloud environment to another. If the personnel maintaining the systems are constantly patching or tweaking applications, that makes for an inefficient use of resources that could increase downtime and introduce new security risks.

Performance matters: speed, latency and availability

Mention performance and the first thing that comes to mind is improving speed and reducing latency. And, indeed, those factors are critical in hitting performance benchmarks — even more so if large files are being pulled from one cloud to another.

Data must be stored and backed up in a way that minimizes latency when it is accessed. Integration and manageability also have a direct impact on performance. To the end user, accessing data should be seamless from any application on any platform.

It’s also worth considering availability. More than one-third of respondents to the 451 Research survey cite better availability and resiliency as advantages of cloud storage. That raises the question of whether the conditions of a cloud storage deployment are impacting an organization’s ability to meet its SLAs. For some data, an hour or two of downtime can have minimal impact; for other data, the cost of even seconds of downtime can be disastrous. It’s crucial to know the difference, and to ensure that measures are in place to avoid harmful downtime.

Cost cuts both ways

Many businesses turn to the cloud to save money, and while it is likely to trim costs on equipment and other capital expenditures, cloud storage will raise other expenses, and it may not net out to an overall cost savings.

However, many of the public cloud providers reduced costs markedly in 2016, and similar price cuts are expected to follow this year. Taking the time to re-examine what the business is paying for storage — and whether it is meeting (or exceeding) its needs — and then renegotiating a contract accordingly can be a worthwhile endeavor.

[ Related: How to compare cloud costs between Amazon, Microsoft and Google ]

Hill says businesses should account for bandwidth and data movement when evaluating costs. The price of storing data may be low, but the minute it starts moving, costs can go up.

Don’t overlook scalability

Although scalability is woven into the basic architecture of the cloud, it should not be overlooked. Scalability is one of the primary drivers of movement to the cloud — 62 percent of respondents to the 451 Research survey said cloud and SaaS-based services are easier to scale up or down than traditional offerings. The object-based storage model that underpins cloud systems supports their scalability, but those systems must also integrate with on-premises deployments that use a different model. The ultimate measure of scalability will in large part be a function of how smoothly that integration works. 

Scalability needs vary by organization, and the ability of storage providers to adapt to customers’ demands is key. If a firm’s storage needs are steady, the ability to increase or decrease storage on-demand may not be a key criterion for the cloud provider, and might even sweeten the appeal of a private cloud deployment.

Increasingly, having the right cloud-storage strategy in place will be essential for firms to deliver products and services, meet customer needs, and innovate and grow the business. For many enterprises, finding the right cloud-storage provider enables them to have a partner in the process. But before looking for a company to work with, the organization should have a solid understanding of the performance, integration, security and scalability needs to meet its SLAs and satisfy and grow its customer base.