by Mary Branscombe

Is tape storage dead … again?

Feature
Oct 11, 2016
Cloud StorageSAN

Tape storage has been declared dead so many times that it has become a trope in technology journalism. The truth is more complicated and is ultimately less about tape’s demise than it is about the steady encroachment of cloud services.

Much like the mainframe, tape still has its place. It’s a veteran technology that is heralded for being cheap, reliable and simple, and advancements continue. Even so, by the time you’ve invested in tape robots for automation and verification steps to make sure your backup actually captured your data, all that labor and infrastructure means tape isn’t as cheap as the per-megabyte costs make it look.

Meanwhile, cloud services don’t just compete economically (you can pay just one or two cents per gigabyte per month for blob storage), they let you think differently about the things you used to need tape to achieve, like long-term archiving and disaster recovery.

“Large enterprises are saying ‘I’m done with my tape, I want to move my backups to the cloud’,” says Guru Pangal, general manager for hybrid storage and data protection at Microsoft, who admittedly has some skin in this game.

Those customers aren’t dropping their investment in tape overnight, Pangal admits, but they don’t want to spend any more money on infrastructure that doesn’t give them the storage features they want. “Tape has been declared dead about 15 times so far. Every five years someone says tapes are dead but so far, tapes are still alive. We’re not saying tapes are dead. But a lot of our customers are saying, ‘When the lease comes up for the tape I’m done with it. It’s too complex. I don’t know if I can restore it. I can’t test it. I want to go to the cloud.’”

Changing your approach

The question to start with is, “Why is my organization using tape in the first place?” suggests Phil Bindley, CTO of storage service provider The Bunker. “As with other bits of legacy infrastructure, in many organizations tape has become the default mode of backup simply because ‘we have always done it that way’. It’s easy to get stuck in a status quo mindset, but the key questions you have to ask in regards to any IT assets, including tape, is whether it provides what the business needs to achieve its goals, and whether it enables the company to be more agile and therefore competitive.”

For small businesses, cloud offers enterprise-class IT capabilities like business continuity without upfront costs, and it’s a good fit for their cold storage and archive, which Mark Read Jones, strategy director at managed cloud service provider Timico Technology Services, says is “better managed in the cloud than by the in-house team who should focus on the ‘crown jewels’ of the business — applications, supporting business processes, keeping live systems availability and supporting digital transformation.”

Cloud services also allow self-service recovery. “Users today should not have to rely on logging a call/email/IM with the service desk to restore a file they lost or accidentally deleted,” says Read Jones.

That’s the kind of convenience that appeals to customers of cloud data management company Rubrik. Red Hawk Casinos and TotalJobs, for example, switched away from tape “because of the operational cost and complexity of maintaining tape archives,” says Rubrik CEO Biphul Sinha. “When they needed to recover data from tape archives, this required truck deliveries, multiple full server restores and manual searches. With Rubrik, they get instant search and rapid recovery of individual files.”

Storage strategy

AWS Glacier is one (appropriately named) option for storing ‘cold’ data in the cloud, but you’re unlikely to simply copy files there, or to other equivalents. Instead, you’re going to treat them as a specific storage tier as part of your overall strategy.

Take Azure Backup, which Pangal calls a very simple service. “You just back up your data and restore your data. You back up your VMs and your data to Azure and you can restore it in Azure or you can restore it on premise. One of the biggest use cases we’re seeing for it is as a tape replacement.”

You can use Azure Backup as a target from client machines, from servers and from server applications like SQL Server, from workloads you’re running in IaaS on Azure, or from traditional storage systems like Microsoft Data Protection Manager, and you can set retention policies (up to 99 years).

If you adopt SQL Server 2016, the new Stretch Database lets you stretch warm and cold data into SQL Azure automatically, or you can put an asynchronous replica in an Azure VM. If you’re not migrating to SQL Server 2016, DH2i’s Containers as a Service (which is based on DxEnterprise container management software running on Rackspace Microsoft Private Cloud) gives you cloud HA (high availability) and DR (disaster recovery) for any edition of any version of SQL Server, from 2005 onwards. Running SQL Server in a container makes it easier to snapshot and copy a running server, as well as to roll out a new install if you do need to recover.

But if you want to protect complex applications, Azure Site Recovery is usually a better choice, because it includes orchestration.

“Azure Site Recovery works with all the key application technologies and it makes sure your apps come up in an orchestrated fashion,” Pangal says. “The customer hands us their VMs or says ‘these are the applications I want protected’ and we replicate the data, orchestrate it, and provide the right mechanisms to restore the data and bring the application back up correctly.”

If you want to use cloud as an extra storage tier in a way that fits in seamlessly with existing storage infrastructure, Microsoft’s StorSimple range, for example, can look like a SAN or a NAS, but it also connects to Azure, AWS and other cloud storage. Applications write to local storage for performance. If you can afford the performance hit of identifying and deduplicating inactive data, StorSimple can compress it, encrypt it and tier it to Azure. For tier one workloads like Exchange and SQL Server where you don’t want the performance hit, StorSimple takes snapshots and sends those to your cloud storage. Again, you can recover to Azure or to your SAN.

StorSimple isn’t the only way to connect local storage to cloud storage, of course. Syncplicity’s file sharing platform can automatically sync its StorageVaults to AWS or Azure storage for backup. Rubrik’s software-defined storage appliance can now back up physical workloads, especially SQL Server and Linux systems, to cloud storage like AWS S3 and Azure using policies and SLAs to keep a backup local or send it to the cloud. Even familiar backup tools like Acronis can now back up to cloud storage.

Have an exit plan

Jon Tilbury, CEO of digital preservation specialist Preservica, also sees organizations increasingly shifting to the cloud for long-term storage, but he reminds CIOs to “make sure they have the flexibility to exit and migrate to other services in the future.”

Check the costs and options for getting your data out if you need to (data egress from AWS Glacier, for example, gets expensive above a certain daily limit). You might also need to think about the network connections you’re using to get data to the cloud. To move your existing archive, you may want to ship hard drives to your cloud storage provider. And if you have a lot of data to move regularly, you’ll want to look into MPLS connections (like Azure ExpressRoute or AWS Direct Connect).

Whatever service you choose, Tilbury suggests thinking as much about the format of what you’re storing as where you put it. “What is the likelihood of us actually being able to read and use the file formats of today a hundred years on? Even file formats used ten years ago are obsolete or unreadable today.” Just as you can use thumbnails to find the image you need and then download the high resolution version to work on, you could put ‘presentation versions’ of content in a fast storage service like Amazon S3 and formats that you’ll be able to extract and upgrade in lower-cost services like Glacier.

John Hood, CTO of Civica (an outsourcer that specializes in working with local government in the U.K.) hears similar concerns from customers about guaranteed long-term availability.

A growing number of cloud services are able to offer data residency, and you’ll need to make sure the service you pick complies with the upcoming GDPR regulations if you have customers or partners in Europe. If data sensitivity matters, says Hood, you need “a cost-effective solution from service providers that can meet very rigorous assurance and accreditation requirements.”

That might mean services like Mimecast’s email archiving for Office 365 and Google Apps for Work, or Archive360’s Archive2Azure, which promises long-term, secure retention of unstructured data you need for compliance like journal email, file system work files, individual .pst files and system generated reports, as well as the usual file and email data. With connectors for email archiving services like MessageOne, AXS-One, MX Logic, ArchiveOne and Gwava, Archive2Azure can preserve the chain of custody, so you don’t need to worry about the impact of moving inactive data to the cloud on any legal discovery you need to undertake.

That’s a useful reminder that you still need to think about archiving, even with SaaS and other cloud services, because the cloud doesn’t remove the need for backup: users might accidentally or deliberately delete files or overwrite data, ransomware is increasingly a problem, admins might miss renewing a cloud subscription, or a workflow might go wrong.

It may go without saying that cloud is the logical place to back up cloud services. But while you’re choosing cloud backup for your cloud services, it might be a good time to consider if it’s the right place for the rest of your backups, archives and disaster recovery sites as well.