by Kevin Fogarty

Cloud Computing Poses Control Issues for IT

May 12, 2010
Cloud ComputingData CenterVirtualization

As data moves into the cloud, one of the most precious things that IT leaders will need to give up is some level of control. Will vendors be able to strike a palatable balance between IT's need for agility and control?

Though most U.S. companies still list customer and other corporate information as their most valuable assets, many keep pushing this data farther from safe lockdown in the data center—and are about to give it another strong shove in that direction.

Cloud computing and efforts to virtualize internal storage, servers, client hardware and even the movement of bits in and out of a server allows IT to design systems according to the needs of end users, rather than the location and limitations of on-site hardware.

One common fear about data in the cloud: What happens to it once it leaves building?

End-user companies probably won’t completely lose track of data in the cloud, according to Chris Wolf, infrastructure and virtualization analyst at The Burton Group. They are likely to lose some level of control over not only who accesses the data, but also when and for what purpose, he says.

15 Cloud Companies to Watch

On a related note, companies must also be able to audit usage of sensitive data, keep restricted data in Europe or other places where regulations are tight, and comply with all the other requirements spreading out to define proper use of corporate data.

It’s not a disaster waiting to happen, but the control issue will keep many companies from using cloud technology, or even advanced storage technology, to its best advantage, Wolf says.

Slideshow: 5 Things We Love/Hate About Cloud Management Tools

EMC, for example, is developing technology for its high-end storage products that goes far beyond current storage virtualization technology, which still associates storage volumes with specific spindles or storage arrays, according to Brian Gallagher, general manager of EMC’s EMC Symmetrix and virtual products groups.

“Virtual Storage,” would make it possible to make the most cost-effective use of storage hardware and move data around at need, or at whim, Gallagher says.

“Theoretically, you could shift terabytes or petabytes of data from New York to Alaska for batch-processing overnight because the power in the Alaska data center is so much cheaper than New York,” Gallagher says. “You should be able to do that from one console, almost with a couple of mouse clicks or drag-and-drop.”

One Company’s Private Cloud Choice

Losing critical data in the cloud isn’t a huge issue right now, mostly because companies that want to put data that’s more critical than a Gmail account into the cloud keep that data on internal clouds, or keep the whole cloud within their own walls, according to Bill Gillis, eHealth technical director, Beth Israel Deaconess Medical Center in Boston.

“We went with a cloud approach because we weren’t sure how big it was going to be, and it gave us an environment that could expand or contract as we needed it to,” says Gillis, who built a virtual infrastructure designed to allow independent physicians’ offices to plug into BIDMC’s patient records and billing systems easily.

“Data goes from physician offices to our servers directly,” he says. “Even though it’s in a cloud, it doesn’t leave our servers or the physicians’, which is good for security and privacy.”

Few companies are putting critical data into public clouds now, partly because they’re nervous about doing it, partly because neither their structured nor unstructured data has yet been converted to function well in services-oriented environments such as the cloud, according to Mark Bowker, analyst of Enterprise Strategy Group.

The potential cost-savings of the cloud will add to the pressure for end-user companies to convert to more SOA-friendly formats, Bowker says.

A Solution for Tracking and Securing Data

Technology that does provide centralized control is not under development, however, Wolf says. Wolf and his Burton colleagues have proposed development of an infrastructure authority, or IA—a database, directory and security application all in one, designed to create security that can travel along with specific parts of data and help its owners keep control over it no matter where it resides.

In the cloud, data from several locations could ‘reside’ next to each other, sharing user access controls, audit and security policies, but actually be stored within a company’s own data centers co-location facilities or publicly available data centers.

Without a set of metadata that can track the real and virtual location of that data, as well as its related security information, orchestration, federation and control of that data is, at best, extremely difficult, Wolf says.

“If a tool wants to place an object somewhere within a cloud infrastructure, there needs to be a central place where it can check to make sure the physical location offers the necessary resources (compute, memory, networking and storage I/O) and security policy isn’t violated in the process, among other concerns. We don’t need to re-invent the wheel. Instead, we need to take existing virtual infrastructure management databases and evolve them so that they can act as the central authority for all infrastructure decisions,” Wolfe wrote in a blog on the topic.

Burton Group, and its new owner, Gartner Group, plan a series of reports explaining and promoting the idea of the IA later this spring, Wolf says. Industry organizations such as the DMTF, Cloud Security Alliance and Storage Networking Industry Association’s Cloud Storage Initiative are working on various aspects of the problem, but haven’t cracked it yet, Wolf says.

Follow everything from on Twitter @CIOonline.