by Soumik Ghosh

Veritas takes on dark data; helps slash storage costs

Feature
Feb 28, 2017
AnalyticsBudgetingBusiness

With the industry expected to churn out 44 zetabytes of data by 2020, organizations are grappling with the increasing concern around dark data. Here’s how Veritas battles the problem.

A Vanson Bourne research revealed that 52 percent of all information currently stored and processed by organizations around the world is considered ‘dark’ data. Data whose value is unknown.

However, what most organizations do is to save it all. Why? Because it’s the easiest thing to do. But is it a smart thing? The report says organizations spend millions in hoarding unwanted and irrelevant data.

If left unprocessed, this dark data will unnecessarily cost organizations around the world a cumulative USD 3.3 trillion to manage by the year 2020.

A tête-à-tête with Balaji Rao, MD – India & SAARC and Marcus Loh, Director – Technology (APJ), brought to light the rising cost of data storage, and the effect OpenStack has on the enterprise.

The enormous data horde that washed up with the digital wave

The entire industry is moving on a digitization wave right now. The government is also making added efforts to boost the digital drive. Also, with the advent of IoT, a lot of data is being transmitted back to the companies. A large faction of ITeS companies are also saying that a portion of their revenues are being driven by digital.

“The challenge that most customers face while transitioning from the current world to the digital world is that the information needs to be available on the fly. But today, that information in most organizations, is not available on the fly. This is because the whole information is in various silos within the organization,” says Rao.

Rao adds saying that it all comes down to how you classify data in an organization. There’s stale data and then there’s relevant data.

“Unfortunately, today all organizations store really old data. Nothing is deleted and very little is classified. One of the off-shoots of this is reducing risk,” he says.

Rao highlights the fact that only 15 percent of the data residing in an organizations is relevant to the business, and in that 15 percent, if you really want to take business advantage, it’s only 3-4 percent. “Finding that 3-4 percent requires a lot of tools and technology, and that’s where we step in and de-clutter the environment for them and provide meaningful insights,” says Rao.

The ravenous appetite of enterprise data

“If you don’t know what you’re backing up, and data continues to grow, you’re potentially backing up things that haven’t changed for the last three years. So, you just end in backing up the same data over and over again,” says Marcus Loh, director – Technology (APJ), Veritas Technologies.

“Through our data genomics report, we found out that dark data exists in a huge amount. What are we doing for that? We provide them with a visibility to tell them that a certain chunk of information is irrelevant, or if something is non-business related,” says Loh.

There’s always pressure on CIOs to roll out more business applications. The only place one can effectively cut costs is on the infrastructure side. And on the infrastructure side, a major chunk is spent on storage.

“The storage cost has become a big portion of the infrastructure cost, and if they don’t control this, the data explodes. What you also need to bear in mind is that IT budgets are growing only by 3-4 percent every year. You can’t cut down on business applications, so infrastructure is the place to cut,” explains Rao.

Spotting the weakest links in the security perimeter with Data Insight

“There’s sensitive data in all organizations. We’ve a product called Data Insight, which helps you scan the network to tell you if a set of files is very crucial, and is used by a certain employee. Sometimes, we have observed that the usage pattern has changed. Security officers are very interested in this sort of information,” says Rao.

So the product can not only tell you what’s irrelevant now, but what’s relevant, and warn you about possible exposure.

“So, in the event of a hack, you can mitigate the risk, and avoid having to pay massive sums of money to recover from that situation,” explains Rao. “With our recent alliance with AWS, we’re bringing disaster recovery to the masses. We also offer real-time technologies to cater to this solution,” he adds.

Shutting up the noisy neighbor

“One of the platforms where we see growth is on the OpenStack side. So, if our customers are using containers, we want to make sure that we’re there with the support they want. So, we can help them break the silos and give them an ability to store, in spite of using open source,” says Rao.

Loh explains the problem of the ‘noisy neighbor’ in OpenStack environments by highlighting how putting everything together may slow down the quality of service you may require in a high-compute environment. ‘Noisy neighbor’ arises due to the software not being able to distinguish between important applications. 

“So, what we do is that we bring the memory in-server, so that instead of going through the same infrastructure, the data goes directly to a very intelligent cache. This is because most mission-critical applications are 80 percent read, and 20 percent write,” explains Loh.