Big data gets a lot of buzz these days and organizations are increasingly concerned about the problem of managing it, but many don’t really understand what big data is. Nor do they have the tools in place to effectively manage much of the data already at their disposal, says Mandeep Khera, chief marketing officer of LogLogic, which specializes in a scalable log and security intelligence platform (LSIP) for the enterprise and cloud.
“Most of them are concerned about big data, yet they don’t understand what it means,” Khera says. “Because there’s been so much said about big data, there’s no clear definition and everyone is confused.”
A new survey conducted by LogLogic in conjunction with IT security research consultancy Echelon One finds that 49 percent of organizations are somewhat or very concerned about managing big data, but 38 percent don’t understand what big data is and a further 27 percent say they have a partial understanding. Additionally, the survey found that 59 percent of organizations lack the tools required to manage data from their IT systems, instead turning to separate and disparate systems or even spreadsheets.
“We know that data is important from a lot of different perspectives: security, IT operations, compliance,” Khera says. “Companies need to be managing data much more effectively so they can make more intelligent decisions.”
The global survey was based on the responses of 207 individuals at director level and above in a variety of industries, including manufacturing, education, government, finance, healthcare, transportation, media and publishing and others.
“Big data is about many terabytes of unstructured data,” Khera explains. “Information is power, and big data, if managed properly, can provide a ton of insight to help deal with security, operational and compliance issues. Organizations of every size are collecting more data from a variety of sources within the enterprise and cloud infrastructures, and many organizations are not using the right tools and processes to manage these data. If this pattern continues, we will see enterprises falling further behind, unable to derive actionable insights which can help organizations make intelligent decisions.”
Most respondents to the survey—62 percent—said they already manage more than one terabyte of data. But more is coming. The volume of data in the world is increasing at a nearly incomprehensible rate. IBM says we create 2.5 quintillion bytes of data every day. And perhaps more astonishing, 90 percent of the data in the world today was created in the past two years according to Big Blue. The data is coming from sensors, transaction records, images and videos, social media posts, logs and all sorts of other sources.
That’s big data. If you can learn to drink from the fire hose, it can provide the sort of intelligence and actionable insight that business leaders dream about. On the security front it can help you protect your organization from advanced persistent threat (APT) attacks and malware by providing visibility into what’s happening in your network, and it can give forensics a huge boost as well. It can also lead to tremendous gains in operational efficiency, from optimizing your servers to optimizing your supply chain management. It can even help you get a handle on compliance issues.
But if you don’t have the tools to manage and perform analytics on that never-ending flood of data, it’s essentially garbage.
Khera says one of the keys to getting big data under control is log management that consolidates and centralizes logs from across an organization-including logs from web applications, middleware, custom backend applications and databases—with an indexed storage repository and common user interface. To make sense of the data requires the ability to normalize it, correlate it, report on it and send actionable alerts.
Earlier this month, LogLogic commissioned IANS, founded as the Institute for Applied Network Security, to perform an Information Security Investment Analysis (ISIA) of its log data management and compliance products.
After interviewing a number of LogLogic customers dealing with big data issues, IANS said, “The major differentiator with big data log management is the sheer size of the amount of log information. Trying to recreate an event after the fact is no simple matter if only a few devices are available. Imagine looking across thousands of devices and through petabytes of data without having an easy-to-use UI or an indexed storage repository for rapid response. Big data is characterized not just by size but also speed. Searching through massive amounts of data takes time if it’s not properly indexed. If critical information about unauthorized access or other activity is not available because it hasn’t been indexed, the results of a search will be inconclusive. Thus a bid data management solution must be able to keep up with the onslaught of new messages. This is even more important when it comes to alerting. If the indexing is taking too long, critical alert messages are delayed causing unacceptable latency in response times.”
For now, though, only 54 percent of respondents said they use a log management solution to manage their log data. Many use syslogs and spreadsheets to manage their logs, according to the survey, and 33 percent do nothing at all.
“The results show significant inconsistencies in practice,” says Bob West, founder and CEO of Echelon One. “Namely, while big data, cloud needs and compliance requirements are clearly major concerns, the majority of companies are not prepared to deal with any of them adequately. It’s fascinating to see the rift, and the overwhelming percentage of companies surveyed are not prepared to manage big data properly, monitor cloud environments effectively or report network and device activities properly. These companies are leaving themselves exposed to attacks, making less-than-informed business decisions and even risking fines from the federal regulatory agencies for not complying with their requirements.”
Thor Olavsrud is a senior writer for CIO.com. Follow him @ThorOlavsrud.