Data is infiltrating many organizations at an astounding rate, whether the industry is healthcare or entertainment, and regardless of whether the company is a Fortune 500 behemoth or a small- to mid-sized business. As companies consume terabytes of data culled from Internet behavior, social media, and Internet of Things (IoT)-connected assets (among other things), they run the risk of drowning in a deluge. If that data volume is properly managed and leveraged, however, it can be the leg up on competition\u2014and business success.\nAccording to a survey of 2,300 global business and IT leaders by MIT Technology Review Insights in association with Pure Storage, nearly 90% of respondents believe data is the key to delivering better results and future growth, especially for shaping a more personalized customer experience.\nYet those same respondents are worried about capitalizing on the bounty: 90% of responding companies voiced widespread concern about their ability to analyze the data given surging data volumes, quality, and speed.\nIn fact, it is the companies\u2019 ability to analyze this new data that may be the greatest impediment. \u201cThere\u2019s a misconception that capturing the data will provide inherent value, but instead, businesses struggle to access the data as well as to understand the data they have captured,\u201d notes Diana Nolting (@DianaNolting), director of product for Anvl. \u201cThe biggest complaint we hear is that companies are simply visualizing data, not analyzing it for action.\u201d\nThe MIT study found the speed at which data can be received, analyzed, interpreted, and acted upon is a key barrier for 84% of companies. At the same time, 87% agreed that data needs to be analyzed for meaning and context. So what\u2019s the best path forward? While there is no one-sized-fits-all roadmap, there are a number of steps enterprises should take immediately.\nA Data Deluge Protection Plan\nBuilding out a strategy and creating a data management foundation is the critical first step to ensure maximum value of data assets, according to Will Wilkinson (@wawilkinson), head of infrastructure presales at CANCOM UK. As part of this early process, organizations must do the work of qualifying the data and determining which data sets are important\u2014including, what is necessary to the core business objectives, and what is irrevocably out of date.\n\u201cEnterprises are still struggling with duplicated and outdated data due to the lack of a cohesive data management strategy,\u201d says Larry Larmeu (@LarryLarmeu), enterprise transformation leader. \u201cIt is important that enterprises provide the tools, processes, and guidance around storing and accessing data, allowing for ease of integration, increased data veracity, and unlocking the valuable insights available from real-time data analytics.\u201d\nProper data governance is also central to an effective data insurance plan; it should cover all ends of the spectrum, from availability and operations to storage, retention strategies, and data security. \u201cHow much to keep, where data goes, how it\u2019s protected\u2014these are all problems that have always existed, but they will be greatly magnified with the current technology trends,\u201d says Mark Thiele (@Mthiele10), edge computing engineer at Ericsson.\n\u201cWith the availability of so much information, I have had to add a host of tasks to my schedule such as determining what data is valid, reliable, complete, on target for my needs, and readily accessible,\u201d says David Geer (@geercom), a cybersecurity technology writer. \u201cI have achieved success by prioritizing the data that most quickly and easily confirms that it meets all these requirements.\u201d\nCompanies should pay special attention to consistent classification and labeling of data, as it\u2019s one of the biggest hurdles to effective data governance. Setting default labels for new data (for example, dubbing them confidential) can ensure that policies and technical controls are applied consistently across the organization. This also frees up data creators from having to manually label all newly created information. \u201cIn that way, a data steward only needs to review data labels when that data is crossing a security barrier such as preparing a file to send to a client or third-party vendor,\u201d notes Kayne McGladrey (@kaynemcgladrey), director of security at information technology at Pensar Development.\nNot to be overlooked: how to optimize storage for growing data stores and how to effectively expose insights so organizations can optimally benefit. For HBO, the cloud was a huge leg up. The company\u2019s already burgeoning customer video usage data was compounded by the release of the HBO GO service and additional traffic created in social media. \u201cThe information provided us with an opportunity to better understand the customer interaction with our products, but it also provided a storage and data analytics challenge,\u201d says Michael Gabriel, former executive vice president and CIO at HBO. \u201cBeing an early adopter of the cloud over a decade ago provided us the time to determine usage patterns and storage retention requirements on an on-going basis.\u201d\nBeyond storage and security, the ability to visualize results and present it to users in a form that is easily understandable had Deepak Puri, founder of Skilled Analysts, scrambling. This was particularly true during a political and advocacy campaign that involved large data volumes such as voter files and survey results. The solution: present the data in an interactive map. The move was such a success, and it was published in Newsweek, Puri says.\nCreating centers of excellence and investing in data science skills and talent is all part of the process, notes Wayne Anderson (@DigitalSecArch), enterprise security architect at McAfee. In addition, getting ahead of emerging technologies in areas like machine learning and artificial intelligence (AI) will be essential for making sense of all the data. This includes an assist automating categorization and storage as well as what is ultimately thrown away, says Scott Schober (@ScottBVS), president and CEO of Berkeley Varitronics Systems Inc.\nWhile establishing a formal governance plan can seem onerous, without it, data quickly turns from asset to liability. \u201cThe data deluge can become cumbersome and risky,\u201d cautions Jason James (@itlinchpin), CIO at Optima Healthcare Solutions. \u201cTo paraphrase Spider Man, \u2018with great data, comes great responsibility.\u2019\u201d\nFor more information on Pure Storage, visit www.purestorage.com.