If survey results are an accurate predictor of future calamity, then IT will soon find itself in big data-management trouble. In this case, the cause for alarm is a "mushrooming explosion of data" that is barreling down on enterprises and will be a "global and unstoppable force poised to overwhelm the complacent or unprepared." Scared yet? All that fear-mongering is actually from one recent study, called "Uptime @ Crunch Time: Valuing the Need for Data Speed at Critical Business Inflections." But there are others out there. How about IDC's prediction that by 2010 we'll all be producing 1,000 exabytes of digital information per year (an exabyte equals 1 billion gigabytes, by the way). The "Uptime @ Crunch Time" survey work was conducted by the Business Performance Management Forum and sponsored by BlueArc, a network storage provider (so take the results with a heaping dose of salt because BlueArc aims to help solve these data storage issues). The findings are based on responses from more than 125 IT professionals. So what say these IT professions? The "Crunch Time" study concludes that most IT organizations "are not equipped to handle periods of intense data flow," despite the majority of respondents saying they know that the data tsunamis are coming in the next year. (To read how one CIO is battling his 400 terabyte storage problem, see "Inside One CIO's Storage Nightmare.") "More than half [of the respondents] have already experienced productivity losses as a result of data overload at critical business junctures," according to the study, "and 25 percent willingly share specific stories of how poor data performance has hurt their business." More findings paint a complex and grim picture: - More than a third of respondents said chances are good they will experience a significant spike in data volumes and user demand over the next year. - 56 percent are not prepared or only somewhat prepared to handle the onslaught. - 78 percent are not fully prepared to address any single spike of data that is over 10 times their average daily processing volumes. - 80 percent said storage performance and data access by employees, partners and customers are important to their business. According to the findings, a variety of core enterprise applications are driving the need for better data management. These include: rich media files and applications, ERP systems, e-mail and messaging, research and product development, and Web traffic. This gets even more complex with today's master data management initiatives. (For more, see "Companies Struggle to Find the Truth in Massive Data Flows." Also, for a rare look at an MDM success story, see "How Master Data Management Unified Financial Reporting at Nationwide Insurance.") The implication for today's enterprises is that "peak business performance is inextricably tied to seamlessly handling sudden spikes in user demand, accommodating lightning-fast data throughput, rapidly assimilating high bandwidth applications and quickly analyzing vast amounts of information," notes the study. For those who were surveyed, \u201cuptime at crunch time\u201d is already a business imperative, states the study. "They have no time for long waits for expensive fixes or delayed opportunities. In their eyes, 'time to data' is equivalent to 'time to money.'" So how are you and your organization faring? Are you worried yet? Or is this just more vendor FUD?