Government agencies are collecting vast amounts of data, but they're struggling just to store it, let alone analyze it to improve efficiency, accuracy and forecasts. Big Data has the potential to transform the work of government agencies, unlocking advancements in efficiency, the speed and accuracy of decisions and the capability to forecast. But despite the potential benefits, most federal government agencies are struggling to leverage Big Data. These agencies lack the data storage/access, computational power and personnel they need to make use of Big Data, according to a recent report, “The Big Data Gap,” by MeriTalk. MeriTalk is a community network for government IT developed as a partnership by the Federal Business Council, Federal Employee Defense Services, Federal Managers Association, GovLoop, National Treasury Employees Union, USO and WTOP/WFED radio. “Government has a gold mine of data at its fingertips,” says Mark Weber, president of U.S. Public Sector for NetApp, underwriter of MeriTalk’s report. “The key is turning that data into high-quality information that can increase efficiencies and inform decisions. Agencies need to look at Big Data solutions that can help them efficiently process, analyze, manage and access data, enabling them to more effectively execute their missions.” SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe The government is collecting data: 87 percent of government IT professionals say their stored data grew in the last two years and 96 percent expect their data to grow in the next two years (by an average of 64 percent). Unstructured data makes up 31 percent of data held by the government, and the percentage is on the rise. On average, government agencies store 1.61 petabytes of data, but expect to be storing 2.63 petabytes within the next two years. These data include: reports from other government agencies at various levels, reports generated by field staff, transactional business data, scientific research, imagery/video, Web interaction data and reports filed by non-government agencies. The majority of IT professionals (64 percent) say their agency’s data management system could be expanded or upgraded to cope with this growth, but they estimate it will take an average of 10 months to double their short- to medium-term capacity. While government agencies collect massive amounts of data, MeriTalk’s report found that only 60 percent of IT professionals say their agency analyzes the data collected, and less than 40 percent say their agencies use the data to make strategic decisions. That includes U.S. Department of Defense and intelligence agencies, which on average are even farther behind than civilian agencies when it comes to Big Data. While 60 percent of civilian agencies are exploring how Big Data could be brought to bear on their work, only 42 percent of DoD/intel agencies are doing the same. Big Data RoadblocksThe roadblocks are myriad and varied, Weber says. First and foremost is the question of who owns the data. In the private sector, the pattern is clear, he notes: Enterprises are taking their data analysts out of the IT department and embedding them in lines of business. IT may have responsibility for making sure the enterprise is capable of storing vast amounts of data, but it is the lines of business that have ownership of the data and how it is used. But that’s not the case in government. MeriTalk found that 42 percent of respondents believe that IT owns the data collected by the agencies, 28 percent believe it belongs to the department that generated the data and 12 percent believe data ownership belongs to the C-level suite. “There’s a lack of ownership of the data,” Weber says. “It might not seem like that big of a deal, but it is. Who is responsible for mining that data? I think it’s a partnership, but someone’s got to be directing. Work needs to be done there.” Technology and a lack of personnel also present roadblocks. MeriTalk found that when it comes to driving mission results around Big Data, agencies estimate they have just 49 percent of the data/store/access, 46 percent of the computational power and 44 percent of the personnel they need. And 57 percent of the respondents say they have at least one dataset that has grown too big to work with using current management tools and infrastructure. In an effort to help government agencies harness the power of Big Data, the Obama Administration announced a new “Big Data Research and Development Initiative” at the end of March that promises more than $200 million in new research and development investments in Big Data. “In the same way that past federal investments in information technology R&D led to dramatic advances in supercomputing and the creation of the Internet, the initiative we are launching today promises to transform our ability to use Big Data for scientific discovery, environment and biomedical research, education and national security,” Dr. John P. Holdren, assistant to the president and director of the White House Office of Science and Technology Policy (OSTP), said when announcing the initiative. Under the initiative, the OSTP, together with six federal departments and agencies will work in concert to achieve the following objectives: Advance state-of-the-art core technologies needed to collect, store, preserve, manage, analyze and share huge quantities of data Harness these technologies to accelerate the pace of discovery in science and engineering, strengthen national security and transform teaching and learning Expand the workforce needed to develop and use Big Data technologiesThor Olavsrud covers IT Security, Big Data, Open Source, Microsoft Tools and Servers for CIO.com. Follow Thor on Twitter @ThorOlavsrud. Follow everything from CIO.com on Twitter @CIOonline and on Facebook. Email Thor at tolavsrud@cio.com Related content opinion The changing face of cybersecurity threats in 2023 Cybersecurity has always been a cat-and-mouse game, but the mice keep getting bigger and are becoming increasingly harder to hunt. By Dipti Parmar Sep 29, 2023 8 mins Cybercrime Security brandpost Should finance organizations bank on Generative AI? Finance and banking organizations are looking at generative AI to support employees and customers across a range of text and numerically-based use cases. By Jay Limbasiya, Global AI, Analytics, & Data Management Business Development, Unstructured Data Solutions, Dell Technologies Sep 29, 2023 5 mins Artificial Intelligence brandpost Embrace the Generative AI revolution: a guide to integrating Generative AI into your operations The CTO of SAP shares his experiences and learnings to provide actionable insights on navigating the GenAI revolution. By Juergen Mueller Sep 29, 2023 4 mins Artificial Intelligence feature 10 most in-demand generative AI skills Gen AI is booming, and companies are scrambling to fill skills gaps by hiring freelancers to make the most of the technology. These are the 10 most sought-after generative AI skills on the market right now. By Sarah K. White Sep 29, 2023 8 mins Hiring Generative AI IT Skills Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe