Big Data has the potential to transform the work of government agencies, unlocking advancements in efficiency, the speed and accuracy of decisions and the capability to forecast. But despite the potential benefits, most federal government agencies are struggling to leverage Big Data.
These agencies lack the data storage/access, computational power and personnel they need to make use of Big Data, according to a recent report, “The Big Data Gap,” by MeriTalk. MeriTalk is a community network for government IT developed as a partnership by the Federal Business Council, Federal Employee Defense Services, Federal Managers Association, GovLoop, National Treasury Employees Union, USO and WTOP/WFED radio.
“Government has a gold mine of data at its fingertips,” says Mark Weber, president of U.S. Public Sector for NetApp, underwriter of MeriTalk’s report. “The key is turning that data into high-quality information that can increase efficiencies and inform decisions. Agencies need to look at Big Data solutions that can help them efficiently process, analyze, manage and access data, enabling them to more effectively execute their missions.”
The government is collecting data: 87 percent of government IT professionals say their stored data grew in the last two years and 96 percent expect their data to grow in the next two years (by an average of 64 percent). Unstructured data makes up 31 percent of data held by the government, and the percentage is on the rise.
On average, government agencies store 1.61 petabytes of data, but expect to be storing 2.63 petabytes within the next two years. These data include: reports from other government agencies at various levels, reports generated by field staff, transactional business data, scientific research, imagery/video, Web interaction data and reports filed by non-government agencies.
The majority of IT professionals (64 percent) say their agency’s data management system could be expanded or upgraded to cope with this growth, but they estimate it will take an average of 10 months to double their short- to medium-term capacity.
While government agencies collect massive amounts of data, MeriTalk’s report found that only 60 percent of IT professionals say their agency analyzes the data collected, and less than 40 percent say their agencies use the data to make strategic decisions. That includes U.S. Department of Defense and intelligence agencies, which on average are even farther behind than civilian agencies when it comes to Big Data. While 60 percent of civilian agencies are exploring how Big Data could be brought to bear on their work, only 42 percent of DoD/intel agencies are doing the same.
Big Data Roadblocks
The roadblocks are myriad and varied, Weber says. First and foremost is the question of who owns the data. In the private sector, the pattern is clear, he notes: Enterprises are taking their data analysts out of the IT department and embedding them in lines of business. IT may have responsibility for making sure the enterprise is capable of storing vast amounts of data, but it is the lines of business that have ownership of the data and how it is used. But that’s not the case in government.
MeriTalk found that 42 percent of respondents believe that IT owns the data collected by the agencies, 28 percent believe it belongs to the department that generated the data and 12 percent believe data ownership belongs to the C-level suite.
“There’s a lack of ownership of the data,” Weber says. “It might not seem like that big of a deal, but it is. Who is responsible for mining that data? I think it’s a partnership, but someone’s got to be directing. Work needs to be done there.”
Technology and a lack of personnel also present roadblocks. MeriTalk found that when it comes to driving mission results around Big Data, agencies estimate they have just 49 percent of the data/store/access, 46 percent of the computational power and 44 percent of the personnel they need. And 57 percent of the respondents say they have at least one dataset that has grown too big to work with using current management tools and infrastructure.
In an effort to help government agencies harness the power of Big Data, the Obama Administration announced a new “Big Data Research and Development Initiative” at the end of March that promises more than $200 million in new research and development investments in Big Data.
“In the same way that past federal investments in information technology R&D led to dramatic advances in supercomputing and the creation of the Internet, the initiative we are launching today promises to transform our ability to use Big Data for scientific discovery, environment and biomedical research, education and national security,” Dr. John P. Holdren, assistant to the president and director of the White House Office of Science and Technology Policy (OSTP), said when announcing the initiative.
Under the initiative, the OSTP, together with six federal departments and agencies will work in concert to achieve the following objectives:
- Advance state-of-the-art core technologies needed to collect, store, preserve, manage, analyze and share huge quantities of data
- Harness these technologies to accelerate the pace of discovery in science and engineering, strengthen national security and transform teaching and learning
- Expand the workforce needed to develop and use Big Data technologies
Thor Olavsrud covers IT Security, Big Data, Open Source, Microsoft Tools and Servers for CIO.com. Follow Thor on Twitter @ThorOlavsrud. Follow everything from CIO.com on Twitter @CIOonline and on Facebook. Email Thor at email@example.com