by Kenneth Corbin

To Tap Big Data, Federal IT Must Partner With Tech Industry

May 15, 20135 mins
Big DataGovernmentGovernment IT

Experts within and outside government IT stress the role the private sector must play in helping cash-strapped federal agencies find order in their growing stockpiles of data.

WASHINGTON — For IT managers in the federal government to wring more value out of the enormous stores of data they oversee, they must develop deeper partnerships with service providers in the private sector, according to a panel of experts speaking here at the annual FOSE government IT conference.

“The reason why government is hesitant towards a lot of the private sector is the private sector would push solutions looking for problems.”

Federal IT workers from the CIO on down are dealing with the challenges of big data, but they’re doing so amid the various pressures of contracting budgets, exponential growth in data volumes and a mounting expectation for higher-level, technology-enabled citizen services.

Federal Big Data

That unique set of circumstances creates a very real opportunity for IT providers in the private sector to step in with repeatable technology solutions that can help mitigate the data challenges that agencies across the government are facing.

[Related: Federal Government’s Big Data Efforts Lagging]

“Why is everything a one-off, custom system for the government? The government doesn’t have the budgets anymore, and it won’t. The mantra is clear. Do more with less,” says Thomas Cellucci, the chairman and CEO of the consultancy Cellucci Associates, who previously served in the George W. Bush and Obama administrations. “Technology’s the answer and it’s going to be with a speed of execution that government isn’t used to.”

Obama Big on Data

As it happens, Obama recently addressed the government’s data situation with an executive order declaring that the production of new data should come in a machine-readable format, and commissioning the development of an Open Data Policy “to advance the management of government information as an asset,” while taking steps to ensure that sensitive data is still treated with sufficient protections.

[Related: Government IT Leaders Get Big Data Roadmap]

“When implementing the Open Data Policy, agencies shall incorporate a full analysis of privacy, confidentiality and security risks into each stage of the information lifecycle to identify information that should not be released,” Obama writes in the executive order.

Machine readability could go a long way toward managing the information that is accumulated within an agency’s system, but that’s only part of the big data challenge.

A holistic solution for exploding volumes of data is all the more important at a time when agencies are dealing with what David Mitchem, the data fusion lead for the U.S. Army, calls the “wild west of unstructured data.”

[Related: Feds Look to Big Data to Position ‘Government as a Platform’]

“As more and more of this data is nonnative … we have to have a framework to make sense of it,” says Mitchem. “What we found is it’s all about context, and it’s about a disciplined approach at the enterprise level that can bring meaning.”

Data Lingers at Homeland Security

The view is similar at the Department of Homeland Security, where too much data lingers in disparate formats for too long before the information is marshaled toward the department’s objectives of border security, counter-terrorism and other mission areas.

“That’s wasted time in my opinion,” says Donna Roy, executive director of the Information Sharing Environment at the DHS Office of the CIO.

“We’re focusing on getting better formats, better structure, and working on how to particularly ingest data into big data systems,” Roy says. “We’re really leaning toward advanced concepts for tagging data.”

Roy also harbors concerns that while big data solutions, in spirit, are meant to generate a unifying effect and break down barriers between disparate data sources, both the solutions developed within the federal government and those provided by the private sector are often not shared, creating what she calls “siloes of big data excellence.”

“We’re working on approaches to share results,” says Roy, who warns private-sector providers against adopting a “black box” approach to the big data technology and services they provide the government.

But as important a role as the private sector can play in helping the federal government address big data, industry representatives would be well advised to spend more time listening to government CIOs describe their needs, according to Cellucci, who recalls being inundated with sales emails during his time in the government.

“The reason why government is hesitant towards a lot of the private sector is the private sector would push solutions looking for problems. Doesn’t it make a lot more sense for the government to articulate the problem and then in an open and transparent way to ask the private sector to help?” Cellucci says.

“That’s my tough love for the private sector,” says Cellucci. “Educate government. Don’t sell a solution looking for a problem that may not exist.”

Kenneth Corbin is a Washington, D.C.-based writer who covers government and regulatory issues for Follow everything from on Twitter @CIOonline, Facebook, Google + and LinkedIn.