by Kenneth Corbin

Government IT Leaders Get Big Data Roadmap

Oct 04, 20124 mins
Business IntelligenceData ManagementGovernment

The TechAmerica Foundation has released a report that stresses the need for government to align big data initiatives with business objectives and take steps to professionalize the field. The report also outlines a set of policy recommendations and practical steps agencies can take to get started on big data initiatives

Government and Big Data

WASHINGTON — Just as government IT managers are moving toward cloud computing and mobile technologies, big data initiatives are underway at federal departments and agencies.

But while big data, as a government-wide priority, won the endorsement of the White House in March, there remains considerable confusion and uncertainty among federal CIOs about how to tap into the enormous stockpiles of data the government maintains to improve citizen services and further the business objectives of departments and agencies.

Five Things CIOs Should Know About Big Data

In that spirit, the TechAmerica Foundation empaneled a commission of industry, academic and government leaders, culminating in a new roadmap for federal entities to proceed with big data initiatives the organization released at an event on Capitol Hill on Wednesday.

The report, “Demystifying Big Data” (available in PDF format here), aims to “move from the buzz word of big data down to what’s practical,” said Steve Mills, senior video president and group executive at IBM and a co-chair of TechAmerica’s Big Data Commission.

“It really doesn’t matter what domain you look at. All aspects of our lives today have been touched in various ways by the extraordinary capabilities that computing delivers to improve our lives from healthcare to obviously all kinds of scientific pursuits,” he added.

Mills and others noted that government agencies have been dealing with big data challenges long before the term was coined, but that the confluence of the declining costs and increasing processing power of high-speed computing has unlocked new potential for putting the data to use on key challenges such as medical research and national security, while making existing programs more effective by eliminating waste, fraud and abuse.

The Big Data Challenge: How to Develop a Winning Strategy

“This is not something where we’re waiting for hover cars to show up,” says Steve Lucas, a co-chair of the commission and global executive vice president and general manager of SAP’s database and technology organization. “The reality is, that if there’s one thing this report solidifies, it’s this is not a world that is coming. This is the world we live in. We are surrounded by big data.”

The roadmap positions the government’s large stores of data as a major (if underused) strategic asset. At a definitional level, the report describes big data as marked by the volume, velocity and variety of datasets.

“The phenomenon represents both a challenge in making sense of the data available to governments, and an opportunity for government agencies that seek to exploit it to enhance the business of government,” the authors of the report wrote.

Strategic Guide to Big Data Analytics

The report offers policymakers a series of recommendations for enabling federal agencies to improve their big data efforts, including a call for heightened collaboration among the various government entities.

The authors also stressed that federal CIOs should focus on the “art of the possible,” emphasizing that the shift toward big data initiatives “will be iterative and cyclical, versus revolutionary.”

Part of that pragmatic approach calls on federal IT leaders to align their big-data plans with the mission of their agencies, and to build on their existing technology assets, rather than embarking on a wholesale overhaul of their enterprise architecture — a costly undertaking that is likely out of reach for most organizations at a time of contracting budgets.

As a practical matter, that would mean recasting big data as a business initiative, rather than as a distinctive focus of the IT department, involving a different cast of characters than the traditional technology undertaking.

“Big data should no longer belong to IT. It should no longer be under the purview of the CIO,” says Bill Perlowitz, CTO of Wyle’s science, technology and engineering group and a vice chair of the TechAmerica commission. “We need to shift the idea to where the business process people become stewards and custodians of the data.”

The report also suggests that the individuals who will be best equipped to oversee big data projects will have a varied set of skills that draws on disciplines such as computer science, statistics and engineering. The authors call on the government to establish a formal career track for data scientists, partnering with higher-education institutions to create a leadership academy offering big data training and certification.

To further professionalize the field, the federal government could partner with academic and industry members to form a coalition that would develop and maintain professional standards and core competencies for the big data field.

Kenneth Corbin is a Washington, D.C.-based writer who covers government and regulatory issues for

Follow everything from on Twitter @CIOonline, on Facebook, and on Google +.