Just as the federal government has been working to hasten its shift to cloud computing over the past few years, senior technology officials are now seeking to advance a framework for the government's approach to the challenges and opportunities presented by big data.\n\nToday, at a conference on the cloud and big data hosted by the National Institute of Standards and Technology (NIST), U.S. CIO Steven VanRoekel described the government-wide effort to make more data available to the public in an ordered format that could provide the basis for new services and applications, startup companies or even entire industries.\n"The government is sitting on a treasure trove of that sort [of information]. If you think about the role government plays and could play in cloud and big data, I just think we're at the tip of the iceberg on this stuff."--U.S. CIO Steven VanRoekel\n"We are I think at the verge, just at the tipping point of the data economy, both on the big data side ... as well as thinking about government as a platform for value. We're just now starting to see companies founded on government data," VanRoekel says. "We can greatly impact the lives of every American just by unlocking pieces of data."\n[Related: How the Government's 2013 Tech Policy Agenda Will Impact IT]Real Estate, Healthcare and Energy Business Tie Into Fed DataVanRoekel cites the real estate sites Trulia and Zillow, which have incorporated government data into their listings, and looks ahead to the untold number of startups that could be launched around government data on health, energy and other areas.\n\n"The government is sitting on a treasure trove of that sort [of information]," he says. "If you think about the role government plays and could play in cloud and big data, I just think we're at the tip of the iceberg on this stuff."\n[Related: Cloud Computing Gains in Federal Government]Proponents of unlocking government data point to the success stories of the National Weather Service and the availability of location information from the government's GPS program, both of which have formed the basis for numerous commercial ventures.\n\nFollowing in the footsteps of the federal government's cloud-first policy, the White House last May released its digital government strategy, a multi-pronged initiative that addresses the use of technology within departments and agencies and directs federal CIOs to make more of their data available to the public as part of the government-as-a-platform strategy. That includes directives on presenting data in machine-readable format and rolling out APIs to enable developers to build applications on top of the raw data sets.\n\nFederal CIO to Coordinate Big Data Like the CloudNIST's role in the government's treatment of big data involves working out common definitions and standards, a process similar to what the agency has been doing in cloud computing, involving coordination with federal CIOs to develop reference architectures and taxonomies, use cases and a technology roadmap. In that capacity, NIST also operates as a facilitator, convening both CIOs and their technical teams and end users in what Patrick Gallagher, the director of the agency and undersecretary of commerce for standards and technology, calls a "structured dialogue ... between those that are shaping the technology and those that are trying to use it in the federal government."\n\n"Big data," Gallagher says, "unlike cloud, doesn't have a common definition yet. We haven't yet agreed as a community what exactly we mean by big data. But whatever it is, it's here, and that's clear."\n\nOn the cloud computing side, where the government is farther along on its strategic roadmap than with big data, NIST has been working in concert with the Department of Homeland Security, which oversees the security aspect of cloud systems, and the General Services Administration, which has been developing guidance for procurement.\n[Related: Government Moves Toward Cloud Computing 'Perfect Storm']The government has devised the Federal Risk and Authorization Management Program, or FedRAMP, to present cloud providers with a uniform set of criteria across departments and agencies that covers security assessment, authorization and continuous monitoring. FedRAMP, which took effect last June, is intended to smooth over inconsistencies in the procurement process that had created confusion among cloud providers in the private sector looking to contract with the government.\n"We are really looking at a new paradigm, a place of data primacy, where everything starts with consideration of the data, rather than consideration of the technology. This is a real shift from the way we've historically thought about this."--Patrick GallagherNIST director and undersecretary of commerce for standards and technology\n\nLate last month, the FedRAMP Joint Authorization Board issued its first certification to a small cloud services provider based in North Carolina called Autonomic Resources for its infrastructure-as-a-service offering. But that's only the beginning, VanRoekel says, noting that there are currently 78 other companies awaiting FedRAMP certification and that new certificants are expected to be announced shortly.\n[Related: Government Seeks Guidance on Cloud-Brokerage Services]"We've got a pipeline that's going to start really flowing and I think really will be an inflection point where we really catalyze cloud adoption inside government," VanRoekel says. "The challenge we were facing in government was one where all the agencies of government wanted to go to cloud, were, you know, following the cloud-first guidelines, [but] were doing that in a very unpredictable way, and they were going to the marketplace and saying, 'You know, I need these requirements and I need these requirements and different things,' and agency A and agency B were completely different from each other. So FedRAMP, first and foremost, creates a predictable environment for cloud providers to relay cloud services to the government," VanRoekel says.NIST is following a similar path with its work on big data, moving toward a point where, say, agencies looking for a commercial provider to implement a Hadoop deployment would start with a common set of requirements. But along with those standardization efforts, the move to big data will necessitate another cultural shift within the government, just as the administration has been pressing CIOs to embrace the cloud and develop new mobility policies to address application development, bring-your-own-device and other considerations.\n\n"Like cloud, big data is going to change everything," Gallagher says. "We are really looking at a new paradigm, a place of data primacy, where everything starts with consideration of the data, rather than consideration of the technology. This is a real shift from the way we've historically thought about this."\nKenneth Corbin is a Washington, D.C.-based writer who covers government and regulatory issues for CIO.com. Follow everything from CIO.com on Twitter @CIOonline, on Facebook, and on Google +.