This week I’m in Washington, D.C., speaking at a government cloud computing event and also meeting with congressional staff and agency personnel to discuss their cloud computing plans and concerns. The discussions have been really fascinating, both for what they indicate about the Federal government’s cloud plans as well as what they illustrate about its challenges. But, make no mistake, the government is serious about its interest in cloud computing.
I was really looking forward to the conference, not least because of a panel giving a “state of play” view of where the government is with its cloud plans. It has set up a cross-agency cloud steering committee headed by Casey Coleman, GSA CIO, who was on this panel. Prior to this panel, however, I attended another one that included Peter Tseronis of the Department of Energy, who also sits on the steering committee. He described the activities of the committee and noted that a number of subcommittees have been set up to examine issues like security, architecture, and so on.
During the question period, I asked when the just-announced apps.gov will include on-demand IaaS services (the GSA is currently evaluating RFP responses for these services). Tseronis said that, all things coming together, this will occur by end of this year.
[For timely cloud computing news and expert analysis, see CIO.com’s Cloud Computing Drilldown section. ]
I made the point during my question that on-demand IaaS will likely drive a lot of interest and experimentation by government agencies, and thereby accelerate government adoption of cloud computing. One has only to look at how the easy availability of Amazon cloud computing has engendered experimentation and forced companies to begin more formal cloud computing initiatives to understand what apps.gov IaaS might precipitate.
Coleman’s panel discussed the current status of various government cloud initiatives (including, of course, the GSA-led one). A very intriguing initiative is one being implemented by a shared service organization located in the Department of Interior. From my interpretation, the organization started as a typical consolidation effort, but has now morphed into a cloud-enabled shared infrastructure offering. It’s very much a work in progress, but moving forward rapidly. One thing the speaker, Doug Bourgeois, emphasized is that his group works very hard with potential cloud customers to ensure they understand that applications running in the cloud environment retain responsibility for a portion of the overall security of the app.
This is an important point, because not everyone recognizes that using a cloud infrastructure does not (and cannot) transfer all risk to the provider; after all, the provider may not even be aware of the totality of the application, so expecting it to be responsible for overall application security is shortsighted.
The panel I was on discussed virtualization. What was interesting was how much (and how quickly) the discussion—and questions—turned to cloud computing. It’s inevitable: once an organization begins harvesting the benefits of virtualization, it soon wants to drive toward streamlining administration, and inexorably that leads to questions regarding the feasibility of taking operations out of the deployment of virtual resources altogether and allowing application groups to self-provision.
One thing I was struck by is the sense of urgency communicated by Coleman and other agency staff during all the panels. This is somewhat surprising, given that (let’s be fair) government is not often thought of as being a pioneer in technology trends. I asked a GSA staff member attending the conference about why this sense of urgency. His response was that, even though the government was already working on cloud computing before the arrival of the new administration, the pace has been galvanized by the new Federal CIO, Vivek Kundra. The result is a very ambitious rollout plan to get cloud computing into the government.
The day following the conference found me meeting with congressional staff members from the House Committee on Energy and Commerce, the Senate Committee on Homeland Security and Governmental Affairs, and the Senate Select Committee on Intelligence. All are involved in cloud computing, in one way or another: Energy and Commerce from the perspective of trying to understand the potential impact of cloud computing on the economy and ensuring government regulations support it, while ensuring that security and privacy are maintained; Homeland Security from the perspective of avoiding security vulnerabilities in governmental computing efforts; and the Intelligence Committee from the perspective of implementing cyber-security and protecting US governmental and commercial activities from espionage and attacks. If you noticed a common theme regarding security, go to the head of the class.
What struck me about each group’s discussion of security was how their security concerns are balanced with a desire to support cloud computing in general. Obviously, the buzz about the potential benefits of cloud computing has reached even these very non-technical individuals, who spend their time focusing on policy and legislation. It would be quite easy for any of these committees to cite the paramount need for security and hinder progress on cloud initiatives, both within and without the government. Instead, they are very concerned about making sure cloud initiatives are accompanied by appropriate security measures.
Along with the general concerns about security and privacy, one topic that came up several times is FISMA, the Federal Information Security Management Act, a law that mandates that security assessment and implementation be part of every Federal information system. FISMA imposes a structured, consistent approach to system security, ensuring that every application is treated the same. One challenge for FISMA in a cloud computing world (and, indeed, in a virtualized infrastructure world) is that it is application-driven and envisions a non-shared hardware basis for applications. This is obviously too constrained for a cloud (or, indeed, a virtualized) infrastructure; my recommendation is that FISMA be examined with an eye toward modification to support the notion of applications residing on a shared infrastructure with a method to evaluate the infrastructure and grade it according to its security capability. This would allow cloud infrastructures to be examined once, with the resulting accreditation (that’s the official FISMA term) applied to any subsequent application that also resides on that infrastructure. Sharing accreditation would speed up FISMA certifications, not to mention reducing their cost (which is not inconsiderable).
The FISMA predicament illustrates the general picture, which is that many security and privacy laws and regulations are out-of-date with respect to computing infrastructure assumptions. Written at a time before virtualization became common, the assumption that applications would reside on fixed, unshared hardware resources is obsolete, undone by the march of Moore’s Law and the drive to infrastructure efficiency.
I am incredibly heartened by my trip. Seeing the Federal government taking a leading role in cloud computing is impressive. Seeing how many of the agencies and committees involved in the process intuitively understand the potential of the cloud is striking. I am really looking forward to the next year to watch how the government’s initiatives march forward.
Bernard Golden is CEO of consulting firm HyperStratus, which specializes in virtualization, cloud computing and related issues. He is also the author of “Virtualization for Dummies,” the best-selling book on virtualization to date.
Follow Bernard Golden on Twitter @bernardgolden. Follow everything from CIO.com on Twitter @CIOonline