“Modernization” was one of the watchwords in President Obama’s proposal for the federal IT budget in fiscal 2017, and while that broad effort might sound anything but controversial, updating legacy systems entails a host of challenges that agency CIOs are only beginning to work through.
[ Related: U.S. CIO aims to cut legacy spending, proposes IT moderization ]
Obama called for a $3 billion fund that would directly support agencies’ projects to update or replace aging applications and infrastructure, efforts that would include transitioning systems to the cloud, building out network capacity and strengthening cybersecurity.
But the $3 billion, even if it should materialize, would only be a modest down payment on a vast and open-ended effort to update the federal government’s sprawling IT apparatus.
Budget-minded strategies for moving to cloud
And modernization efforts like moving to the cloud don’t come cheaply, Randall Conway, principal director to the Defense Department’s deputy CIO for information enterprise, observed during a recent panel discussion hosted by Federal News Radio.
[ Related: 5 years into the ‘cloud-first policy’ CIOs still struggling ]
“There’s a big transition cost to moving legacy applications to cloud,” Conway says. “And some of the applications that we’ve got in DoD are very old, very bulky, and almost impossible to move. And so therefore, we’re not going to probably spend any time on those. But we think moving forward, we think there’s some value in having cloud capacity within the department.”
Conway’s agency, far and away the largest IT purchaser in the federal government, is looking to widen its collaboration with vendors in the private sector as it takes tentative steps into the cloud.
“We’re looking at trying to outsource cloud services and do it on-premise, do it on our military installations,” Conway says. “We think there’s some value there — engage with the industry, build some partnerships with industry and sort of modernize our data center infrastructure, but take it one step at a time. We’re not into let’s just jump in the deep end automatically. And so we want to have several little operations going to try to figure out what’s the best way ahead for us.”
That approach can be found in a variety of government IT initiatives. Increasingly, CIOs have been looking at a modular, agile model for developing software and applications and shifting away from what’s sometimes called the “big-bang” mode, where agencies embark on complex, multi-phase projects that are prone to cost overruns, blown deadlines and often results in a subpar final product.
At the General Services Administration, officials are working to develop repeatable contracting models that CIOs across the government could use for much of their computing infrastructure that is not mission-specific, addressing common areas of concern like cybersecurity and data reliability.
[ Related: Obama wants more cybersecurity funding and a federal CISO ]
“There are going to be parts of each agency’s mission that you just can’t simply buy from a box,” says Bill Zielinski, GSA’s director of strategic programs. “However,” he quickly adds, at each agency “there are some centralized types of solutions, centralized kind of schedules, purchasing vehicles that can be leveraged that would allow for us to really do this with much greater speed and a much more cost-effective way.”
“These are 80-20 solutions, realizing and understanding that we’re not going to be able to do all things for every agency,” Zielinski explains. “But if we can really work together to figure out what are those things that can be placed into those contractual vehicles, we can help them move much more quickly, bring in new technologies much more readily.”