Within the federal government, the shift toward virtualization and cloud computing is already well underway, but agency and industry officials warn that those migrations invite new security considerations, particularly in the form of insider threats.
Eric Chiu, president of the cloud and virtualization security firm HyTrust, notes the familiar list of arguments in favor of virtualizing servers and systems – lower costs and increased agility and efficiency chief among them – but points out that there are dangers associated with that transition.
"Virtualization also concentrates risk … You're taking what used to be lots of separate physical systems that had their own configurations, their own separate management consoles, their own separate group of experts that managed them, and you're collapsing all that functionality onto a single software layer," Chiu said during a panel discussion hosted by Federal News Radio.
"Any virtualization admin can access any VM, could copy any VM, could delete or destroy any VM," he adds. "If you look at today's threat, it's really coming from the inside."
Federal IT Leaders Must Be Vigilant as Network Perimeter Widens
Officials stress that vendors have an important role to play in facilitating the shift to the cloud and virtualized servers and systems within the government, particularly as agencies mull the implementation of service models at the software and platform levels.
For agencies, that puts a premium on working with service providers in the vendor community to hammer out security protocols through the contracting process. At the same time, government organizations have been deploying the continuous diagnostics and mitigation (CDM) program that the Department of Homeland Security has been developing to provide for real-time network monitoring and threat detection.
State Department CISO Bill Lay points out that when the network-level operations are turned over to a third-party provider, government IT workers often won't be able to conduct the same security checks as they once could when all the systems were managed in-house. That means federal IT managers must work with their vendors to ensure that the personnel with access to the network are properly vetted and satisfactory security controls are in place.
[ Analysis: How Cloud Vendors Can Win Over Skeptical Federal Buyers ]
"It all comes down to how you negotiate and write your contract," Lay says.
John Skudlarek, deputy CIO at the Federal Communications Commission (FCC), agrees, noting that his agency is looking to partner with forward-looking, innovative vendors but, at the same time, remains cautious about teaming up with young startups.
"One of the key aspects is ensuring that you pick the right provider, that they've got appropriate security controls in place, that you're able to negotiate those good SLAs," Skudlarek says. "I would probably want to be somewhere near leading-edge, but not necessarily bleeding-edge, because you'd like somebody that's got a track record."
The government's increasing reliance on private-sector IT contractors and the growing adoption of mobile devices and BYOD policies have in many ways blurred the lines of the traditional network perimeter. Security officials increasingly are concerned about identifying who's accessing which parts of their organization's network.
"For insider threat, it's very difficult. Once you're on the network, everybody looks the same," Lay says. "Insider threat really boils down to monitoring behavior, heuristics, finding out if ... something look[s] askew or out of the ordinary [and] warrants closer attention."
[ Discussion: New Methods for Addressing Insider Threats ]
The evolving continuous monitoring systems under development are meant to act as an early warning system to identify emerging threats and potentially vulnerable systems and applications. But those processes, facilitated by sensors positioned throughout the network, can pose an additional challenge in the form of how to make meaningful inferences from a constantly growing dataset.
"Our struggle is to tie all the information, all the feeds, the speeds from our sensors into a pool of information that can be filtered and boiled down into actionable information so our administrators [and] investigators have something to work with where they don't feel like they're boiling the ocean," Lay says. "It poses a huge problem."
Government Big Data Presents 'New Frontier for Cybersecurity'
Lay says that the State Department's current continuous monitoring system gathers "well over" 2 terabytes of data each day. It's an unwieldy trove that quickly multiplies when the department's security team wants to conduct a threat analysis over a substantial period of time.
"If we want to do trending over a year, we're looking at basically a petabyte database," he says. "That's huge big data, and that's a new frontier for cybersecurity experts, so it really becomes a big data issue that we have to overcome."
While some of the security challenges associated with virtualization and the cloud are common to all federal agencies, those that regularly engage with the public through their online systems have additional considerations.
Regulatory agencies such as the FCC are bound by law to collect comments from the public as part of the normal rulemaking process. The commission recently closed its comment window for its contentious open Internet, or net neutrality, rules. The response was overwhelming, says Skudlarek, who describes that process, and the more than 1 million public comments that poured in, as "the Pandora's box ... that's opened up for us."
"We find ourselves having lots of interactions with stakeholders and the general public, and we find that that causes certain activities within our networks and our public-facing systems that bear close monitoring," Skudlarek says, hinting without elaborating that the FCC's systems have been targeted by malicious actors through the comment process.
Another agency in a similar position is the Securities and Exchange Commission. With its cache of corporate filings that heavily influence trading activities, the SEC receives more than 2 billion hits on its website each month, according to CIO Tom Bayer.
Bayer's team at the SEC is "very much in favor of continuous monitoring" and undertakes a security evaluation to consider potential threats before deploying a new system or application on the network.
However, the heightened security procedures Bayer and other government CIOs put in place can be highly labor-intensive, creating another challenge for federal agencies already struggling to do more with less amid a tight budget environment.
"It requires us to spend more time monitoring and understanding the logs," Bayer says. "My concern is that federal agencies don't always get the opportunity to invest in checking what's happening from an erratic perspective, and that investment takes time and it takes a lot of effort."
As important as new sensor-enabled monitoring and big data technologies may be to protecting the evolving network architecture, officials also note that they can't overlook some of the fundamental security procedures involving access and authentication that should have been in place all along but have, in fact, been inconsistently implemented. Those policies, what some refer to as basic "blocking and tackling," can't be taken for granted, Skudlarek argues. He cautions that CIOs need to re-examine some of the security basics even as they pursue new technologies to fortify their shape-shifting networks.
"While it's true insiders are now a significant threat," he says, "the other significant threat is where you weren't following basic standard security protocols that you should have been following in the first place."