How to Conduct a Vulnerability Assessment

Los Alamos National Laboratory's Roger Johnston talks about how aliens, Elvis impersonators and your worst security users can help you find and fix security problems.

Roger Johnston knows about security vulnerabilities, and not only because he works for the Los Alamos National Laboratory, which has experienced more than its share of security problems of late (including the loss of classified materials last autumn). As leader of the laboratory's Vulnerability Assessment Team, a research group devoted to improving physical security, Johnston is the guy who gets brought in to find security problems, not only at his own agency, but also at other agencies and at private companies. His team has been hired to conduct vulnerability assessments at government agencies with such high security stakes as the International Atomic Energy Agency, the Department of State and the Department of Defense, as well as at private companies that are developing or considering the use of high-tech security devices.

Senior Editor Sarah D. Scalet recently spoke with Johnston about strategies for running an effective vulnerability assessment and then communicating the results without also putting your job on the line. To help security leaders identify specific areas that need improvement, Johnston also developed a quiz that identifies the 28 attributes of a flawed security system. "We see the same things over and over," he says. "These are the common unifying themes." Find out how you rate. (Note: Johnston emphasized that his statements here are his own opinions and do not necessarily reflect the official position of the Los Alamos National Laboratory or the U.S. Department of Energy, its parent organization.)

CSO: You basically spend your days finding problems with things. Are people afraid to cook for you?

ROGER JOHNSTON: Yeah, well, we always try to have an upbeat message. There are often very simple fixes to problems. Say you're using a tamper-indicating seal for cargo security. When you inspect the seal, maybe you simply spend an extra second or two looking for a little scratch in the upper right-hand corner to discover an attack.

CSO: So training is a key to that upbeat message?

JOHNSTON: Right. We're very strong believers in showing security personnel a lot of vulnerability information. Often, low-level security people aren't given the information they need to do a good job. If they know what they're supposed to be looking for, instead of just turned loose and told to report "anomalous incidents," they generally will do a lot better. You really haven't spent a lot of extra money, and it doesn't necessarily take a great deal of time.

CSO: When you're doing a vulnerability assessment, what's the best way to get into the mind-set of the adversary?

JOHNSTON: That's the real trick. The problem with a lot of vulnerability assessments is that they're done by very sincere security people who have devoted their lives and careers to being good guys. They really don't want security to have any problems. It's not a matter of dishonesty; it's just human nature. Also, in many cases security personnel come from military or police backgrounds. That kind of training and discipline can be very useful, but those backgrounds don't typically tend to attract people who are wildly creative.

You want to look around your organization and find people who are outside-the-box thinkers. They don't have to be in the field of security. You're looking for people who would normally be your worst security nightmare--people who are loophole finders, smart alecks, kind of skeptical. They're people who have to prove things for themselves and aren't sure they buy everything they hear from authority.

CSO: So you're looking for people who've been in trouble for violating some security policy?

JOHNSTON: I don't want to push it too far. If they're wanted in 35 states for felonies, maybe that's not exactly who you want looking at your critical security vulnerabilities. It's more about finding the people who won't automatically toe the party line. These are people in your organization who are already thinking about how they could beat your security. They're probably not going to do it, but that's just the way they think. They may be graphic artist types; they may be the smart aleck on the loading dock who's always questioning the boss.

CSO: There's more of that ethos in the information security culture than in the physical security culture.

JOHNSTON: Absolutely. There's a huge cultural gap, of course, between IT security and physical security, and that's much of the problem of convergence, trying to bring the two together. I think IT is better off in this regard. A lot of the people who work on computers automatically think that way.

CSO: What's the risk of conducting a vulnerability assessment from the point of a good guy?

JOHNSTON: When vulnerability assessments are done by good guys thinking like good guys, number one, they let the existing security infrastructure and hardware and strategies define the vulnerability issues. For example, if there's a fence, they'll think about ways the bad guys might get over the fence. But of course that's all backwards. We need to think about what the bad guys want to accomplish and then decide if we even need a fence. Number two, there's that tendency not to want to try to find problems.

CSO: Not only are they possibly making themselves look bad if they find a problem, they're also creating more work for themselves, right?

JOHNSTON: Absolutely. In many cases when the fix is very simple, organizations are very reluctant to do it, because that is sometimes thought of as saying, "We've been screwing up all these years." So you don't want to go with people who have a history of doing a vulnerability assessment and then telling you everything is swell. There are always vulnerabilities, and they are always present in very large numbers. Any vulnerabilities assessment that finds zero vulnerabilities is completely useless.

CSO: When you actually do the assessment, are there warm-ups you can do to get yourself in the mind-set of a bad guy, or are there ways you should set up the room?

JOHNSTON: A lot of vulnerability assessment needs to be very similar to classic brainstorming. A lot of the tools that are applied to creative thinking in other fields can be applied directly to vulnerability assessments. This is kind of a radical position. A lot of people in the security business are not comfortable with this 1960s hippy, touchy-feely, "let's all get together" approach.

CSO: I'm imagining a bunch of beanbag chairs.

JOHNSTON: Yeah. A lot of people would much rather have a rigorous, quantitative approach, and I would claim that's largely a sham. I don't think it's a mistake to use analytical tools like a security survey, but we would like to combine those more closed-ended, straightforward tools with creative thinking. The fact is that creativity has been studied extensively over the last 50 years, and there's a lot of understanding of how you create an environment where people come up with good ideas. It's not quite the seat-of-the-pants, wacky kind of thing that it might look like from the outside.

CSO: Should the CSO even be there?

JOHNSTON: You don't want the boss in the room, because it constrains people. What you need are really nutty ideas, so we strongly encourage thinking about attacks that involve Elvis impersonators and flying monkeys and the use of space aliens. Early on, it's very important not to editorialize. Later on, we're going to prioritize them and think about the practicality of them. In many cases, we have people say, "Well, if I had the space aliens come down with a ray beam, they could do the following." Later on, it turns into a very viable attack, once we get rid of the space aliens and the laser beams.

CSO: Does this take hours? Days? Weeks?

JOHNSTON: It depends. If you're looking at a very complex security program, you may want to spend two or three weeks just kind of freewheeling. But you don't just sit around and do ideas. You generate nutty ideas, and then you go back to the program or the hardware and play around a little bit to see if those nutty ideas might have some merit. Then you get back together again, and you think of more nutty ideas based on what you learned. We're very much in favor of hands-on work, and not just thinking in abstractions. Toss the device around. Chat up the security guards. Kick the fence. Play with the system and try to understand how it behaves.

CSO: When the CSO tells his or her company about a vulnerability, we've seen that there can be a kind of "shoot the messenger" effect. (See "Don't Shoot the Messenger" from the August 2006 CSO.) What are ways they can avoid that or at least mitigate the effect?

JOHNSTON: We try to encourage people think about a vulnerability not as bad news. It's great news. When you find a vulnerability, you can do something about it.

CSO: But you still have to take people down the path of, something terrible could happen.

JOHNSTON: All our vulnerability assessment reports start out by pointing to the good things. There are <em>always</em> good things. Sometimes they're an accident, but by pointing them out, you get them recognized. Also, at the very beginning we always point out that we're going to find more vulnerabilities than they can possibly mitigate. We're going to make more suggestions for changes than you can possibly implement. That's OK. The bottom line is, vulnerability assessors are not here to tell you what changes to make. We're here to point out what we think are problems and what we think may be solutions. It's up to you to decide what you do with the findings.

This binary thinking about security--that something is either secure or not secure, or that we have to have all the vulnerabilities covered or we're not doing our job--is really nonsense. Security is a continuum, and there are always going to be vulnerabilities you can't do anything about. That doesn't mean anybody is screwing up. That's just the way security works.

CSO: In coming up with this laundry list of problems and possible solutions, is there oftentimes an 80/20 thing at play, where you can solve 80 percent of the problems with 20 percent of the solutions?

JOHNSTON: It does work that way. People say, "Gee, you're telling me I need to make this one little change, and this attack and this attack and this attack and this other attack basically go away?" It's really quite surprising. Sometimes the vulnerabilities are extraordinarily complex, and the solutions, while they may not be 100 percent perfect, are often really painless. We don't always have the most realistic view--we work for the government--about what's economically viable to implement. Sometimes what we think is simple isn't really simple in the real world. But that's OK too. Sometimes our suggestions get the end users thinking, and then maybe they come up with their own solution.

CSO: You've brought a couple of industrial-organizational psychologists onto your team. Why?

JOHNSTON: Industrial-organizational psychology has been applied across a wide range of fields, but for some weird reason, not security. When we first got these psychologists to work with us, they just couldn't believe that no one had applied all these powerful tools in industrial psychology towards security problems. Increasingly, we're using them to understand the human factors associated with security. In the end, security is really about how people interact with technology, how people use and think about technology, and how the technology was designed to enhance what people are already doing.

CSO: What kinds of things have the industrial-organizational psychologists found?

JOHNSTON: The main one early on was the recognition that the security guard turnover problem is a huge problem. The numbers typically run between 40 percent and 400 percent per year. McDonald's has a turnover rate of about 35 to 40 percent, so McDonald's does a better job than security of finding the right people and hanging on to them. There are plenty of organizations that do very fine with turnover rates that don't pay people very well and don't necessarily represent fabulous careers. There are ways that IO psychologists have developed over the last couple decades that help these companies, but the tools never have been applied to security. The first things that our guys did was publish some papers basically saying, "Hey, wake up, we don't need to do any new R&D, there are all these tools already proven out there." They involve things like understanding who you hire and creating a realistic picture in their mind of what the job is like. If you simply do that, turnover rate plummets.

1 2 Page 1
Page 1 of 2
Discover what your peers are reading. Sign up for our FREE email newsletters today!