In November 2005, Jason Spaltro, executive director of information security at Sony Pictures Entertainment, sat down in a conference room with an auditor who had just completed a review of his security practices.
The auditor told Spaltro that Sony had several security weaknesses, including insufficiently strong access controls, which is a key Sarbanes-Oxley requirement.
Furthermore, the auditor told Spaltro, the passwords Sony employees were using did not meet best practice standards that called for combinations of random letters, numbers and symbols. Sony employees were using proper nouns. (Sox does not dictate how secure passwords need to be, but it does insist that public companies protect and monitor access to networks, which many auditors and consultants interpret as requiring complex password-naming conventions.)
Summing up, the auditor told Spaltro, “If you were a bank, you’d be out of business.”
Frustrated, Spaltro responded, “If a bank was a Hollywood studio, it would be out of business.”
Spaltro argued that if his people had to remember those nonintuitive passwords, they’d most likely write them down on sticky notes and post them on their monitors. And how secure would that be?
After some debate, the auditor agreed not to note “weak passwords” as a Sox failure.
Doing the Right Thing
Spaltro’s experience illuminates a transaction that’s rarely discussed outside corporate walls. Compliance with federal, state, and international privacy and security laws and regulations often is more an interpretive art than an empirical science—and it is frequently a matter for negotiation. How to (or, for some CIOs, even whether to) follow regulations is neither a simple question with a simple answer nor a straightforward issue of following instructions. This makes it more an exercise in risk management than governance. Often, doing the right thing means doing what’s right for the bottom line, not necessarily what’s right in terms of the regulation or even what’s right for the customer.
“There are decisions that have to be made,” Spaltro explains. “We’re trying to remain profitable for our shareholders, and we literally could go broke trying to cover for everything. So, you make risk-based decisions: What’re the most important things that are absolutely required by law?” Spaltro does those, noting that “Sony is over-compliant in many areas,” and he says that Sony takes “the protection of personal information very seriously and invests heavily in controls to protect it.”
He adds that “Legislative requirements are mandatory, but going the extra step is a business decision” based on what makes business sense.
So you adjust, you decide, you weigh the issues. It’s not black and white, yes or no.
When it comes to compliance, you can, in fact, be a little bit pregnant.
When business metrics are applied to compliance, many companies decide to deploy as little technology or process as possible—or to ignore the governing laws and regulations completely.
According to “The Global State of Information Security 2006” survey conducted by CIO and PricewaterhouseCoopers, about a quarter of U.S. executives who say their companies must comply with Sox regulations admit to being noncompliant with the 2002 law. (See The Global State of Information Security 2006.) Two-thirds of U.S. companies are not compliant with the two-year-old Payment Card Industry (PCI) Data Security Standard, with guidelines (and penalties) developed by the major credit card companies to protect their customers’ credit card numbers. And 42 percent of U.S. healthcare companies admit to not complying with the almost 10-year-old Health Insurance Portability and Accountability Act (HIPAA), which requires health institutions to secure private health information.
“The dirty little secret here is that everybody tries to figure out how much risk they can assume without being embarrassed or caught,” says David Taylor, a former Gartner security analyst and now vice president for data security strategies for Protegrity, a security and privacy consultancy. “The people I regularly talk to are trying to figure out if [their security] fails, what’s the smallest amount they need to do to stay out of trouble and how they can blame someone else.”
The percentage of CIOs who admit to being noncompliant at first may be a bit unnerving, leaving the impression that a significant portion of IT executives are scofflaws. But the problem is more complicated than bad or irresponsible behavior.
What most security experts believe is that CIOs and CSOs are so overwhelmed by the demands of their jobs—running projects, innovating, keeping the lights on and putting out those ever-smoldering IT fires—that they simply don’t have the time to decipher the laws that affect them, much less the time to invest in reconfiguring systems and processes to meet regulatory requirements. (This problem is exacerbated in smaller companies. See The ROI of Noncompliance in the Mid-Market.) And make no mistake: It takes a lot of time. According to a 2006 Gartner report, IT organizations spend between 5,000 and 20,000 man hours a year trying to stay compliant with Sarbanes-Oxley’s requirements.
Complicating the CIO’s task, even auditors frequently are unclear as to what the laws mean. Alex Bakman, founder and CTO for Ecora Software, which sells audit compliance applications, asserts that the checklist for Sox compliance that some auditors use can differ within the same company. “For Sox, how IT needs to be managed for compliance is really all over the place,” Bakman says.
But there is, thankfully, an emerging consensus on how to comply in ways that make business sense.
Sarbanes-Oxley, which became law in 2004, remains something of a mystery. “Sox is tough; it’s hard,” says Rich Mogull, a Sox analyst at Gartner. “When we look at what companies are doing, there are shifting standards, and auditors enforce the standards differently.”
What it means to be Sox compliant can be a moving target. Many confused CIOs have turned to standard to-do checklists supplied by Sox auditors or consultants. But that strategy, Spaltro argues, frequently leads to no compliance at all. “When they begin to implement [the checklist], they quickly find out how much more it costs than they thought,” he says. “Soon, they can’t keep up with the demands of completing all the items and they give up.”
Spaltro recommends CIOs and security executives sit down with outside auditors, their corporate legal staff and executives from human resources to figure out what Sox compliance means—not what it means in the abstract but what it means to their company specifically. For example, a bank’s risk of not following a strict interpretation of Sox compliance may be higher than, say, an entertainment company like Sony. A bank must build a higher level of trust with its customers because it manages their money. Risks at other companies may be lower, which means compliance may require lower (and less expensive) levels of controls. “I sincerely believe that if we left it all up to the auditors to tell us what works, we wouldn’t have a business at the end of the day,” Spaltro says.
Most compliance experts agree that CIOs should at least be able to show that they have made a good faith effort to comply with the law. For example, the Sox requirement to control access to sensitive information includes deploying ID management, requiring companies to monitor log files. But how do you know if you have met the standard for monitoring, and what is the standard?
For Sergio Pedro, a managing director in PricewaterhouseCoopers’ advisory services practice, monitoring doesn’t mean spending thousands on technology. Rather, he says the process is very similar to the one your CPA may ask you to follow when tracking the days you spend in an office in one state and the days you are in an office in another state: Just write it down. The person designated to monitor the log files should mark on a printed copy of the files any concerns, initial the file and file it. “Sox deficiencies basically involve not being able to produce evidence that you’ve done your job,” Pedro says. “You just have to prove that you checked the process.”
Two federal laws require organizations to protect the private information of customers: the Gramm-Leach-Bliley Act (GLBA) of 1999 and HIPAA. Generally, GLBA requires any organization that stores personal financial information on customers to have a comprehensive security program that identifies threats and risks, and the steps it has taken to address them. HIPAA is similar to GLBA, but instead of protecting financial information, the act requires healthcare organizations to secure health records and guarantee patient privacy.
Both laws, by design, are not highly prescriptive. The Federal Trade Commission, which enforces the so-called Safeguard Rule as part of the GLBA, states that “the plan must be appropriate to the company’s size and complexity, the nature and scope of its activities, and the sensitivity of the customer information it handles.” Which leaves a lot of room for interpretation.
And interpretation usually requires negotiation. Glen Damiani, director of IT at the money management firm United Capital Markets (UCM), says much of his job involves mediating between employees who insist they need access to certain information and the compliance officer who insists on strict access controls. Mostly, a middle ground can be worked out, he says.
For example, when Damiani was head of technology at a healthcare organization (prior to joining UCM), a clinician was transferred from the oncology department to the obstetrics unit. The clinician asked Damiani for permission to access the records of her former oncology patients. She wanted to continue to track their progress and provide them moral support. But the company’s privacy officer ruled that the clinician’s new responsibilities in obstetrics did not give her the right to access the oncology database.
Damiani negotiated a compromise in which the clinician could have access to those patients she had had direct contact with while working in the oncology unit. Strictly speaking, HIPAA does not allow such access, but Damiani argued that it would facilitate continuity of care, an important medical principle. “Compliance has got to be a give and take,” Damiani says. “The compliance officer may be dead set on not giving access, but another senior person is trying to do their job.”
Auditors are increasingly looking for guidance in these matters, Pedro says. But be advised: CIOs should document the discussions and highlight the main points of why the decision was made to allow access to certain data. That way, you can explain your reasoning to state and federal regulators when they come knocking on your door if data is lost or leaked. A clearly worded argument backing up a decision may help convince regulators that you thought about the risks, you thought about how to mitigate them and you took compliance with the law into consideration.
This is particularly pertinent when it comes to deciding whether to encrypt data. No law currently requires companies to encrypt data, but it is an effective way to protect data, and the FTC has fined companies for privacy breaches. So what to do? Pedro says a best practice has developed in the past year that recommends encryption for data that may leave the organization (data on laptops, PDAs and in e-mail) while leaving data that remains in-house (on, say, a mainframe) unencrypted.
Again, you will want to document why your organization believes that this is an appropriate practice. “If you can show that you have read the pertinent regulations, and show that this is your interpretation of what the regulation says, and you can show intent to protect the data, you are more protected than those who haven’t done that,” Pedro says.
Notification: The Risk Analysis
At a 2006 security conference, Protegrity VP Taylor ate lunch next to the CISO of a large metropolitan university. The CISO told Taylor that she had received an e-mail from one of her programmers informing her that the school may have experienced a breach that may have exposed students’ personal information. The programmer was unsure if the law required the school to report the incident and asked the CISO for guidance.
Taylor asked her what she did. She said she wrote back to the programmer telling him not to do anything.