In November 2005, Jason Spaltro, executive director of\n information security at Sony Pictures Entertainment, sat down\n in a conference room with an auditor who had just completed a\n review of his security practices. \n\n The auditor told Spaltro that Sony had several security\n weaknesses, including insufficiently strong access controls,\n which is a key Sarbanes-Oxley requirement.Furthermore, the auditor told Spaltro, the passwords Sony\n employees were using did not meet best practice standards that\n called for combinations of random letters, numbers and symbols.\n Sony employees were using proper nouns. (Sox does not dictate\n how secure passwords need to be, but it does insist that public\n companies protect and monitor access to networks, which many\n auditors and consultants interpret as requiring complex\n password-naming conventions.)Summing up, the auditor told Spaltro, \u201cIf you were a\n bank, you\u2019d be out of business.\u201dFrustrated, Spaltro responded, \u201cIf a bank was a\n Hollywood studio, it would be out of business.\u201dSpaltro argued that if his people had to remember those\n nonintuitive passwords, they\u2019d most likely write them\n down on sticky notes and post them on their monitors. And how\n secure would that be?After some debate, the auditor agreed not to note\n \u201cweak passwords\u201d as a Sox failure.\n\n Doing the Right Thing\n Spaltro\u2019s experience illuminates a transaction\n that\u2019s rarely discussed outside corporate walls.\n Compliance with federal, state, and international privacy and\n security laws and regulations often is more an interpretive art\n than an empirical science\u2014and it is frequently a matter\n for negotiation. How to (or, for some CIOs, even whether to)\n follow regulations is neither a simple question with a simple\n answer nor a straightforward issue of following instructions.\n This makes it more an exercise in risk management than\n governance. Often, doing the right thing means doing\n what\u2019s right for the bottom line, not necessarily\n what\u2019s right in terms of the regulation or even\n what\u2019s right for the customer.\u201cThere are decisions that have to be made,\u201d\n Spaltro explains. \u201cWe\u2019re trying to remain\n profitable for our shareholders, and we literally could go\n broke trying to cover for everything. So, you make risk-based\n decisions: What\u2019re the most important things that are\n absolutely required by law?\u201d Spaltro does those, noting\n that \u201cSony is over-compliant in many areas,\u201d and he\n says that Sony takes \u201cthe protection of personal\n information very seriously and invests heavily in controls to\n protect it.\u201dHe adds that \u201cLegislative requirements are mandatory,\n but going the extra step is a business decision\u201d based on\n what makes business sense.So you adjust, you decide, you weigh the issues. It\u2019s\n not black and white, yes or no.When it comes to compliance, you can, in fact, be a little\n bit pregnant.\n\n Living Dangerously\n When business metrics are applied to compliance, many companies\n decide to deploy as little technology or process as\n possible\u2014or to ignore the governing laws and regulations\n completely.According to \u201cThe Global State of Information Security\n 2006\u201d survey conducted by CIO and PricewaterhouseCoopers,\n about a quarter of U.S. executives who say their companies must\n comply with Sox regulations admit to being noncompliant with\n the 2002 law. (See The Global State of Information Security 2006.) Two-thirds of U.S. companies\n are not compliant with the two-year-old Payment Card Industry\n (PCI) Data Security Standard, with guidelines (and penalties)\n developed by the major credit card companies to protect their\n customers\u2019 credit card numbers. And 42 percent of U.S.\n healthcare companies admit to not complying with the almost\n 10-year-old Health Insurance Portability and Accountability Act\n (HIPAA), which requires health institutions to secure private\n health information.\u201cThe dirty little secret here is that everybody tries\n to figure out how much risk they can assume without being\n embarrassed or caught,\u201d says David Taylor, a former\n Gartner security analyst and now vice president for data\n security strategies for Protegrity, a security and privacy\n consultancy. \u201cThe people I regularly talk to are trying\n to figure out if [their security] fails, what\u2019s the\n smallest amount they need to do to stay out of trouble and how\n they can blame someone else.\u201dThe percentage of CIOs who admit to being noncompliant at\n first may be a bit unnerving, leaving the impression that a\n significant portion of IT executives are scofflaws. But the\n problem is more complicated than bad or irresponsible\n behavior.What most security experts believe is that CIOs and CSOs are\n so overwhelmed by the demands of their jobs\u2014running\n projects, innovating, keeping the lights on and putting out\n those ever-smoldering IT fires\u2014that they simply\n don\u2019t have the time to decipher the laws that affect\n them, much less the time to invest in reconfiguring systems and\n processes to meet regulatory requirements. (This problem is\n exacerbated in smaller companies. See The ROI of Noncompliance in the Mid-Market.) And make no mistake: It takes a lot of\n time. According to a 2006 Gartner report, IT organizations\n spend between 5,000 and 20,000 man hours a year trying to stay\n compliant with Sarbanes-Oxley\u2019s requirements.Complicating the CIO\u2019s task, even auditors frequently\n are unclear as to what the laws mean. Alex Bakman, founder and\n CTO for Ecora Software, which sells audit compliance\n applications, asserts that the checklist for Sox compliance\n that some auditors use can differ within the same company.\n \u201cFor Sox, how IT needs to be managed for compliance is\n really all over the place,\u201d Bakman says.But there is, thankfully, an emerging consensus on how to\n comply in ways that make business sense.\n\n Sox Simplified\n Sarbanes-Oxley, which became law in 2004, remains something of\n a mystery. \u201cSox is tough; it\u2019s hard,\u201d says\n Rich Mogull, a Sox analyst at Gartner. \u201cWhen we look at\n what companies are doing, there are shifting standards, and\n auditors enforce the standards differently.\u201dWhat it means to be Sox compliant can be a moving target.\n Many confused CIOs have turned to standard to-do checklists\n supplied by Sox auditors or consultants. But that strategy,\n Spaltro argues, frequently leads to no compliance at all.\n \u201cWhen they begin to implement [the checklist], they\n quickly find out how much more it costs than they\n thought,\u201d he says. \u201cSoon, they can\u2019t keep up\n with the demands of completing all the items and they give\n up.\u201dSpaltro recommends CIOs and security executives sit down\n with outside auditors, their corporate legal staff and\n executives from human resources to figure out what Sox\n compliance means\u2014not what it means in the abstract but\n what it means to their company specifically. For example, a\n bank\u2019s risk of not following a strict interpretation of\n Sox compliance may be higher than, say, an entertainment\n company like Sony. A bank must build a higher level of trust\n with its customers because it manages their money. Risks at\n other companies may be lower, which means compliance may\n require lower (and less expensive) levels of controls. \u201cI\n sincerely believe that if we left it all up to the auditors to\n tell us what works, we wouldn\u2019t have a business at the\n end of the day,\u201d Spaltro says.Most compliance experts agree that CIOs should at least be\n able to show that they have made a good faith effort to comply\n with the law. For example, the Sox requirement to control\n access to sensitive information includes deploying ID\n management, requiring companies to monitor log files. But how\n do you know if you have met the standard for monitoring, and\n what is the standard?For Sergio Pedro, a managing director in\n PricewaterhouseCoopers\u2019 advisory services practice,\n monitoring doesn\u2019t mean spending thousands on technology.\n Rather, he says the process is very similar to the one your CPA\n may ask you to follow when tracking the days you spend in an\n office in one state and the days you are in an office in\n another state: Just write it down. The person designated to\n monitor the log files should mark on a printed copy of the\n files any concerns, initial the file and file it. \u201cSox\n deficiencies basically involve not being able to produce\n evidence that you\u2019ve done your job,\u201d Pedro says.\n \u201cYou just have to prove that you checked the\n process.\u201d\n\n Negotiating Privacy\n Two federal laws require organizations to protect the private\n information of customers: the Gramm-Leach-Bliley Act (GLBA) of\n 1999 and HIPAA. Generally, GLBA requires any organization that\n stores personal financial information on customers to have a\n comprehensive security program that identifies threats and\n risks, and the steps it has taken to address them. HIPAA is\n similar to GLBA, but instead of protecting financial\n information, the act requires healthcare organizations to\n secure health records and guarantee patient privacy.Both laws, by design, are not highly prescriptive. The\n Federal Trade Commission, which enforces the so-called\n Safeguard Rule as part of the GLBA, states that \u201cthe plan\n must be appropriate to the company\u2019s size and complexity,\n the nature and scope of its activities, and the sensitivity of\n the customer information it handles.\u201d Which leaves a lot\n of room for interpretation.And interpretation usually requires negotiation. Glen\n Damiani, director of IT at the money management firm United\n Capital Markets (UCM), says much of his job involves mediating\n between employees who insist they need access to certain\n information and the compliance officer who insists on strict\n access controls. Mostly, a middle ground can be worked out, he\n says.For example, when Damiani was head of technology at a\n healthcare organization (prior to joining UCM), a clinician was\n transferred from the oncology department to the obstetrics\n unit. The clinician asked Damiani for permission to access the\n records of her former oncology patients. She wanted to continue\n to track their progress and provide them moral support. But the\n company\u2019s privacy officer ruled that the\n clinician\u2019s new responsibilities in obstetrics did not\n give her the right to access the oncology database.Damiani negotiated a compromise in which the clinician could\n have access to those patients she had had direct contact with\n while working in the oncology unit. Strictly speaking, HIPAA\n does not allow such access, but Damiani argued that it would\n facilitate continuity of care, an important medical principle.\n \u201cCompliance has got to be a give and take,\u201d Damiani\n says. \u201cThe compliance officer may be dead set on not\n giving access, but another senior person is trying to do their\n job.\u201dAuditors are increasingly looking for guidance in these\n matters, Pedro says. But be advised: CIOs should document the\n discussions and highlight the main points of why the decision\n was made to allow access to certain data. That way, you can\n explain your reasoning to state and federal regulators when\n they come knocking on your door if data is lost or leaked. A\n clearly worded argument backing up a decision may help convince\n regulators that you thought about the risks, you thought about\n how to mitigate them and you took compliance with the law into\n consideration.This is particularly pertinent when it comes to deciding\n whether to encrypt data. No law currently requires companies to\n encrypt data, but it is an effective way to protect data, and\n the FTC has fined companies for privacy breaches. So what to\n do? Pedro says a best practice has developed in the past year\n that recommends encryption for data that may leave the\n organization (data on laptops, PDAs and in e-mail) while\n leaving data that remains in-house (on, say, a mainframe)\n unencrypted.Again, you will want to document why your organization\n believes that this is an appropriate practice. \u201cIf you\n can show that you have read the pertinent regulations, and show\n that this is your interpretation of what the regulation says,\n and you can show intent to protect the data, you are more\n protected than those who haven\u2019t done that,\u201d Pedro\n says.\n\n Notification: The Risk Analysis\n At a 2006 security conference, Protegrity VP Taylor ate lunch\n next to the CISO of a large metropolitan university. The CISO\n told Taylor that she had received an e-mail from one of her\n programmers informing her that the school may have experienced\n a breach that may have exposed students\u2019 personal\n information. The programmer was unsure if the law required the\n school to report the incident and asked the CISO for\n guidance.Taylor asked her what she did. She said she wrote back to\n the programmer telling him not to do anything.Taylor told the CISO that the university should have\n reported the breach. The CISO disagreed, saying, essentially,\n that because very few people review system log files and\n because only one or two people at the university understood the\n systems and the data in them, it was probable that the breach\n would go unremarked and undiscovered. \u201cI was thinking,\n Wow,\u201d Taylor recalls. \u201cThat\u2019s a risky chance\n to take.\u201dAccording to Behnam Dayanim, a privacy attorney with Paul,\n Hastings, Janofsky & Walker, state security breach\n notification laws are among the most frequently ignored types\n of security regulation. About 35 states have passed security\n breach notification laws, which lay out, to varying degrees,\n when an enterprise needs to notify customers and clients if\n their private information may have been exposed to an\n unauthorized user. According to CIO and\n PricewaterhouseCoopers\u2019 \u201cThe Global State of\n Information Security 2006\u201d survey, 32 percent of U.S.\n organizations admit to not being compliant with state privacy\n regulations.There are two possible explanations for why the\n noncompliance rate is so high. First, the risk of being caught\n is low because it has been extremely difficult to tie a\n specific instance of fraud to a specific breach at a specific\n company, says Jim Lewis, a security expert at the Center for\n Strategic and International Studies in Washington, D.C. (Recent\n lawsuits filed against TJX\u00ad\u2014owner of discount\n retailers TJ Maxx, Marshalls and other stores\u2014which claim\n credit card numbers stolen during a security breach in 2005 and\n 2006 led to specific instances of fraud, may indicate that this\n fact of security life is changing. For more on the TJX breach,\n see Financial Penalties for Security Breaches Will Promote Change.)Second, these laws tend not to be terribly specific\n regarding situations and requirements. For example,\n California\u2019s security breach notification law, the first\n in the nation, does not require notification of a security\n breach if the private data was encrypted. However, it also does\n not require encryption.For that reason, how companies protect private data has\n become a risk-based business decision, says Sony\u2019s\n Spaltro. Sony processes about 5 million credit card\n transactions a month, mostly associated with its PlayStation\n consoles and the massively multiplayer online games it sells.\n Although Spaltro declines to talk about Sony\u2019s security\n practices, he says that while Sony Online Entertainment is\n fully compliant, every company weighs the cost of protecting\n personal data with the cost of what it would take to notify\n customers if a breach occurred. Spaltro offers a hypothetical\n example of a company that relies on legacy systems to store and\n manage credit card transactions for its customers. The cost to\n harden the legacy database against a possible intrusion could\n come to $10 million, he says. The cost to notify customers in\n case of a breach might be $1 million. With those figures, says\n Spaltro, \u201cit\u2019s a valid business decision to accept\n the risk\u201d of a security breach. \u201cI will not invest\n $10 million to avoid a possible $1 million loss,\u201d he\n suggests.That reasoning is \u201cshortsighted,\u201d argues Ari\n Schwartz, a privacy expert at the Center for Democracy and\n Technology. The cost of notification is only a small part of\n the potential cost to a company. Damage to the corporate brand\n can be significant. And if the FTC rules that the company was\n in any way negligent, it could face multimillion-dollar fines.\n In 2006, the FTC fined information aggregator ChoicePoint $15\n million after the company admitted to inadvertently selling\n more than 163,000 personal financial records to thieves. The\n FTC ruled ChoicePoint had not taken proper precautions to check\n the background of customers asking for the information.\n\n Crime and Punishment\n How can a CIO know that the security measures he\u2019s taken\n will be adjudged customary and reasonable by federal or state\n regulators? Looking at the 15 security breach cases the FTC has\n ruled on since 2002, a picture emerges as to what regulators\n deem reasonable (the FTC plans this spring to release a\n document that will set out more specific guidelines), and by\n looking at the 14 cases the FTC has settled against companies\n that have experienced security breaches, one can get a sense of\n what\u2019s deemed customary. (For a look at some of the more\n recent cases, see When Companies Violate the Rules.)In 2005 the FTC handed down a judgment against BJ\u2019s\n Wholesale Club. BJ\u2019s was found to have \u201cengaged in\n a number of practices which, taken together, did not provide\n reasonable security for sensitive customer information.\u201d\n The FTC ruled that BJ\u2019s had failed to encrypt personal\n data transmitted over the Internet; had stored personal data\n after it no longer needed the information; used commonly known\n default passwords for access to files containing personal\n information; and did not use commercially available technology\n to secure wireless connections, detect intrusions or conduct\n security audits.\u201cWe know security can\u2019t be perfect, so we\n don\u2019t expect perfection,\u201d says Jessica Rich,\n assistant director of the Division for Privacy and Identity\n Protection at the FTC. \u201cBut companies need to try, and if\n you do that, you will be much better off.\u201dBut for some, \u201ctrying\u201d requires a lot more than\n receiving an \u201cA\u201d for effort. You better know what\n you are doing, says Joe Fantuzzi, CEO of the information\n security company Workshare. He says the FTC\u2019s ruling\n against BJ\u2019s was intended to send the message that if you\n claim to protect your customers\u2019 personal data in your\n privacy statement (as BJ\u2019s did), your security had better\n be up to the task.\u201cThe point is that companies make risk management\n versus compliance management trade-offs all the time,\u201d\n Fantuzzi says. \u201cJust make sure you do your homework so\n you know you made the right trade-off.\u201dAllan Holmes is a Washington, D.C.-based freelancer and\n security expert.