by Scott Berinato

The CIO Code of Ethical Data Management

News
Jul 01, 200215 mins
Privacy

All it took was one e-mail from an irate customer for Saab Cars USA CIO Jerry Rode to realize he had a developing public relations fiasco on his hands.

In 1999, Norcross, Ga.-based Saab hired four Internet marketing companies to send customers information about Saab’s new models. And although the auto company had specified that the program be opt-in (meaning it would e-mail only the people who had agreed to receive such mail), one of the marketing companies apparently had a different definition of opt-in. And that meant one ticked-off customer, with more potentially on the way.

Rode fired the errant marketing company and immediately developed a formal policy surrounding the use of customer data. “The customer doesn’t see ad agencies and contracted marketing firms. They see Saab USA spamming them,” he says. “Finger-pointing after the fact won’t make your customers feel better.”

As Rode learned to his chagrin, data itself doesn’t care how it’s used. It will not stop itself from spamming customers or sharing personal, identifying details with third parties. It cannot decide to delete or preserve itself. Therefore, it falls on the shoulders of those who lord over ever larger terabytes of data?the CIOs?to develop ethical guidelines on how to manage it.

But most CIOs have not picked up the gauntlet. There are any number of reasons why. For one, the stuff is hard. The issue traps CIOs between two rather loathsome roles in the company?the bad guy or the fall guy. Either the CIO stops marketers from using data for purposes some customers are not comfortable with, thereby damming potential revenue streams. Or the CIO imparts no ethical code and is left to explain why the company has allowed a customer’s data to be exploited. Similarly, the CIO can tell the CEO what he can and cannot do with the company’s data, thereby putting his own job in jeopardy. Or he can go along and run the risk of finding himself culpable for some misuse of data.

No matter what the situation, ad hoc decision making on data ethics isn’t viable any more. CIOs must start a meaningful conversation about ethics at their company. Hard money is at stake here. Companies such as DoubleClick and Qwest Communications, which were both forced to pull back from controversial plans to share customer data with other companies, have seen their stock prices plummet. Soft money is at stake too. Brands such as Saab Cars USA stand to suffer uncalculable losses when customers feel violated. Data ethics is not just a moral issue; it’s the intersection of business and morality. And if the CIO doesn’t take the lead, then someone else?like the customer or the government?will.

The goal here is to explain why CIOs need a code of ethics when it comes to data management, and then to propose such a code. (See “The Six Commandments of Ethical Data Management,” this page.)

Rode, for one, is all for it. “I don’t see ducking the problem and blaming it on marketing or saying something slipped by as the answer,” he says. “I see this as the CIO’s responsibility. We need to take ownership of the problem and develop some rules.”

Function Creep

Everyone wants Scott Thompson’s data. Thompson is executive vice president of Foster City, Calif.-based Inovant, the company Visa set up to handle its technology. Thompson’s job includes the responsibilities of Visa CIO, the job from which he was promoted last year. At Inovant, he stands atop 35 billion credit and debit card transactions per year.

Sales and marketing would love to drill into Thompson’s databases. They’d love to refine the data into loyalty programs, target marketing or partnerships between Visa and retailers. “There are lots of creative people coming up with these ideas,” Thompson says. “This whole area of data sharing is enormous and growing. For the marketers, the sky’s the limit.”

But Thompson isn’t playing ball. He is determined to prevent an environmental disaster: the pollution of the Visa brand through the violation of its customers’ privacy. To share the data on what people spend their money on, in what stores, at what time of day?all of which and more resides in Visa’s databases?would violate Visa’s own rules for acceptable use of customer data.

Visa, Thompson says, errs on the side of caution with its credit card data policy, devised by Thompson along with staffers who specialize in privacy. He bars any use of it outside of its intended purpose, which is mainly billing. Keeping the data off-limits, he figures, keeps customers from choosing MasterCard instead.

This is not to say Thompson’s not tempted to share the data. He is, every day. “Every single day of the week, someone outside our organization comes to us and says, ’If only we could use your data to do this or profile that, it would be a license to print a zillion dollars for Visa,’” he says. “I’ve heard a half-dozen ideas for uses of this data where I thought, That’s pretty clever, I know that would work. But as I’ve said, we made this decision. We’re not going to do it.”

However, a larger question remains: Can Thompson guarantee?as he did at one point during an interview?that some unethical use of data won’t crop up at Inovant or Visa? Many experts, and lawyers, think he cannot. After all, in a large majority of cases, the unethical use of data happens not through the malicious scheming of a rogue marketeer but rather by inattention.

Here’s how: Customer data is collected and stored for some purpose, such as record keeping or billing. A sales or marketing professional figures out another way to use it. Or a partner company may ask to share it. Or another company could be willing to pay for it?pure profit that would be tempting for any company.

This is known as function creep, and it’s a serious concern for ethicists. The classic example of function creep is the Social Security number, which started simply as a way to identify government retirement benefits and is now used as a sort of universal personal ID, found on everything from driver’s licenses to savings accounts. But the problem is exacerbated by technology that makes it incredibly easy to repurpose data.

N2H2, a company that makes Web-filtering software, fell victim to function creep. N2H2 software is popular in secondary schools for filtering inappropriate content. To do this, the software collects information about every website a student visits.

With all that data, a marketer could learn much about the students’ surfing habits, and in 2000 N2H2 offered to sell its data (in the aggregate) for $10,000 a month, under the brand Class Clicks, through marketing company Roper Starch. N2H2 had two customers?a Web portal that focused on education and the Department of Defense, which wanted to use the data for recruiting programs. The plan was exposed to the public after only a month, in part by the DoD, which got cold feet about the program and notified privacy advocates. After an ensuing wildfire of publicity, N2H2 decided to stop offering the Class Clicks service.

It sounds almost quaint to say that data should be used only for the purpose for which it was collected, and nothing else, but that is conventional ethical wisdom, according to Chris Hoofnagle, legislative counsel at the Electronic Privacy Information Center based in Washington, D.C. “Somehow, technology has led to this rogue theory that acquiring data about people gives you the right to own that data, that mere collection translates to ownership,” Hoofnagle says. “But there’s no theory of property that says that’s OK.” Hoofnagle blames the new mind-set on the ease with which data is now collected and stored, via database queries and massive CRM data warehousing systems.

The only current limits to function creep seem to be the limits of marketing and sales’ collective imagination. “Marketing organizations can rationalize their way to the greater good to a point where it could blind anyone, especially a CIO sitting on the fence on ethics,” says Jack Cranmer, CIO of Arizona-based Mayo Clinic Scottsdale, one of three group practices that make up the Mayo Foundation. In Cranmer’s case, he’s handling sensitive patient information that researchers want to use for medical studies. “But if you’re targeting customers based on data you’ve collected for some other use, there’s where you should start thinking about ethics,” he says.

Often, by the time the CIO finds out about the function creep, or by the time it gets to corporate counsel, the marketing group is disseminating the data already, says Rich Honen, an attorney at Albany, N.Y.-based Honen and Wood, which advises corporate clients on data ethics. But Honen maintains that it’s the CIO’s responsibility to understand and track the flow of information and maintain an ongoing discussion with marketing and legal executives about the ethical use of the data.

Joel Reidenberg, professor of law at Fordham University in New York City and expert on data privacy, adds, “Look, there isn’t a single company today that doesn’t know where its money is. Why aren’t companies paying attention to the flow of information the way they do money?”

Whose Data Is It Anyway?

Last year, Reidenberg received an offer for cell phone service from AT&T Wireless. The offer revealed that AT&T Wireless had used Equifax, a credit reporting agency, to identify him as a likely customer. “We used information we obtained from a consumer reporting agency,” the disclaimer read in part. This disclosure is required by the Fair Credit Reporting Act (FCRA). The FCRA also requires Equifax to disclose that it has sold a credit report and to whom, if a consumer asks. Reidenberg asked. Equifax told him it had sold his credit report to AT&T Wireless.

It was good business. Equifax made money selling data it already owned; AT&T Wireless could hone its target market to get a better response to its marketing campaign.

The bad part was that, by law, credit information can’t be used to sell anything. Reidenberg cites the FCRA, which forbids such repurposing in every case except when the data is used for “a firm offer of credit or insurance.” In other words, the only product you can sell based on credit data is credit.

A spokesman for Equifax says that as long as AT&T Wireless (or any company for that matter) is offering the cell phone service on a credit basis, such as allowing the use of the service before the consumer has to pay, it is in compliance with the FCRA. Since cases around unfair marketing based on credit data often languish in court, many companies may take a calculated risk that they won’t run into too much legal trouble for using credit data in their targeted offers.

Reidenberg makes a distinction between what is legal and what is ethical. “The average consumer would have no idea AT&T Wireless was violating the law,” he says. “But even if?and this is highly unlikely?even if the courts decided after five years that this was a legitimate use of customer data, the customers will be outraged.”

In the end, customers decide whether a company has acted ethically, and very often a consumer’s notion of the rules shares little in common with what’s allowed under law.

Qwest Communications, a telecom company based in Denver, is learning this the hard way. Earlier this year, Qwest planned on sharing internal customer proprietary network information (CPNI) data about its 12 million customers with other companies. That means your phone bill. Whom you call. How long you talk. How much you pay. What services you select. How often you use directory assistance. Like Visa’s credit card transaction data, CPNI data is a target marketer’s dream. If marketers knew you had relatives in Wisconsin and that you called them on Sundays, they could tailor a long-distance service expressly for you. If they sold the data to a travel agency, you’d receive solicitations for flights to Milwaukee.

Unlike the AT&T Wireless case, using this data seems to be legal. But customers didn’t care about that. When they discovered Qwest’s plan embedded in fine print in a Qwest brochure, they protested via the media, saying it was an unethical violation of their privacy. Qwest offered an opt-out service whereby consumers could call or go online and request that Qwest exclude their personal CPNI data. This process proved so difficult that some privacy advocates accused Qwest of making it difficult on purpose, to keep the number of customers who opted out at a minimum.

Finally, under unyielding customer pressure, Qwest dropped the idea?for now. Pending the Federal Communications Commission’s review of the rules around CPNI data later this year, Qwest has said it may yet use CPNI data as a marketing tool.

After several requests by CIO to speak to the CIO and chief privacy officer, a Qwest spokeswoman declined to comment, and questions sent via e-mail went unanswered.

Qwest could have avoided the flood of bad publicity by making the CPNI marketing program opt-in from the beginning. According to privacy advocates, opt-in is the appropriate way to handle customer data. Personal data is untouchable until you, the consumer, give us permission to use it after we, the company, tell you precisely what we want to use it for.

The problem is, marketers will do anything to avoid opt-in marketing. Statistics suggest that when personal data marketing becomes opt-in, more than nine out of 10 consumers will decline to join. If only 10 percent of the target market chooses to participate, the marketers are left with data that doesn’t tell them very much.

The Next Branding Trend

When CIOs do take a lead on data ethics, the results are positive. Gene Elias, CIO of Quiksilver, a surfwear clothing company based in Huntington Beach, Calif., that targets 9- to 15-year-old children, has taken this thinking to the extreme. Elias has prohibited sales and marketing from using any of the customer data he possesses?which amounts to personal information collected when people join mailing lists or become members at the Quiksilver website. He says the company doesn’t retain credit card numbers after transactions in Quiksilver stores, and all of his data ethics guidelines pass muster worldwide.

Clearly such limits set up an adversarial relationship between marketing, which stands to benefit greatly from collecting and repurposing data, and the CIO, who stands to lose his reputation over a privacy flap. “So far, knock on wood, marketing understands when I tell them there are pitfalls in doing certain things with data,” says Saab’s Rode. “But it’s also helped to offer alternatives, instead of just being the ’no’ man.”

One alternative suggested by Rode that has paid back is an online service called Saab-i. Rode makes sure it’s the ultimate opt-in?with absolutely no use of data without express consent of the consumers. In exchange for agreeing to be marketed to and have their data used in aggregate form, consumers obtain access to early notification of Saab promotions and can get their car questions answered. Rode says membership has doubled every year for three years.

Both Rode and Elias believe the next branding trend in the United States will be trust. Marketing opportunities lost will be made up through customer loyalty. Consumers will choose between vendors based on their policies around how ethically the company treats personal information, among other privacy litmus tests. And CIOs, they say, can play a crucial role in promoting this trend.

While Rode has developed formal rules on how to manage customer data, Elias admits that his are mostly informal. “If someone new came into marketing and I felt like they didn’t understand how I guard this data, I might change my tune, make the rules a little more formal because the relationship is not there,” he says.

A Hippocratic Oath

It might work informally at Quiksilver, but as Mayo Clinic’s Cranmer notes, “not everyone has a strong sense of ethics.” As a result, some CIOs say there needs to be a formal code of ethics or principles that lay out the CIO’s moral obligations when it comes to data. Such a doctrine is meant to help CIOs in a way that the Hippocratic oath guides doctors. The six commandments we have suggested on Page 58 address not only the issue of customer data but data retention and deletion as well, and the role that the CIO should play in communicating to the rest of the organization why such guidelines are crucial.

“If we have guidelines, it will help the CIO know where the line is,” says Cranmer. “It would also allow CIOs to hold up a document that says, ’This is what I ascribe to.’”