I'm a numbers guy—you have to be if you're an analyst—and I'm a firm believer that analytics, used properly, could not only change your company but also change the world. However, I've spent the last few weeks studying some of the bigger market failures of the last decade and, in many cases, executive management had the information it needed to avoid the mistake but simply chose not to use it.
Years ago, I did a study at IBM to understand how the company lost some of the markets it dominated. IBM was extremely numbers-focused, and here I found an intentional corruption of study results to give executives a sense of false comfort. It was so bad at one point that it cost the IBM CEO his job and nearly put one of America's oldest and most successful companies out of business.
Analytics is capable of giving executives unprecedented insight into assuring that their decisions are the correct ones, but many who pay for the deployment of these tools will likely instead find they are used to showcase what idiots they were after the fact. This would be like a car safety system that, rather than prevent an accident, only reported on the failure of the driver after an accident.
I'm not arguing that analytics are bad. If corporate culture favors positive information over accurate information, and if the winning choice comes from the most powerful executive, rather than the best informed one, then analytics is only going to make a bad decision look worse, as it will show that the bad decision could have been avoided. If the bad behavior isn't fixed first, it will force whoever uses the tool to either falsify the results or constantly be a source of career-ending information on bad decisions—which in and of itself is a great path to unemployment for the user.
Microsoft's Experience: Make a Decision, Then Run the Numbers
Watching Microsoft about five years ago, I couldn't reconcile the fact that CEO Steve Ballmer—even more of a numbers guy than I am—seemed to be making horrid mistakes that should have been avoidable. I figured either the numbers from Microsoft's internal market research organization weren't getting to Ballmer or the folks in market research were incompetent. After an interview with the head of market research, I concluded that both assumptions were wrong.
Related: Top 12 Microsoft Stories of 2012
Rather, this organization clearly had a mission to give executives results that made decisions they had already made look smarter—apparently so they were better protected when the decision didn't work out. I figured Ballmer would eventually figure this out, since having numbers that say one thing and results that clearly say another should have clued him into the problem, even if he ignored my annoying emails.
This was also about the time I first ran into argumentation theory, which suggests that we are hard-wired to hold in high esteem people who win arguments. What's really screwy about this, when put in the context of human nature, is that we don't seem to care that much if someone is right, only that he prevails. To that end, we'll follow executives that win arguments no matter what.
Now think of Ballmer's position. (I'm picking on Microsoft because I really tried to fix this problem and was catastrophically unsuccessful, and argumentation theory may explain why.) Ballmer's a business guy in a company formed around a brilliant software developer, Bill Gates. This would be like putting a hockey coach in charge of a tennis team. I don't care how long you're there, chances are you won't survive unless you figure out a way to fix the game.
My working theory: Ballmer inadvertently tasked research with a role of making his positions look right after the fact to offset the problem of him running a company of experts in an area where he wasn't an expert himself. The sad thing is that, had he approached analytics a bit differently, he could have made better decisions, held his position of power and made Microsoft a better company. I still hope he'll eventually see this path before his time there is up.
Effective Data Analysis Comes Before a Decision
Perhaps the saddest decision I was ever part of was IBM selling ROLM. I was part of the analysis team—and my report on how to turn the unit around instead convinced Ellen Hancock to sell the unit in the first place, since it showed a number of areas that needed substantial work.
What was sad about the sale was that our internal study clearly said selling this business unit to Siemens would be a catastrophic failure for the unit. This should have prompted IBM to either sell to someone else or make the sale final. Instead, IBM sold half of ROLM to Siemens and carried 50 percent of the resulting losses for five years. The unit lost more money over that time than IBM initially paid to acquire it. Subsequent analysis showed that, had IBM just shuttered the business, it would have been billions of dollars ahead.
The ROLM mistake happened because the decision was made before the research was done. Apparently the IBM executive team forgot it had even commissioned the research in the first place. If you're going to do research, it needs to come before the decision is made—afterwards, as this decision shows, it has the high probability of making executives look like idiots.
However, executives have to be able to accept that the decision they want to make—quickly divesting a troubled unit in this case—may be a bad one. If they aren't, then you still have another problem: confirmation bias, or the tendency to only see information that agrees with your world view.
We saw this play out in spades in the recent U.S. presidential election, which the Republicans were convinced they were going to win, and win big, but they lost big.
The GOP had used analytics, but not only was it using companies inexperienced in political campaigns, it was cherry picking and reporting the results that supported the belief that Mitt Romney was going to win. As a result, Republicans focused on the wrong geographies, under-resourced their efforts and lost an election against a relatively unpopular incumbent.
You Can't Handle the Truth, So You Make Bad Decisions
The famous courtroom scene from the movie A Few Good Men highlights the core problem: Often "the truth" is at best inconvenient and at worst highly embarrassing. Analytics, done right, provides an incontrovertible view of the truth.
There are executives and entire companies that largely exist by avoiding the truth. Look at Apple. Here's a company that was designed around the vision and skills of one man, Steve Jobs, but has done so little to adjust for his passing.
Apple is running around saying the guy who was executive of the decade really didn't play much of a role while acknowledging Apple couldn't have been successful without him. Both statements can't be true, but they coexist because Apple can't handle the truth that, without dramatic changes, the firm is crippled without Jobs.
Related: Top Apple Stories of 2012
Here's another example: a surprisingly small number of the companies that sell analytics tools actually rely on those tools for major decisions. Such companies are unable to handle the truth, even though they could become the best-standing examples of why a company should deploy their products.
This leads me to two recommendations if your executives, like most, tend to make decisions on their gut, rather than look for analysis up front. First, avoid analytics as a decision tool. It'll make executives look bad, and they probably won't appreciate that. (It goes without saying that you might want to move to a company with less foolish tendencies.) Second, pick a solution from a company that uses analytics heavily internally, and have the company help you guide your executives to become smarter decision makers. In that case, not only will the tool be more successful, but so will the company.
It is far more satisfying to be partially responsible for making your company successful than it is to show that your executives are idiots. Trust me.