Africa

Americas

by Rob Enderle

What the Seahawks Bad Call Teaches Us About Decision-Making

Opinion
Feb 06, 20156 mins
AnalyticsTechnology Industry

The Seattle Seahawks bad play call in the Super Bowl made CIO.com columnist Rob Enderle realize that the biggest problem that analytics and AI will have to overcome is that we put more effort into looking smart than we do into winning the game.

I was watching the interview between Matt Lauer and Seattle Seahawks coach Pete Carroll on the Today Show yesterday. They were discussing team’s bad — and it really was bad — call at the end of the Super Bowl. The play also showcased one of the big problems yet to be addressed with analytics and the artificial intelligence (AI) enhancements that will build on limited initial offerings like Apple’s Siri.

I paid particular attention to the Super Bowl this year because Paul Allen, one of Microsoft’s founders, owns the Seahawks and he has historically been one of the most powerful guys in technology — yet his team doesn’t seem to benefit from his unique knowledge of technology. Thinking of Allen, the Seahawks and the Super Bowl made me realize that the biggest problem that analytics and AI will have to overcome: We put more effort into looking smart than into winning the game.

The goal isn’t to look right while losing, the goal is to win. However, often the massive effort to do the first thing makes the second increasingly impossible.

Let me explain.

Football — and Information Game

I’m by no means a football expert, but I do cover analytics and at its heart is analysis.   Professional sports, and particularly football, are massively analyzed both offensively and defensively.  Competing teams are analyzed in depth as are the capabilities of each player, the team’s playbook and even the quirks of the coaches. By the time the Super Bowl is played, each team has a massive amount of information both on the team they are playing and on their own capabilities.  

In addition, to get to the Super Bowl each team has demonstrated a nearly unmatched competence in their coaching team, which includes not only a head coach but experts on offense, defense and special teams.  

While players in the heat of play make mistakes, it should be impossible for coach to make a bad play call.  Let’s point to the bad call that was made. In this case, the game hinged on this one play, so a focus on making the right decision should have been even higher, making the probability of a bad call so low as to be impossible. Yet the call made was clearly a bad one.

The Seahawks were running well for short yardage, and this was a short-yardage play. This appeared true even when the Patriots had set up for a run as they did in the final moments.  

In addition, throwing the ball always has a higher probability of a turnover.  But even if the chance of success in a run was 25 percent and the chance of success of a pass was 40 percent you’d want to factor in the chance of a turnover and here the chance of a turnover with a run was less than 10 percent and looking at the film of the play the chance (with three defensive players around one receiver) was around 75 percent. So looking at that higher percent chance of ending the game prematurely with a loss should have prevented the play from being called.  

Now these numbers are all rough, but after the fact you’d alter the chance of interception to 100 percent, which should have led Pete Carroll to admit it was a bad call.  But if you listen to the interview, Carroll says it wasn’t a bad call it was simply a bad outcome. This would be like someone shooting themselves in the leg while cleaning a gun concluding they were simply unlucky.

The Desire to Not Look Stupid

There is an almost universal tendency in executives to avoid learning from mistakes because they refuse to see them as mistakes. This goes to the heart of something we call Argumentative Theory and it is executive Kryptonite. Instead of coming up with creative ways to argue what they did was correct, Carroll should instead be trying to figure out why his process failed him. The goal isn’t to look right while losing, the goal is to win. However, often the massive effort to do the first thing makes the second increasingly impossible. If an executive refuses to see a mistake, he’ll never fix it.  

You can see this behavior very well in CEOs, particularly underperforming ones. They will seem to repeatedly do things like layoffs or selecting executives without the proper qualifications who don’t work out. The CEOs act as if the results were historically successful when they weren’t. They are so focused on appearing right that they fail to grow and end up failing repeatedly by making the same avoidable mistakes over and over again simply because they aggressively refuse to see them as mistakes.    

Analytics and AI

So what’s this have to do with analytics and AI? Any process of analysis and conclusion is less than perfect. If it were perfect, you’d have the equivalent of a crystal ball and no system is close to precognition yet (it may be impossible to create such a system).

To improve the accuracy of a system you have to constantly learn from its mistakes and improve it. But if you instead focus on defending the solution as if it was perfect, even though it will never be, it won’t improve and you’ll effectively promote a system that has designed-in flaws.  

In addition, if the system says one thing and the executive disagrees and makes an alternative decision that turns out to be wrong, the executive is likely to see the system as a threat. Rather than relying on it next time, he or she will eliminate it from the process or alter it so that it produces a recommendation in line with the wrong decision. That way, they can then blame the system.  

What If We’re the Problem?

We focus too much on blame and we connect too much emphasis to the impression that a decision maker’s decisions are flawless. Winning in sports and business is a process. You learn from your mistakes and work to find ways to avoid making them. Focusing on blame shifts emphasis from doing the right thing to looking right even when you’re wrong. Unless this is more effectively addressed, as we move from analytics to AI we’ll corrupt these ever more powerful systems into doing the wrong things.  

In addition, advanced AI could see us legitimately as the problem to be solved — something that a number of high profile folks, most recently including Bill Gates, are increasingly concerned about. For analytics and AI to work we have to make every effort to ensure they are used correctly and are as accurate as possible. We need to shift emphasis from blame to improvement. If we can’t make this pivot what we are building might actually make things worse.  

We could be intentionally turning the future version of Siri into anything but our friend.

Something to think about this weekend.