The recent press regarding many more IT supplier lawsuits reminds us that the ability to decipher the human element in these complex efforts is often the key competency required to be successful.
Several years ago, I read “Into Thin Air” by Jon Krakauer. It is a chronology of events that took place on May 10th – 11th, 1996, when two expedition teams got caught in a storm while attempting to climb Mt. Everest. Tragically, both expedition leaders and three of the team members died during the storm.
These expeditions were commercial in nature, meaning that individuals paid to be guided to the top by a professional mountaineer.
The expedition leaders talked extensively about the need for a turnaround-time rule, mandating that if a climber failed to reach the summit by one or two o’clock in the afternoon, s/he must turn around. This reduces the risk of a climber needing to descend in the dark of night. On May 10th – 11th, 1996, however, several members of the expedition team did not reach the summit until late in the afternoon. Some even arrived after four o’clock. Then a massive storm hit. These unfortunate climbers found themselves trying to descend the mountain in the darkness, during a raging blizzard. Five people did not make it back.
Mt. Everest vs. an ERP deployment
While reading the book, I started drawing some parallels between climbing Mount Everest and deploying a major ERP system. The software you buy is the path that you choose to climb the mountain, and the System Integrator (SI) is your commercial guide. You put your trust in the SI to prepare you for the journey and make a significant investment of both your time and resources to complete the trip, just like those that trusted their lives to the mountain guides in 1996.
One of the questions I asked myself while reading the book was, “How could these experienced guides make the bad decision to continue to climb?” In doing some additional research, I came across an article by Harvard University’s Professor Michael Roberto that explores the events of these days. Roberto hypothesizes that our cognitive limitations, not our lack of intelligence, lead to errors in judgment. He describes cognitive biases that everyone experiences while making decisions. These decision-making traps affect novices — but they also ensnare experts. Roberto’s article outlines 3 biases that specifically contributed to the poor decision-making on Everest:
Overconfidence – Having made the trips over 40 times, the experienced guides became overly optimistic in their abilities. Their own confidence spilled over onto the climber clients as well. Krakauer described his climbing group and himself as “clinically delusional” during their trip.
Sunk-Cost Effect – The sunk-cost effect refers to peoples’ tendency to escalate commitment to a course of action in which they have made a substantial investment. These climbers had each spent $65,000 and many months of training prior to setting foot on Everest. They violated the turnaround-time rule because they did not want their efforts to have been in vain.
Recency Effect – The weather on the mountain in the years prior to 1996 had been relatively peaceful. Therefore, climbers underestimated the probability of a bad storm. A decade earlier, however (before many of the guides had been on Everest), there were three consecutive seasons during which no one climbed the mountain because of the wind.
I have personally experienced these biases while approaching major ERP go-lives. I have repeatedly witnessed the Overconfidence of an SI (“Don’t worry, we’ve done this hundreds of times before”). The SI often works on a contract with budget and schedule incentives, driving them to push for premature go-lives, even if a go-live is not in your company’s best interest.
I have seen the Sunk-Cost Effect take place within executive management teams. Go-live dates are established, and the executive management team refuses to move them because of their internal desire to protect their corporate reputations.
I have seen the Recency Effect take place when, after a few successful go-lives, companies start to cut-corners – only to be burned by a problem that could have been avoided if proper precautions had been taken.
Could those deaths on the mountain have been avoided? Yes.
If the guides and climbers had recognized that these cognitive biases exist and followed their own turnaround-time rule, Krakauer’s group could still be reminiscing about their climb today.
ERP implementation disasters can also be avoided. The risk of ERP implementation disasters can be greatly reduced by:
- Recognizing that all participants in the implementation will have cognitive biases that guide their decision-making process.
- Putting in place rules to guide go-live assessments.
- Executing regular independent reviews of the program progress from those not operating under the same cognitive bias as the project team or the SI.
Learn from Jon Krakauer’s book and do not let your ERP implementation go into thin air.