Using predictive analytics to forecast IT failures is itself fraught with difficulty. But combined with artful human intervention, the tools can work.
By Kim S. Nash, CIO
Is it possible to reliably predict – and, therefore, avoid — failure? Some companies are using Big Data analytics to find out. CIOs, CFOs, auditors and anyone else who wants to monitor risk can use business intelligence tools to try to quantify the degree of danger they face from common project failure points, such as bad governance, poor buy-in and vague goals.
We all know the sordid history of IT failures. We may have even participated in a few. ERP projects gone south. Business process changes that users rejected. Transformations that made things worse. Smart CIOs learn from failure, and the brave ones talk about it to reporters so others can learn from their mistakes.
Maybe they’re hungry for business as the economy picks (slightly) up, but vendors are trying more than ever to add predictive analytics to the discussion of project outcomes. Deloitte consultants use a proprietary prediction tool to help clients ferret out project weaknesses, assessing 28 project characteristics against a database of 2,000 projects compiled by the Helmsman Institute, a project consulting company in Australia. Consulting company Capers Jones, which specializes in software quality, offers a tool to assess project defects, call RiskMaster. Oracle, SAP and other enterprise software vendors are now putting more concerted effort into helping customers run more successful projects, according to Michael Krigsman, a consultant who studies IT failures.
CIOs might want to use such tools before jumping into cloud computing, where the risks can be just as high as the rewards. Perhaps those running mission critical systems on Amazon’s cloud, gun shy about last year’s outage, might make good use of predictive analytics. But predicting project failures will never be as easy as pouring data points into an analysis tool to create a foolproof plan. You still need sharp-thinking humans to figure out what’s really going on.
Analytics may produce some nice charts highlighting trouble spots, but people must interpret them to devise a fix-up plan, as Scott Wallace, a director at Deloitte, told me.
Sure, Deloitte would say that, right? Billable hours and such. But the point is solid. Regardless of whether the interpretation is done by outside consultants or internal staff, CIOs have to combine art (human smarts) and science (analytics) in the right proportions to rescue a project or plan one correctly from the start. One rather large reason: The CIO understands the company’s other priorities, as well as the personalities involved.
When I ran the idea of tools for predicting failure by Tom Nealon, a former IT and corporate strategy leader at JC Penney, he was skeptical. Nealon has a long history as a game-changing CIO at Southwest Airlines and Frito-Lay, among other companies. He said that analytics can help show you the risk profile for a given project but not truly predict what will happen because success has more to do with leadership and what else is going on at the company. True enough.
Failure or success often depends on projects coming together, not on any one outcome, he said. “Few projects stand alone.”
At Deloitte, Wallace is using the tool to help a U.S. bank quantify the risks in its entire portfolio of active IT projects. He tells me that one useful byproduct of applying software to the touchy subject of failure is that data diffuses emotions. After all, it’s the analytics tool that spits out the bad news, not a person. Acceptance of the message “has gone up dramatically,” he said. “It’s taken a lot of suspicion off the table.”