For many CIOs, the budget story has not been a happy one these last several years. The economic downturn that followed the dotcom meltdown, 9/11 and the high-profile accounting scandals that led to the Sarbanes-Oxley Act negatively affected IT budgets—a shock to IT leaders after the go-go, profligate nineties. Now IT budgets are beginning to grow again...but under an intense level of scrutiny by executive management that wants proof that all those IT dollars actually redound to the bottom line. The risk is that while CIOs struggle to provide the business with evidence of IT’s value—as well as its fiscal responsibility—they may cut through any remaining fat in their budgets right into the bones that support their enterprise’s enabling technologies.
More on CIO.com
This risk, and the fear that comes with it, brings back bad memories of the days when IT was regarded as a mere cost to contain and a part of operations, notes Howard Rubin, president of the consultancy Rubin Systems and a research associate at MIT’s Center for Information Systems Research. That cost focus changed in the 1980s when IT became part of business strategy and the fiscal discipline imposed on IT investments was somewhat reduced. “Then, in the 1990s, companies became technology day traders—profits were rising and it was very easy [to] buy stuff,” Rubin says. “But when the bubble burst in 2000, companies said that those investments had done nothing for them, so they cleaned up their portfolios. Technology,” Rubin suggests, “is once again viewed as a cost.”
If true, that puts CIOs in a difficult position. “If IT is just a cost, you want to cut it,” notes Rubin. But that thinking forces CIOs to slash costs while at the same time responding to another demand coming from the executive suites: to innovate and thereby grow the business.
Closer to the Line