Find out why measuring the TCO might not matter as much as you thought
Discover what influences the total cost of ownership
See how seat management and thin-client technology can reduce TCO
Total cost of ownership (TCO), the concept that has long had CIOs tearing their hair out, all started with Gartner Group. Since the mid-1990s, embattled CIOs have been trying to get a handle on their organizations’ proliferating desktop computer TCOs. But to what effect? In 1996 when Gartner announced the average Windows 95 desktop cost a whopping $10,000 a year to own, businesses began to wake up to the fact that the machines were costing five times the purchase price to maintain.
And today? That figure remains unchanged. Although the desktops in question are easier to manage and have more efficient operating systems with a lower TCO, the corporate world in which those desktops are used has grown more complex. The result is one step forward, one step back.
But does it matter? Heresy though it might appear, the most important aspect of TCO may not be the actual dollar value that it appears to cost an organization, but an awareness that there is a cost. CIOs need to keep looking for methods to reduce it, such as seat management, where the provision of end user computing facilities is outsourced to a third party that takes on the task of procurement, installation, configuration, maintenance and help-desk support, all at a fixed price-per-seat. In some ways, TCO is a bit like the federal budget deficit: It’s a big number, but the exact size doesn’t matter as much as whether it’s going up or down.
There’s another fundamental problem with trying to attain a black-and-white amount. “To actually get a true TCO figure is extremely difficult,” says David Masding, director of operations at the Manchester, England, headquarters of the National Computing Centre, a British independent nonprofit research institute. “There’s an almost inexhaustible list of things you have to include.”
Just look at all that gets rolled into TCO: the direct costs of user support, hardware maintenance, software updates, training, lost productivity while users (and coworkers) try to figure out what’s gone wrong, security, downtime, administrative costs and a host of other headings—including depreciation and finance charges. With a laundry list that long, coupled with the increasing cost of hiring, it’s no surprise that a business’s TCO quickly climbs to about $10,000 a year. Technical advances drive it down, but the people-related costs in the calculation push it obstinately back up.
The ability to determine some of those costs isn’t as straightforward as you might imagine. Not only can it be hard to pinpoint precisely how to apportion some easy-to-measure items (the time your purchasing department spends on acquiring desktop PCs and the time your help desk spends sorting out end user fumbles) but some costs are almost impossible to measure. These gray areas include the cost of lost downtime, peer-to-peer support and time spent on user-solvable problems. Time and again, TCO models show that a huge slew of the overall TCO lies in such imponderables—typically around 50 percent, if not higher.
Because many of these factors boil down to people issues—and people’s time—rather than IT practice, putting a dollar value on TCO may be misguiding. While cost can be a valid way of measuring technological complexity, it may be the wrong metric for manageability, Masding believes. For example, time may be a better metric for measuring the effort involved in maintaining the usability of desktop systems. “The price mechanism provides a useful, if far from perfect, way of identifying the value associated with time or other resources, as long as we don’t confuse the resulting figures with actual payments of any kind,” says Masding.
Bullish on Dollar Metrics
Not everyone agrees that accurately measuring TCO in dollar terms is too much of a challenge. Officials at retail giant Sears, Roebuck and Co., for one, have wholeheartedly jumped into the TCO challenge. Ken DeWitt, vice president of IT planning and integration at Sears’s Hoffman Estates, Ill., headquarters believes it is possible to put a tangible dollar cost on an organization’s TCO. Consequently, DeWitt has embarked on a major program to first measure and then minimize the TCO of Sears’s servers and its associated infrastructure. Compared with the company’s 30,000 desktops, the servers represent something of a lower-hanging fruit, he believes, and also are easier to measure in terms of their TCO. “The servers are more stable,” DeWitt says, because the software on them changes less frequently than on desktops and because end users can tweak desktops, but thankfully not servers.
Nevertheless, the project—underway for just less than a year—is no sinecure. There are 3,500 Sears locations, many with different infrastructures. Not only is the information widely scattered, it’s also vague—knowing that an activity goes on is not the same as knowing how frequently it happens, for how long and who is responsible. “Starting off there are a lot of gray areas,” DeWitt acknowledges. Key difficulties include support, administration and maintenance activities being undertaken by fractions of individuals, and elements of lost operational costs in big IT budgets. On the other hand, he adds, once the facts have been pinned down, it’s easier to keep track of them. “Measuring what you’ve got today is a lot harder than measuring the TCO going forward—[when] elements such as roles, responsibilities, tasks, processes, leases and licenses become more tightly specified.” DeWitt adds that the project is going to be a multiyear endeavor, “and there’s an awful lot of probing and digging that needs to be done. But can we save costs and improve the way that we use our resources? Based on the studies we’ve done, the answer is ’Yes, very definitely.’”
Maybe so, but most folks liken measuring the TCO of desktop computers to nailing Jell-O to a wall: easier said than done. But with desktops and laptops becoming more ubiquitous, CIOs continue to search for new approaches.
Hence there is a temptation to use some TCO shortcuts. For those with neither the time nor the inclination to buckle down to a long, data gathering exercise, there are other possibilities available. One option is to rely on generic benchmarking profiles of TCOs, supplied by IT consulting companies. These profiles attempt to match an organization’s IT system and infrastructure with those of comparable companies and estimate a likely TCO accordingly. Using a software tool like Gartner’s TCO Manager, a user selects the number of desktops, operating systems, user applications and network architecture from a menu of options and—presto—the TCO. Or at least, it might be.
Many IT consultancies offer this, but the price tag for such software packages carries a base price of anywhere from $20,000 to more than $50,000, depending on how much consulting assistance is required to interpret the results.
While Gartner and similar companies undertake such comparisons on behalf of clients, Gartner Research Director Peter Lowber counsels a more rigorous approach. “Do your own data collection,” he urges. “Don’t rely on our generic numbers, but invest your own time in gathering data for your organization.”
Other consultancies recommend still another TCO shortcut: Skip the figuring-it-out stage, and jump straight to the doing-something-about-it stage. A favorite option: management tools, such as Hewlett- Packard OpenView, IBM’s Tivoli TME10 or Microsoft’s Zero Administration Kit, that help users standardize and simplify such routine chores as software upgrades and onsite user support. “This was one of the advantages of mainframes that got lost in the push to client/server, and anything that can be done here helps enormously,” says Randy Covill, a senior analyst at Boston-based AMR Research. “If IS has to come by and upgrade my machine, it doesn’t just cost the technician’s time—but I’ve had a 15-, 30- or 60-minute disruption to my schedule too.”
Even so, such tools will never be a total panacea. Think back: Desktop management tools have been around—well, almost as long as desktops—and annual TCO costs are still around the $10,000 level. Remember Norton UnDelete—a handy management tool to help users recover files they had accidentally erased? Year by year, the tools get more sophisticated but so do the desktops and the problems that arise.
In any case, the scope of such tools is limited. “Using the best tools won’t do much if the best practices aren’t there,” says Lowber. What’s more, adds the National Computing Centre’s Masding, three-quarters of a typical business’s desktop TCO results from management issues rather than technology problems—rendering it largely impervious to the effects of such tools. No one’s saying tools are bad things; they simply don’t tackle the whole problem or even a decent-size chunk of it.
Given that many of these issues ultimately boil down to people issues rather than IT practice, says Masding, it may be better to develop stated policies on such things as e-mail usage, Internet access, private use, recreation and personal customization, all of which have a bearing on TCO. “It’s about the management of people as much as it’s about the management of IT,” he says.
The Outsourcing Route
The challenge of managing the people component has led some companies to opt for outsourcing alternatives like seat management. Unless a business believes it is worse than average at providing cost-effective end user support, training and maintenance facilities, savings sufficient to meet the third party’s required profit margins can only logically come from an ability to generate economies of scale. This can be done by lumping individual companies’ desktop computers together and reaping economies of scale. “I’m not saying that seat management reduces the cost,” says Masding, “but at least it makes [the cost] more visible.” And at Gartner, says Lowber, “we’re very careful about [recommending seat management to clients]—it’s a case-by-case decision, and it’s not something that we have a black-and-white view on.”
Inevitably, seat management’s proponents see the concept as making a very positive contribution to lowering TCO—without necessarily having to quantify it in the first place.
At NASA, for example, the decision was made to outsource the maintenance of desktops in order to focus on the space agency’s core competencies. “We were trying to refocus civil servants on NASA’s mission, rather than having them support computers, and also trying to buy in commercial best practices [desktop support],” says Program Manager Mark Hagerty, at the organization’s Greenbelt, Md.-based Goddard Space Center. The Outsourcing Desktop Initiative for NASA (ODIN), which started nearly two years ago, has come to be recognized as a significant example of the TCO concept. As its name implies, under the initiative, NASA outsources the management of its desktops to third parties in order to reduce the organization’s TCO. NASA has currently outsourced the management of some 27,000 desktop seats to third parties, along with the maintenance of 42,000 of its phones.
Although reduced TCO wasn’t the only objective of the switch from internal to seat management, it was one of the goals, explains Hagerty. ODIN covers seven types of NASA’s scientific and engineering workstations, including high-end Wintel, Macintosh and different Unix systems. Scientific and engineering systems comprise 52 percent of NASA’s installed base, but only 3 percent of those systems run Unix.
At the time the ODIN contracts were awarded in June 1998, sample data collected by NASA from 11 of its installations pointed to a TCO-per-desktop of just under $3,000 annually, says Hagerty. Within the agency, the view was that this amount could be reduced—although, with hindsight he believes the $3,000 figure was probably too low, as it was based on a sample of 11 NASA installations. When contract delivery formally began in November 1998, NASA centers could select an outsourcing company from a list of seven ODIN-approved vendors, according to each center’s geographic location and the level of service provision that it required. The decision to approve seven vendors rather than one was very deliberate, adds Hagerty. “We were very clear that we wanted competition” in order to get the best available rates, he says.
The vendors included Computer Sciences Corp., Federal Data Corp., OAO Corp., Science Applications International and Wang. The ODIN contract called for NASA to be able to purchase seat management services from the vendor for 12 years, in three-year periods. A given vendor could manage a desktop for three years and then have the contract passed to another vendor if NASA so chose—a useful way of keeping the service providers on their toes.
This strategy can be judged from the impact the arrangement has had on NASA’s TCO. Hagerty points to a reduction of at least two-thirds: Compared with the $3,000 baseline, the fees paid to the seven vendors range from $1,800 and $2,200 per computer, per year, according to the service level. More important, he points out, NASA also now has key information it didn’t have before, such as an excellent inventory of its assets, a single number to call for service and a total certainty regarding the cash cost of operating each desktop.
But as undoubtedly valuable this information is, the ODIN work still represents an attack on only the technology aspect of TCO, not the human or management cost. In other words, costs such as downtime, training and reduced user productivity are excluded. To shrink the costs in these areas, it’s necessary to find some way of dramatically simplifying the desktop’s complexity: The greater the number of applications that reside on a machine, the more complex the installation. And although better tools, better training, better personnel policies and better technology can all reduce this human intervention, it’s hard to deny the desirability of eliminating TCO altogether.
How in Is Thin?
Eliminating TCO was one of the chief objectives of the much-hyped network computer (NC), a slimmed-down, dumbed-down $500 desktop promoted by Oracle Corp.’s CEO and Chairman Larry Ellison in the mid-1990s. The logic was simple: Applications and complexity would reside on cheap-to-maintain servers, while users would operate devices that acted like normal desktops, but lacked their complexity. No floppy disk drives, a minimal hard drive and zero customizability. More than 100 million devices would be sold, forecast Ellison, seeing them as a way of not only besting arch-rival Microsoft Chairman Bill Gates, but also boosting demand for core Oracle server-based products such as databases.
The idea fizzled with just 20,000 of the devices ever produced—although Ellison has now funded a similar NC by taking a stake in San Francisco-based startup New Internet Computer Co. at the end of 1999. But history, of course, could repeat itself. “Larry underestimated the drag effect of a whole legacy infrastructure of applications,” says General Donald Walker, CEO of Reston, Va.-based network security company Veritect, a subsidiary of Veridian Corp. During his tenure as CIO of United Services Automobile Association in San Antonio, Texas, Walker conducted a trial network computer installation in 1998 (see “NC or not NC?” CIO, March 15, 1998) and saw firsthand some of the difficulties.
“Larry would always say, ’It’s no big deal,’ but you really do have to look at the business case for rewriting all those legacy applications just to run them on NCs. And if it’s just a reduced TCO, then it won’t wash,” says Walker. Despite this flaw in the logic, he concedes, the idea behind the NC still has merit—as evidenced by the number of me-too Internet devices. “The overall concept of a thin client didn’t bomb, but the clean sheet of paper represented by the NC did bomb,” he says.
And indeed, there’s a much greater acceptance of thin-client devices today. With more and more companies finding themselves with mixed environments characterized by usage-specific form factors, wireless devices and multiple operating systems, CIOs are becoming more comfortable with the notion of thin clients. Gartner’s Lowber points out that the more varied the end user computing environment becomes, the more thin-client technologies, rather than seat management devices, appear to be the answer to the problem.
But even in a straight desktop environment, thin-client technology looks to be able to make significant inroads into a business’s TCO because of its much greater inherent simplicity, irrespective of whether the definition of TCO is the narrow, technology-based one, or the broader people-based one.
Since the ongoing maintenance and upkeep of an organization’s desktop PCs is such a large chunk of TCO, some companies are finding that reducing the number of PCs in use is an increasingly attractive notion. While approaches like seat management have their place in certain instances, thin-client technology can improve an organization’s ability to monitor and reduce TCO.
Which may, in the end, be the biggest lesson of all to emerge from the past decade-long focus on TCO. It’s big, it’s staying big, and most approaches to reducing it are akin to treating the symptoms and not the disease. Thin-client technology, for all that it represents, is still distinctly tarred as unconventional. But for those bold enough to adopt the thin-client approach, the payoff is clear.