by Edward Yardeni

IT: The Productivity Miracle

May 15, 20017 mins
IT Leadership

I sympathize with Maximus. The great and victorious Roman general, played by Russell Crowe in Gladiator, is tired of fighting and longs to return to the peace of his farm and family. Instead, the fictitious farmer soldier is wrongly accused of crimes against the state and enslaved as a gladiator who must fight to the death to right the wrong.

As one of the first proponents and a long-time defender of the view that IT has brought us enormous productivity gains in recent years, I feel like Maximus: I can no longer sit back and let the latest round of naysayers go unchallenged. These barbarians at the gate seek to prove that the new economy is a house of cards, not a solid fortress. It collapsed along with the Nasdaq, they say. The productivity gains from technology have been overrated, they argue, and in any case, are over. Our collective crime was irrational speculation and overinvestment. The punishment is a prolonged period of economic stagnation.

Allow me to differ. A decade ago, I predicted that competitive forces unleashed by the end of the Cold War would mean that the pricing environment would get tougher for companies. To boost profits, they would have to cut costs and boost productivity. At the time, the idea that there could be a rebound in productivity was widely viewed as far-fetched. After all, the nonfarm productivity of American workers rose a meager 1.4 percent per year on average from 1973 to 1995. Technology wasn’t helping to improve the situation, or so the argument went. Economists were baffled by this lackluster productivity performance and called it the Productivity Paradox.

The Miracle Begins

But in 1996, the Productivity Paradox became the Productivity Miracle as the rise in output per hour in the nonfarm business sector of the economy jumped to 2.9 percent per year on average through 2000. Now, 2.9 percent may look like a small number, but it translates into big numbers across time. If productivity grows at an annual rate of 2.9 percent, living standards double in 24 years, or one generation; at 1.4 percent productivity growth, living standards take almost 50 years, or two generations, to double.

Here’s how many economists explain this jump. While companies have been investing in computers and software for decades, it took a while for them to figure out how to use these tools not just as a convenience, but to change the nature of people’s jobs. At first, their effect was too negligible to show up in productivity statistics. But by the second half of the 1990s, as each innovation built on the one before, the aggregate effect of IT on productivity became pronounced enough to register in national statistics, hence the 2.9 percent jump in output per hour.

Given such evidence, I thought my days of feeling like Maximus would surely be over. After all, the correlation between technological innovation and productivity gains has been trumpeted by no less an authority on the economy than Federal Reserve Chairman Alan Greenspan and embraced by the Congressional Budget Office (CBO). On more than one occasion, Greenspan has declared that American workers are much more productive than they used to be, and the reason is the proliferation of information technology. And he firmly believes that this productivity will only increase. In his congressional testimony on Jan. 25, 2001, the fed chairman endorsed the CBO’s projections that the federal budget would reach $6 trillion in surpluses during the next 10 years, assuming that productivity will continue to grow at a 2.5 percent pace. He added, “As I have argued for some time, there is a distinct possibility that much of the development and diffusion of new technologies in the current wave of innovation still lie ahead, and we cannot rule out productivity growth rates greater than [those] assumed in the official budget projections.”

Federal Approval

The folks at the Commerce Department would agree. In a recent report, they argue that the official data may still understate the effect of high-tech innovations on economic growth. And they propose a different method of valuing the output of the computer industry: adjusting the raw dollar figure of computer sales, because computers are constantly improving in power and quality. In 1999, for instance, final sales of computers were $92.5 billion. By taking into account the enhanced value of each new computer, the Commerce Department came up with the figure of $245.9 billion, which is what it uses in reporting the contribution computers have made to the GDP. Using this measure, industries that are heavy users of the new information technologies, such as education and certain financial services, have generated significant productivity growth.

The skeptics, of course, see it differently. They see a Productivity Mirage. Indeed, this is the title of an article by the always thoughtful John Cassidy, a reporter who frequently writes about economics. In the Nov. 27, 2000, issue of The New Yorker, Cassidy observes that virtually the entire productivity miracle has been concentrated in the technology sector. In other words, computers have created productivity gains for the computer industry but have no significant influence elsewhere.

Robert Gordon, a professor of social sciences at Northwestern University, makes a similar argument in these very pages. Even within the high-tech sector, he argues, permanent productivity growth will turn out to be illusory.

Such arguments are sensible and espoused by very civil counterrevolutionaries. We all agree that productivity is hard to define and harder to measure. No wonder, therefore, that reasonable men and women can look at the same phenomenon and see it differently?as a paradox, a miracle or a myth.

What’s in Your Paycheck?

At the end of the day, however, what matters is how much money we bring home. That is the reason productivity is so important. It is the source of our standard of living. Productivity determines our purchasing power and our real incomes?which, in turn, determine how much we can buy. This concept was first most simply and clearly explained by Adam Smith in The Wealth of Nations (1776). According to Smith, prosperity is based on the division of labor, or “outsourcing” as we say today. Thanks to the butcher, the baker and the candlestick maker, the rest of us can do what we do best, thus maximizing our productivity and our pay.

The fact is that real wages and salaries per worker rose during the 1990s to a record high?the fastest pace since the 1960s?after stagnating along with productivity from the mid-1970s through the mid-1990s. This is proof positive that there really has been a rebound in productivity and that it has been more widespread than the doubters claim. The prosperity of the previous decade was not limited to a few high-tech nerds with stock options while the rest of us flipped hamburgers. On the contrary, more than half of the 24 million jobs created since 1987 were for professional and management positions, where pay is relatively high.

No one disputes the fact that there has been productivity growth within the IT sector itself. Consider, for instance, that between 1995 and 1997, workers at computer manufacturers such as Dell and Compaq raised their productivity at an annual rate of 41.3 percent. There is persuasive anecdotal evidence that IT has also boosted productivity in other industries and will continue to do so. Federal Express, Wal-Mart and Citigroup could not even exist today without the information technologies that empower their managements to run such huge global organizations so successfully.

British Telecom claims that procuring goods and services online will reduce the average cost of processing a transaction by 90 percent and reduce the direct costs of goods and services it purchases by 11 percent. International Paper Co. and Motorola plan to put microchips in cardboard boxes, a big step toward eliminating bar codes and ultimately bringing the entire manufacturing supply chain online.

I agree with Greenspan: The best may be yet to come. In his January testimony, he observed, “Economists have long noted that the diffusion of technology starts slowly, accelerates and then slows with maturity.” The extraordinary plunge in the cost of computing and telecommunicating suggests that information technology will proliferate at a faster pace in the years ahead. As technology becomes cheaper it can be used by more of us to increase our productivity and our prosperity.

Edward Yardeni is chief investment strategist at Deutsche Banc Alex.Brown. He can be reached at