DAN REED CAN’T TELL YOU exactly what Globus means to you. But Reed, head of a Champaign, Ill.-based research lab called the National Center for Supercomputing Applications, and his colleagues are spending about $10 million in tax money on this project, a computer network for studying the stars.The project is one facet of a $70 million effort by the National Science Foundation to develop new networking technologies?a jewel in NSF’s $640 million budget for basic IT research. Its immediate aim is to help researchers port huge amounts of cosmological data to colleagues who aren’t lucky enough to live near high-endtelescopes like the Keck in Hawaii. While Reed can’t say when the concepts behind Globus (advances in real-time, multilocation collaboration) will bear commercial fruit, he has little doubt that someday?maybe in five years, maybe 10?its findings could spark product innovations that return taxpayers’ investment in ways that are incalculable today.Academic researchers such as Reed say the federal government needs to fund more long-term, infrastructure-oriented IT research projects like Globus that simply ask whether a thing is possible, without worrying whether it’s profitable. “We don’t know what the next killer technology will be, but if we don’t look for it in every way possible, we won’t ever find it,” says Neal Lane, science adviser to former President Clinton and now a university professor of physics and astronomy at Rice University in Houston. Following nearly three decades of neglect, Uncle Sam started betting significant money in the past two years on long-term IT research, making projects like Globus national, bipartisan priorities. But proponents of basic IT research fear this support will be short-lived. In the wake of a 2002 budget that does not earmark much money for basic research, computer scientists who keep track of the government’s spending worry that the Bush administration and Congress won’t put the same priority on long-term IT research?projects taking five years or more?as policymakers in both parties were advocating only a year ago. Commercially oriented research is popular with corporations and, by extension, with politicians. Meanwhile, the administration and lawmakers are putting greater emphasis on other popular research areas, such as medicine and energy.The end of the Cold War freed funds from the defense budget, where most IT research spending was allocated through the 1980s. But funding choices are also linked to changes in the business cycle, say research budget experts. As the economy slows, there’s less support for research that doesn’t offer a marketable return. This, says Ruzena Bajcsy, assistant director of the National Science Foundation’s Computer Information Science and Engineering (CISE) directorate, means that without a deliberate emphasis on funding long-term research, it might not get done. “Without a funding focus on these basic projects, we lose a sense of context and long-range planning to our technology endeavors as a nation,” says Bajcsy (pronounced by-chee). “More than ever, this kind of research has to be pushed so we can start thinking about the future.”The Private Sector’s Short-Term VisionEveryone knows the story of the Internet. The tale of its invention as part of a government research project is a perfect example of how research discoveries take many years and consistent funding to cultivate. The now famous effort began in 1969 with a $1 million grant from the Department of Defense’s Advanced Research Projects Agency?an amount equivalent to $4.7 million today. It took almost 20 years before researchers realized that the project had vast commercial potential. It was the largest federally funded basic research project to go commercial.The commercialization of the Internet, starting in the mid-1980s, signaled changing times. With the rise of Japan as a technological superpower and the end of the Cold War, U.S. government spending for applied IT research?three- to five-year projects with clear commercial potential?exploded. According to statistics from the nonpartisan Congressional Budget Office, funds for applied research in computer science skyrocketed from $171 million in 1986 to $757 million in 1996, outpacing money for basic research by a ratio of nearly 3-to-1. What’s more, as private sector companies such as GE, IBM, Lucent and Microsoft saw the value of capitalizing research, they soon petitioned the government for grants to help with their product research. The result: Private sector research agendas dominated the nation’s R&D. A good example is Murray Hill, N.J.-based Bell Labs, the research arm of telecommunications giant Lucent Technologies. Director of Research Effectiveness Terry McPherson says that when AT&T created Lucent in 1996, the new company committed 11 percent of its annual revenues to R&D through Bell Labs, with another 1 percent of the revenue devoted to long-term projects, a ratio that McPherson says is similar to when Lucent was part of AT&T. Today, McPherson says that Bell Labs devotes $4 billion to R&D and nearly 700 researchers to three- to five-year projects with the ultimate goal of commercialization. What little Lucent spends on long-term research also has clear commercial applications. The company is putting that money into wireless technology.Why do companies shy away from long-term research efforts? Economists say the benefits of basic research are disproportionately weighted against those who invest in it. That theory, proposed by the late economist Edwin Mansfield in the 1960s, has received renewed attention in a recent book by Lester Thurow, professor of economics at MIT’s Sloan School of Management. In the book Building Wealth: The New Rules for Individuals, Companies, and Nations in a Knowledge-Based Economy (HarperCollins, 1999), Thurow says that companies that invest in long-term research receive only a 24 percent return on every dollar; society gets the rest, he asserts, from spinoffs enabled by new discoveries such as the Internet. As a result, even private companies that invest in basic research don’t enjoy as many of the benefits as society at large, Thurow says.“There’s no way private companies would support research projects at a loss for 10 or 12 years, especially when there’s no guarantee that they can make back a return on that investment,” he says. “This is where the federal government comes in.” Keeping the Edge in U.S. IT No one disputes Thurow. But by the mid-’90s, federal investments in long-term IT research had stagnated, says Cita Furlani, director of the National Coordination Office for Information Technology, Research & Development, which coordinates investments in IT R&D across the entire government. Four years ago, the President’s Information Technology Advisory Committee (PITAC), a congressionally mandated panel of academic researchers and IT industry executives, urged policymakers to close the gap. The panel recommended doubling the budget for basic research in IT to $2.7 billion by 2004. In a letter to former President Clinton in 1997, committee members said funding for this type of research was substantially inadequate, and that if left to private industry little would get done. Without more federal involvement, they feared basic research would stagnate indefinitely, and the computing infrastructure the United States pioneered would atrophy. Among their concerns, committee members felt that more research was needed to improve software reliability, infrastructure scalability and processing power. In response, Congress bumped federal funding for research in those areas by $400 million in fiscal year 2000 and another $600 million in fiscal year 2001 (the current budget year), for a total of $1.9 billion in funding.New Administration, New AgendaIn late January, PITAC member Ken Kennedy, a computer science professor at Rice University, in Houston, said he hoped President Bush would propose another $300 million for IT research in his first budget. But when the budget was released this April, Bush signaled his priorities were elsewhere. Bush proposed to increase the overall budget for science research by 2 percent but called for no increases in spending on basic IT research. Instead, he asked for major increases in spending on science education and biotechnology research projects. While many biotech projects such as nanotechnology (the art of manipulating materials on an atomic scale) have IT components, computing isn’t their main focus. Not surprisingly, because there’s a limited amount of money to spend, the president proposed only a modest $56 million increase to the overall NSF budget. Meanwhile in Congress, the new chairman of the House Science Committee, Rep. Sherwood Boehlert (R-N.Y.), named as his priorities the same research areas as Bush. He didn’t mention IT in his maiden speech this past winter. Boehlert’s committee sets the agenda for federal nondefense research.At press time, it was also unclear whether Bush planned to disband PITAC, although he had extended its charter until June 1, 2003. While it’s easy to dismiss the importance of a group of unofficial advisers that has no decision-making power, members?who include Irving Wladawsky-Berger, vice president for technology and strategy with IBM’s Enterprise Systems Group, and TCP/IP inventor Vinton Cerf, now a senior vice president with WorldCom?point out their access to the White House and Congress means researchers’ concerns have a platform. Raj Reddy, one of the organization’s cochairs, says that without an academic-oriented body to keep the government on task, emphasis on funding for basic research could fade like yesterday’s news. “I don’t care who’s in office,” says Reddy. “Someone has to make sure this type of funding remains a priority.”The funding crunch isn’t dire. Though President Bush did not propose significant increases to basic research funding, Reddy and his counterparts say existing projects should receive enough money to continue at their current pace. Still, the computer science and robotics professor at Carnegie Mellon University in Pittsburgh says that new projects, as well as some that have barely begun, probably won’t move too far ahead without more money. According to George Strawn, executive officer of NSF’s CISE directorate, projects slated for the back burner include research on data modeling using huge data sets and some new escapades into advanced technology for the Internet. “These are the projects of tomorrow,” says Strawn. “Without further attention to basic research, though, they might never realize their full potential.” Bajcsy, CISE’s assistant director, agrees: “We understand that the government operates like a family and works with the budget it’s got, but that’s not to say this isn’t hard to swallow.”Not everyone is pessimistic. Larry Smarr, director of the California Institute for Telecommunications and IT, a state- and industry-sponsored research center in San Diego, notes that since Bush released his budget, literally hundreds of government funded scientists have come out in favor of increasing the research budget for 2002. Smarr also notes that since the late 1970s, increased funding in all kinds of IT research, whether basic or applied, has received bipartisan support. “In the long run, who knows? Bush could come back in 2002 and 2003 and adhere to the PITAC recommendations.” Adds Boehlert: “If you don’t think technology should be a priority, you’re done [in politics]. Everybody thinks this stuff is important.”In a speech to the American Association for the Advancement of Science in May, Boehlert said he thought some Bush science-budget proposals, including the one for NSF, were “especially disappointing,” but added that he thinks the NSF budget will improve in fiscal year 2003. A week later, Congress proposed spending slightly more overall for science research than the Bush administration did. The exact budget won’t be set until the fall. Carnegie Mellon’s Reddy maintains, however, that because many politicians still connect IT research with defense projects, they won’t ever see the value of putting more money behind pure scientific inquiry. Meanwhile, Anne Armstrong, president of Virginia’s Center for Innovative Technology, a partially state-funded high-tech incubator, observes that support for basic versus applied research swings with the business cycle. The funding gap between basic and applied research started to shrink in the late 1990s, when times were good. With the economy suffering, Armstrong thinks the federal funding pendulum is bound to “swing the other way,” diverting funds from basic research and providing grants to state governments to bankroll new product developments.A New Funding PartnershipThat’s not necessarily bad. Brett Berlin, chief scientist with WareOnEarth Communications in Annandale, Va., says the best approach for funding high-tech research in the future might be a mix of grants from federal and state agencies. Berlin, a consultant to the Defense Department’s High Performance Computing Modernization program, which does applied research, envisions a scenario in which academics, government officials and private companies combine federal and state funding to pay for both basic and applied research projects. Such a setup, Berlin says, would be the best of all worlds: federal funding for basic research, state funding for applied research and collaboration with private industry to fund all of the efforts in between. In the end, he says, there’s a fine line between projects that deal with pure science and those with a commercial purpose. It’s a line drawn according to how much time a project takes. MIT’s Thurow says a federal-state-industry partnership could close the profit gap between those who make research investments and those who enjoy the benefits. Even government technologists Bajcsy and Strawn see benefits to this approach, noting that a funding strategy that supports every kind of research is certainly better than one that predominantly supports one. Armstrong says her organization takes this approach now, distributing state and some federal funds for research of all types.“The role of government is to make sure it is solving and addressing problems at hand,” says Berlin. “Basic research in IT is important, but it’s not the whole ball of wax. Looking forward, that’s something we all should keep in mind.” Related content brandpost Sponsored by SAP When natural disasters strike Japan, Ōita University’s EDiSON is ready to act With the technology and assistance of SAP and Zynas Corporation, Ōita University built an emergency-response collaboration tool named EDiSON that helps the Japanese island of Kyushu detect and mitigate natural disasters. By Michael Kure, SAP Contributor Dec 07, 2023 5 mins Digital Transformation brandpost Sponsored by BMC BMC on BMC: How the company enables IT observability with BMC Helix and AIOps The goals: transform an ocean of data and ultimately provide a stellar user experience and maximum value. By Jeff Miller Dec 07, 2023 3 mins IT Leadership brandpost Sponsored by BMC The data deluge: The need for IT Operations observability and strategies for achieving it BMC Helix brings thousands of data points together to create a holistic view of the health of a service. By Jeff Miller Dec 07, 2023 4 mins IT Leadership how-to How to create an effective business continuity plan A business continuity plan outlines procedures and instructions an organization must follow in the face of disaster, whether fire, flood, or cyberattack. Here’s how to create a plan that gives your business the best chance of surviving such an By Mary K. Pratt, Ed Tittel, Kim Lindros Dec 07, 2023 11 mins Small and Medium Business IT Skills Backup and Recovery Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe