As we continue to shovel away the debris from the Y2K and Internet buying binges of a few years ago, CIOs in a variety of industries are discovering something remarkable. Beneath the rubble, a dramatically different model for business computing is emerging, as different from today’s architecture as was the shift from mainframes to client/server in the 1980s. Whereas PCs made it possible to distribute both applications and data closer to their users, the next-generation architecture will distribute even smaller units of software over the Internet, not just to distant users but to destinations such as equipment on the factory floor and packages on store shelves. That capability will create a new class of information products and services that will interact with each other across organizational boundaries using sophisticated messaging and security protocols. Data processing will become even more tightly connected to business processes, designed to scale up or down quickly as conditions require, supported by new kinds of outsourcing relationships with hardware, software and communications vendors. Sports fans might think of it as “extreme computing.”
The individual movements that fuel this next-generation architecture scenario have been percolating for some time. But it’s their coming together within the next two to five years that will create the profound new landscape in which companies will do business. Organizations adopting next-generation architecture will realize substantial reductions in development and maintenance costs, and participate in new forms of information exchange up and down the supply chain that will create new sources of value.
The good news is that much of the investment you’ve made in the past five years in networks, hardware and applications will turn out to yield unexpected dividends as part of this new architecture—not just productivity savings but, more important, competitive advantage and perhaps even new products and services.
The bad news is that, like earlier large-scale shifts in computing architecture, the transition will be accompanied by plenty of trauma. To take maximum advantage, your company will need to make more investments in infrastructure, rethink some applications and gear up for a major transformation of your IT skills.
The next-generation architecture emerges from the intersection of three important trends in business computing:
Cheap computing. Following Moore’s Law, computing power continues to get dramatically faster, cheaper and smaller all the time. Not only can computers do more things, more things can be turned into computers, including consumer electronics, appliances and, increasingly, individual products and packaging.
Distributed processing. Applications are being distributed closer and closer to the business functions they support, another long-standing trend. In the coming wave of distribution, processing will move away from human users to the objects in a commercial transaction. It won’t be the driver of the truck who scans the package and uploads the data, in other words, but the package itself, which will send data continuously throughout its lifetime. A new set of business relationships will run parallel to the movement of goods and services, collecting and communicating information about every stage of every transaction—an important byproduct of the next-generation architecture I have called the “information supply chain.”
Openness. Turbo-charging these dramatic changes is one unexpected development of the past decade: the unprecedented spread of TCP/IP, XML, MP3, and other nonproprietary networking and data communications standards known collectively as the Internet. Despite the dotcom meltdown, open data communications standards continue to evolve and take hold, creating the foundation for some of the most exciting new applications of next-generation computing.
Seven Pillars of Architecture
The next-generation architecture is made up of the following seven key elements.
1 Reusable software components.
Application software will continue to grow away from the monolithic, hard-to-maintain masses of code we’ve known in the past toward smaller components that communicate with each other to complete particular tasks. Generic and company-specific elements can be mixed and matched without undermining the overall design. Instead of exchanging data using brittle application program interfaces, applications will exchange standard documents such as orders and invoices and extract the data they need behind the firewall.
Enabling technology: Object-oriented software. Object-oriented design tools and execution environments, such as Microsoft’s Visual Studio, will replace traditional development and run-time environments with comprehensive programming, testing and operating environments, and manage large libraries of code. Nonprofit organizations including the insurance industry’s Association for Cooperative Operations Research and Development, as well as software vendors such as customer management provider Chordiant Software, are building document and process objects that are understandable within industries and functions. Another consortium, XML Common Business Library, is working on a standard for XML documents that cross industries. We will finally overcome the long-standing limitations of object-oriented tools’ performance, functionality and critical mass of users. The solution is coming in part from the rapid increase in computing power of both servers and client devices to fuel the added processor requirements of the object-based approach, and in part from major commitments by leading software vendors to build robust development tools and run-time environments. Principally, the big vendor battle is between Microsoft, with its largely proprietary .Net Web services, and IBM’s on-demand computing platform based on Linux and Websphere open systems. Who will win? IBM is betting heavily on the open architecture of Linux, which has fans in the developer community. Microsoft is targeting the installed base of Windows users, hoping to bring them onto the next-generation platform gradually. For now, both companies need to do a better job of making the business case from the customer’s point of view, especially those already under pressure to reduce IT spending growth and focus on short-term productivity gains.
Key impact: Software development becomes a cottage industry. Users will buy software in pieces—some from traditional application and systems software vendors and others from companies specializing in particular business functions, for example, credit scoring or industry-specific legal compliance. Companies will also write their own modules for activities in which they already enjoy a competitive advantage, eliminating the painful and unsatisfying make-or-buy dichotomy of today’s environment. Open-source enthusiasts, both at the system and application software level, will accelerate the spread of reusable and extendible code.
To maximize companies’ abilities to collaborate both inside and outside the organization, we will build software and networks on open standards for process, data, user interface and, most important, information exchange.
Enabling technology: Standards. In addition to the next generation of existing Internet standards, including XML, open architectures will play a dominant role for lower-level operating systems, such as Linux, as well as higher-level application-to-application interaction. Over time, application vendors will build on emerging standards to create higher-level rules for interchangeable business documents and processes for managing them. Industry and vendor consortia such as the Web Services-Interoperability Organization (WS-I), supported by BEA Systems, IBM, Microsoft and others, as well as the Universal Business Language promoted by Commerce One, SAP, Sun and others are vying to create the winning combination of standards and applications. Each group has a slightly different definition of openness, but all the participants understand that they can sell new products and services to users only when collaboration across the supply chain becomes cheap and standardized. Industry-led groups such as MIT’s Auto-ID Center, which is developing open standards for electronic product codes, and the Uniform Code Council, which is developing business data standards using XML, will likewise play power broker roles in establishing common ground across platforms, industries and companies.
Key impact: The data warehouse will become the data retailer. Open standards will make it possible to unlock the potential of corporate information trapped in your unwieldy or inaccessible databases. More data and tools for sharing it will lead to new sources of productivity within your company, such as business intelligence, and new sources of profitability outside your organization. For example, consolidated data from similar products or data from across the supply chain can provide invaluable market research, which you can use to improve pricing, promotion, new product development and maintenance. Even a few new connections (perhaps to your closest trading partners—key suppliers and major customers) will add tremendous value, but the biggest gains won’t come until major participants across the supply chain make the switch. The struggle to arrive at a common infrastructure will be long and complicated, but ultimately—think railroad gauges, electrical outlets or even rules for driving—a combination of public, private and consumer interests will force a resolution. For now, most CIOs should avoid committing irreversibly to any one of the competing efforts, but should watch the Darwinian struggle carefully, experiment when possible and adopt particular solutions when the business case justifies doing so.
3 Adaptable interfaces.
One-size-fits-none user interfaces will evolve into device-independent interactions that can be customized by users or by the application itself as it learns how different users want to send and receive information. The same user will take on different profiles depending on the device she is using at any given time, be it a PC, telephone, PDA or cell phone. In many cases, devices will communicate directly with each other without the intervention of a user. Next-generation interfaces will be as different from today’s icon-and-window model as that paradigm was from the C prompt of DOS.
Enabling technology: Consumer electronics. During the past 10 years, computing has increasingly migrated to noncomputer devices made and marketed by consumer electronics companies, the best of which are experts at creating natural man-machine interactions. Manufacturers of video game consoles, digital video recorders and cellular technologies understand that consumers need limited but relevant information at home or on the road, even if the same consumer becomes a power user of more complicated applications back at the office. These devices offer far fewer options than a desktop computer, but appropriately so, and they will point the way to effective interfaces that are specific to the device and transparently obvious to the user.
Key impact: Collaboration becomes device-independent. Recognizing that different users will interact with applications using a wide range of devices (many yet to be invented), interfaces will evolve from hard-coded dialogues to real-time requests for information formatted and presented in ways meaningful to the device and the role of the user at any given time. Because of those more natural, adaptive interfaces, the interaction you have with customers, suppliers and employees will become truly collaborative in nature.
4 On-demand scalability.
To better handle the rapid pace with which global businesses buy and sell major assets, the next-generation architecture will be capable of rapidly adding or removing capacity and connecting or disconnecting operating units without breaking the network, applications or databases.
Enabling technology: Computing on demand. Building on the component software architecture and open standards foundation, the next-generation platform will lead to new forms of outsourcing. Not only will some applications be handed over to service providers, but so too network, processing and data storage. (Securing that interaction is another feature of the next-generation architecture; see key element six.) IT services companies, including CSC, EDS and IBM, already control computer operations for global businesses and are investing today in computing centers that can quickly switch on or off major resources as the business needs of their customers change. The component-based software architecture, at the same time, will allow companies to keep operating assets loosely coupled, making it easier to add or remove them. (For more on utility computing, see “Plug and Pay” at www.cio.com/printlinks.)
Key impact: IT will adapt to your company rather than the other way around. Today, one of the major costs in both time and dollars of mergers and divestitures is the fragility of mission-critical IT systems. With an infrastructure built on open standards and supported by on-demand resources, the scalable company will be able to plug and unplug divisions and even acquire competitors without having to wait for applications to catch up with the changes a year or two later. Having converted to a standardized application and desktop environment in the late 1990s, oil giant BP, for one, has leveraged that investment to simplify its acquisitions of Amoco and Arco, and it now runs the business of its former competitors on its cheaper and more reliable platform. The Amoco takeover was completed nearly a year ahead of schedule, in part because BP simply replaced Amoco applications and hardware with its own standardized environment.
5 Compartmentalized application components.
Following a long-standing trend toward separation of data structures, application logic and user interfaces, next-generation applications will manage each component independently, making modifications far simpler. The job of enhancing the next generation of legacy software will not be met with the same dread it is in today’s environment, where even a small change can ripple through to hundreds of other programs, files and screen layouts.
Enabling technology: Web services. Though poorly understood, the true promise of Web services will be to simplify not only application development but, more important, application maintenance. In the model for Web services developed by vendors participating in WS-I, small software modules located anywhere on the Web will be able to interact with each other using standard protocols, making it possible to cobble together computer systems that reflect the needs of your organization, even—perhaps especially—as those needs change. Your IT organization will only have to worry about the pieces that describe functions specific to your business. (For more on Web services, see “Web Services: Still Not Ready for Prime Time” at www.cio.com/printlinks.)
Key impact: Ownership of key IT components will migrate to the organization best suited to develop them. In the rush to present users with Web-based interfaces—and with few development tools available—most companies built first-generation Internet applications that hopelessly entangled data, logic and interface. Connecting the new software to existing systems proved difficult, and adding new functionality or taking advantage of new Internet technologies proved even harder. Most companies will start over in the next two years, adopting a ruthless insistence on true separation. Once they do, the Web services model will flower. Some companies aren’t waiting. Startup airline JetBlue has built all its systems from scratch using Microsoft’s .Net environment. The company claims dramatic reductions in development time, cost and operations overhead, and it’s succeeding despite being in a ruinous vertical industry. (For more about the airline, see “JetBlue Skies Ahead” at www.cio.com/printlinks.)
6 Built-in security.
External threats to systems security, coupled with growing consumer privacy concerns, will figure prominently in the design and operation of next-generation systems, with likely involvement by various governmental agencies from state to international levels (likely spurred on by privacy lawsuits). Since next-generation applications will reach much deeper into day-to-day activities of consumers, businesses and governments, they will require built-in safeguards far beyond passwords and physical security.
Enabling technology: Agent-based computing. Monitoring software will run side-by-side with applications to test the validity of every request over the global open network of integrated applications. Each transaction will carry its own encrypted security profile and generate a secured audit trail. Autonomous software using agent technology will negotiate in the background to ensure that interactions in the open marketplace of Web services follow the rules. The same technology will be used to manage the millions of interactions taking place across the network, perhaps supported by regulated service providers that offer secured audit trails and public records.
Key impact: The integrity of data will become a matter not of engineering but of public policy. Capability-based technology for securing systems, along with encryption techniques, must be introduced into next-generation applications and designed into the highest levels of your network. The combination of anxiety over global terrorism, the increasingly open exchange of data between participants in the supply chain, and the growing unease among consumers about the collection and use of personal information, will move security to center stage, where regulatory agencies, legislators, lobbyists and courts will play a prominent role in design.
7 Disposable computing.
As the cost of computing continues to decrease, it will become cost-effective to introduce some level of intelligence to each individual item in commerce. The Auto-ID Center at MIT estimates that about a trillion new Internet-friendly “devices” will be added to the network in the next 10 years, with chips and radio transmitters simply included as part of a product’s basic packaging.
Enabling technology: Moore’s Law again. Today, companies including Wal-Mart, Procter & Gamble and Gillette are experimenting with cheap electronic product codes that transmit data using radio frequencies. The emerging standard will assign every item—not just every SKU—with a unique identifier. And an Internet address.
Key impact: The information supply chain is completed. The emergence of disposable computing will test the scalability, flexibility and security of next-generation architecture by flooding it with trillions of new transactions, sending and receiving small amounts of data that track the flow of goods and services throughout the supply chain. As more and more data becomes available in usable forms—particularly data generated after a product leaves the store—the “information supply chain” will begin to function as an independent source of revenue, generating invaluable data about product performance, consumer behavior and logistics. Look for forecasting applications—which today rest on shaky assumptions—to become scientific and reliable. Bundling information services with physical products such as smart appliances will be another key source of new value. The understandable concerns of consumers over data privacy will be resolved along the way, largely by consumers sharing in the value derived from the use of their data. Today, grocery stores trade significant price discounts for consumer data using preferred shopper identification cards.
Companies that have the most to gain from the information supply chain are retooling right now for the next generation, even when doing so requires significant investments during challenging economic times. Amazon.com, for example, embraced open source in 2002, converting from Sun’s proprietary operating system to Linux. The switch is simplifying the process by which freelance retailers known as Amazon associates can build links to Amazon applications into their websites, using Amazon’s payment, fulfillment and customer service without actually installing the software.
So the real question now is not whether but when the next generation will become your computing architecture. More to the point, given some unavoidable uncertainty about how quickly the next generation will arrive, what, if anything, should you do now to prepare for the change and help your company profit from its potential?
From a technology standpoint, the most important technique for managing uncertainty and hedging against the risk of disruptive change is to adopt a portfolio approach to IT investments. The most common mistake companies make during slow periods is to terminate any project that will take more than a few months to pay off—effectively shorting the future. (For more on IT portfolio management, see “Portfolio Management: How to Do It Right” at www.cio.com/printlinks.)
You must, of course, spend most of your IT dollars on running the business, but you also have to spend to change the business. Divide your budget into short-, medium- and long-term projects, and agree with the rest of your executive team on an appropriate ratio among them (perhaps a percentage allocation of 75-15-10). Then stick to the ratio whether the budget goes up or down.
At the same time, bear in mind that the real obstacles to adopting next-generation architecture, as in previous transitions, will be more organizational than technical. Mastering the seven elements of the next-generation architecture will challenge IT professionals to not only cultivate new technical skills but to also foster a better understanding of their companies’ business. Capturing complex processes in code and identifying opportunities for profit from new data sources will work only if the boundaries between IT and users are as permeable as the data that is free-flowing in the information supply chain.
As the next-generation model makes more data available from business partners and consumers, the psychology of privacy and security will take center stage. Data privacy and security concerns—poorly understood today—can be solved but only with industry leadership and public education. Unfortunately, governments are shooting first and asking questions later, if at all. Laws and regulations that are being adopted in haste today will later prove to have created more problems than they solved. As CIOs, you must educate the industry groups and trade associations you belong to on the long-term benefits of robust data exchange across the supply chain so they in turn can work with consumer groups and regulatory bodies to head off as many regulatory disasters as possible.
Most of all, keep your eyes open. Be ready to jump on early opportunities to profit from any of the seven elements of next-generation architecture as soon as you spot them. As the comedy team The Firesign Theater likes to say, “Live in the future. It’s happening now.”