New technologies can change the relationship of supplier and customer. New applications can create huge infrastructure implications. And sometimes the best way to learn how to cope with these changes is to face them squarely. To give you early warning, we queried various experts and found five technologies that have the significant potential to transform the way your IT department does business in the coming years.
Disruption 1: Multicore Processors
Leading vendors: AMD, Azul, Intel, Sun
Old thinking: One processor in every box
IT impact: New numbers in the “performance, power and cooling” equation
New computing platforms from Intel and AMD have arrived with reduced power requirements that take price and performance to new levels. On the plus side, the newer processors are more energy-efficient and can deliver more raw horsepower per watt of power, so IT departments concerned with their energy bills will be motivated to replace their older servers.
But even the most energy-efficient designs are still an issue for data centers that were constructed long ago on mainframe layouts. The trouble is that these spaces aren’t able to withstand the intense heat generated by racks of densely stacked distributed processors that generate more heat per square foot when stacked in large columns than the mainframes ever did. This may mean wholesale design changes on the data center floor. “For every watt of server power consumption, we pay twice—first for powering the server and second for the air-conditioning that cools it down,” says Rene Wienholtz, CTO of German Web hosting provider Strato.
Leading vendors: EMC/VMware, Liquid Computing, Microsoft, Sun
Old thinking: PCs running a single OS
IT impact: Quicker software development and rapid application deployment, cutting costs and better leveraging server hardware
The concept behind virtual machines (VMs) is simple to state but hard to implement: Take a single server and divvy it into separate “virtual” machines with their own software-built memory, virtual hardware, drive images and other resources. Virtualization isn’t new: IBM has been doing this on its mainframes for close to 30 years. What is new is that the power of virtual machines can be effectively delivered to the PC platform. And fierce competition in this space is forcing the major players literally to give away pieces of their VM server software.
Why the hot market? For IT, virtualization lets multiple operating systems and applications run on the same box, making it easier to provision new servers as necessary and make more efficient use of hardware. This continues the consolidation trend that began several years ago with blade servers: Think of virtualization as the ultimate result, where many individual servers can now run on the same piece of hardware rather than on individual blades. You save space, you save time, you simplify your IT support structure, and you save plenty of money reusing the same gear.
But there is more to VM than server consolidation. EMC, for instance, has already established some prebuilt virtual machine appliances that come with ready-made applications such as Web, e-mail and database servers that IT can install and have running in minutes—further reducing the time to build out new servers. “We plan to use virtual server management to reduce our server support efforts, minimize downtime and reduce the ongoing costs of server replacement, enabling us to support more hardware with existing staff,” says Karen Green, CIO of Brooks Health System.
“Two years ago, it wouldn’t have been possible to handle [such a heavy] workload in a data center. Now we can, thanks to this new virtualization software,” says Wienholtz.
Disruption 3: RFID
Leading vendors: Reva Systems, ¿ScanSource, Symbol
Old thinking: Automating the production line was all about making cheaper finished goods
IT impact: Dramatic changes in the supply chain and ERP systems will let companies keep closer tabs on inventory and production
Trading information between suppliers and distributors will never be the same thanks to the maturation of radio frequency identification (RFID). While the technology is a decade old, new developments in the integration of supply chain infrastructure, more solid standards, and products such as Reva Systems’ tag acquisition processor have made it easier to integrate RFID data directly into inventory, supply chain and manufacturing systems. Tagged merchandise can be tracked as it leaves finished-goods inventory, travels across state lines, arrives at the loading dock door and gets purchased by a retail consumer—with each step along the way providing real-time information to various systems. And with Wal-Mart and the U.S. Department of Defense making RFID information exchange mandatory for many suppliers, tens of thousands of vendors are implementing RFID to track everything from pill bottles to palettes to people. RFID enables all sorts of applications such as alarms that sound when items are shoplifted, payment systems that don’t require a credit card swipe and automatic employee-access controls surrounding specific sensitive locations.
Older RFID applications were built to locate a particular palette or track a shipment. Tomorrow’s applications will enable product line managers to track where all of their goods are in the production and delivery process in near-real-time, letting them closely observe any bottlenecks or supply problems.
The consequences of RFID are huge, especially the infrastructure implications. “RFID computerizes the edges of the enterprise,” says Marlo Brooke, senior partner at systems integrator Avatar Partners. This means that companies will need to upgrade networking infrastructure—both wired and wireless—as RFID readers are deployed across the enterprise.
Software as a Service
Leading vendors: Amazon, Google and thousands more you’ve never heard of
Old thinking: Build your apps one at a time from the ground up
IT impact: Mix and match your browser-based applications to create cheap
The Web has become a solid application delivery platform, transforming the way we can deploy enterprise software. Call it a mashup, Web service, software as a service or service-oriented architecture—it all amounts to the same thing: becoming more flexible and nimble while saving a boatload of money by not having to write code from the ground up for each application.
By using these techniques, says management consultant Rod Boothby, “We are on the verge of experiencing a jump in the capabilities of office tools that is just as significant as the jump that occurred when the first PCs landed on people’s desks.” Instead of picking application partners that have the best prices or series of features, savvy CIOs can order a combination of small-scoped applications that are more appropriate to particular situations yet work well together. Take one part hosted e-mail server, mix in another part Java servlet to process a series of forms, then add an online storage repository from Amazon called by another Web application to configure automated backups and to run batch jobs. All of these can connect to each other via the Internet and may not even reside on your company’s servers.
The hard part with all this “Web 2.0” talk is understanding how to take apart your particular application into discrete pieces that someone else has already written. To adapt, IT managers must think in layers, just like the Internet is designed with different protocols that distinguish between lower-level transportation and higher-level applications. According to Doug Neal, a research fellow with the Computer Sciences Corp. Leading-Edge Forum Executive Program, “We finally have a layered series of services that can meet changing [business] requirements. You can pick the right layers to match your needs.”
Leading vendors: Cisco, ConSentry, ¿Lockdown Networks
Old thinking: Security point solutions from multiple vendors
IT Impact: Protect your laptops and defend your entire network by thinking in broader strokes
Our last trend is a real challenge: delivering consolidated endpoint security across the enterprise. You buy single-purpose products that do one or two things well, such as antivirus, firewall, intrusion prevention, policy enforcement, authentication and the like. Trouble is, there is no single product that delivers a complete solution. Meanwhile roaming laptops are coming into your network and spreading infections daily.
Picking among the three current major architectural efforts for endpoint security will consume a good part of your budget and time. Cisco and Microsoft have their own takes on the issue, and an open-standards group called Trusted Computing Group is behind door number three. Cisco focuses more on securing network infrastructure, Microsoft more on desktop remediation, and Trusted Computing Group starts with low-level hardware protection. Before you get behind any approach, take the time to research the differences and decided which vendors implement the pieces of the endpoint puzzle most critical to your business. Right now, no single approach covers all the security bases.