As cloud adoption continues to accelerate, it appears that every company is moving at least some of its business-critical workloads from on-premises servers up into the ether. Use of public (or hybrid public-private) cloud infrastructure offers myriad benefits, but not every business is ready to take what for many seems like a leap of faith. For some, regulatory or governance considerations are keeping servers and storage on site; for others there’s a lack of clarity about the advantages that cloud migration really offers.
Many organizations that have not begun adopting cloud infrastructure are facing a major challenge: the imminent end of support for Windows Server 2003, the decade-old OS that still powers millions of servers both here and worldwide. After July 14, 2015, any vulnerability in Windows Server 2003 that is uncovered by hackers will not be patched; support for new applications and utilities will not be addressed and applications and data running on those servers will be operating under risk of imminent failure or data loss. In a nutshell, businesses that don’t consider migration are really gambling with their data.
So somewhere between migrating to the cloud and remaining on Windows Server 2003 there is a middle ground: migration to newer OS platforms such as Windows Server 2012. Here’s why upgrading to the latest OS makes good business sense.
Let’s start with the obvious. Moving to supported OS platforms reduces the risk of malware or data loss, since vulnerabilities (when discovered) are rapidly addressed.
Second, there are bottom line benefits. Most older servers run a single application, which is quite inefficient. Most of those Windows Server 2003 server CPUs are way underutilized, sitting idle while waiting for something to do. This was the driving factor behind server virtualization, which become the underpinning of cloud computing. Since virtual servers completely isolate each application in its own environment, many applications can be hosted on a single physical server without fear of the dreaded blue screen of death causing other applications to crash. The result? Fewer servers are needed, saving money in hardware, OS licenses, power, cooling, and real estate.
Upgrading to a newer OS like Windows Server 2012 is a logical choice for any organization looking to improve efficiency, especially when considering that Microsoft includes its server virtualization platform, Hyper-V, as part of the server OS license. Why not just virtualize Windows Server 2003 servers? First, there are performance issues. The past decade has seen an order of magnitude improvement in CPU power, thanks to Moore’s Law and multiple CPU cores. In many cases the first new server you deploy may have more raw power than the several servers it is designed to replace. Secondly, there are software and integration issues. Windows Server 2003 won’t run Hyper-V, so older versions of other hypervisors like VMware and Xeon would be required—at an additional cost. Experts agree that virtualizing without upgrading is a classic case of throwing good money after bad.
Are there still Windows Server 2003 servers in your shop? Maybe it’s finally time to get that migration strategy under way, whether in preparation of a cloud migration or just to drive efficiency up and long-term support costs down.
With the end of support date for Windows Server 2003 fast approaching, there's never been a better time to plan your data center transformation. Our experts have designed this helpful tool to get you started on the right upgrade path for your unique environment, applications, and workloads.