by Paul Schnell

How will you deploy Windows 7?

Feature
Aug 23, 2010
IT LeadershipMobileMobile Apps

Windows 7 has arrived in the in-tray of CIOs with a resounding thud. The market is ready for it and people’s perception of how it’s going to perform is very positive. Many organisations that remained on Windows XP, struggling to see a compelling reason to move to Vista, are primed to make the change – if for no other reason than XP is now nearly a decade old and coming to the end of its mainstream support period.

While all this is resulting in a massive migration push towards Windows 7 – with, according to IDC, nearly 65 per cent of businesses already moving or set to do so in the next six months – see Windows 7: Why now? – organisations are not simply switching wholesale to the new system. Instead, they are looking at alternative deployment options, including application virtualisation, 64-bit computing and server-based computing and virtual desktops.

This is raising some challenging questions: how do you make the most of the compelling cost savings that virtualisation and server-based computing can bring? How can you profit from the performance improvements of 64-bit without risking breaking some applications? And how can you put in place a plan for a phased migration without knowing which areas of the organisation are ready for which technologies? The trouble is that without a thorough understanding of the compatibility of applications for these diverse environments, developing a robust, credible migration journey is rather like sticking a pin in a map.

Complex migration options

Faced with the complexity of these migration options, some organisations are deciding that application virtualisation and server-based computing are just too risky to pursue. They are splitting out their Windows 7 migration as a separate project – and missing out on the wider benefits of virtualisation and 64-bit.

Others figure that while they are going to have to devote a major chunk or budget and energy to their Windows 7 migration anyway, they might as well invest a bit more to understand how applications would work in these alternative environments – and have this information at hand for future scenarios. Developing cost/benefit scenarios for different environments enables them to understand which options are open to them going forward as well.

In order to achieve this, organisations need to test applications in different environments – but when you are just exploring which deployment options are open to you, manually testing all the applications in your estate is highly laborious and prohibitively expensive – in organisations with thousands of application it can take months and even years. An alternative is just to test a core of the applications for different environments and take a calculated risk on the rest – but how many organisations really want to gamble on applications failing?

Typically, when organisations enter into planning for a platform shift, they find they have many more applications than they imagined within their estate, perhaps even thousands more. In an organisation with 2000-3000 managed applications there can be as many again that corporate IT don’t even know about – and quite apart from the task of rationalising and getting rid of duplicated instances of applications, the sheer scale of the application compatibility task they then face creates a huge stumbling block for the project before it has even begun.

Research from Gartner Group has shown that the application management part of a migration project can end up costing more than the hardware updates themselves. And it’s a cost that you keep incurring each time you shift to a new platform.

By contrast, automating the testing of applications for the compatibility with different environments can shrink the decision making process and de-risk the preparation phase for the adoption of new technologies. That frees up time in the planning process to decide which applications are relevant to which parts of the business and in what types of processes technologies like virtualisation can be deployed.

The whys and wherefores of virtualisation

There are significant benefits to be had from application virtualisation, whether it’s using Microsoft’s App-V or other technologies such as Citrix XenApp, VMWare’s ThinApp, Symantec’s SVS or InstallFree. These benefits centre on the ability to create a bubble round the application, isolate it from the operating system and reduce conflicts with other software. This makes applications more portable, potentially reducing hardware costs as utilisation is improved. It should also ease application delivery and maintenance, reduce disruption and again save costs as application management is centralised.

However, not all applications are suitable for virtualisation. Those with heavy I/O requirements for example may not perform well in a virtual environment, and older applications with hooks into the operating system may just fail to carry out basic functions. Knowing which applications are suitable candidates for virtualisation before such an initiative is therefore a prerequisite to determining an application virtualisation strategy.

Incredible as it sounds, many organisations simply attempt the virtualisation of applications without knowing their suitability, then if they fail, keep on trying to resolve issues, sometimes for weeks before giving up and reverting to Windows Installer packaging. Imagine what that process does to the IT department’s agility and responsiveness rating!

What’s needed is a process that can tell you immediately if an application is suitable for virtualisation, has issues which can be remediated or, if it’s unsuitable, facilitate a decision about whether to deploy it directly to the end point or perhaps via Citrix XenApp, using presentation virtualisation. Similar processes need to be established around applications’ suitability for 64-bit environments.

Apart from the resourcing issue, another problem when manually testing application compatibility for alternative technology deployment options is that you don’t necessarily see all of the functionality or test every single file. So even though you might think an application is suitable for a new platform or a 64-bit environment, when a function runs that hasn’t been tested it might still fail.

A looming problem: web apps

An additional complication comes in the form of Web apps, which are becoming more prevalent and business-critical in many organisations. Internet Explorer 8, which ships with Windows 7, comes with enhanced support for standards and security, a point of departure with previous versions, which were very forgiving on code that was not standards-compliant.

This in itself is becoming an application compatibility issue for some organisations in two key areas. Firstly, Active X controls that are embedded in web pages will sometimes be blocked from running on client machines. This blocking was introduced for security reasons to stop malicious code taking over host computers, but it is now having the unwanted impact of blocking some legitimate controls – and stopping functionality from appearing that may be critical to a web app.

Secondly there are some visual rendering changes that mean pages may not appear as intended in the new browser. Some implementations of JavaScript and Style Sheets could mean that the rendering of a page may shift to one side, for example, taking a function away from the user’s view.

It’s not only in the migration to a new platform that application compatibility issues raise their ugly head. In Business As Usual processes, as applications are introduced into the environment, the standard QA, testing and user acceptance processes will take far too long if manual testing is still used. Whatever platforms or technology choices are introduced, investing in automated tools to understand and document the status of your application estate today will pay massive dividends in the future.

Paul Schnell is CTO of application migration software specialists App-DNA