Do IT groups really need to move to a software-defined environment?

directional signs
Credit: Thinkstock

Pay close attention to the competitive advantage levers in software-defined IT.

CIOs and their IT teams are bombarded with major changes on nearly every front these days. Shifting to a software-defined environment is one of the top agenda items in many enterprises. But what does this shift in operations really mean? Are the benefits really worth the headaches of major operational change?

We are in the early days of this shift to a software-defined world, and it takes me back 20–25 years to the talk about taking our offices paperless. The early motivations of saving paper and preserving the rain forests have given way to a focus on office productivity, collaboration and ease of use. 

In a similar manner, the rational for moving into a software-defined environment may start with a focus on cost savings, but is quickly augmented by the desire to cut cycle time and make the IT environment much more agile and thus better aligned with the business needs. 

Fundamentally, a software-defined environment is a world in which the IT infrastructure and network are managed and operated through software, not through people. That’s the simple definition. All the testing tools and enablement tools for the infrastructure need to be software defined. 

Increasingly, the main motivations for moving to a software-defined world are the benefits of speed, agility, quality and cost. It enables bringing on applications quickly. With agility comes scalability to quickly grow services and infrastructure to the business needs – or shrink them. This increased speed and agility paradoxically do not come at the expense of quality. In fact, where we have been able to study software-defined environments, we find them operating at much higher quality levels. 

This increase in quality seems to be driven from two main sources. First, there are fewer people and thus fewer human errors in the process. Secondly, software-defined environments enforce a much higher degree of standardization, making the environment simpler and less prone to error. 

In fact, implementing these standard environments is one of the largest impediments to moving to a software-defined world. The more willing companies are to tackle the standards issues, the faster they are able to make progress. The more standardized the environment is, the simpler it is to operate and the easier it is to run it without people and use automated tools instead. 

Finally, software-defined environments are far cheaper to operate and maintain. It is easy to understand that fewer people equals less cost, and less rework due to higher quality saves money. However, this is just the start. A software-defined environment also results in lower costs because it operates at a much higher utilization and ends the need to pay for overcapacity. It also is a great complement to the public cloud, further decreasing overcapacity and enabling an unlimited scalable capacity. 

Where to focus first in moving to a software-defined world

Like the journey to a paperless office, this journey does not have to be done in one step. The best approach is to focus first on particularly high-return areas. Automated testing is an example. It hits a number of the buttons at once. As you think about where to make investments, automated testing gives a very big and quick return because the testing process is full of people and it’s rarely done well. Automated testing improves quality and speed and dramatically reduces cost. So it’s a great place to start. 

Another good starting place is automated provisioning. This is the ability to stand up a computing environment or shut it down. Regardless of what environment you have (mainframe, Azure or different virtual environments), automated provisioning will provide a high return.

Automation tools 

As you eliminate people and move into the more automated software-defined world, there are a variety of tools that help IT groups quickly reconfigure, redesign, scale up or down, and monitor and drive consistently high-quality environments. In addition to automated testing tools, there are very effective automation tools that enterprises can purchase and implement to drive toward their performance goals. 

Use the Chef tool, for example, to write simple scripts for automating tasks. Another tool, Puppet, automates every step of the software delivery process, from provisioning to reporting and releases. It ensures consistency and reliability. Tools functionality goes all the way up to IPsoft’s autonomics suite, which automates many functions and processes in the data center. The various tools’ applicability varies across different technical environments. 

Ripple effect 

My experience in working with clients is that the move to a software-defined environment is not a destination – it’s a journey. And the further you go on the journey, the more powerful the benefits become. The more your environment is software defined and uses standard components, the more resilient it becomes. The more resilient it is, the easier it is to operate. With easier operation, the faster you can stand things up and roll out new apps. The more automated the environment is, the higher the quality of the work will be. Severe mistakes will be eliminated in your data center or infrastructure environment, as mistakes will go down proportionately with the number of people operating the environment. The more mistakes and rework are eliminated and the more automation and standard components take effect, the more cost reduction will be realized. 

Together, the benefits of a software-defined IT environment have a rippling effect like pebbles tossed into a pond, and the more value and competitive advantages the enterprise will enjoy.

This article is published as part of the IDG Contributor Network. Want to Join?

Drexel and announce Analytics 50 award winners
View Comments
Join the discussion
Be the first to comment on this article. Our Commenting Policies