by Huw Price

Virtualization can improve outsourced testing

Jun 20, 20123 mins
Cloud ComputingIT Strategy

Outsourced suppliers may be responsible for completing the work but it is the CIO who is accountable for development projects tokeep to a certain standard of quality.

In order to maintain control over your development, it is important to define strict quality gates and there are some that are easily defined, yet are being missed at present.

Testing earlier and more often in the development lifecycleincreases the detection of defects and minimises the amount of costly reworking required to fix them but there are other roadblocks that are being missed.

Often outsourced suppliers are tasked with developing individual components which are highly dependent on other teams for data.

Virtualised services can make access to data by development teams easier but this doesn’t guarantee the quality of that data.

Therefore, CIOs must provide outsourced suppliers with data that is meaningful, compliant with relevant data privacy regulations and contains sufficient code coverage to be fit for purpose.

One method is to create hard-coded, stubbed message responses to the suppliers’ data requests.

However, although the data will be realistic, it provides narrow code coverage, limiting its value to developers and testers.

Also, creating these responses is time-consuming and expensive and it can’t guarantee the good data governance expected by management and regulators when outsourcing.

CIOs looking to add significant value to their data should consider the merits of creating synthetic data responses.

Synthetic data is modelled on live traffic but contains none of the sensitive records, ensuring good data governance.

Back-end data is then used to build up these models to contain richer, more intelligent responses than live or manually-created traffic can ever provide.

Using coverage techniques, the data can be designed to ensure that scenarios designed to break your systems can be developed and tested with minimal data.

Much of the data provisioned is read-only so the code that comes back is often poor as it doesn’t meet the development requirements.

The result is considerable reworking, costly delays and potential political challenges arising in the partnership.

If a record doesn’t exist, or requires modification, developers have traditionally been unable to adapt the data to fix the problem. Essentially, companies are tying their own hands by placing these restrictions on developers.

By providing access to the back-end data along with your service layers, outsourced developers are freed to transact against it, enabling them to quickly and creatively solve challenges.

Actively empowering suppliers to create solutions enables them to buy-in to your project, and increases their accountability for delivering.

This process does carry a health warning. Control over an outsourced project is necessary to its success, but defining a bounded framework mitigates the risks and ensures that the code you receive back provides real value, and can be tested early, often and rigorously.

Today’s competitive marketplace defines success by the ability to continually and cost-effectively deliver valuable software to customers.

By using virtual service layers, and reconsidering how they can help you provision fit for purpose data, you can ensure that your outsource agreement offers the value you need to stay ahead of the field.

Huw Price is the founder of Grid-Tools, the leading test data management vendor internationally, offering holistic, end-to-end test data solutions for traditional and Agile development and testing environments, including cloud and virtualized services.

Pic: bb_mattcc2.0