Virtualization can improve outsourced testing

Outsourced suppliers may be responsible for completing the work but it is the CIO who is accountable for development projects to keep to a certain standard of quality.

In order to maintain control over your development, it is important to define strict quality gates and there are some that are easily defined, yet are being missed at present.

Testing earlier and more often in the development lifecycleincreases the detection of defects and minimises the amount of costly reworking required to fix them but there are other roadblocks that are being missed.

Often outsourced suppliers are tasked with developing individual components which are highly dependent on other teams for data.

Virtualised services can make access to data by development teams easier but this doesn't guarantee the quality of that data.

Therefore, CIOs must provide outsourced suppliers with data that is meaningful, compliant with relevant data privacy regulations and contains sufficient code coverage to be fit for purpose.

One method is to create hard-coded, stubbed message responses to the suppliers’ data requests.

However, although the data will be realistic, it provides narrow code coverage, limiting its value to developers and testers.

Also, creating these responses is time-consuming and expensive and it can’t guarantee the good data governance expected by management and regulators when outsourcing.

CIOs looking to add significant value to their data should consider the merits of creating synthetic data responses.

Synthetic data is modelled on live traffic but contains none of the sensitive records, ensuring good data governance.

Back-end data is then used to build up these models to contain richer, more intelligent responses than live or manually-created traffic can ever provide.

1 2 Page 1
Page 1 of 2
7 secrets of successful remote IT teams