Organisations that have implemented substantial data integration architectures can save more than US$500,000 annually by rationalising tools in the short term and adopting a shared-services model in the longer term, according to Gartner. Deployment of multiple and functionally overlapping data integration tools creates excessive cost in terms of software licensing, maintenance, and skills of up to $250,000 per tool annually.
“Organisations often purchase and implement new data integration tools in a fragmented way without considering extending investments already made in other parts of the business, resulting in multiple tools from various vendors,” said Ted Friedman, vice president and analyst at Gartner. “The first step is for IT teams focused on data integration to save money by rationalising tools. Further, there is a greater longer-term opportunity to substantially reduce costs and increase efficiency and quality by moving to a shared-services model for the associated skills and computing infrastructure.”
As organisations in all industries continue to focus heavily on cost optimisation, various aspects of IT represent potential for removing cost. The imperative to increase efficiency, combined with the historically fragmented and tactical approach to data integration that is commonplace in most businesses, is now driving organisations to rethink how they have approached this discipline.
Gartner recommends that organisations consider executing three elements of rationalisation in the short term:
1. Rationalise Data Integration Tools
Planners should rationalise across the three main categories of data integration tools: extraction, transformation and loading (ETL); data replication; and data federation, ideally arriving at a standard tool for each of these styles of data delivery. They should decide which tools to keep and which to discontinue based on the business context and requirements, rather than blindly rationalising wherever possible based purely on cost. In addition, organisations should consider saving costs by using the data integration tools that are provided at no additional cost with database management system (DBMS) products and open-source data integration tools. However, they must also consider the investment required in re-design, re-development and testing to migrate existing data integration processes from one toolset to another, as well as the relative immaturity of many of these lower-cost solutions in comparison to incumbent products.
2. Centralise Data Integration Computing Infrastructure
Organisations with multiple data integration tools typically deploy each tool on dedicated hardware, resulting in redundant servers and storage. With hardware costs for data integration tools deployment often ranging from $50,000 and upward, many organisations can make substantial savings on computing capacity by implementing shared computing infrastructure for data integration workload. These savings are possible whether the organisation rationalises tools or not. In addition, substantial savings in productivity and time to delivery are gained because each new project team requiring infrastructure for data integration workloads can leverage (or pay to expand) the shared hardware, removing the need to select, procure, implement, and support project-specific hardware.
3. Consolidating Data Integration Roles and Skills
The skills involved in data integration are typically fragmented across the organisation, and are not always utilised to capacity, depending on project size, timing of phases, level of complexity, and other factors. Gartner recommends that organisations centralise these roles and skills into a shared services team model to reduce staffing costs directly by 50 per cent or more each year.
“Rationalisation limited to one business unit may not optimise cost savings,” concluded Mr Friedman. “For organisations to achieve savings of more than $500,000 per year, CIOs and data integration teams should work together to lead the rationalisation and shared-services programme.”