by Andy Hayler

Gauging data governance

Mar 21, 2011
IT StrategyTelecommunications Industry

For several years now there have been efforts to improve the consistency and quality of master data — the customer, product, asset and location information that is shared around an organisation.

Without consistent data, large firms struggle to measure their business performance properly, as no one can agree on what the correct numbers should be.

An enterprise software market has sprung up to support these initiatives, though they are inherently large-scale, lengthy projects. But it is clear that it is exceedingly difficult for such projects to be driven from the IT department.

By its nature, master data stretches across business domains, and so at some point hard questions have to be asked about which is really the definitive product code classification, or materials master, or list of strategic suppliers.

IT simply does not have the authority to make business departments change their way of doing things, so getting the business to take back ownership of their data is crucial. The name given to the activities associated with business ownership of data is data governance.

Data governance is not a technology, but, according to the Data Governance Institute, is “the exercise of decision-making and authority for data-related matters”.

It involves business people sitting around a table and agreeing who will be the ultimate authority for disputes about the definitions of key business data, and who will be responsible for ensuring its quality and its uniformity throughout an organisation.

Unfortunately, there is precious little hard data available about what people are actually doing with such initiatives: how big are they, do they work and what causes success and failure?

Late last year my organisation, The Information Difference, with the Data Governance Institute and a panel of global companies, carried out a project to help improve this state of affairs.

We defined a structure of data governance activities and asked firms to share their experiences and data within this framework.

We then approached the wider market looking for companies who already had live data governance programmes and asked them to submit their data to this benchmark exercise.

We were delighted to get 134 respondents. This sample size allowed some proper statistical analysis to be made. Here are some key findings:

– Only a little over half (55 per cent) the organisations had a written statement setting out the objectives of their data governance programme – Those who considered their data governance programmes to be successful generally had a clear and documented process for resolving disputes about data – 57 per cent of organisations rated their initiatives as quite or tolerably successful – Data governance programmes required a mean of four (median two) dedicated staff, supported by an average of nine (median three) part-time staff – Organisations required on average nine full-time data stewards (median four) – A scary 58 per cent of organisations confessed to not having any form of risk register, with only six per cent having an effective register in place – Only nine per cent enforced business rules in the source applications for all operational systems – Only 21 per cent of organisations regularly undertook data quality audits – A huge 46 per cent did not measure the quality of their data or assign a monetary value to errors caused by poor data

We were also able to conduct extensive statistical analysis into the behaviours of companies that had successful programmes. In summary, these had:

– A data governance mission statement – A clear and documented process for resolving disputes about data – Policies for controlling access to data – A proper register of business risks – Effective logical data models for key business data domains – Well documented business processes – Regular data quality assessments – A documented business case for data governance – Established a link between programme objectives and team or personal objectives – A comprehensive program of data governance training

So if you are about to start a data governance programme, taking the above steps will help you succeed.

We hope that this analysis will be of benefit to those considering setting up a data governance programme. Indeed organisations can now measure the success of their programme against this peer group by participating in the ongoing benchmarking exercise.

By putting in place a successful data governance programme alongside a project to improve the quality of master data, organisations can maximise their chances of success. This is still a difficult area, but the roadmap of how to sort out master data effectively is now a little clearer.

Andy Hayler is founder of research company The Information Difference. Previously, he founded data management firm Kalido after commercialising an in-house project at Shell