The Cost Of Dirty Data

Dirty Data

Speculation of a ‘data doomsday’ – the point at which companies are so overwhelmed by data that they are frozen into inaction – is becoming rife. Although 99% of the 1,200 companies surveyed now have a strategy in place to manage data quality, it seems that 91% still struggle with basic issues around it.

Recent research has shown that of the 1,206 companies surveyed across the UK, US, France, Germany, Spain and The Netherlands, a worrying 94% believe they have poor data quality in their organisation. It was also found that data inaccuracy was up by 5% (from 17% in 2013 to 22% this year), with 59% saying that human error is responsible for the majority of their inaccurate data.

This poor quality data problem is resulting in 75% of UK organisations surveyed losing potential revenue, with the average organisation losing 14% of its income because of data anomalies. This is costing the economy a minimum of £200 million collectively.

It also affects a business’ ability to gain any kind of competitive edge – hampering its efforts to deliver a personalised cross-channel customer experience. With the businesses now using an average of 3.2 channels to engage with their customers, it’s important to ensure that the ever-increasing amount of data flowing into an organisation is properly managed, processed, and stored – as this will impact accuracy and integration across the board.

However, 42% of companies currently believe inaccurate data is to blame for the difficulties they are having in this area.

40% of businesses also say that poor quality data is preventing them from generating meaningful Business Intelligence and Analytics – another area that will affect future potential. After all, how can you plan for, and rise to the challenges of, the future, without meaningful insight into your current performance?

All of this means that defining a data quality strategy is not a tick box exercise – it’s an essential one. And an integral part of that strategy should, in my view, be the presence of a Chief Data Officer (CDO).

However, at present it seems that only 30% of respondents put a single director in charge of data quality for the whole organisation. In order to stay in the game it’s essential that more businesses act now to give one person ownership of this critical issue.

Some may be concerned that a CDO may overlap with the role of the Chief Information Officer (CIO). However, the reality is business related roles such as the CDO and even the CMO need to play a greater role in aligning technology spend with business requirements.

For example, businesses need to move away from the current preference for manual processes. Call centres are still one of the most common ways to collect information on a customer (54%), but this is where most inaccuracies seem to be generated (52%).

It’s also essential to make use of the data quality tools available. At present only 38% of respondents are using specialised software to check data at the point of capture, while 34% use software to clean it after it has been collected. A CDO will ensure that such data improvement initiatives are effectively scoped and have appropriate metrics in place to diagnose success and failure.

Only through decisive action can businesses stop the data quality problem getting even worse – and for those that invest in getting ahead of the game, making their data work for them in a way that has a real impact on the bottom line.

Joel Curry

Joel Curry is managing director of Experian Data Quality, previously known as Experian QAS. Joel joined Experian QAS in 1997 as a Sales Account Manager in the London-based operations. In 2000, he relocated to Boston to set-up the US sales operation where he successfully built the first sales team. Prior to Experian QAS, Joel Curry worked in sales at ATI Technologies. Joel earned a degree in Human Physiology from the University of Birmingham in Birmingham and a MBA from City University Business School in London.