Subscribe

The four attributes of data quality

Data must be considered in terms of its validity, completeness, consistency and correctness.

Mervyn Mooi
By Mervyn Mooi, Director of Knowledge Integration Dynamics (KID) and represents the ICT services arm of the Thesele Group.
Johannesburg, 27 Jul 2009

Nothing focuses the mind on data quality quite as much as CRM. The moment the injunction comes from on high that management wants to run a CRM campaign, or reinvigorate an existing one, the onus falls on IT to get the organisation's data right.

Let's look at two examples of the type of CRM campaigns management might be considering:

* Boost company profit by targeting share of wallet, rather than share of market. This requires that management is able to segment customers by current and future (or lifetime) value, an exercise which typically operates against two axes: one of frequency, the other of recency. Once management understands which customers have the highest current and future value, they can target and communicate with them differently, and through the appropriate channel. But the foundation of this exercise is accurate, reliable data, and the absence of this in many organisations means the CRM campaign dies stillborn.
* Reduce churn through the ability to predict it with a relative degree of accuracy. Certain behaviour patterns manifest when a customer intends to defect to a competitor, and by being able to predict and identify those behaviour patterns, a company can significantly reduce its churn. Again, it all begins with correct customer data, which allows customers to be segmented by value so the correct communication channels and effort can be applied.

The problem in both cases is that the customer data is likely to be incomplete and inaccurate. That is the case in almost every company.

The data riddle

Data quality represents one of the most persistent and challenging conundrums for business, but it must be addressed, irrespective of the business issues (many silos, for example). In cascading order of difficulty, a company must consider data in terms of its validity, completeness, consistency and correctness.

Data quality represents one of the most persistent and challenging conundrums for business, but it must be addressed, irrespective of the business issues.

Mervyn Mooi is director of Knowledge Integration Dynamics.

* Validity: This is the easiest to address - an incomplete or wrongly structured telephone number, a postal code that is plainly wrong, for instance. There are many applications that can automate this process, including those which validate data at point of entry and trap error. But, and this is especially important in South Africa, operator error leads to the creation and perpetuation of errors that are not that easy to trap, such as the misspelling of people's names; it is a general rule that if there are two ways of spelling a person's name, data capturers will opt for the wrong one (Procter or Proctor, for instance). For proof of this, consider in how many different ways your own name is misspelt in postal items you receive.
* Completeness: Customers' names show up in many applications, from CRM to inventory, from service desk to resource management, and unless a foundation of data relationships is established, with bidirectional updating capability, data will be and remain incomplete.
* Consistency: This is inordinately difficult to achieve. Its resolution requires the application of a metadata layer, without which companies are just not going to enjoy data consistency across divisions and applications, customers are going to receive a fragmented experience, and the company will miss revenue generation opportunities.
* Correctness: Data may be valid, complete and consistent, but unless it is correct, the other three tests can be rendered irrelevant. No technology will pick up incorrect data: such as an entry that wrongly reflects equipment a customer has bought. This will have the dual effect of irritating the customer when service is required, and of costing the company revenue; on the one hand it seeks to find the error, and as it loses customers, services the wrong equipment and holds the wrong spares in inventory.

There are many approaches to solving the ongoing data quality conundrum, but one of the most promising is that which combines master data management, or MDM, with services-oriented architecture, or SOA. This binds master reference data tightly with the applications, or processes (services) which generate the data. It holds a great deal of promise and deserves to be explored.

* Mervyn Mooi is director of Knowledge Integration Dynamics.

Share