Subscribe

Quality is queen

Don't let bad data be your downfall.

Nicholas Bell
By Nicholas Bell, CEO of Decision Inc.
Johannesburg, 26 Jun 2014

In my previous Industry Insight, I outlined why effective decision-making needs a well-planned and implemented business intelligence (BI) roadmap and strategy. Poor decisions can be costly for a company, and executives have therefore developed a deeper appreciation for the importance of having access to correct, timely information to support the decision-making process.

For a well thought-out and relevant strategy to work, IT teams need to give users timely access to both internal and external data. Businesses rely heavily on data to make informed, accurate decisions. Information of inferior quality and that which is poorly populated can end up undermining all the efforts of the IT team in supporting the business.

Companies will usually invest a considerable amount in their systems, such as ERP, CRM, supply chain management and the like. When it comes to getting value from these systems, it is imperative they rely on the data contained in the system.

However, few companies give sufficient upfront thought to the quality and integrity of their data until it is called into question - usually long after there has been significant investment in time, money and resources.

Unreliable

Incorrect or poorly populated information on systems means companies essentially operate with blinkers on. Doubtful, deficient data means management information is unreliable, and will negatively impact the decision-making process. Any business decisions made on poor information may prove to be harmful, costly and will undermine trust in business intelligence projects. Trust is a serious issue; companies may spend a substantial sum of money on implementing their BI strategies, only to find the key supporters of the system have gone back to using Excel.

The reality is that inefficient management of data costs money and can affect reputation.

Data quality and proper data management should therefore be the top priority for IT teams before implementing BI initiatives. In fact, data quality should be a priority for any company that has information stored in multiple systems or divisions across the business, or for businesses that deal with large volumes of information that need to be consolidated and contextualised - whether it be for the purposes of analyses or simply to make contact with customers.

Companies need to get their data quality management right before they can even begin to extract what they need out of the information. This means ensuring their data is rich, complete, properly populated and in the right place. With data management, what companies put into a system is precisely what they get out of it.

Bad quality data is the result of poor data capturing, poorly integrated and disparate technological systems, as well as badly designed, planned and implemented business and operational processes and standards. Unfortunately, it is not always easy or even possible to control data entering a system.

Securing return on quality data

Employees and customers are data producers and are probably the main causes of poor quality data. People can be lazy and tend to do the bare minimum when it comes to capturing data. There is also the issue of human error and lack of user training.

So, what can be done to make sure data returned to users is reliable and accurate?

One of the first steps is to make sure the processes and standards are properly designed and implemented. Users also need to be well trained on an ongoing basis.

It is a matter of basic cleanliness.

IT teams that do not have the in-house capability could commission data quality consultants to conduct an analysis of why and where the quality of their data is being compromised, and advise them on selecting the most appropriate technology to address the problem.

Companies need to consider data quality tools to manage information quality. Data quality tools allow companies to build rules that help define data quality. Validation sequences can be created to prompt input as data is captured in order to control the quality of the information entering a system. These tools also enable companies to consolidate and contextualise information stored in different systems in different parts of the business. Once the data has gone through a data quality process, ad hoc checks of the data can be run regularly to highlight where information is missing, in duplicate or incomplete, using pattern matching techniques.

It is a matter of basic cleanliness. Companies need to clean and refresh their data on a regular basis.

Another aspect that IT teams need to consider about their data is timeliness. Once the BI strategy has been communicated to relevant stakeholders, users will demand their information now. It is critical that any data management initiative takes into account data volume and processing. It is important to plan for scalability, because successful BI initiatives drive demand for more data, faster, within the company.

A clear and carefully constructed plan on how to enable effective decision-making within the company must be underpinned by a solid data delivery plan. Data is the foundation to any BI solution, and the importance of quality and timeliness of data cannot be underestimated when forecasting the effectiveness of implementing the strategy.

Share