Why is improved data capturing not improving manufacturing performance?


Johannesburg, 27 Aug 2019
Lance Zikalala, MD, nCoded Solutions
Lance Zikalala, MD, nCoded Solutions

Data. Insight. Planning. In the cutthroat world of manufacturing, where every minute counts and every decision creates a ripple effect, the prospects offered by those three words are more than enticing. Each manufacturing line generates information that can be used to accelerate operations, improve agility and create new efficiencies. Thus it came to be that many manufacturers invested in better data capturing.

A common example is using bar codes to scan items and track them through automated processes. Another is to make it easy for staff to record starts, stops and other events on the floor using interactive displays or mobile devices. These are funnelled into a dashboard environment where planners can make more confident and dynamic changes to schedules.

Some operations rely on an ERP (enterprise resource planning) system, while more advanced and effective solutions also incorporate an APS (advanced planning and scheduling) service. But the promised results don’t always arrive.

“There is no argument left against running a modern, data-centric manufacturing environment,” says Lance Zikalala, MD of nCoded Systems. “Manufacturing lines already generate a lot of useful information from the machines, staff, logistics and tracking of materials. But these are often not being put to good use and thus can’t accelerate planning or make it more agile.

“So modernising a manufacturing environment to tap into all that intelligence is a very potent investment and how all future manufacturing will be operated. But just capturing and viewing the data is not enough, which is why many modernisation projects don’t end up delivering value.”

Three steps of data

Capturing data is the first of three steps to reach this planning ideal, the other two being the administration and reporting of the data. What often happens is that stakeholders focus on the first and last steps, as they are the most visibly effective components. But many then ignore the unseen role of data administration and vetting.

“It’s not a new story – we all know the familiar saying, ‘rubbish in, rubbish out’,” Zikalala explains. “But manufacturers overlook this for two reasons. The scale and scope of a modernisation project can focus on visible deliverables, of which capturing and reporting serve very well. So they forget that there is an administrative requirement. They also underestimate how complex raw data can be. We have many more data points out there that can be captured. When companies fail to distinguish between what they need, would like to have and don’t need, then they just end up with a heap of rubbish.”

Poorly structured data causes more problems than just limiting the value of a modern scheduling and monitoring system. Comparative analysis with more mature data sets is not effective, and any futuristic applications, such as analytics using machine learning (a type of artificial intelligence), stay out of reach.

At a more fundamental level, if data isn’t being vetted properly, problems with input methods won’t be spotted. Without clean data, continuous improvement of a modern manufacturing environment is not possible.

Getting data vetted and clean

But capturing is inherently messy – the administration of data is where data gets cleaned, vetted and organised. This is a job for data administrators, roles that are chronically undervalued in many organisations, as well as smarter use of automation through programmable logic controllers (PLCs) that can monitor for any abnormal behaviour or outputs.

Policies need to be created that define what data is and isn’t needed. Much of this takes place in the data staging area, the spaces where data is moved after capture and before being sent to relevant stakeholders. Data can be structured and then moved to what is called an operational data store; here is where different parties can access the data for their respective systems.

Thus data administration is the most important functional part of the value chain. Capturing determines what kind of data can be made available and reporting determines what kind of data is needed. But capturing is often broad in scope and, to meet the needs of different parties, reporting is subjective. It lands on data administration’s shoulders to match the two requirements.

“Every manufacturer will have slightly unique data requirements,” says Zikalala. “But it’s the administration portion that holds it all together. This is where policies and data structure should be implemented, based on the feedback from the reporting side and matched to what the capturing side is providing. Administration is the stable point in a data value chain, which then makes it easier to spot, resolve and improve features in the capture and reporting areas.”

Data, like any fluid commodity or currency, requires discipline and a stable area from which other actions can be launched. This is the role of administrators, a combination of human expertise and automation. If your data is not providing the value you anticipated, take a closer look at the middle of the data value chain. You’ll find this is actually where it all begins…

Share