The data revolution
The business case for master data management and data quality.
The Victorian Industrial Revolution in the 1700s truly revolutionised the world, changing the way people travel, work, eat and live.
However, the Industrial Revolution resulted from the pebble thrown in the pond by way of the Agricultural Revolution. Britain's colonial dominion in the world gave it access to a vast agricultural diversity, and its influence and investment in these colonies resulted in significant technological innovations and developments, increasing the productivity of farms.
This Agricultural Revolution then resulted in excess wealth, raw produce such as food and especially cotton, as well as spare workforce capacity, as farm workers migrated to urban areas in search of work. Surplus produce and population shifts resulted in a dire need to process and distribute the produce. The excess wealth was sensibly applied to spur technological developments in automation (most significantly, probably, the textile industry), metallurgy and transportation, which was effectively empowered by one key innovative breakthrough: Deriving coke from coal as a key energy source for the steam engine and numerous manufacturing machines.
The outcome of the Industrial Revolution is life as we know it in the global village, where technological innovation is the norm rather than the exception, together with all its social ramifications of unemployment, urbanisation, increase in crime, etc.
This brief history lesson sets the context for the data revolution, which is a natural progression from its agricultural pebble in the pond - the digital revolution. Developments in digital storing and digital processing, together with the Internet and social media since the 2000s, now leave the industry at a precipice: there is a data explosion on hand, with data being "excess produce", and digital and data technological innovation being "excess wealth". People must learn very quickly how to make sense of all the data at hand, before it explodes and pushes everything and everyone off the precipice.
Big data technology would be the one innovation to highlight, as I believe it is the analogous "coke derived from coal" that will fuel the data future. However, big data is not the silver bullet that will ensure a bright future. It is merely an innovative resource that needs to be honed and applied mindfully to ensure return on investment. Quoting Gartner from its Top 10 Strategic Trends for 2015 when referring to trend number four, analytics: "Big data remains an important enabler for this trend, but the focus needs to shift to thinking about big questions and answers first, and big data second - the value is in the answers, not the data."
So, what key practices are needed to transform the data explosion into a data revolution? Big data innovation needs to be accompanied by the technology and disciplines of master data management (MDM) and data quality management, similar to manufacturing developments being accompanied by rigid health and safety regulations and quality standards.
The whole truth
Data quality management disciplines ensure the big data generated or leveraged effectively reflects and represents real-life truths. Just like a consumer wouldn't like finding out a take-away burger is produced from rat meat, the consumer would also not like to discover that decisions being made based on an understanding of customers in South Africa, was in fact data collected about people from a different nationality living in the US.
The Industrial Revolution resulted from the pebble thrown in the pond by way of the agricultural revolution.
Granted, that's an extreme example, but it illustrates the importance of data quality management. Data quality technology enables users to measure and monitor data quality in all the diverse data stores. Best practice data quality discipline is to implement controls in source to prevent data quality degradation, but data quality tools also enable reactive data cleansing and improvements.
Master data management enables users to manage the contextual data relating to their key data entities to set standards, ensure consistencies, and increase confident interpretation of the data trends and analytics. I would imagine a vehicle manufacturing plant driving off into the abyss of bankruptcy if what it thought was a stainless steel exhaust was in fact made of PVC. Just so, effective master data management is crucial to ensure all the various stakeholders in the value chain understand the meaning of all the descriptive or contextual data elements of their key data entities, such as customer, product, campaign, or even organisational structure. Beyond just understanding the meaning, it is crucial that all stakeholders have access to the same consistent view of such data.
In May 2015, Germany announced its aggressive investment to initiate the fourth industrial revolution, referred to as Industry 4.0. The essence of Industry 4.0 is "smart factories" based on artificial intelligence in all aspects of the manufacturing value chain.
This artificial intelligence will be dependent on impeccable big data to learn from, but with the current state of data, where debates about inconsistent figures on reports is still pervasive in most boardrooms, I predict "smart factories" driven by "confused intelligence". Industry 4.0 must be preceded by a data revolution, which cannot be achieved without effective MDM and data quality management.
In August 2014, UN secretary Ban Ki Moon issued a mandate for UN members to bring about a data revolution to improve reporting on sustainable development. May the private sector be the leaders and catalysts for this, and not the followers!
Yolanda Smit is a strategic BI manager at PBT Group, specifically focused on developing and driving strategic blueprint roadmaps for business intelligence, master data management and other information management initiatives for clients in support of strategic achievement. Her understanding of the challenges of strategic execution converges with her passion for information insights as the driver for success to offer clients a unique approach to turning information into a concrete asset. Smit started off as a junior BI ETL and front-end developer at Harvey Jones Systems, and in a period of five years, established herself as an all-rounder with business and technical insight. Smit joined PBT Group in 2009, where she continues to consult in various industries and covers all verticals within the organisation. She completed her MBA with a specialisation in management consulting in 2008 at the University of Stellenbosch Business School. Smitâs MBA research evaluated the effectiveness of existing BI offerings in supporting corporate strategy execution, which she completed Cum Laude. She developed a tested model for business intelligence teams to guide them in how to shape their BI offering specifically geared towards the improvement of overall strategy execution.