Subscribe

Using analytics to gain competitive-edge

The correct approach to business intelligence can help a company to remodel and enhance its customer experience, and boost its competitive advantage.
Marko Salic
By Marko Salic, CEO of the Argility Technology Group
Johannesburg, 20 Aug 2020

Gartner advises that IT can use advanced analytics to gain competitive business advantage and that with the right approach business intelligence (BI) can be a leading source of competitive advantage.

Certainly, it can be used to remodel and enhance the customer experience.

Forrester notes that everyone wants to monetise their data, but the growing stream of data and analytics requests are overwhelming IT teams charged with gleaning insights from the material. While there is no crystal ball to see into the future, predictive analytics provide the means to make an educated guess.

Forrester goes on to highlight the 37 Major Machine-Learning Tools for 2020, noting that enterprises need more artificial intelligence (AI) and machine-learning (ML) solutions to drive value, transform their businesses and outperform the competition.

The starting point

Firstly, define clear and measurable goals and outcomes you expect from the data strategy. Without the ability to measure, uplift and estimate ROI, it becomes difficult to build stakeholder support.

Secondly, appoint a data analytics "champion" − an executive, or a person with authority, to spearhead the implementation of the organisation's data strategy. Change management is crucial for success as the workforce transitions from using gut feel and experience, to using data and analytics to make better decisions.

Thirdly, select the correct technology partner to help implement the data analytics strategy. Ideally, the selected technology partner should have necessary data engineering, data science and development capabilities and subject matter expertise in the company’s industry.

As AI prediction capabilities improve, the value of human prediction decreases, and the value of human judgement increases.

The latter will mean that data is well understood with early detection of consistency issues. Moreover, it ensures the right questions are asked, and raw insight can be converted into actions.

Lastly, select the correct platform for the data lakes, extract, transform and load, analytics and machine learning. If the organisation's data volume, velocity and variety are broad and/or it expects significant growth, cloud tools are the only way to go.

The alternative is to build and use own infrastructure and data analytics tools which require a team of data engineers and DevOps engineers to manage and support, inflating the costs and risks significantly.

The next step

Data analytics is about extracting trends and patterns in historic data and using that insight to make better decisions and influence future outcomes.

Analytics and machine learning have become highly effective in making predictions, but AI does not do judgement. As AI prediction capabilities improve, the value of human prediction decreases, and the value of human judgement increases. Ensure you are utilising both to maximise the benefits; don't allow analytics alone to make automated decisions until the models have been battle-tested and polished over time.

Data tiering is a way to optimise how you operationally use the data, and drive costs down. For example, data accessed infrequently or not mission-critical can be stored on slower and more affordable storage.

Analytics databases are different to operational, high transaction volume databases and each comes with pros and cons. Understanding the business goals and how the insights will be used will help to tier the data and tools correctly.

The company will need to devise protection, governance and redundancy policies and there are two ways of going about this: the easy or the hard way.

The easy way is to use cloud tools such as GCP, AWS and Azure. When using cloud tools data security, governance and redundancy is a first-class citizen, simple to implement and manage.

I often hear concerns and questions raised, regarding the security of the cloud. The answer is simple: read the security policies for your favourite cloud provider and ask yourself how long it would take to internally achieve that level of security compliance and standards, if you decided to build it internally?

The hard way is to build and manage it all in-house. My advice is, only explore this option if there are regulatory or compliance reasons preventing you from using the cloud. In that case, regulation would already specify how to manage security and governance, and it becomes a question of "how" to implement as opposed to "what" to implement.

How will the data be used?

If there was a one-size-fits-all approach, everyone would be following the same process. It's crucial that there is a clear understanding of how the data will be used, what and how many sources there are, also how "clean" is the data, etc. Only then can you define the correct data ingestion strategy.

As a rule, start by building a data lake and dumping all the data, no matter how "dirty", into the data lake. Then data teams can incrementally start building an "analytics" storage and data pipelines from this central lake.

Once the data is imported and analysed, the process of developing individual "use case"-driven data pipelines begins.

One of the hardest and most time-consuming steps in data analytics is data preparation and extract, transform and load processes, as it often happens that 80% of time is spent getting this step right.

If the data is not clean, the results will be poor and inconsistent. Augmented analytics is the process of using technology and tools such as machine learning and AI to augment how data is ingested, transformed and analysed.

Natural language processing (NLP) is another tool or algorithm to be used in analytics pipelines. It's especially useful in building chatbots and conversational utilities. But the principles of how NLP works can be used in various other analytics applications from recommended engines to data extraction and preparation steps.

Share