BI of the future
Business intelligence must address an expanded set of data needs, with a convergence of different technologies.
There's an interesting buzz in the business intelligence (BI) industry, where more and more clients are starting to ask what BI of the future looks like. I think there are a few factors that drive this question.
Vendors' messaging to their target market is definitely a strong driver that's causing users of their technology to start wondering. The typical messages from vendors are that BI is dead and analytics replaces it. Depending on the vendor's own internal strategy, it might either be pushing for cloud platforms for analytics like the big four (SAP, Oracle, IBM and Microsoft), or some of the more niche players are punting visualisation or advanced analytics as if it's the be-all and end-all of the 'new BI'. And if the market is not confused yet, they'll throw words like search and text analytics, self-service, in-memory, or agile into the mix.
In the process of answering this question with my clients in the last year, I realised that companies have to get past the hype, and get back to understanding that all the available forms of BI exist for one reason only, and that is to meet actual business needs. Therefore, instead of asking what the future of BI will look like, rather ask what the future of the company's BI should look like to meet the needs of its information users and support its organisational strategy.
Granted, each company is not 100% unique, and there are a few common business change trends that I've picked up in my experience with a variety of clients, where new and alternative BI technology approaches should be considered in order to adapt to the changing needs of business.
The first common trend speaks to how the general profile of data sources has changed over time. Traditionally, users' needs were satisfied by providing them with operational data brought into a central data warehouse, combined with a stock-standard stack of BI tools (reports, scorecards, and dashboards).
However, now more users are struggling with additional, non-operational data from various sources, ranging from internal spreadsheets to external cloud data providers like Google Analytics. This results in the users creating manual work-arounds to cope with the diverse data, leaving the BI users feeling overwhelmed and constrained. Therefore, the BI of the future must address far more diversified data needs with a convergence of varied technologies specialised to different scenarios.
Just as the data source diversifies, the profile of BI users also changes significantly, leading to the second trend I've picked up on. The traditional user base was predominantly management decision-makers on all levels, supported by power users. The changing trend sees this profile being extended by adding a large operational consumer layer. Information is more pervasively consumed directly by operational systems to drive rule engines, enabling operational decisions as part of the workflow.
On the other extreme, increased sophistication in power users gives rise to a highly specialised community of data scientists needing advanced technologies such as big data, predictive and prescriptive analytics, and even machine learning for building operational intelligence into the operational systems. These data scientists are also the typical users that require far more self-service power in their BI tools.
This democratisation of data requires a paradigm shift that makes data central to IT and BI capability development in order to ensure the business intelligence competency centre's (BICC's) ability to effectively service the needs of the end-users. The traditional engagement model between business and IT has left BI orphaned and treated as an after-thought, but as BI becomes operationalised and more strategically relevant, data considerations must become central to the systems development life cycle.
All the available forms of BI exist for one reason only, and that is to meet actual business needs.
Finally, BI needs to deliver at the speed of decisions. Traditionally, daily, weekly, monthly data refreshes were sufficient, and business accepted a six to 18-month timeframe for delivering new capabilities. Today's pace of business has increased significantly. Companies must be agile and adapt their tactics and strategies in-flight in order to remain competitive. The operational dependencies on BI, therefore, implies a dire need for faster refreshes (near real-time) and more agile and flexible delivery cycles.
The last, and potentially most disruptive, factor is increased regulation. This trend is especially common in the financial services industry, but impacts more industries (especially companies that have crossed into providing financial services as value-added services). Corporate regulations like King III have matured over the last decade, to the point where detailed regulations are more explicitly touching on management and the use of data and information to ensure reliable decision-making on a corporate governance level.
Besides corporate regulation Acts like POPI, the Amendment Bill to the ECT Act has significant implications to what companies may or may not do with their data, forcing BICCs to revisit their own methodologies, practices, and governance.
The implications on these common business and environmental trends lead to the answers; the main issues the BI ecosystem of the future must cater for:
* The rapid delivery of information supported by conventional BI capabilities, integrated with next-generation architectures, to include data discovery, data cataloguing, data integration, data virtualisation, advanced analytics and more;
* Underpinned by a more agile BI delivery that enables tangible business value through data science at the speed companies make decisions; and
* Carefully governing all components of the ecosystem in order to protect the quintessence of BI: the single version of the truth.
Inevitably, a highly complex ecosystem such as this requires conscious stewardship, starting with a well-rounded, robust and sustainable strategy, with the strategy becoming the driving factor of what a company's BI should look like in the future.
A clear strategy empowers BI decision-makers to wade through the hype and identify the various innovative BI technologies that best suit their companies' needs, and thereby become the creators of their own BI fate.
Yolanda Smit is a strategic BI manager at PBT Group, specifically focused on developing and driving strategic blueprint roadmaps for business intelligence, master data management and other information management initiatives for clients in support of strategic achievement. Her understanding of the challenges of strategic execution converges with her passion for information insights as the driver for success to offer clients a unique approach to turning information into a concrete asset. Smit started off as a junior BI ETL and front-end developer at Harvey Jones Systems, and in a period of five years, established herself as an all-rounder with business and technical insight. Smit joined PBT Group in 2009, where she continues to consult in various industries and covers all verticals within the organisation. She completed her MBA with a specialisation in management consulting in 2008 at the University of Stellenbosch Business School. Smitâs MBA research evaluated the effectiveness of existing BI offerings in supporting corporate strategy execution, which she completed Cum Laude. She developed a tested model for business intelligence teams to guide them in how to shape their BI offering specifically geared towards the improvement of overall strategy execution.