Subscribe

New BI architectures needed

Alex Kayle
By Alex Kayle, Senior portals journalist
Johannesburg, 28 Feb 2012

The business intelligence (BI) world is moving into the second era of data warehousing in order to meet new requirements from business users. By 2012, business units will control at least 40% of the total budget for BI.

This is according to Rick van der Lans, international BI expert from R20/Consultancy, who gave an overview of global trends in the BI arena during this morning's keynote address at the ITWeb BI Summit, at The Forum, in Bryanston.

“New technologies create new opportunities. With BI, the technology and the requirements are there. We need to change our architectures. The whole concept of a chain of databases is the end of the era. We are entering a new era of data virtualisation,” said Van der Lans.

Data virtualisation is a software layer between reports and data stores. According to Van der Lans, data virtualisation enables IT to decouple applications from storage structures. He says IT can redirect data marts and build a much simpler architecture.

“All organisations use various tools from various vendors and have metadata specifications that are replicated all over. As soon as an organisation uses data virtualisation servers, all the tools use one specification. This means that data virtualisation can enable BI architecture to bring metadata in a centralised way.

“Data virtualisation can bridge the gap between replicated data stores and what business users need in their BI environments.”

He added: “The data warehousing system that IT has been building for the last 15 years does not have the architecture that can cope with new requirements of managing huge volumes of data in order to make fast business decisions.”

According to Aberdeen Research, there are problems with current data warehouse platforms, 45% were not happy with the query performance they were getting from their data servers. Other challenges include inadequate data load speed and the high cost of scaling.

Says Van der Lans: “Business pressures demand a more flexible approach to BI delivery, yet 42% of enterprises report that making timely decisions is becoming more difficult.

“The average time needed to add a new data source was 8.4 weeks in 2009, 7.4 weeks in 2010, and 7.8 weeks in 2011. Around 33% of respondents say they need more than three months to deploy a new data source. Developing a complex report or dashboard took an average of seven weeks in 2011.”

According to Van der Lans, business users demand more information and they need it faster than ever before. Added to this challenge is the growing volume of data, which is creating more complex data warehouses, and IT staff face growing backlogs of information requests. “We don't have the flexibility anymore in current architectures to give users what they need.”

Share