Data analytics going boldly into the future
Controlling costs and optimising performance from IT assets is not a new mandate for corporate IT execs - particularly in the data analytics arena.
But there`s a slight change in the way managers are approaching it these days - because enterprises want operational analytics, analytics that feeds them corporate performance management info and operational business intelligence. Those two capabilities drive a number of benefits, not least among which are better customer retention and greater customer value.
And the change? Gartner reckons enterprises are beginning to accept a higher initial cost for lower ongoing costs and better performance.
As a result, we see that data architecture is evolving - and that means it is one of the most exciting times to be a data-head. Sure, the future of analytics is predictive and it will see the incorporation of social media data into the fold, but right now, on the ground in South African enterprises, slowly receding into memory are the dark days when you bunged a load of data into a warehouse and accessed it as needed.
Replacing them are lighter times - that demand you be on top of your game - in which you must deliver data to those who need it and help them to make sense of it, putting it in context and relating it back to their needs. It`s proactive and not reactive.
Enterprises today have tons of data, but it is making sense of the data - turning it into information and putting it into the hands of the right people - that is really the key to maximising its value.
As an example: you have customer data, information about what customers bought, where and when, how much they paid, and then you have product data, distribution and supply chain data, and you have HR data. These undoubtedly sit in different systems, possibly even in different geographic locations. You need to combine the data so that you can begin to mine it and make some business decisions.
There`s much integration and data quality work that goes on in the background, but so far that`s nothing new. And if you know your data from your information, your analytics from your intelligence, and your dashboard from your tool (not always easily identifiable), you get a data warehouse together, copy all the data from the original sources and bung it into a warehouse. But that`s a slow process. It requires data replication - always a problem. And it means you`re basing your decisions on what happened yesterday, last week, or even last month.
Surely it would be far better to do without the data warehouse. That way you could save costs - warehouses are terrifically expensive, and then there`s the ongoing cost of running and maintaining them. Using operational stores means no duplication of data. It also means using current data - literally up to the minute.
The ramifications are that this architecture grows the online transaction processing systems into data warehouses and business intelligence back-ends. Executives probably don`t care about that, though. You can tell them it saves money and it gives them up-to-the-minute data with which to work.
But surely one of the biggest problems is the performance hit that the systems will take. Not so. The architecture uses data appliance technology to create a virtual warehouse, and a business intelligence model is created using standard query and reporting tools - no need to acquire additional warehousing skills - plugged into the appliance. And it is supported by a high-performance virtual hardware environment.
And then it goes one step further: it sweats existing assets by using the disaster recovery system to shadow the appliance and load-balance processing on the appliance.
It`s going to fundamentally change vendor market dynamics because modern analytics is the final frontier - enterprises boldly going where none has yet ventured.