The final hurdle
You're on the home stretch to implementing a cloud BI solution - but first consider your solution implementation options.
In my previous Industry Insights, I defined certain important points to consider when defining a cloud BI strategy. I emphasised the importance of understanding the potential risks, and the critical success factors on the road to the cloud for BI.
At this stage, a company should have a clear idea of its cloud BI strategy and what its use cases and requirements are. The company knows exactly what risks it faces, what to do to mitigate these risks, and is aware of what the cloud BI solution is capable of achieving. Having an understanding of these aspects will give the company everything it needs to choose the most appropriate product and vendor.
However, before beginning, the company must carefully consider its solution implementation options. Remember - data, reporting, visualisation, analytics and many combinations of these can be implemented in the cloud. This is not a simple matter - there are a few points that need to be carefully considered.
The solution a company chooses to implement here needs to be determined based on the business case it has defined. Don't fall into the trap of letting this become an IT exercise to investigate and prove the cloud adoption for the entire enterprise. Maintain the focus on the existing business case.
All for one
If the data and visualisation/analytics are all required in the cloud, this needs to be done in one go. Why do I say this? Isn't it a good thing to break down the work into smaller cycles? In most cases, absolutely. However, the following must be considered:
* Hosting data in the cloud, while visualisation remains on-premises:
The impact of this is all client-side reporting and analytics tools need to download the data in order to process. This could be a once-off, but could also happen every time a process is executed. The company must ask if the client-side tools are compatible with the cloud data; do they need to be upgraded, or completely swapped out? Can the network support such data traffic?
* Hosting visualisation in the cloud, while data remains on-premises:
This scenario is slightly better than the former, but is still not ideal, as the tools are still going to be burdened with the impact of having to continuously upload the data into the cloud.
Don't fall into the trap of letting this become an IT exercise.
Once the company has a clear view of its solution implementation scenarios, it can then begin the vendor and product selection process. This should be a formal process, and should consist of an initial round, where vendors and products are identified, and the vendors complete a questionnaire on the related products and services. Following this, a shortlist of vendors should be identified and included in a POC round, where a small-scale solution is implemented for assessment.
When selecting the products and vendors to be evaluated, it is important to remember the large volume of IP in the existing toolsets, and the potential costs of migrating this into a new platform. Therefore, while looking towards new, futuristic products, a company should include any products it already uses, which have cloud versions available.
The overall criteria that should be evaluated throughout this process includes the following key aspects specifically relating to cloud solutions:
* Cost: Estimating costs in cloud platforms can be tricky, as costs are often determined on a combination of storage, processing time and data exported from the cloud. A simple questionnaire will only provide a rudimentary sense of the costs. The POC, however, will provide the company with a useful environment that can be used to benchmark costs. These costs need to be evaluated within the context of the chosen solution implementation scenarios.
* Migration to cloud: Assess whether the product has any migration capability that can help reduce the migration cost. Additional focus should be placed on existing products in the company.
* Regional support: Make sure to understand the cloud provider's regional availability, where the company's data is located, and ensure the relevant failover data centre is not in a country where the company's data cannot be hosted.
* Customer support level: Understand the level of customer support and associated SLAs.
* Accessibility: Understand how access to cloud services would work, ie, Web portal, client side, list of compatible devices, etc.
* Downtime history: Assess and compare the downtime history (if any) of the various cloud vendors.
* Skill requirements: Unless the company is sticking with an existing vendor and product, it is a given that there will be a requirement to re-skill its staff in the new technology set. It is important to do a realistic evaluation of the skill level required, and type - in terms of languages, modelling, ETL, report creation, etc.
From start to finish, this is undoubtedly a long process, and could easily take six months or more. The rewards, however, should be evident. By following this process, a company will have a clear view of its cloud BI strategy; its cloud platform has been chosen to support its business objectives; and it has a clear view of how to implement its solution, and what risks to avoid and/or manage.
This will give the company an extremely stable foundation, and it can now begin to implement its cloud BI solution with eyes wide open, fully informed and fully prepared.
Julian Thomas is principal consultant at PBT Group, specialising in delivering solutions in data warehousing, business intelligence, master data management and data quality control. In addition, he assists clients in defining strategies for the implementation of business intelligence competency centres, and implementation roadmaps for a wide range of information management solutions. Thomas has spent most of his career as a consultant in South Africa, and has implemented information management solutions across the continent, using a wide range of technologies. His experience in the industry has convinced him of the importance of hybrid disciplines, in both solution delivery and development. In addition, he has learned the value of robust and flexible ETL frameworks, and has successfully built and implemented complementary frameworks across multiple technologies.