Subscribe

Building a data-centric foundation

By Charmaine Shangase
Johannesburg, 06 Oct 2016

Join the free-to-attend Webinar, brought to you by Hewlett Packard Enterprise (HPE) in partnership with ITWeb, and hear how you can use the HPE workload-optimised infrastructure to solve big data problems. Click here to book your space.

Big data is not just a tool. By leveraging, processing and understanding the data, it is possible to get insights that have never been visible before.

This is according to Greig Lilienfeld, senior data centre architect at Hewlett Packard Enterprise (HPE).

"One obvious advantage of having these insights is gaining competitive advantage over peers in the industry," he continued.

As much as he believes South African companies are not yet using big data insights to their maximum, he said there is an awareness of its capabilities. "I believe that South African companies are definitely becoming aware of the potential value that big data insights could bring to the table, but the majority of companies are far off implementing a valuable solution. I was chatting to a number of delegates attending the 2016 Gartner Symposium, in Cape Town, and there is a lot of energy being demonstrated around what they believe they may be able to achieve with big data insights, but all have voiced their challenges of not knowing where to begin," Lilienfeld said.

To improve this situation, companies must tap into the knowledge available from experts in the industry by starting with a "basics" workshop, he advised. "Most vendors offer a short discovery workshop that can help with understanding the basics of big data and the challenges associated with it. The workshops will help customers understand 'what' they can get out of big data, 'who' needs to be involved, 'where' the data should come from and 'how' to kick-start the journey. These workshops would also help to align business and IT strategies, which often differ."

Lilienfeld said there is no one-size-fits-all solution: "Factors such as budget, data source, data quality, data security and privacy, as well as company standards, all play a major role in deciding what solution approach should be followed.

"It's important to both understand and differentiate the various components that make up a big data solution. From a bird's eye-view, the three high-level components are the data source (where the data is mined, ie, both structured and unstructured data sources), the data lake (where the sourced data is stored to be analysed), and the analytics component (contextualising, sorting and extrapolating valuable information from the stored data).

"The quality of data is equally important - you have to ensure the data you will be working with will provide you with the right answers. The saying, 'Garbage in, garbage out' is far more important when you are working with hundreds or even petabytes of data," he elaborated.

Greig Lilienfeld, senior data centre architect at Hewlett Packard Enterprise, will conduct a Webinar to help you build a data-centric foundation.
Greig Lilienfeld, senior data centre architect at Hewlett Packard Enterprise, will conduct a Webinar to help you build a data-centric foundation.

"At each point, there are choices that can tailor the end-to-end solution. The frequently overlooked component in this story is the data collection component, yet this can make or break a valuable solution. Where and how should one store this 'mined' data that can be easily accessed to turn it into valuable insights?" asked Lilienfeld.

"In the long term, businesses will become more agile and responsive to current and future customers, by predicting their needs and future trends. Furthermore, the knowledge gained will proactively help avoid critical business traps," he concluded.

Lilienfeld will present on the free-to-attend Webinar, presented by HPE in association with ITWeb Events. This Webinar will aid those attending with methodologies in building big data solutions along with HPE's partnerships in the space.

Share