As organisations move from disparate proofs of concept (POCs) to enterprise-wide AI deployments, it is crucial for them to build high-quality, properly governed data foundations to ensure that their AI investments deliver the expected operational benefits, business value such as long-term efficiency, and ROI. This is according to speakers at an executive breakfast hosted by iOCO, in partnership with HPE, in Sandton this week.
The forum highlighted that a solid data foundation is the single most critical factor separating successful AI initiatives from failed experiments.
Candice Solomons, Business Executive: Data & Analytics at iOCO, said AI was generating both excitement and anxiety in enterprises – excitement about its potential and anxiety about execution.
“The real-world conversations CIOs are having now highlight issues such as strategy paralysis, cost explosions and flawed data causing failure points,” she said. “They face questions such as how to manage an AI strategy when AI evolution makes it a moving target, why an AI project that ran successfully in a sandbox environment became prohibitively expensive in the real world, and why promising AI models delivered misinformation.
“Successful AI depends on a solid data foundation with properly curated, staged and stored data, which includes quality assurance and traceability. If you are spending millions on an AI model built on data that is not governed or properly curated, you are wasting your investment,” Solomons said. “Getting the basics right is basically pre-paying for scalable, trustworthy AI in future.”
In a panel discussion, Asgarali Mia, Principal Consultant at iOCO Digital South Africa, and Louis de Gouveia, data competency manager at iOCO, outlined approaches to integrating data for use in AI models, and scaling POCs.
De Gouveia said: “One challenge organisations face is data silos that typically exist in large enterprises. These can be overcome by creating data lakes, or by implementing a data mesh – a thin layer connecting the silos, but still allowing each silo to be managed by the division that created it.”
Mia said: “Organisations must have the right planning, strategy and AI platforms in place. It is vital to have the right governance in place – including considering the risks to enterprise data when using public LLMs, ensuring you are validating external data and continually monitoring the AI model. Organisations can start small, focusing on AI use cases that will add value to the business.”
Saleh Al-Nemer, Chief Technology Officer & Chief Architect at HPE, noted: “No organisation can afford to ignore AI.”
However, he highlighted challenges such as harnessing data in hybrid IT environments with millions of edge devices, and vast amounts of structured and unstructured data. To overcome these challenges, organisations need the right partners, with technology designed to address these challenges, he said.
Al-Nemer elaborated on the need for AI factories – purpose-built environments that enable enterprises to industrialise AI. These systems are engineered to handle massive data sets and complex workloads while maintaining seamless performance and operational efficiency and ensuring data control, compliance and security.
He noted that HPE’s edge-to-cloud solutions enable enterprises to build AI factories to connect, protect, analyse and act on data wherever it resides. HPE drives AI innovation with modular AI factory solutions, providing all the infrastructure, software and services needed for a complete, industrial-scale AI environment.
“AI is very demanding and drives up cooling requirements. HPE has been doing liquid cooling for years and can help organisations cut cooling requirements by up to 90%,” Al-Nemer explained further.
HPE's cooling solutions for AI include direct liquid cooling, 100% fanless system architectures and immersion cooling, enabling customers to significantly lower their utility costs and carbon footprint.
Share