IT downtime costs Global 2000 companies $400 billion per annum and impacts data resilience, according to Veeam Software. This statistic should serve as a warning to local businesses to know the strength of its data resilience and better manage this resource.
Data resilience is an organisation’s ability to protect and recover data in the event of disruption. It is well documented that AI-driven cyber threats continue to rise in number and sophistication, which adds more pressure on organisations to maintain a high level of data resilience.
This is why Veeam Software has launched the Data Resilience Maturity Model, a framework to help organisations to assess their data resilience postures, identify necessary actions and align people, processes and technology with data strategies.
The company cites a report released by Veeam and McKinsey, which it claims reveals a staggering disconnect: while 30% of CIOs believe their organisations are above average in data resilience, fewer than 10% actually are.
According to the report, IT downtime costs each Global 2000 company $200 million in losses, including outages, reputational damage and operational disruption.
The research also found that 74% of organisations fall short of best practices, operating at the two lowest levels of maturity; and over 30% of CIOs in the least resilient companies mistakenly believe their data resilience capabilities are better than they actually are, exposing their businesses to potential failure.
“Data resilience is critical to survival – and most companies are operating in the dark,” said Anand Eswaran, CEO of Veeam.
“Data resilience isn’t just about protecting data, it’s about protecting the entire business,” Eswaran continued. “This is the difference between shutting down operations during an outage or keeping the business running. It’s the difference between paying a ransom or not. It provides the foundation for AI innovation, compliance, trust and long-term performance – including competitive advantage.”
The company says organisations must take a proactive approach to develop data resilience.
“First and foremost, it’s essential to mitigate risks. While we know that it’s impossible to eliminate every risk, identifying and reducing as many as possible is crucial for building a strong foundation,” Eswaran continued.
Preparedness is another key component, he added. “Organisations must be ready to respond when incidents occur, which means having robust incident response plans in place. These plans should be clear and actionable, providing a roadmap for teams to follow during challenging situations.”
The company advocates regular testing processes and technologies to ensure an organisation can handle disruptions effectively. This means not just routine checks, but consistent and comprehensive testing, especially under worst-case scenarios, Eswaran added.
According to Veeam Software, while local statistics about data resilience may not always be available, the local market should take cognisance of what is happening internationally.
“The insights derived from global data are still incredibly relevant to local contexts. The challenges of data resilience and the impact of downtime are universal issues that organisations face, regardless of their geographical location. By understanding these global trends, local businesses can better assess their own vulnerabilities and the potential costs of inaction,” said Eswaran.
Ultimately, the lessons learned from global statistics can serve as a vital guide for organisations looking to enhance their data resilience, Eswaran added.
“It’s about taking those insights and applying them to local strategies – ensuring that every organisation, no matter where they are, can make informed decisions that bolster their data resilience efforts and minimise downtime. In today’s interconnected world, we must think globally while acting locally to achieve true data resilience.”
Share