
The business risk from poor data management has never been so great.
So says Nick Wonfor, country manager at CommVault, who points out that the massive increase in data volumes is continuing to overwhelm IT resources.
Findings by the IDC show that the quantity of digital archive data being created is growing at an alarming rate - four times the current volume to retain and search in three years. According to Wonfor, this influx of data means that many companies are struggling to meet backup deadlines, and scan times alone are contributing significantly to backup windows.
"Organisations are now fighting against rising data management capital costs whilst trying to focus operationally on managing more data. The heightened value of information today also puts more pressure on staff to meet the stringent SLA's relating to the retention and recovery of mission critical data," he says.
Wonfor notes that for many organisations, the solution to this big data problem is to deploy various products to solve backup and restore issues while also seeking to improve archive and storage resource management capabilities. He admits that snapshot technology can tackle shrinking backup windows and that deduplication is proven to reduce the amount of data being stored for short terms backups.
"Source-side de-dupe can minimise network and data activity and archiving is still one of the best ways to reduce the amount of old data on primary cost stores," says Wonfor.
Although these options alleviate the problem, Wonfor assets that implementing solutions from multiple vendors can create inefficient data silos that are difficult to manage and that these solutions fail to address the bigger problem of too much data being processed too many times, or being stored for the wrong period of time on the wrong media.
"The answer to the big data challenge is that instead of looking for short-term band-aids for existing solutions, companies should consider a converged process for backup, archive and reporting," he says, adding that a unified approach to data management, only reading and moving data once, can eliminate redundant processes and speed up operations. This will reduce storage costs and simplifying management policies, he says.
"A single data policy enables organisations to scan, copy, index, analyse and store data once. It also enables data analytics to be performed, classifying the data and automatically supplying archive policies for data tiering that will ultimately reduce the total cost of ownership."
According to Wonfor, using in-built intelligent data collection reduces scan times, which allows companies to keep within incremental backup windows. He adds that the single pass scan routine for backup, archive and reporting reduces server loads while integrating source-side deduplication and synthetic full backups minimises server and network load.
"Instead of just moving the pain point, a converged solution can reduce the combined time typically required to manage backup, archive and report by more than 50% - helping companies to affordably protect, manage and access data on systems that have otherwise become 'too big'," Wonfor concludes. "If businesses want to be in a position to efficiently overcome the backup challenges of tomorrow, they need to take a different approach today."
Share