Subscribe

Traditional backup tech under stress

Admire Moyo
By Admire Moyo, ITWeb's news editor.
Johannesburg, 29 Apr 2015
Think of backup and related activities as part of the data management continuum, says Harold van Graan, sales director at Actifio SA.
Think of backup and related activities as part of the data management continuum, says Harold van Graan, sales director at Actifio SA.

With the Internet of things (IOT) and big data, critical data volumes are growing exponentially while the time allocated for backups is shrinking and many companies are not able to back up critical applications at all.

That's the view of Harold van Graan, sales director at ICT solutions provider Actifio SA, who believes the high adoption of server virtualisation has exacerbated the problem by increasing architectural complexity as well as creating a physical and virtual world that often has to be managed by the same teams using different technologies.

IDC notes that as of the end of 2013, there were 9.1 billion IOT units installed and estimates the installed base of IOT units to grow at a 17.5% compound annual growth rate to 28.1 billion in 2020. In its 2014-2018 forecast, the analyst firm anticipates big data technology and services market to grow at a 26.24% compound annual growth rate through 2018 to reach $41.52 billion.

Meanwhile, according to Gartner, at least 70% of x86 server workloads are virtualised, the market is mature and competitive, and enterprises have viable choices. It adds an increasing number of enterprises are evaluating both the cost benefits of competitive migrations and the benefits of deploying multiple virtualisation technologies.

"Traditional backup technologies and teams are challenged to keep pace with the exponential data growth that requires a totally new way of thinking," says Van Graan. "We can't fix it by doing what we've always done but faster. We have to do it differently."

The key point, says Van Graan, is to think of backup and related activities such as archiving, recovery, replication and disaster recovery as part of the data management continuum. Traditional siloed approaches mean some companies can have anywhere from 10 to 120 sets of duplicate data copies, often stored in proprietary formats, he explains.

As research house 451 Research notes, says Van Graan, the copying function itself can reduce the performance of primary systems.

He points out that copy data virtualisation - a modern approach - entails capturing data in its native application format according to defined service-level agreements. This physical master copy can be used for all use cases via virtual live copies, or "live clones", he notes, adding it means virtual copies of a single physical master copy can be used for backup, test and development, disaster recovery, analytics and more.

"Data virtualisation is elegant because it dramatically reduces the amount of physical storage capacity required - for example we have seen reductions of 90%. The reduction in costs is significant," says Van Graan. "And because the data is stored in its native format, it is available almost instantly."

Best of all, he notes, copy data management frees the data from physical infrastructure, making it available to the entire enterprise, and thus increasing its value exponentially.

"It's ideally suited for cloud environments and organisations exploring the adoption of software defined data centres," Van Graan concludes.

Share