White paper: Backup and disaster recovery: What you need to know about data reduction techniques!


Johannesburg, 05 Apr 2018
Read time 1min 00sec
White paper: Backup and disaster recovery: What you need to know about data reduction techniques!
Whitepaper
White paper: Backup and disaster recovery: What you need to know about data reduction techniques!

Data deduplication is not an optimal storage technique for every backup-related workload. Applications which make use of backup data as the source to power business continuance, development, testing, quality assurance and data warehouse population, all benefit from faster back-end storage I/O than is available with deduplication. Furthermore, the cost assumptions which once necessitated deduplication for disk-based storage of backup data have changed as the industry has advanced.

As a technique used to reduce the amount of storage required for backup data, data deduplication uses sophisticated algorithms to minimise redundancy. Deduplication has proven particularly valuable for backup applications which historically contain large amounts of duplicate data from one backup to the next. The technique reduces the amount of storage required for backup applications to a fraction of what was formerly required in the tape era. Lowering storage capacity requirements reduced costs associated with storing backup data, making it economically feasible to back up data to disk.

Since the introduction of deduplication over ten years ago, the industry has advanced, changing the math affecting the cost/benefit associated with deduplication. What has changed?