Data De-Duplication Background: A Technical White Paper
sponsored by Quantum Corporation

The term "data de-duplication", as it is used and implemented by Quantum Corporation in this white paper, refers to a specific approach to data reduction built on a methodology that systematically substitutes reference pointers for redundant variable-length blocks (or data segments) in a specific dataset. The purpose of data de-duplication is to increase the amount of information that can be stored on disk arrays and to increase the effective amount of data that can be transmitted over networks. When it is based on variable-length data segments, data de-duplication has the capability of providing greater granularity than single-instance store technologies that identify and eliminate the need to store repeated instances of identical whole files.
(THIS RESOURCE IS NO LONGER AVAILABLE.)
 
Available Resources from Quantum Corporation
See what other users are reading via our Daily Top 50 Report
.

About TechTarget:

TechTarget provides enterprise IT professionals with the information they need to perform their jobs - from developing strategy, to making cost-effective IT purchase decisions and managing their organizations' IT projects - with its network of technology-specific Web sites, events and magazines

All Rights Reserved, Copyright 2000 - 2014, TechTarget | Read our Privacy Statement