This resource is no longer available
Data De-Duplication Background: A Technical White Paper
The term "data de-duplication", as it is used and implemented by Quantum Corporation in this white paper, refers to a specific approach to data reduction built on a methodology that systematically substitutes reference pointers for redundant variable-length blocks (or data segments) in a specific dataset. The purpose of data de-duplication is to increase the amount of information that can be stored on disk arrays and to increase the effective amount of data that can be transmitted over networks. When it is based on variable-length data segments, data de-duplication has the capability of providing greater granularity than single-instance store technologies that identify and eliminate the need to store repeated instances of identical whole files.
- Quantum Corporation
- 09 Jun 2008
- 01 May 2008
- 13 Page(s)
- White Paper