This resource is no longer available
Excerpts from Virus Bulletin Comparative Reviews August-December 2010
The basic requirements for a product to achieve VB100 certification status are that a product detects, both on demand and on access, in its default settings, all malware known to be ‘In the Wild’ at the time of the review, and generates no false positives when scanning a set of clean files.
Various other tests are also carried out as part of the comparative review process, including speed and overhead measurements and ‘RAP’ (Reactive and Proactive) tests.
The RAP tests measure products’ detection rates across four distinct sets of malware samples. The first three of these comprise malware first seen in each of the three weeks prior to product submission and measure how quickly product developers and labs react to the steady flood of new malware. The fourth test set consists of malware samples first seen in the week after product submission. This test set is used to gauge products’ ability to detect new and unknown samples proactively, using heuristic and generic techniques.
While the results of these secondary tests do not affect a product’s qualification for VB100 certification, they are included to provide the reader with a better overall picture of product performance.