Some time ago (February 25–26), the Anti-Malware Testing Standard Organization (AMTSO) had its first meeting this year. This time, it was hosted by McAfee and took place in Santa Clara, California.
One of the hot topics during the meeting was related to the initiative to review reports published by testing and certification organizations/companies.
How was this process designed? The Review Analysis Board (RAB) of the AMTSO receives initial requests, makes a decision to conduct a review, and coordinates the work of the Review Analysis Committee (RAC). The RAC comprises volunteer members that analyze reports against the organization’s existing nine principles. The AMTSO’s principles were agreed upon by its members—testers and antivirus vendors—and supported by the AMTSO’s academic advisors. The testing principles mainly refer to how published reports could be presented to their audiences.
The review process does not, however, intend to prove if the right things were done but rather to review whether the things done were done right.
As such, as long as a test report included an accurate description of how threat samples were gathered and validated, how tests were conducted, and how conclusions were made (including correct and fair communication among all parties involved in the testing), then the report may be deemed compliant with the AMTSO’s testing principles. The actual testing methodology used by a testing lab was not, itself, the subject of the review.
Take, for instance, a highly innovative test like the one conducted by NSS Labs last year. This was reviewed based on how well the testing methods and conditions were described and whether the conclusions did follow the test results, regardless of the way the test was designed and its methodology.
The AMTSO’s reviews neither intend to promote nor constrain innovation in anti-malware product testing methodology but to improve output quality.