By Tony Larks, Vice President, Global Consumer Marketing, Trend Micro
There’s no doubt that buying the right threat protection system can be a mind boggling task for the modern consumer. That’s why a consumer guide can be a very helpful aid, especially in an area such as information security, where the choice is bewildering and it can be difficult to differentiate.
Problems arise, however, when consumers don’t have the whole story, making it difficult for them to accurately assess competing products. When the firms that compile the ratings don’t open their testing methodologies up to external scrutiny, they undermine their own work and risk misleading the very customers they purport to serve.
Consumers deserve to be given the full picture
A recent review of 2012 Security Software by the well-respected US title Consumer Reports placed Trend Micro™ Titanium™ 11th out of 14 vendors and that is misleading. The 8th placed vendor (Norton) for example was given 70 points versus Titanium’s 60, but had exactly the same category marks with one exception: Titanium actually scored higher under performance. How could this be?
In a similar way, Titanium ranked higher in three categories against 9th placed F-Secure, whereas F-Secure only had one category where it scored higher than Titanium. Some categories are obviously being given more weight than others, but which ones and why? Consumers deserve to be given the full picture.
Titanium is one of the fastest products
In fact, the only difference between Trend Micro Titanium and the 4th placed vendor (ESET) is that Titanium was marked down on “updating” and “performance.” When it comes to updating, we are confident that Titanium is one of the fastest products around because it uses Trend Micro Smart Protection Network™ cloud-security architecture to block new threats immediately, before they have a chance to get onto a customer’s network.
You can’t get much faster than “immediate,” and NSS Labs agrees, rating Trend Micro fastest in 2009 and 2010 tests and this trend continues.
When it comes to performance, Titanium was actually redesigned to improve performance by recommending the use of the Microsoft Windows built-in firewall. It looks like Consumer Reports did not test the MS Firewall for this category, hence handicapping Titanium. It appears that way, but without greater transparency into testing methods we don’t know.
The threat landscape is moving at such a pace that vendors like Trend Micro are constantly being forced to up their game. We have recently moved file reputation to the cloud to improve performance and threat detection rates. This is what Trend Micro means when we say Titanium won’t slow you or your computer down. We have incorporated proactive network-layer botnet detection—so customers’ computers won’t be used to spew spam. And we’ve boosted browser exploit prevention capabilities to keep them from clicking on websites that host malware.
In the end, the most important thing is to protect our customers’ digital lives by providing them with the fastest, most effective way of blocking threats while keeping their valuable digital assets out of harm’s way. We think we’re doing that pretty well. However, it’s not just the vendors who have to keep pace with the rapid pace of technological change, testing methodologies must evolve too.
Consumer Reports needs more openness and transparency about how they reach conclusions
Clearly, the consumer guides with a strong brand and captive audience like Consumer Reports need to take their responsibilities seriously in order to provide the most accurate information possible. This may involve working more closely with the vendor community. It will definitely require more openness and transparency about how they come to their conclusions. Only then will these tests have the legitimacy they deserve.
Tony Larks works for Trend Micro and is guest blogging for the Fearless Web. The opinions expressed here are his own.