Monday, December 15, 2008

nuggets of misinformation

over the weekend martin mckeay published a post asking people what free av they used at home... the story is ordinary enough, i'm sure a lot of people out there have faced the problem of what anti-malware software to choose, whether a free one, one of the big name for fee ones, or none at all (and for the record, i'm not in any of those camps)... martin is a well known security blogger and podcaster, he knows about a lot of security and privacy related subjects, but from this fairly informal posting i now know that martin does not know av...

what caught my eye about his post came near the end where martin pointed towards this proactive detection test report as showing how ineffective av really is... for everyone's benefit, tests of proactive protection capabilities are tests specifically designed to bypass the signature-based portion of an anti-malware product so as to test only the heuristic components... that one word - "proactive" - all on it's own would tell someone familiar with this field that the test does not measure the overall effectiveness of products but rather just the effectiveness of a subset of the technologies in those products - and that word was right in the main heading for the report...

reading further (ie. reading the introduction) reveals that the subset of technologies tested is further constrained... the test only measures the effectiveness of static heuristic techniques only, no dynamic heuristics, nothing involving run-time behavioural detection or anything like that... it should be clear that when you're only testing a small part of a product your results won't indicate it's overall effectiveness...[EDIT dec. 16, 2008: turns out i read the intro wrong, however it's still only a test of the heuristic components of anti-virus products rather than of the entire products, and thus not a reflection of their overall effectiveness]

of course if you don't understand the terminology being used and only look at the numbers and the graphs then of course you might think this represents the overall effectiveness - that's probably why martin thinks the effectiveness of av is somewhere between 60% and 80% (not too different from the numbers on the report he points to) when the latest on-demand tests (which still don't include run-time behavioural detection, but do include a broader range of the detective capabilities of the products) performed by both av-comparatives.org and av-test.org place the effectiveness of most products well above 90%...

sadly of all the people who responded to his post, none of them seem to have noticed this interpretation error so far... i'm sure everyone has heard the idiom that there are lies, there are damn lies, and there are statistics... since numbers can be so misleading, it behooves one to familiarize oneself enough with a topic to at least properly interpret those numbers so that you can't be so easily fooled by them...

3 comments:

Unknown said...

So give me a post explaining the difference between static and dynamic heuristics and run-time behavioral analysis! I'm a PCI assessor and before that I was an IDS monkey. I understand the basics of AV but not the depths. Educate me rather than just pointing out what I don't know. I'd appreciate the added knowledge.

kurt wismer said...

in fact, this post did explain where and what the error was, and it wasn't about the differences between static and dynamic heuristics...

the error had to do with not understanding the significance of the word "proactive" in anti-malware testing...

however, point taken - since i only have a definition up about normal (static) heuristics, i should put one up about dynamic heuristics as well so that when i use the term i can link to a definition of that too... i'll work on that today...

kurt wismer said...

i struck off the part about dynamic and static heuristics - mentioning it was a mistake on a number of levels, not the least of which being because it seems to have clouded the issue