Thursday, January 03, 2013

imperva's anti-virus study is garbage

Enough is enough! I have had it with these motherf#$%ing flakes on this motherf#$%ing train of thought - (what i imagine samuel l. jackson might say if he were following this nonsense about imperva)

in case you are unfamiliar, imperva (a security vendor of some sort) commissioned a bunch of students from the technion-israel institute of technology to perform an evaluation of the efficacy of anti-virus (all anti-virus as a whole, apparently, rather than comparing them to each other) by uploading 82 found samples to virustotal. yes, you read that right, it's another virustotal-based test.

these days i have a number of alternative avenues to express myself that i didn't have when this blog was still young, and that can often sate my need to express my feelings on some topic. i can make snide comments on twitter, or even parody tweets from a satirical twitter account. in fact i can even make memes about it. unfortunately none of that has proven sufficient in this case because the hits just keep coming.

you see, imperva keeps shopping this quackery out to more and more media outlets where it gets gobbled up and regurgitated uncritically by writers/editors (who really ought to know better if reporting on this sort of topic is part of their actual job) and thus gets put in front of more and more eyeballs of those who realistically can't know better. along the way it can even collect somewhat supporting voices from venerated members of the security community like robert david graham
or wim remes

let me be clear, however - this is all wrong. as has been repeated over and over again, virustotal is for testing samples, not anti-malware. they say it themselves in their about page
The reason is that using VirusTotal for antivirus testing is a bad idea.
and
BAD IDEA: VirusTotal for antivirus/URL scanner testing
those statements alone should be enough but, because virustotal later talks specifically about comparative tests, imperva (and others) have tried to argue that imperva's test is OK because it doesn't compare products to each other. however...
VirusTotal's antivirus engines are commandline versions, so depending on the product, they will not behave exactly the same as the desktop versions: for instance, desktop solutions may use techniques based on behavioural analysis and count with personal firewalls that may decrease entry points and mitigate propagation, etc.
this makes it pretty clear that the product a customer installs is very much a different thing from the program that virustotal uses - they will in most cases behave very differently and so the results that virustotal spits out cannot be considered representative of what actual users of anti-malware products will experience.

(ironically,  a product that appears to fare best in a virustotal-based test may actually be the worst because a higher focus on the type of static (often signature-based) detection that virustotal best measures could be to cover for a weakness in (or absence of) more generic/dynamic detection capabilities.)

but don't just take my word for it, let's hear from a couple of people who actually work at virustotal
yes, that's right, imperva's study is a joke. this shouldn't be surprising to long time readers of this blog since when i first wrote about this problem four years ago the first reason i gave for why you might want to avoid performing virustotal-based tests was that those of us who know better will laugh at you. i'm sure a number of people are laughing at imperva's gross incompetence (hanlon's razor makes me choose this explanation over the more sinister alternatives) but i'm afraid i can't consider the mess they're making to be a laughing matter.

promulgating ignorance in a security context has the potential to do real harm, and that is where i draw the line. that's why i'm writing this, that's why the title gets straight to the point, and that's why i'm going to start naming some names of people/organizations who have helped make this mess and who really ought to have known better. imperva has behaved like a dung beetle, persistently rolling this turd around, but somehow it keeps getting bigger like some katamari damacy of bullshit, and i think it's important to see the scale and scope of it and hold the people responsible accountable. it's worth noting, however, that somewhere deep down someone at imperva must also have seen the potential for their message to do harm - that's why the caveat that they weren't advising eliminating AV was added (as an apparent afterthought).

a non-exhaustive list of people/orgs who really should have known better, tried harder, and ought to be held to account for this growing mess is as follows:
(i'm aware there's a lot more than this that you can simply find by googling sentences from the press release, i wish i had the time to make this list exhaustive - that said: reuters, the new york times, and the wallstreet journal... that definitely caught a lot of eyeballs)

now, perhaps you're thinking i'm being too hard on the journalists involved here. after all, they aren't experts. frankly, however, they don't have to be experts to see what's wrong with this test. if you're the type of reporter who reports on this type of technology then you should already know about virustotal and about how it can and can't be used. this isn't rocket science, or even some obscure nuance that only matters every 5th wednesday - not in the context of reporting on this subject. this is something reporters covering security technology ought to know. it's table stakes. you need to be this tall to get on the ride.

perhaps you think i'm being too hard on the students and their supervisor(s)? but this is academia we're talking about. they're expected to do their research, and i don't just mean the experimental research, i mean looking up and reading about the issues involved in designing and performing tests on anti-malware products. and their supeverisor(s) should have made sure they were doing their due diligence in this regard. frankly, in my time i've seen lone rank amateurs perform better tests than this with fewer resources. this is not acceptable academic performance.

and as for imperva themselves, well... if you intend to occupy part of the security industry that hopes to steal some of the AV industry's market, then you better know this stuff like it's the back of your hand. the institutional incompetence going all the way up the chain of command to the chief technology officer is astonishing and i'm surprised they managed to find someone with too many dollars and too little sense to give them funding, but i guess p.t. barnum was right about there being one born every minute.

imperva - do yourselves a favour and put a stop to this mess before it gets any bigger. you can't defend this junk computer science, the truth will eventually come out (it seems to have already started). you can't sweep it under the rug either, you've let things get too out of hand. the kind of smear campaign you're currently running was already attempted by the whitelisting industry years before you, and while that industry itself is still around and may even still be pumping out this same kind of junk, it didn't stop them from drifting back into obscurity. the way i see it the only way you can move forward sustainably is
  1. admit your error
  2. publicly retract your study
  3. reach out to the journalists whose reputations have been tarnished by listening to you and apologize
  4. assist the students you dragged into this in learning the error of the experimental methodology they followed (you can probably find a lot of good info either on or linked to from the anti-malware testing blog)
  5. start over with a more intelligent methodology and try to make your case again with valid data
and if you can't manage to follow these steps then i'll be glad to watch you fade away or get swallowed up in a few years time, because the kind of incompetence you've been proudly displaying so far is not the path to success.