Focus on Virus
Consumer Reports AV and their 5,500 new variants Aug 22 2006 03:48AM
Bill Stout (bill stout greenborder com) (1 replies)
RE: Consumer Reports AV and their 5,500 new variants Aug 22 2006 07:00PM
Bill Stout (bill stout greenborder com) (2 replies)
Re: Consumer Reports AV and their 5,500 new variants Sep 05 2006 05:24AM
Kurt Seifried (bt seifried org) (1 replies)
RE: Consumer Reports AV and their 5,500 new variants Sep 06 2006 08:01AM
Bill Stout (bill stout greenborder com) (1 replies)
RE: Consumer Reports AV and their 5,500 new variants Sep 06 2006 02:20PM
Paul Schmehl (pauls utdallas edu) (1 replies)
RE: Consumer Reports AV and their 5,500 new variants Sep 06 2006 03:31PM
Roger A. Grimes (roger banneretcs com) (3 replies)
RE: Consumer Reports AV and their 5,500 new variants Sep 18 2006 04:50PM
Bill Stout (bill stout greenborder com)
RE: Consumer Reports AV and their 5,500 new variants Sep 06 2006 11:48PM
Nick FitzGerald (nick virus-l demon co uk)
Roger A. Grimes wrote:

> I've been doing AV for 20 years now, and supported this basic safety tenet,
> but the Consumer Reports' lab testing incident doesn't bother me.

So you're not bothered that what a CR test claims to measure and what
it does measure are not only not the same thing, but immeasurably
different from being the same thing?

I thought (though not living in the US, this is mediated by many
potentially biased and diverging influences) that CR prided itself in
doing and publishing meaningful, realistic, repeatable and "provable"
tests. I thought that if CR says that Car A has unacceptable stopping
distances compared to Car B and Car C, then that actually meant
something real about Car A. Sadly (for CR and its readers) that cannot
be said of any _meaningful_ measure of detectability of future malware
from the results of the test under discussion here.

The "new malware detectability" component of this test fails several of
the most fundamental criteria of CR testing, as I understand those
criteria. The "new malware detectability" component of this test is
badly designed and probably was badly performed. The lack of
meaningful details as to how this part of the test was performed alone,
and especially in light of the subsequent, sustained expert criticism
of the test, raises significant concerns about the design of these
tests and the suitability of the tester(s) running them to conceive,
design and perform such tests.

Using the car braking analogy again, as far as we can tell, this test
was analogous to different cars being tested on different road
surfaces, under different wet/dry conditions, with varying tyre
compounds, and varying inflation pressures AND with none of those
variables measured, recorded, reported or even hinted at as possibly
affecting the results.

> It had a good AV expert behind the work, ...

Sorry Roger, but there I have to disagree. I have been affiliated with
or "in" the AV busines for a similar time to you and the only folk I
can ever recall claiming any "AV expertise" for the testers are, in
fact, the testers themselves. Teaching or passing a few security
classes that cover viruses and malware as a small part of the total
curriculum does not an expert AV product tester make. Further, as a
one-time expert AV product tester by employment and still closely
connected with the very small group who make up that "profession", I
can honestly say that these testers had no accepted professional
standing as AV product testers before the CR test was published, and as
a result of this test they are, by my reading, now considered
amateurish, at best, within the very small circle of professional AV
product testers.

> ...tested logical goals that can only
> be tested by creating new malware programs, ...

Obviously the concept of retrospective testing, where the tester
freezes the product to be tested and then, for several months, collects
newly released/discovered malware then tests the "old" products against
increasingly newer malware (say in weekly or monthly cohorts), escapes
you, as it escaped the CR testers?

As that is clearly another logically correct way of testing the
detection of unknown malware, your and the CR testers' views of such
things are more limited than those of "more expert" testers and
commentators.

Such testing has the rather undesirable (from some testers'
perspectives) property of not producing results quickly.

However, it has the rather desirable result from the perspective of the
desirability of obtaining repeatable, meaningful test results that
those results are reproducible and reflect the ability of the tested
products to detect the actual, real new malware that was produced and
released after the product under test. Repeated often enough, or on an
ongoing basis, and another disadvantage of both this and the approach
taken by the CR testers -- that the result is only a one-time snapshot
of such capabilities -- is also overcome.

> ... and was kept controlled. ...

As far as we know this has not been a problem so far.

> ... If it
> wasn't done by a professional and if great care wasn't taken to make sure
> they didn't leak, I'd be bothered. But let's be honest, at this point, the
> malware problem is so bad, the AV vendors are so bad at detecting them, and
> so many variants are being created each day, that the original problem of
> something new leaking out, just isn't the priority it used to be.

BUT that doesn't excuse sloppy testers of accidentally releasing
something they have unethically created. And, aside from showing that
they are not proffessional (because of their inability to contain their
test samples, regardless of their real-world status), it would also put
them in breach of the "data protection" laws of most jurisdictions that
have such laws _if_ the escapee malware was soemthing of their own
creation, so despite the extent of the problem, I don't see there is
any justification for such a nonchalant attitude to such releases.

Anyway, there is no evidence, nor any actual suggestion, that the CR
testers did make such releases and in general I think this aspect of
the criticisms of the CR tests is a somewhat over-emphasized
possibility.

> If I worked for an AV vendor, I'd stop my complaining and get to work on a
> better product. The state of AV protection is as bad as it has ever been.
> I've been reading about the "death of antivirus scanners" for 20 years now,
> but for the first time I think their time is nearing the end, and I say so
> in my Friday column in InfoWorld.

Sadly, the practices of computer users, combined with a bizarre notion
that every person and their dog "needs" what is effectively "admin
level" access to a general purpose computer are dead-set against
anything much better ever "working" in the sense of "achieving
acceptable market penetration", though I think that may eventually
change in the corporate sphere when stupid/lazy admins (a fair whack of
them) come to realize what they should really be doing to earn their
pay cheques (to their surprise, it has little to do with knowing what
MS shoves into its "certification" tests) and/or when the current
corporate mismanagement of IT climate changes with much of the common
stupid corporate politics removed and the staff who know what is better
are actually allowed to get on and do it rather than be dictated to by
those who can't tell their arses from their elbows.

But that has nothing to do with the flakey part of the CR tests under
discussion here, which do a great disservice to CR's reputation as a
quality testing organization.

Disclosure: Yes, I am currently under contract to an antivirus product
developer (CA). No, my remuneration is not tied in any way to their AV
products' successes or failures in the market. No, no-one from that
(or any other AV) company has suggested I write or say anything about
the CR tests and the time spent on this will not be billed to them.
Yes, I am a previous editor and product tester, and (titularly) a
Contributing Editor for Virus Bulletin magazine which no longer pays me
nor suggested blah, blah, blah. Yes, I have a very long-running
interest in improving the excellence of AV product testing whereever it
is found -- an endeavour that has had many more failures than
successes, it seems.

Regards,

Nick FitzGerald

------------------------------------------------------------------------
----
ALERT: "How a Hacker Launches a SQL Injection Attack!" - White Paper
It's as simple as placing additional SQL commands into a Web Form input box giving hackers complete access to all your backend systems!

https://download.spidynamics.com/1/ad/sql.asp?Campaign_ID=70160000000CZW
l
------------------------------------------------------------------------
----

[ reply ]
RE: Consumer Reports AV and their 5,500 new variants Sep 06 2006 08:23PM
Bill Stout (bill stout greenborder com)
Symantec AV Strategy Aug 24 2006 06:14AM
Serge Vondandamo (serge vondandamo wanadoo fr) (2 replies)
Re: Symantec AV Strategy Aug 24 2006 08:22PM
Edgar B. Tijerino (ebt2001 med cornell edu)
RE: Symantec AV Strategy Aug 24 2006 02:22PM
Robert D. Holtz - Lists (robert d holtz gmail com)


 

Privacy Statement
Copyright 2010, SecurityFocus