Assessment the Samples against Antivirus Products:
The purpose of this work was to evaluate AV software’s
ability to detect previously non-cataloged malware samples. Hence, we could not
rely on any of the existing malware databases. We therefore resorted to other means
of virus hunting over the Web. We have appointed various methods for collecting
malware samples as described below. We executed the samples in a controlled
environment to make sure that they display behavior indicative of malware. Using
the methods described below, we were able to collect 82 samples.
Now that we had 82 malware samples, we needed an
infrastructure that would allow us to evaluate them with as many AV products as
possible, repeatedly over time.
Virus Total is a website that provides a free online
service that analyzes files and URLs enabling the identification of viruses,
worms, Trojans, and other kinds of malicious content detected by antivirus
engines and website scanners. At the time of our work, each sample was tested
by 40 different products. A detailed report is produced for each analysis
indicating, for each AV product, whether the sample was identified as malware,
and if so, which malware was detected. The following figures show sample
screenshots of a manual evaluation method (in which a user uploads the malware
sample through a browser and reviews results in HTML form).
On top of the manual submission interface, Virus Total
more provides an API that can be used for automating the submission and result
analysis method. The API is HTTP based and uses simple POST requests and JSON
replies. We used a set of homegrown Python scripts to schedule an automated
scan of all the samples in our data set on a weekly basis. Results were stored
in a relational database for further analysis. We ran the test for six weeks
and collected a total of 13,000 entries in our database, where each entry
represents the result of a certain scan of a certain sample file by a certain
product.
Analyzing the Results
A typical Statistics
In our analysis, we looked at two kinds of
measurements: static and dynamic. The static measurements look at AV coverage
regardless of the period. The dynamic measurements look at understand of AV
coverage over time.
The first measurement we took is coverage by most
popular AV products (see above). For this static measurement, we picked up both
entrepreneurship and free AV products and looked only at those samples that, by
the end of the testing period, were identified by at least 50% of assessment products
(we used this criteria to reduce noise and potential dispute claims). The
results are displayed in Table 1 where blue area matches the portion of the
sample that was detected.
Viruses Identified vs. Not
Detected, by Antivirus Vendor
1. ESET-NOD32
2. Avast
3. McAfee
4 Kaspersky
5. TrendMicro
6. Symantec
7. Antiy-AVL
8. Clam-AV etc.
1. ESET-NOD32
2. Avast
3. McAfee
4 Kaspersky
5. TrendMicro
6. Symantec
7. Antiy-AVL
8. Clam-AV etc.
Out first dynamic measurement compares each AV
product’s recognition capability at the beginning of the test (first run,
colored in blue) with its recognition rate at the end of the test (last run,
colored in red). It indicates how well AV products method new inputs in a
typical. The diagram below includes only those products for which an
improvement was shown.
Now we get to the very interesting question of how
long does it take for an AV product to incorporate recognition for a previously
undetected sample. The following chart shows the average time, by the seller
listed, to detect those samples that were not recognized as malware in the
first run. For each seller, we took the average for files not detected by that seller
alone. We chose to show the improvements rate only for the most prominent
product out there. We chose the AV with biggest market share (AVAST) and then 4
entrepreneurship products from the largest Security / AV sellers. The data in
this chart gives us an idea about the size of the “window of opportunity” for
an attacker to take advantage of a newly malware. Do notice that none of the
malware samples we used were identified by ANY of the products as an entirely
new type of malware – rather, they were all recompilations of existing malware
families. As one can see, the typical window of opportunity for the listed AV
products is as long as four weeks!
When we checked the dynamics of the least detected samples,
we came up with even worse results. We checked how many weeks are required for
samples to reach a rate greater than 50% recognition that were detected less
than 25% of the time during their initial scan. By analyzing our results
database, we discovered that 12 files had a recognition rate of less than 25%
when they were first scanned, yet not a single one of them came terms detected
50% of the time in following scans.
Other phenomenon that we discovered after analyzing the
results, which were obtained across the period of a few weeks and after
scanning was finished, was that not only did recognition change, but the
association made by antivirus products changed. This means that we encountered
a condition in which, over the period of three weeks, antivirus products
classified a file as “Unclassified Malware,” and only in the fourth week did it
finally classify it as a certain type of malware (Trojan horse). We
additionally encountered cases in which the antivirus completely changed the classification
that it made of a certain file. For example, one week the antivirus product Byte
Hero identified a file as Trojan Malware, and other as Virus. Win32.
Consequently, we can conclude that antivirus products occasionally are not
consistent in the results they provide.
In our analysis, we have tried to come up with an effective
combination of AV products that would yield the best protection against our
data set. We have considered, for the sake of this test, only those files that
were detected by more than 25% of AV products. None of the individual AV
products were able to provide coverage for this set of samples. To our
surprise, the set of antivirus products that has the best recognition rates
included two freeware antivirus products, Avast and Emsisoft. Other interesting
point is that, while the most well-known AV products available the best
standalone coverage, their coverage could not be effectively advanced using other
single product.
No comments:
Post a Comment