AV Test Report (analyst report) TEST created binaries for five possible attack scenarios. These were then modified in two different ways creating 15 test cases. 1. Installation: The product is installed and updated to the latest program version and signatures 2. The test was performed in three steps. The first two without internet access, the final one with internet access: a. Static Detection (Offline): First perform an on-access test by copying the files on the hard disk and then perform an on-demand scan of the remaining files b. Dynamic Detection (Offline): Then execute all remaining samples c. Dynamic Detection (Online): The connect the system to the internet and execute all remaining samples 3. Record efficacy results. Sample Selection There were five different tools created with the behaviors listed below. All of the tools have been written in C/C++ and compiled with Microsoft Visual C++ Attack 1 Attack 2 Attack 3 Attack 4 Attack 5 Persistence Registry: HKCU Startup Folder - Payload Credential harvesting and upload to external server Credential harvesting and upload to external server Download and execute further executable Modify HDD sectors Encrypt files Table 1: Simulated attacks The five resulting test cases were then modified with two different approaches. The first approach was to add a section to the PE file. The second approach was to append data to the end of the PE file. This resulted in 15 different files with different hashes. Test Case 3: Malware distributed by Websites Many infections these days are spread via websites. Most protection products use a layered approach to block these. They try to cover the URL and the malicious content. Different products have different priorities here and some may focus more on URL blocking than others. However it is rather easy for attackers to change the URL to defeat this kind of detection. Therefore this test is designed to determine the detection rate of the malicious PE file delivered by the website, when no URL is available. 1. Installation: The product is installed and updated to the latest program version and signatures 2. Turn off URL filtering of the product if applicable. 3. From victim connect to the website, download and execute the file 4. Record efficacy results. Sample Selection For this test, 69 websites have been randomly selected that are spreading malware. Test Case 4: False Positives It would be easy to create a protection product that scores 100% in all protection tests but at the same time creates false positives on all benign files. Therefore we tested how the products do react to common and less common software when downloading them from their original source, installing 3 Copyright © 2017 by AV-TEST GmbH, Klewitzstr. 7, 39112 Magdeburg, Germany Phone +49 (0) 391 60754-60, Fax +49 (0) 391 60754-69, Web https://www.av-test.org and using them on the computer. Whenever the security product detected something and warned or blocked an action this was noted for the result. Sample Selection In total 38 different common and less common applications have been used for the testing. The full list is given in the appendix at the end of this document. General notes regarding the testing methodology: There is one issue we would like to briefly discuss about writing attack simulation tools. When testing protection against new or targeted attacks it is always necessary to create own test cases. This can happen by modifying existing malware or by creating own attack simulation tools. Both is controversially discussed not only in the anti-malware industry. We opted to create our own attack simulation tools in order to have full control over the tools and to be able to make sure that no harm could be done by them. The Anti-Malware Testing Standards Organization (AMTSO) created several documents for best practices of anti-malware testing. One of those documents is named “Issues Involved in the ‘Creation’ of Samples for Testing”1. This document discusses the arguments in favor and in opposition to modifying malware or creating files like the attack simulators that we did. The document neither forbids nor explicitly endorses the creation of new samples for testing. It depends on what exactly shall be achieved with the test. After carefully consulting the document we are convinced that our approach is in line with the presented arguments. 1 http://www.amtso.org/download/amtso-issues-involved-in-the-creation-of-samples-for-testing/ 4 Copyright © 2017 by AV-TEST GmbH, Klewitzstr. 7, 39112 Magdeburg, Germany Phone +49 (0) 391 60754-60, Fax +49 (0) 391 60754-69, Web https://www.av-test.org Please complete the form to gain access to this content Email * First name * Last Name * Access Now
TEST created binaries for five possible attack scenarios. These were then modified in two different ways creating 15 test cases. 1. Installation: The product is installed and updated to the latest program version and signatures 2. The test was performed in three steps. The first two without internet access, the final one with internet access: a. Static Detection (Offline): First perform an on-access test by copying the files on the hard disk and then perform an on-demand scan of the remaining files b. Dynamic Detection (Offline): Then execute all remaining samples c. Dynamic Detection (Online): The connect the system to the internet and execute all remaining samples 3. Record efficacy results. Sample Selection There were five different tools created with the behaviors listed below. All of the tools have been written in C/C++ and compiled with Microsoft Visual C++ Attack 1 Attack 2 Attack 3 Attack 4 Attack 5 Persistence Registry: HKCU Startup Folder - Payload Credential harvesting and upload to external server Credential harvesting and upload to external server Download and execute further executable Modify HDD sectors Encrypt files Table 1: Simulated attacks The five resulting test cases were then modified with two different approaches. The first approach was to add a section to the PE file. The second approach was to append data to the end of the PE file. This resulted in 15 different files with different hashes. Test Case 3: Malware distributed by Websites Many infections these days are spread via websites. Most protection products use a layered approach to block these. They try to cover the URL and the malicious content. Different products have different priorities here and some may focus more on URL blocking than others. However it is rather easy for attackers to change the URL to defeat this kind of detection. Therefore this test is designed to determine the detection rate of the malicious PE file delivered by the website, when no URL is available. 1. Installation: The product is installed and updated to the latest program version and signatures 2. Turn off URL filtering of the product if applicable. 3. From victim connect to the website, download and execute the file 4. Record efficacy results. Sample Selection For this test, 69 websites have been randomly selected that are spreading malware. Test Case 4: False Positives It would be easy to create a protection product that scores 100% in all protection tests but at the same time creates false positives on all benign files. Therefore we tested how the products do react to common and less common software when downloading them from their original source, installing 3 Copyright © 2017 by AV-TEST GmbH, Klewitzstr. 7, 39112 Magdeburg, Germany Phone +49 (0) 391 60754-60, Fax +49 (0) 391 60754-69, Web https://www.av-test.org and using them on the computer. Whenever the security product detected something and warned or blocked an action this was noted for the result. Sample Selection In total 38 different common and less common applications have been used for the testing. The full list is given in the appendix at the end of this document. General notes regarding the testing methodology: There is one issue we would like to briefly discuss about writing attack simulation tools. When testing protection against new or targeted attacks it is always necessary to create own test cases. This can happen by modifying existing malware or by creating own attack simulation tools. Both is controversially discussed not only in the anti-malware industry. We opted to create our own attack simulation tools in order to have full control over the tools and to be able to make sure that no harm could be done by them. The Anti-Malware Testing Standards Organization (AMTSO) created several documents for best practices of anti-malware testing. One of those documents is named “Issues Involved in the ‘Creation’ of Samples for Testing”1. This document discusses the arguments in favor and in opposition to modifying malware or creating files like the attack simulators that we did. The document neither forbids nor explicitly endorses the creation of new samples for testing. It depends on what exactly shall be achieved with the test. After carefully consulting the document we are convinced that our approach is in line with the presented arguments. 1 http://www.amtso.org/download/amtso-issues-involved-in-the-creation-of-samples-for-testing/ 4 Copyright © 2017 by AV-TEST GmbH, Klewitzstr. 7, 39112 Magdeburg, Germany Phone +49 (0) 391 60754-60, Fax +49 (0) 391 60754-69, Web https://www.av-test.org
Related Resources Cylance EMEA The 2020 Threat Report Read more Cylance EMEA Economics of Insecurity Read more Cylance EMEA BlackBerry Secures 96% of the Enterprise IoT Threa... Read more Cylance EMEA Next-Generation Anti-Malware Testing Read more Cylance EMEA Cybersecurity Maturity Evaluation Guide Read more Cylance EMEA AI-Driven Security for a Mobile World Read more Cylance EMEA Incident Response Use Cases Read more Cylance EMEA Cybersecurity Risk Prevention Reconsidered Read more Cylance EMEA The Inevitable Ascent of Zero Trust Read more Cylance EMEA AI-Driven Security for a Mobile World Read more Cylance EMEA BlackBerry Cybersecurity Maturity Evaluation Guide Read more Cylance EMEA Summary: 2020 Threat Report Read more Cylance EMEA Artificial Intelligence: The Smarter Approach To I... Read more Cylance EMEA NORDIC ENG Better Security Fewer Resources Read more Cylance EMEA Threat Report Discussion Guide Read more Cylance EMEA Cylance AI Survey Read more Cylance EMEA Better Security, Fewer Resources DE Read more Cylance EMEA AV Test Report Analyst Report Read more Cylance EMEA AI-Driven EDR Read more Cylance EMEA The Total Economic Impact Of CylancePROTECT And Cy... Read more Cylance EMEA AI & Machine Learning Whitepaper Read more Load more
© 2025 Infotech Crowd. All rights reserved. Privacy Policy Cookies Policy Advertise with us Back to top