• No results found

Software Testing Methodology: Anti-spyware and AntiVirus

N/A
N/A
Protected

Academic year: 2021

Share "Software Testing Methodology: Anti-spyware and AntiVirus"

Copied!
15
0
0

Loading.... (view fulltext now)

Full text

(1)

Software Testing Methodology:

Anti-spyware and AntiVirus

Anti-spyware Testing Methodology

A Clear and concise method for comparative testing of anti-spyware Software

Introduction

When comparing the effectiveness of anti-spyware products, the analysis must include the following:

• The ability to accurately detect and remove existing spyware (i.e. True Positives)

• The failure to detect and remove existing spyware

• The mis-identification of non-spyware elements as spyware (i.e.: False Positives)

Any analysis of anti-spyware products must include all three of the above items. The measurement of the third item, and its comparison to the other two items, can reveal the true effectiveness and safety of any anti-spyware product. For example, a very dangerous behavior for any anti-spyware product would be to identify and remove a component of Microsoft Word as a piece of spyware. Even more dangerous would be for an anti-spyware product to flag a key component of the operating system as a piece of spyware.

Anti-spyware product analysis employs a concise scientific methodology. This methodology starts with a test system in a known consistent state, installs sample spyware, and then runs the subject anti-spyware product. The system state is captured at various points within the testing process. Analysis consists of comparing the system states at the end of the test

Comparing the system states will reveal the accuracy of the subject anti-spyware product to identify and remove spyware, as well as to avoid identifying non-spyware elements as non-spyware. Furthermore, when comparing two anti-non-spyware products side by side, the test system must be restored to the known starting state before testing each product.

(2)

Testing Methodology

This testing methodology starts with a clean install of Microsoft Windows XP SP2. No other software products are installed on the system. The methodology employs a disk imaging system such as Acronis1 to enable restoration of the test system to its known state. It is important that Operating System Virtualization software (e.g. VMware) not be used, as it may corrupt the normal operation of spyware and anti-spyware products. Other tools employed include an Installation Analysis tool such as InstallWatch2 to capture the state of the test system as the analysis progresses.

Figure 1. Overall Flow of the Testing Process

Prepare the Clean State Testing System

1. Install Microsoft Windows XP and SP2 (Service Pack 2). 2. Install a System Imaging Product (such as Acronis). 3. Install InstallWatch, but do not perform a scan.

4. Create complete image of the test system. This image is the Starting Testing System Image.

1

Acronis True Image. See http://www.acronis.com

2

InstallWatch is a freeware tool that captures the state of a system. See http://www.epsilonsquared.com/installwatch.htm

(3)

Capture the Starting State Image

1. Install the anti-spyware product under test and run a complete initial scan of the system. Ensure that the scan does not detect any spyware (since the system is in a known clean state, detection of spyware at this point would be considered False Positive spyware detection).

2. Run InstallWatch to capture the state of the system. This capture is the Starting State Capture and will be used as a baseline to compare against the Infected State Capture and the Ending State Capture.

Infect the Test System

1. Install one or more spyware examples. When performing side by side anti-spyware comparisons, this set of example anti-spyware must remain

consistent for all products within the comparison.

2. Run InstallWatch to capture the current state of the system and compare it to the Starting State Image. This capture is the Infected State Capture. System changes shown in this capture are the direct result of installing the example spyware.

Capture the Ending State Image

1. Perform a complete system scan using the subject anti-spyware product. Follow though and remove all detected and flagged spyware elements (this includes known False Positives).

2. Some spyware programs can be completely removed by rebooting the machine and running a scan in safe mode. Some anti-spyware products ship with a safe mode client that is optimized for a 640x480 resolution setting.

3. Run InstallWatch to capture the state of the system after running the subject anti-spyware product and compare the state to the Starting State Capture. This is the Ending State Capture and will list all changes to the system as the result to running the subject anti-spyware product.

Test Results Analysis

Analysis of the results is a simple matter of comparing the captured states of the system. Differences between the Starting State Capture and the Infected State Capture indicate changes to the system as the direct result of installing the example spyware. Differences between the Starting State Capture and the Ending State Capture indicate some type of failure in the subject anti-spayware product. These failures may be in the detection of spyware, or False Positives.

(4)

Figure 2. Overall Test Result Analysis

Comparing Starting and Infected State Captures

As stated above, the differences (or delta) between the Starting State Capture and the Infected State Capture are the direct result of installing the sample spyware. No difference between these to captures indicates a testing error. The difference in these two captures is very important for calling out the actual changes in the system.

(5)

Figure 3. Comparison of Starting State to Infected State Captures

Comparing Starting and Ending State Images

The differences between the Starting State Capture and the Ending State Capture show the true effectiveness of any anti-spyware product.

Figure 4. Comparing Starting State to Ending State Captures When comparing the captures, there are three main possible outcomes:

1. No differences – This is the result of a very effective anti-spyware product. The product accurately detected and removed all spyware elements.

(6)

Figure 5. Starting State and Ending State Captures are the Same 2. Ending State contains more elements than the Starting State - This is

the result of an anti-spyware product that was not able to detect and remove all elements of the sample spyware.

Figure 6. Ending State Capture Containing More Elements than Starting State Capture

(7)

3. Starting State contains more elements that the Ending State – This is the result of an anti-spyware product that has detected and removed too many elements. Some or all of the missing elements are non-spyware components. These are False Positive failures. This is a very dangerous situation as the anti-spyware product may remove user data or key components of operating system rendering it unusable.

Figure 7. Starting Statue Capture Containing More than Ending State Capture

(8)

Anti-Virus Software Testing Methodology

A Clear and concise method for comparative testing of Anti-Virus Software

Introduction

As with testing anti-spyware products, anti-virus product evaluation also

encompasses the steps outlined in the sections above. However, depending on the nature of the malicious code, these steps may vary to some degree. To truly evaluate the efficiency of anti-virus applications the analysis must assess the following:

• The ability to detect and remove viruses on demand (i.e. True Positives) • The ability to detect and prevent replication of viruses on access

• The mis-identification of non-virus elements as viruses (i.e. False Positives)

• The ability to clean infected files, when possible, while preserving original data and functional integrity

• The ability to handle file-access conflicts

• The ability to detect items within multi-level compressed archives

• The restoration of user-selected quarantined items to their pristine state In essence, a good anti-virus software analysis should evaluate the detection ability and intelligent post-detection behavior of the product under study.

Due to the stubborn nature of most virus infections, an anti-virus product should not only be able to detect threats, but also be capable of taking intelligent

decisions to counter the malicious activity and completely remove all traces of the virus.

For example, a virus locked by another process or one with threads running in memory would be difficult to remove completely despite detection. A good anti-virus product should be able to eliminate all traces of the anti-virus by marking the same for quarantine or delete action upon reboot.

Testing Methodology

This testing methodology starts with a clean install of Microsoft Windows XP SP2. No other software products are installed on the system. The methodology employs a disk imaging system such as Acronis3 to enable restoration of the test system to its known state. It is important that Operating System Virtualization software (e.g. VMware) not be used, as it may corrupt the normal operation of virus and anti-virus products. Other tools employed include an Installation

3

(9)

Analysis tool such as InstallWatch4 to capture the state of the test system as the analysis progresses.

Figure 8. Overall Flow of the Testing Process

Prepare the Clean State Testing System

1. Install Microsoft Windows XP and SP2 (Service Pack 2). 2. Install a System Imaging Product (such as Acronis). 3. Install InstallWatch, but do not perform a scan.

4. Create complete image of the test system. This image is the Starting Testing System Image.

Note: Ensure test system is isolated from all other network resources to avoid spread of contamination

Capture the Starting State Image

1. Install the anti-virus product under test and run a complete initial scan of the system. Ensure that the scan does not detect any threats (since the system is in a known clean state, detection of virus at this point would be considered False Positive virus detection).

2. Run InstallWatch to capture the state of the system. This capture is the Starting State Capture and will be used as a baseline to compare against the Infected State Capture and the Ending State Capture.

(10)

Infect the Test System

1. Install a large variety of virus samples. To fully test the effectiveness of an installed anti-virus product, it would be desirable to have the following present on the test system:

a) A virus process running in memory space b) An unauthorized virus registry trace

c) A virus record within an XP system restore folder d) Virus samples within multi-level compressed archives e) External boot sector virus

f) Cleanable virus infected file

g) Virus infected file locked by an existing process (example: open the file using a text editing utility such as textpad5)

The above list is not a mandatory one. To simply test virus detection a simple file such as the EICAR6 test file should suffice. When performing side by side anti-virus comparisons, this set of example viruses must remain consistent for all products within the comparison.

2. Run InstallWatch to capture the current state of the system and compare it to the Starting State Image. This capture is the Infected State Capture. System changes shown in this capture are the direct result of installing the example virus.

Capture the Ending State Image

1. Perform a complete system scan using the subject anti-virus product. Follow through and remove all detected and flagged virus elements (this includes known False Positives).

2. Some virus threats can be completely removed by rebooting the machine and running a scan in safe mode. Some anti-virus products ship with a safe mode client that is optimized for a 640x480 resolution setting. 3. Run InstallWatch to capture the state of the system after running the

subject anti-virus product and compare the state to the Starting State Capture. This is the Ending State Capture and will list all changes to the system as the result to running the subject anti-virus product.

5

Textpad is a powerful, general purpose editor for plain text files. See http://www.textpad.com

6

(11)

Miscellaneous Tests

1. Not only should the anti-virus application be able to detect threats on scan, but also prohibit the introduction and replication of the same with its on-access protection turned on. Verify that the anti-virus tool does not allow copying and execution of malicious code from external sources such as floppy and CD/DVD-ROM drives, USB devices and other network resources.

2. Certain viruses are capable of piggy-backing onto other files. Test the ability of the anti-virus product to detect and clean such infected files. In general the file cleaning operation should adhere to the following rules: • No traces of the virus remain within the host file post-cleanup

• The file content is exactly the same as before infection

• The file performs all functions as before and its associations are maintained

• The cleaning activity does not negatively impact other files on the system in any way

• If the cleaning fails, the system is not rendered unusable

3. Test the anti-virus product’s ability to take intelligent decisions when handling access conflicts by locking an infected file during scan. A good anti-virus product should be capable of detecting in-use infected files and marking them for cleaning, quarantine or deletion upon system reboot. Ensure appropriate action is taken upon system reboot.

4. Once flagged, restore a detected threat from the quarantine list. The anti-virus product under test should place the marked file in its original

location, without changing its content, functionality or properties. 5. Lastly, test the application for its ability to accurately log and report all

threats encountered and subsequent actions taken upon the same.

Test Results Analysis

Analysis of the results is a simple matter of comparing the captured states of the system. Differences between the Starting State Capture and the Infected State Capture indicate changes to the system as the direct result of installing the example virus. Differences between the Starting State Capture and the Ending State Capture indicate some type of failure in the subject anti-virus product. These failures may be in the detection of viruses, or False Positives.

(12)

Figure 9. Overall Test Result Analysis

Comparing Starting and Infected State Captures

As stated above, the differences (or delta) between the Starting State Capture and the Infected State Capture are the direct result of installing the sample virus. No difference between these to captures indicates a testing error. The difference in these two captures is very important for calling out the actual changes in the system.

(13)

Figure 10. Comparison of Starting State to Infected State Captures

Comparing Starting and Ending State Images

The differences between the Starting State Capture and the Ending State Capture show the true effectiveness of any anti-virus product.

Figure 11. Comparing Starting State to Ending State Captures When comparing the captures, there are three main possible outcomes:

1. No differences – This is the result of a very effective anti-virus product. The product accurately detected and removed all virus elements. An effective anti-virus product should be able to identify legitimate virus samples and restore the system post-scan to its exact state prior to

(14)

Figure 12. Starting State and Ending State Captures are the Same 2. Ending State contains more elements than the Starting State - This is the result of an anti-virus product that was not able to detect and remove all elements of the sample virus.

Figure 13. Ending State Capture Containing More Elements than Starting State Capture

(15)

3. Starting State contains more elements that the Ending State – This is the result of an anti-virus product that has detected and removed too many elements. Some or all of the missing elements are non-virus components. These are False Positive failures. This is a very dangerous situation as the anti-virus product may remove user data or key

components of operating system rendering it unusable.

Figure 14. Starting Statue Capture Containing More than Ending State Capture

Summary

Testing the effectiveness of anti-spyware and Anti-Virus products requires clean, concise methods. The starting state and configuration of a test system should be well known and always the same. When running a test of a specific

anti-spyware/anti-virus product, only that product and the example spyware or virus should be installed on the test system. State captures of the test system should be taken at each phase of the test. When comparing multiple products, the test system should be restored to its starting state configuration (using the Starting State System Image). This method of testing ensures non-ambiguous results and fair comparisons.

References

Related documents

Being application independent, the service can be used by any kind of application with security and privacy needs so that applications need not to deal with the protection and

 Sophos Security Suite SBE 4 provides antivirus, spyware removal, anti-spam and firewall software with integrated virus, data, spyware, spam and hacker protection for Windows,

The best defense against today's virus threats is an active virus protection system that automatically acquires the latest anti- virus software, updates all appropriate systems

This year (1997), which sees England as World Men’s Senior and Junior Squash Team champions, a Scottish player, Peter Nicol, in the final of the British Open Championship and

Firewall E-Mail Spam Filter Web Filter Anti-Virus Software Anti-Spyware Software Employee Awareness EVIL Virus Spyware Crimeware (evolved from Malware)

In addition, Garber is the President and Chief Executive Officer of Caesars Acquisition Company (CAC) (NASDAQ: CACQ), a company formed in 2013 to make an equity investment in

9. Valued added refers to cost incurred in the production of a product. Variable cost per unit increases due to increase in production. Fixed cost is also called period cost...

Dari hasil penelitian yang telah penulis lakukan maka diketahui bahwa persepsi guru terhadap kepemimpinan kepala TK Se Kecamatan Pagaran Tapah Darussalam Kabupaten