• No results found

D6.1 GEEWHEZ Test plan

N/A
N/A
Protected

Academic year: 2021

Share "D6.1 GEEWHEZ Test plan"

Copied!
23
0
0

Loading.... (view fulltext now)

Full text

(1)

D6.1 – GEEWHEZ Test plan

Coordinator: Zoomarine Italia S.p.A. – Claudio Di Capua

FP7-SME-2011

Research for the benefit of specific groups

Project Start Date:1st October 2011 Duration:24 months

FP7-SME-2011-1 Grant Agreement 286533 - 25 August 2011 Version:1.0

Project co-funded by the European Commission within the Seventh Framework Programme (2007 - 2013)

(2)

Page 2 of 23

Document information

Title GEEWHEZ Test plan

Workpackage 6

Responsible UC3M

Due date Project month 04 (January 2012)

Type Report

Status Version 1.0

Dissemination Public

Authors

Mario Muñoz Organero Marco Hernaiz Cao Claudia Brito Pacheco Marco Vettorello Juan Rosell Ortega

(3)

Page 3 of 23

Table of Content:

1 Executive Summary ... 5 2 Introduction ... 6 2.1 Objectives ... 6 2.2 Scope ... 6 2.2.1 Testing Techniques ... 6

2.3 Outside the Scope ... 8

3 Outline of Planned Tests ... 9

3.1 Unit Testing ... 9

3.2 Integration Testing ... 9

3.3 System Testing ... 10

3.3.1 User Interface Testing... 10

3.3.2 Security Testing ... 10

3.3.3 Performance Testing ... 11

3.3.4 Recovery Testing ... 11

3.4 Acceptance Testing ... 11

3.5 Regression Testing ... 11

4 Test Plan Criteria ... 13

4.1 Pass/Fail Criteria ... 13

4.2 Suspension Criteria and Resumption Requirements ... 13

4.2.1 Suspension Criteria ... 13 4.2.2 Resumption Requirements ... 13 5 Test Deliverables ... 14 6 Environmental Needs ... 16 6.1 Hardware ... 16 6.2 Software ... 16 6.3 Tools ... 17 7 Responsibilities ... 19 8 Schedule ... 20

9 Risks and Contingencies... 21

10 References ... 22

(4)

Page 4 of 23

List of Figures:

Figure 1: Black-Box Testing ... 7

Figure 2 Testing document production phases ... 15

Figure 3 Test Plan Schedule ... 20

List of Table:

Table 1 Comparison between White- and Black-Box Testing ... 7

Table 2 Levels of Software Testing ... 12

Table 3 Hardware required ... 16

Table 4 Software required ... 17

Table 5 Tools required ... 18

(5)

Page 5 of 23

1 Executive Summary

This deliverable captures the GEEWHEZ project Test Plan intended to describe the scope, approach, resources, and schedule of the testing activities. The document also identifies the test plan deliverables, the participants responsible for implementing each task, and the risks associated with the plan.

This document (D6.1 delivered at month 4 of the project) is intended to serve as a framework document for the consideration of these issues. During the development of the testing activities it shall be used in conjunction with the followings:

• Considered scenarios

• Analysis of scenarios and extraction of functional and non-functional requirements • System Architecture Specification

(6)

Page 6 of 23

2 Introduction

2.1 Objectives

This Test Plan aims to collect all the information necessary to plan and control the testing activities to be performed for the GEEWHEZ system. To achieve this objective, this document takes into account the following issues:

1. Outline the testing approach that will be used. In other words, provide a methodology on what the team involved in the test activities should verify and the types of tests they will perform.

2. List the resulting deliverables of the testing activities.

3. Identify both human and non-human resources required to perform the Test Plan. 4. Provide a timeline with milestones for the testing phase.

The testing activities described in this Test Plan are intended to:

1. Ensure that the GEEWHEZ system meets the specifications and design criteria specified in the following documents: “Analysis of scenarios and extraction of functional and non-functional requirements” and “System Architecture Specification”.

2. Ensure that the GEEWHEZ system is stable and bug-free and the risk of software/hardware failure is reduced to a minimum.

In a nutshell, these tasks aim to verify the proper operation of the GEEWHEZ platform and its modules.

2.2 Scope

This document is intended to provide a test plan which describes the testing activities to be performed to verify the accuracy, reliability and completeness of the GEEWHEZ system. This test plan will consist of unit, integration, system, acceptance and regression testing. Testing techniques that will be performed include white- and black-box testing.

2.2.1 Testing Techniques

Software testing is one of the “verification and validation”, or V&V, software practices. This can be illustrated with the following example. Verification (the first V) asks the question: “Are we building the system right?” as opposed to Validation (the second V) which asks the question: “Are we building the right system?”

To answer these questions it is necessary to perform the two testing techniques: white- and black-box testing. The main difference between both techniques is the tester’s view of the system.

Generally speaking, white-box testing is a verification technique that takes into account the internal mechanism of a system or component [1]. Its main objective is to verify the internal workings of the system, specifically, the logic and the structure of the code. Software engineers

(7)

Page 7 of 23

can usually use it to examine if their code works as expected. White-box testing is also known as structural testing, clear box testing, and glass box testing.

Black-box testing (also called functional testing or behavioral testing) is a validation technique that

ignores the internal mechanism of a system or component and focuses solely on the outputs generated in response to selected inputs and execution conditions [1]. The goal of this type of testing is to test how well the system conforms to the specifications. The software tester does not (or should not) have access to the source code itself.

Black-box testing attempts to find errors in the external behavior of the code in the following categories [2]: (1) incorrect or missing functionality; (2) interface errors; (3) errors in data structures used by interfaces; (4) behavior or performance errors; and (5) initialization and termination errors.

Figure 1: Black-Box Testing

As mentioned above, it is best if the person who plans and executes black box tests is not the programmer of the code and does not know anything about the structure of the code. The programmers of the code are innately biased and are likely to test that the program does what they programmed it to do.

The following table summarizes the three differences between both testing techniques:

Tester Visibility A failed test case

reveals Controlled?

White-Box Testing Code Structure A problem (fault)

Yes, it helps to identify the specific lines of code involved.

Black-Box Testing System’s Inputs/Outputs A symptom of a problem (failure).

No, it can be hard to find the cause of the failure. Table 1 Comparison between White- and Black-Box Testing

In order to make the distinction between fault and failure more clear let’s take into consideration their definitions. A fault is an incorrect step, process, or data definition in a program [1]. However, a failure is the inability of a system or component to perform its required functionwithin the specified performance requirement [1].

(8)

Page 8 of 23

2.3 Outside the Scope

Some tests that were omitted in this Test Plan include: installation and job stream testing. These tests should be done during the deployment phases on each ATP.

(9)

Page 9 of 23

3 Outline of Planned Tests

The participants involved in the testing activities will use the system documentation to prepare all test case specifications. This approach will verify the accuracy and comprehensiveness of the information in the documentation in those areas covered by the tests.

The partners in charge of the actual research and development of the GEEWHEZ modules are responsible for performing the tests mentioned below: unit, integration, system, acceptance and regression testing. For each of the following tests that will be performed to verify the GEEWHEZ system it will be specified the following four issues: the testing technique (is it white-box or black-box testing), the specification (is it the actual code structure, the low/high-level design or the system requirements), the scale (is the tester examining a small bit of code or the whole system and its environment), and the tester (is it the software developer, an independent tester or the customer).

3.1 Unit Testing

Unit testing will test individual hardware or software components along with their functions in isolation. Unit testing is important for ensuring the component is solid before it is integrated with other component.

Testing Technique: White-box testing

Specification: Code structure and/or low-level design

This low level form of testing will consist in white-box testing. Simple unit faults might need to be found in black-box testing if adequate white-box testing is not done properly.

Using white-box testing techniques, testers (usually the developers creating the code implementation) verify that the code does what it is intended to do at a very low structural level. The tester will perform this task by writing some test code (included in a test case, in turn, included in a test suite) that will call a method with certain parameters and will ensure that the return value of this method is as expected.

3.2 Integration Testing

Integration testing is a type of testing in which software components, hardware components, or both are combined and tested to confirm that they interact between them according to their requirements [1]. Integration testing can continue progressively until the entire system has been integrated.

Testing Technique: White- and black-box testing Specification: Low- and high-level design

Integration testing will allow testing of all the individually tested units together as a whole.

Using both white- and black-box testing techniques, the tester (still usually the software developer) verifies that units work together when they are integrated into a larger code base. To plan these integration test cases, testers look at low- and high-level design documents.

(10)

Page 10 of 23

3.3 System Testing

System testing is testing conducted on a complete, integrated system to evaluate the system’s compliance with its specified requirements in representative environments [1]. Because system test is done with a full system implementation and environment, several classes of testing can be done that can examine non-functional properties of the system. This Test Plan includes the following:

• User Interface Testing • Security Testing • Performance Testing • Recovery Testing

Testing Technique: Black-box testing Specification: High-level design

3.3.1 User Interface Testing

The purpose of user interface testing is to verify that the system’s GUI meets its written specifications.

The GUI will be tested by comparing the user interface requirements specified in the document “Analysis of scenarios and extraction of functional and non-functional requirements” with the actual implementation of the GEEWHEZ system. These requirements may include some user interface issues such as aesthetic, validation, navigation, usability and data integrity conditions.

3.3.2 Security Testing

Security testing is performed to determine that the system protects data, and consequently there is not any information leakage, and maintains its functionality as intended. It includes the following:

• Authentication: Allow a receiver to have confidence that information it receives originated from a specific known source.

• Authorization: Determining that a requester is allowed to receive a service or perform an operation.

• Confidentiality: Protect the disclosure of data or information to other parties than the intended.

• Integrity: Check that the intended receiver receives the information or data which is not altered in transmission.

• Non-repudiation: Interchange of authentication information with some form of provable time stamp.

System’s or component’s security will be evaluated against the security requirements specified in the document “Analysis of scenarios and extraction of functional and non-functional requirements”. These requirements may include some security issues such as passwords and permissions, and whatever login or authentication method is in use. For instance, in order to verify permissions

(11)

Page 11 of 23

requirements the tester will verify that if you are not logged in as an administrator, you cannot carry out administrative functions.

Although there is usually some basic security tests included in system testing it does not, however, focus on items such as how to obtain administrative privileges outside of the user login. Although "security" is often presented as merely an aspect of system testing, it really needs to be considered and planned separately. It may take place alongside system testing, but it has almost an opposite focus.

3.3.3 Performance Testing

This testing verifies that a system or component is being performed accordingly to customer expectations (response time, availability, portability, and scalability).

System’s or component’s performance will be evaluated against the performance requirements specified in the document “Analysis of scenarios and extraction of functional and non-functional requirements”.

3.3.4 Recovery Testing

The purpose of recovery testing is to check how fast the system can restart after any type of crash or hardware failure has occurred. Recovery testing is also done to ensure that system backup and recovery facilities operate as designed.

3.4 Acceptance Testing

This formal testing is conducted to validate the system’s compliance with all its requirements (functional and non-functional) in customer’s environment. These requirements are specified in the document “Analysis of scenarios and extraction of functional and non-functional requirements”. It also assures appropriate system acceptance by the user (SMEs owning the GEEWHEZ modules). Testing Technique: Black-box testing

Specification: Requirements specification

After the entire system has been fully tested, now it is ready to be delivered to the SMEs. They will be responsible for writing black-box acceptance tests based on their expectations of the functionality with the assistance of the test team. The test team will run these tests before attempting to deliver the system.

3.5 Regression Testing

Regression testing is selective retesting of a system or component to verify that modifications have not caused unintended effects and that the system or component still complies with its specified requirements [1]. It is usually done to ensure that applied changes to the system have not adversely affected previously tested functionality.

Testing Technique: White- and black-box testing

Specification: Changed documentation (requirements and/or design specification) and high-level design.

(12)

Page 12 of 23

Since regression tests are run throughout the development cycle, there can be white-box regression tests at the unit and integration levels and black-box tests at the integration, system and acceptance test levels.

It is assumed that several iterations of the regression test will be done in order to test system modifications made during the system test period. A regression test will be performed for each new version of the system to detect unexpected impact resulting from system modifications.

The following guidelines should be used when choosing a set of regression tests (also referred to as the regression test suite):

• Choose a representative sample of tests that exercise all the existing software functions • Choose tests that focus on the software components/functions that have been changed • Choose additional test cases that focus on the software functions that are most likely to be

affected by the change.

The following table summarizes the five levels of testing included in this Test Plan:

Testing

Type Opacity Specification General Scope Tester

Unit White-Box Actual Code Structure Low-Level Design

Small unit of code no larger than a class

Programmer who wrote code

Integration White-Box Black-Box

Low-Level Design

High-Level Design Multiple classes

Programmer(s) who wrote code

System Black-Box Requirements Analysis

Whole system in representative

environments

Independent tester

Acceptance Black-Box Requirements Analysis Whole system in

customer’s environment Customer

Regression White-Box Black-Box

Changed Documentation

High-Level Design Any of the above

Programmer(s) or independent

testers Table 2 Levels of Software Testing

(13)

Page 13 of 23

4 Test Plan Criteria

4.1 Pass/Fail Criteria

• All test suites completed for software modules.

• A specified number of tests completed without errors and a percentage with minor defects for hardware modules.

4.2 Suspension Criteria and Resumption Requirements

4.2.1 Suspension Criteria

Test suite execution will be suspended if a critical failure that impedes the ability or value in performing the associated test(s) is discovered.

4.2.2 Resumption Requirements

When a new version of the system is developed after a suspension of testing has occurred, a regression test as described in 3. 5 will be run.

(14)

Page 14 of 23

5 Test Deliverables

The following documents will be generated during the GEEWHEZ testing process: a) Test Plan (this document)

b) Test Case Specifications

These documents specify for each GEEWHEZ module’s and Middleware’s testing requirements the exact input values that will be input and the values of any standing data that is required, the exact output values and changes of value of the internal system state that are expected and any special steps for setting up the tests. It also specifies how the tester will physically run the test, the physical set-up required, and the procedure steps that need to be followed.

c) Test Reports

These documents record for each GEEWHEZ module and Middleware the details of what Test Cases have been run, the order of their running, and the results of the test. The results are either the test passed, meaning that the actual and expected results were identical, or it failed and that there was a discrepancy. If there is a discrepancy it also reports all details of the incident such as actual and expected results, when it failed, and any supporting evidence that will help in its resolution. The report will also include, if possible, an assessment of the impact upon testing of an incident. This report can be automatically generated during the unit testing by the testing framework used (for e.g. TestNG) but it should include the impact on the overall testing procedure.

d) Test Summary Report

This report brings together all pertinent information about the testing, including an assessment about how well the testing has been done, the number of incidents raised and outstanding, and crucially an assessment about the quality of the system.

The following picture illustrates the testing document production phases and how documents are related to each other’s:

The Test Plan is an essential part of the GEEWHEZ project documentation. As mentioned in section 1, it shall be used in conjunction with other project documents to prepare Test Case Specifications documents, one for each GEEWHEZ module (Water Monitoring System, Surveillance System, Leisure Services and Administrative Tools) plus one more for the Middleware. Each of these will consist of two or more Test Case Specification.

After all Test Cases defined for a module have been executed, a new document called Test Reports containing the results of this module’s tests executions is generated.

Once the entire system has been fully tested, all Test Reports are integrated. As a result the Test Summary Report is produced.

(15)

Page 15 of 23

(16)

Page 16 of 23

6 Environmental Needs

This section presents the non-human resources required for the GEEWHEZ Test Plan.

6.1 Hardware

The following list summarizes the system resources required in the test environment for the development of this Test Plan:

Name Quantity Type and Other Notes

Wireless Network Infrastructure 1 A tested and deployed wireless network infrastructure connected to the ATP intranet. Android smartphones 10 Android smartphones with different OS (2.1 to

4.0).

Mainframe Cabinet 1 Cabinet containing the CPU in order to protect it.

Thermographic camera 2 Thermographic cameras for night/day vision.

Visible range camera 1 Visible range camera

CPU 1 CPU for industrial environments.

Water Treatment Integrated Unit 18 Central Unit connected to probes and pumps for monitoring and adjusting water parameters. Table 3 Hardware required

6.2 Software

The following list shows all the software elements required in the test environment for the implementation of this Test Plan:

Name Version Licenses Type and Other Notes

Windows OS 7 3 Operating System

Eclipse Indigo SDK 3.7.1 Open Source IDE, mostly

provided in Java.

JDK 1.7

Development environment for building applications, applets, and components using the Java programming language.

IE, Firefox, Chrome, Safari and Opera Web Browsers

(17)

Page 17 of 23

Object-Relational DBMS. Table 4 Software required

6.3 Tools

The following tools will be employed to support the test process for this Test Plan:

Brand Name Version Vendor Type and Other Notes

Trac 0.12.2 Edgewall

Software

Defect/Issue Tracking

http://minerva.netgroup.uniroma2.it/geewhez

Maven 3.0.3 Apache

Software Project Management and Comprehension Tool.

http://maven.apache.org/

TestNG 6.3.1 TestNG Testing Framework inspired from JUnit and NUnit. http://testng.org/doc/index.html

Monkey r16 Google

Test suite for Android UI that generates pseudo-random streams of user events such as clicks, touches, or gestures, as well as a number of system-level events.

Android JUnit

Extension r16 Google

Component-specific test case classes for Android environment.

REST Assured 1.5 Jayway

Framework for testing and validating REST services. Based on JUnit.

http://code.google.com/p/rest-assured/

Selenium 2.17 Selenium

HQ

Web browser automation testing tool for automating web applications. Testing can be automated in TestNG.

http://seleniumhq.org/

Jmeter 2.5.1 Apache

Graphical server performance testing tool used to simulate a heavy load on a server, network or object to test its strength or to analyse overall performance under different load types.

http://jmeter.apache.org/

t.a.w. 1.0 CTIC Accessibility tool for the analysis of Web sites, based on the W3C.

(18)

Page 18 of 23

Brand Name Version Vendor Type and Other Notes

http://www.tawdis.net/

Fiddler2 Watcher

2.0 1.5.4

Web debugging proxy which logs all HTTP(S) traffic between the computer and the Internet.

http://fiddler2.com/fiddler2/

Web security testing tool and passive vulnerability scanner.

http://websecuritytool.codeplex.com/

Wallflower - Microsoft

Benchmark used for white-box tests of surveillance algorithms.

http://research.microsoft.com/en-us/um/people/jckrumm/wallflower/testimages.htm Table 5 Tools required

About the water monitoring system testing environment: the GEEWHEZ consortium submitted to the REA an amendment asking for the inclusion, in the consortium, of a new certified partner in charge of the future development of this system. If the amendment will be approved this new partner will provide a complete description of the water monitoring test procedure and tools.

(19)

Page 19 of 23

7 Responsibilities

The following table shows the participants responsible for implementing each task:

Task Involved Participant(s)

Develop and execute Middleware test suites

MATEMATICI UC3M T-CON

Develop and execute Surveillance System test suites FAICO

Develop and execute Water Monitoring System test suites Technovation (*)

Develop and execute Leisure Services test suites T-CON

Develop and execute Administrative Tools test suites UC3M

Develop and execute GEWHEEZ First Integration test suites FAICO Develop and execute GEWHEEZ Second Integration test suites T-CON, UC3M Develop and execute GEWHEEZ Final Integration and Prototype test

suites. T-CON, UC3M

Table 6 Responsibilities

(*) About the participant involved in the development and execution of the Water Monitoring System test suites: the GEEWHEZ consortium submitted to the REA an amendment asking for the inclusion, in the consortium, of a new certified partner in charge of the future development of this system. If the amendment will be approved this new partner will be responsible for performing this task.

(20)

Page 20 of 23

8 Schedule

(21)

Page 21 of 23

9 Risks and Contingencies

These are the overall risks to the project with a special emphasis on the testing process: • Lack of personnel resources when testing is to begin

• Lack of availability of required hardware • Late delivery of the hardware

• Delays in training on the system and/or tools • Changes to the original requirements or designs

If the requirements change after their formal definition, the following actions will be taken:

• The test schedule and development schedule will move out an appropriate number of days. • The number of test performed will be reduced1

• The number of acceptable defects will be increased1 • Resources will be added to the test team

• The test team will work overtime (this could affect team morale). • The scope of the plan may be changed

• There may be some optimization of resources. This should be avoided, if possible, for obvious reasons.

1

(22)

Page 22 of 23

10 References

[1] IEEE, "IEEE Standard 610.12-1990, IEEE Standard Glossary of Software Engineering Terminology," 1990.

[2] R. Pressman, Software Engineering: A Practitioner's Approach. Boston: McGraw Hill, 2001. This Test Plan is based on the IEEE 829-2008 Standard for Software Test Documentation.

(23)

Page 23 of 23

11 Acronyms

Acronym Description

ATP Animal Theme Park

CPU Central Processing Unit

DBMS DataBase Management System

FAICO Fundación Andaluza de Imagen, Color y Óptica

GUI Graphical User Interface

HTTP HyperText Transfer Protocol

HTTPS HyperText Transfer Protocol Secure

IDE Integrated Development Environment

IE Internet Explorer

JDK Java Development Kit

OS Operating System

REST Representational State Transfer

SDK Software Development Kit

SMEs Small and Medium-sized Enterprises

SQL Structured Query Language

UC3M Universidad Carlos III de Madrid

UI User Interface

References

Related documents

H2: Enabling school structure, trust in colleagues and trust in the principal, and collective efficacy will individually and jointly contribute to an explanation, and be predictive

The rationale for cross-listing courses is usually that faculty members from more than one department take turns teaching the course and that each department/program involved

All of the participants were faculty members, currently working in a higher education setting, teaching adapted physical activity / education courses and, finally, were

With all four financing mechanisms, price control results in a price decrease in both the monopolistic and the competitive region due to the incumbent ’s strong market power.. There

This suggest that developed countries, such as the UK and US, have superior payment systems which facilitate greater digital finance usage through electronic payments compared to

Objective: To explore women ’s experiences of taking adjuvant endocrine therapy as a treatment for breast cancer and how their beliefs about the purpose of the medication, side

Declarer can in fact make three of the remaining tricks by ruffi ng a diamond with the ace and then leading a club and discarding his heart– East is now trump bound and whatever

State Level Governance of Health Information Exchange © Image Research 2014.. Using the EHR as a