Motivation. Quality Engineering for Test Artifacts TTCN-3 (1/2) TTCN-3 (2/2) Agenda. Agenda. Organizational Quality Assurance

14 

Loading....

Loading....

Loading....

Loading....

Loading....

Full text

(1)

Quality Engineering

for Test Artifacts

Jens Grabowski, Benjamin Zeiss University of Göttingen, Germany

2

2 TAROT 09

Motivation

 Test specification complexity is growing  Current TTCN-3 test suites:

 SIP (ETSI TS 102 027-3 - v4.2.5): 61.990 LOC

 IPv6 Core Protocol (ETSI TS 102.516 - v3.1.1): 67.580 LOC

 3GPP / LTE (STF 160): 71.687 LOC

 Quality Assurance (QA) of test specifications is necessary!  Quality assurance classification:

 Organizational Quality Assurance (infrastructure and management)

 Constructive Product Quality Assurance (avoiding issues, preventative)   Analytical Product Quality Assurance (eliminating issues, reactive)

 How can we apply QA to test specification development?

3

3 TAROT 09

  Testing and Test Control Notation version 3   Language for specifying and implementing distributed tests.   Standardized by

 European Telecommunications Standards Institute (ETSI),  International Telecommunication Union (ITU).   History:

  Standardisation bodies publish (e.g. for ISDN, GSM, UMTS):  Specification of a communication protocol,

 Test suite to check conformance of a protocol implementation to its specification.   Industry:

 Implements specified protocols in their equipment,

 Execute standardised test suites against their implementation.   Today:

  TTCN-3 not only used in telecommunication domain, but for Internet, Service-Oriented Architectures, Automotive, …

TTCN-3 (1/2)

4 4 TAROT 09

TTCN-3 (2/2)

 Example: module exampleModule { ...

type record IpAddressType { charstring ipAddress };

template IpAddressType localhostTemplate := { ipAddress := "127.0.0.1"

}

testcase exampleTestCase() runson ExampleComponent { portA.send(localhostTemplate);

alt {

[] portB.receive(localhostTemplate) {

setverdict(pass); }

[] portB.receive(IpAddressType:{*}) {

setverdict(fail); }

} } }

  Look and feel of common programming languages

5

5 TAROT 09

 Organizational Quality Assurance

 Constructive Product Quality Assurance

 Analytical Product Quality Assurance

 Tools

 Summary / Conclusion

Agenda

6

6 TAROT 09

 Organizational Quality Assurance

 Constructive Product Quality Assurance

 Analytical Product Quality Assurance

 Tools

 Summary / Conclusion

(2)

7

7 TAROT 09

Organizational Quality Assurance

 Assurance of infrastructure and management quality

 Infrastructure quality:

 Configuration management

 Defect management

 …

 Management quality:

 Test process model

 Process improvement models

8

8 TAROT 09

Organizational Quality Assurance: Test process model (1/3)

 Describes activities that need to happen in development

 On what subject something needs to happen  When something happens

 Why it needs to happen

 Example: ISQTB fundamental test process:

9

9 TAROT 09

Organizational Quality Assurance: Test process model (2/3)

 Example: ISQTB fundamental test process (continued)  Test planning and test control

 Testing approach, resource planning, test strategy, schedule, metrics definition for monitoring and control

 Test plan document: IEEE 829 Test Plan Specification  Test analysis and test design

 Decision what to test, how much effort to spend, what types of

testing, tools, risk analysis

 Test specification: IEEE 829 Test Design Specification

 Document reviews

 Test implementation and test execution  Definition of high-level and concrete test cases

 TTCN-3

10

10 TAROT 09

Organizational Quality Assurance: Test process model (3/3)

 Example: ISQTB fundamental test process (continued)  Evaluation of exit criteria and reporting

 Test summary report (test progress / test conclusion)

 Exit criteria from the planning phase satisfied?  Closure

 Wrap-up reports

 Archiving of test documents and test data

11

11 TAROT 09

Organizational Quality Assurance

 Assurance of infrastructure and management quality

 Infrastructure quality:

 Configuration management

 Defect management

 …

 Management quality:

 Test process model

 Process improvement models

12

12 TAROT 09

Organizational Quality Assurance: Process improvement models

 Generic process improvement models  Capability Maturity Model integrated (CMMI)  SPICE (ISO 15504)

 Test process improvement models  Test Maturity Model integrated (TMMi)  Test Process Improvement (TPI)  Critical Testing Processes (CTP)

(3)

13

13 TAROT 09

 TMMi = Test Maturity Model integrated

 Originally developed by Ilene Burnstein at Illinois

Institute of Technology since1996  Complementary to CMMI

 TMMi reference model describes level 1-3 in detail

 Can be mapped to other models like TPI

Organizational Quality Assurance: Process improvement models: TMMi (1/3)

14

14 TAROT 09

 Staged model

 Level 1 (Initial):

 Chaos, ad-hoc debugging

 Level 2 (Managed):

 Test policy and strategy, test planning, test monitoring and control, test design and execution, test environment

 Main objective: does the product satisfy the specified requirements? (Functional Testing)

 Level 3 (Defined):

 Test organization, test training program, test life-cycle and integration, non-functional testing, peer reviews

 Testing at an early project stage  More rigorous process descriptions

Organizational Quality Assurance: Process improvement models: TMMi (2/3)

15

15 TAROT 09

 Staged model (continued)

 Level 4 (Management and Measurement):

 Test measurement, software quality evaluation, advanced peer reviews  Measurements to support decision making

 Level 5 (Optimization):

 Defect prevention, test process optimization, quality control  Continuous test process improvement based on quantitative understanding

Organizational Quality Assurance: Process improvement models: TMMi (3/3)

16

16 TAROT 09

 Organizational Quality Assurance

 Constructive Product Quality Assurance

 Analytical Product Quality Assurance

 Tools

 Summary / Conclusion

Agenda

17

17 TAROT 09

Constructive Product Quality Assurance

 Training and technology certifications

 High-level testing languages

 Best practices:

 Test development guidelines  Test patterns

 Tooling

18

18 TAROT 09

Constructive Product Quality Assurance

 Training and technology certifications

 High-level testing languages

 Best practices:

 Test development guidelines  Test patterns

(4)

19

19 TAROT 09

Constructive Product Quality Assurance: Training and technology certifications (1/2)  Quality Assurance Institute (QAI)

 http://www.qaiworldwide.org

 Certified Software Test Engineer (CSTE)  Certified Software Quality Analyst (CSQA)

 American Society for Quality (ASQ)

 http://www.asq.org

 Software Quality Engineer (CSQE)  Quality Inspector (CQI)  …

 International Institute for Software Testing (IIST)

 http://www.iist.org

 Certified Software Test Professional (CSTP)  Certified Test Manager (CTM)

20

20 TAROT 09

Constructive Product Quality Assurance: Training and technology certifications (2/2)  International Software Qualifications Testing Board (ISQTB)

 http://www.istqb.org

 Certified Tester Foundation Level (CTFL)  Certified Tester Advanced Level (CTAL)  Certified Tester Expert Level (in preparation)

 German Testing Board (GTB)

 http://www.german-testing-board.de

 TTCN-3 Certificate

 http://german-testing-board.info/de/ttcn3_certificate.shtm

21

21 TAROT 09

Constructive Product Quality Assurance

 Training and technology certifications

 High-level testing languages

 Best practices:

 Test development guidelines  Test patterns

 Tooling

22

22 TAROT 09

Constructive Product Quality Assurance: High-level testing languages

 Test purpose specification languages:  Formal enough to parse

 Informal enough for non-technicians to understand  Possibly: transformation of test purposes into test cases  Possible choices: ETSI TPLan, UML Testing Profile

 Test specification and implementation languages:  Strongly typed (to make analysis and refactoring easy)  High-level, abstracted from technical details  Allows full test automation

 Possible choices: TTCN-3, JUnit, UML Testing Profile

23

23 TAROT 09

Constructive Product Quality Assurance

 Training and technology certifications

 High-level testing languages

 Best practices:

 Test development guidelines

 Test patterns  Tooling

24

24 TAROT 09

Constructive Product Quality Assurance: Best practices: test development guidelines

 Naming conventions

 Example: module names start with an upper-case letter

 Code formatting guidelines

 Example: spaces instead of tabs

 Code documentation guidelines

 Example: all test cases must have a @desc and @param description

 Code style guidelines

 Example: no labels or goto statements

(5)

25

25 TAROT 09

Constructive Product Quality Assurance

 Training and technology certifications

 High-level testing languages

 Best practices:

 Test development guidelines

 Test patterns

 Tooling

26

26 TAROT 09

Constructive Product Quality Assurance: Best practices: test patterns

 Identify and document general and proven solutions for commonly

occurring problems

 Provide a common vocabulary for these solutions

 Catalog with a fixed notation

 Exemplified for TTCN-3: classification proposal for TTCN-3 specification

test patterns (Vouffo-Feudijo and Schieferdecker)

 Architectural patterns  Behavioral patterns  Data patterns  Test reuse patterns

27

27 TAROT 09

Constructive Product Quality Assurance: Best practices: test patterns – example (1/3) Name Timer on transmission

Class Behavior

Testing phases Specification

Testing goals All

Application domain Any

Intend Avoid deadlock situation when transmitting data from test component

Context For the testing of reactive systems tested typically via interfaces

Parameter Timer duration

28

28 TAROT 09

Constructive Product Quality Assurance: Best practices: test patterns – example (2/3) Roles test component, source port, destination port

Detailed description After calling a method from a test component or sending a message on a given port, a timer should be started to avoid deadlock in case the SUT does not reply to the function call or to the transmitted message.

Example function timedSend (template <outMessageType> <outMsg>, <OutPortType> <outPort>, timer <t>, template

<inMessageType> <inMsg>, <InPortType> <inPort>)

returns verdicttype

{

<outPort>.send (<outMsg>); <t>.start;

alt {

[]<inPort>.receive (<inMsg>) { <t>.stop;

return pass; }

[]<t>.timeout{returnfail;} }

}

29

29 TAROT 09

Constructive Product Quality Assurance: Best practices: test patterns – example (3/3) Consequences None

Related patterns Default pattern

Known uses Protocol testing

30

30 TAROT 09

Constructive Product Quality Assurance

 Training and technology certifications

 High-level testing languages

 Best practices:

 Test development guidelines  Test patterns

(6)

31

31 TAROT 09

Constructive Product Quality Assurance: Best practices: tooling (1/2)

 Are the tools standards conformant?

 Can I switch tool vendors easily?

 Do the tools interoperate with other tools?

 Tool efficiency?

 Integrated Development Environment?

 Documentation generator?

32

32 TAROT 09

Constructive Product Quality Assurance: Best practices: tooling (2/2)

 Reporting?

 Analytical support?

 Support for quality assurance?

 Version control?

 Defect management system?

 Build automation / continuous integration?

33

33 TAROT 09

 Organizational Quality Assurance

 Constructive Product Quality Assurance

 Analytical Product Quality Assurance

 Tools

 Summary / Conclusion

Agenda

34

34 TAROT 09

Analytical Product Quality Assurance

- Code Review ... Analytical Test Quality Assurance Static Testing With Computer Dynamic Testing - Guideline Checking - Test Metrics - Smell Detection - … - Simulation -  Model-Based Analysis -  ... Without Computer

 Improvement: test refactoring (and debugging/correction)

35

35 TAROT 09

Analytical Product Quality Assurance

- Code Review ... Analytical Test Quality Assurance Static Testing With Computer Dynamic Testing - Guideline Checking - Test Metrics - Smell Detection - … - Simulation -  Model-Based Analysis -  ... Without Computer

 Improvement: test refactoring

36

36 TAROT 09

Analytical Product Quality Assurance: Static testing: code reviews (1/2)

(7)

37

37 TAROT 09

Analytical Product Quality Assurance: Static testing: code reviews (2/2)  Review types:

 Informal review

 Reviewers are asked to deliver remarks.

 No meeting, no formal documentation.

 Walkthrough

 Author presents artifact to the reviewers in a meeting. Remarks based

on the presentation.

 Technical review

 Does the test conform to its specification?

 Fitness for its purpose, Standards compliance.

 Meeting, unanimous review result.

 Inspection

 Formal with focus on possible defect, document quality, product and

process quality.

 Strict Agenda.

38

38 TAROT 09

Analytical Product Quality Assurance

- Code Review ... Analytical Test Quality Assurance Static Testing With Computer Dynamic Testing - Guideline Checking - Test Metrics - Smell Detection - … - Simulation -  Model-Based Analysis -  ... Without Computer

 Improvement: test refactoring

39

39 TAROT 09

Analytical Product Quality Assurance: Static testing: guideline checking

 Semantic analysis / type checking

 Float assignment to an integer variable,  Reading of undefined variable,  Call to undefined function,  Number of arguments mismatch,  Duplicate functions/test case names,  etc.

 Should be done by the compiler  Naming conventions / style guidelines

 Test case should start with prefix “TC_”,  Every test case must have a T3Doc comment,  Log messages must have a certain format,  Types should be defined in alphabetic order,  Variables should be declared first in a test case,  etc.

40

40 TAROT 09

Analytical Product Quality Assurance

- Code Review ... Analytical Test Quality Assurance Static Testing With Computer Dynamic Testing - Guideline Checking - Test Metrics - Smell Detection - … - Simulation -  Model-Based Analysis -  ... Without Computer

 Improvement: test refactoring

41

41 TAROT 09

Analytical Product Quality Assurance: Static testing: test metrics (1/11)

“When you measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge is of a meager and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely, in your thoughts, advanced to the stage of science.”

Lord Kelvin, 1891

“You cannot control what you cannot measure.”

Tom DeMarco, 1982

42

42 TAROT 09

Analytical Product Quality Assurance: Static testing: test metrics (2/11)  Metric: measurement scale + measurement

 Mapping of a software property into a numerical value or symbol of the measurement scale

 Vast amount of metrics have been suggested:

 Counting (Number of XXX, LOC)

 Cyclomatic complexity  Coverage

 Coupling  …

(8)

43

43 TAROT 09

Analytical Product Quality Assurance: Static testing: test metrics (3/11)

 

Metrics classification (Fenton, Pfleeger):

Software metrics Processes Size Metrics Resources Products Internal Attributes External Attributes Structural Metrics 44 44 TAROT 09

Analytical Product Quality Assurance: Static testing: test metrics (4/11)

 

The inconvenient truth:

 Metrics are not universal!

 Metric thresholds are not universal!

45

45 TAROT 09

Analytical Product Quality Assurance: Static testing: test metrics (5/11)

 You are often interested in exactly the opposite of what you measure:

 90% tests with verdict “pass”

 Why are 10% of the tests failing or inconclusive? Do the 10% reflect the severity of the negative test results?

 90% specification coverage

 Why is 10% of the specification untested? Hard to test?

 Number of test cases

 How can I select test cases (maximize coverage, minimize test executions)?

 Metrics tell you what happened, but not why!

46

46 TAROT 09

Analytical Product Quality Assurance: Static testing: test metrics (7/11)

 Reoccurring problems with metric definitions:

 The metric is unbelievable. People don’t understand it.  The metric is not defined clear or it is ambiguous or non-objective.  The metric is not repeatable.

 The metric is not comparable to other measurements.  People can fool the metric to make it look good.

 It is unclear what good and bad values for the metric are (thresholds).  The metric is “just” interesting and not measured for a reason.

47

47 TAROT 09

Analytical Product Quality Assurance: Static testing: test metrics (6/11)

 What should be measured then???

 A quality model is required!

 A quality model for a product provides:  The main quality characteristics of a product,  How these quality characteristics are subdivided.

 ISO 9126-1:

 Software engineering – Product quality – Quality Model

 Quality models for

 Internal quality,

 External quality,

 Quality in use.

Quality is composed of discrete characteristics, which may be structured into further sub-characteristcs.

48

48 TAROT 09

Analytical Product Quality Assurance: Static testing: test metrics (8/11)

External and Internal Quality

Functionality Reliability Usability Efficiency Maintainability Portability

Suitability Accuracy Interoperability Security Functionality Compliance Maturity Fault-Tolerance Recoverability Reliability Compliance Understand-ability Learnability Operability Attractiveness Usability Compliance Time Behaviour Resource Utilisation Efficiency Compliance Analysability Changeability Stability Testability Maintainability Compliance Adaptability Installability Co-Existence Replaceability Portability Compliance

(9)

49

49 TAROT 09

Analytical Product Quality Assurance: Static testing: test metrics (9/11)

Test Specification Quality Suitability Accuracy Interoperability Security Functionality Compliance Maturity Fault-Tolerance Recoverability Reliability Compliance Understand-ability Learnability Operability Attractiveness Usability Compliance Time Behaviour Resource Utilisation Efficiency Compliance Analysability Changeability Stability Testability Maintainability Compliance Adaptability Installability Co-Existence Replaceability Portability Compliance

Functionality Effectivity Test Reliability Usability Efficiency Maintainability Portability

Test Coverage Test Correctness Security Fault-Revealing Capability Test Effectivity Compliance Test Repeatability Reusability Coupling Flexibility Comprehen-sibility Reusability Compliance Test Evaluability 50 50 TAROT 09 Quality Assessment:

Static testing: test metrics (10/11)   How do we define metrics for the quality model?

  Goal, Question, Metric (GQM) approach from

Basili and Weiss (1984)

1.  State the goal to be achieved due to the measurement. 2.  Develop questions that breaks the goal into it’s major

components. What are the attributes that need to be captured?

3.  Develop metrics that answer the questions, preferably free from side-effects.

51

51 TAROT 09

Quality Assessment:

Static testing: test metrics (11/11)

  Example for GQM:

  Goal:

  Evaluate the changeability of templates in a TTCN-3 specification

  Questions:

  Are inline templates used?

  Are there many duplicate inline templates?

  What is the degree of test data duplication?

  Metrics:

  Number of inline templates

  Number of inline templates that are duplicates

 

52

52 TAROT 09

Analytical Product Quality Assurance

- Code Review ... Analytical Test Quality Assurance Static Testing With Computer Dynamic Testing - Guideline Checking - Test Metrics - Smell Detection - … - Simulation -  Model-Based Analysis -  ... Without Computer

 Improvement: test refactoring

53

53 TAROT 09

Analytical Product Quality Assurance: Static testing: smell detection (1/2)

 Patterns of possibly inappropriate code usage.

 Notion of metrics and code smells not disjoint:

 Code smell Metric: count occurrences of code smell.  Metric Code smell: metric violates boundary.

 > 38 TTCN-3 code smells identified (master’s thesis of Martin Bisanz).  Automatic detection of test smells using static analysis:

 Detection of patterns in the Abstract Syntax Tree (AST)

 Violation of metric thresholds.

“certain structures in the code that suggest (sometimes they scream for) the possibility of refactoring”

Fowler: Refactoring – Improving the Design of Existing Code. Addison-Wesley, 1999

54

54 TAROT 09

Analytical Product Quality Assurance: Static testing: smell detection (2/2)

Metric / TTCN-3 Code Smell SIP IPv6

Lines of code 42397 46163

Number of testcases 528 295

Number of components 2 10

Duplicate Alt Branches 938 224

Activation Asymmetries 73 317

Distinct Magic Values 135 222

(10)

55

55 TAROT 09

Analytical Product Quality Assurance

- Code Review ... Analytical Test Quality Assurance Static Testing With Computer Dynamic Testing - Guideline Checking - Test Metrics - Smell Detection - … - Simulation -  Model-Based Analysis -  ... Without Computer

 Improvement: test refactoring

56

56 TAROT 09

Analytical Product Quality Assurance: Dynamic testing: model-based analysis (1/6)  Testing test specifications?

 Unreasonable!

 We would need tests for the tests that test the tests etc.

 Our solution:

 Simulation of test behavior (branch coverage)  Construction of a model from the simulated test runs  Checking the model for anomalies using temporal logic  Result can essentially be considered an testing of specific

properties

57

57 TAROT 09

Analytical Product Quality Assurance: Dynamic testing: model-based analysis (2/6)

Structural Property

Linear Temporal Logic Spin / Promela Processes

Model Checking with Spin Test Component Models

58

58 TAROT 09

Analytical Product Quality Assurance: Dynamic testing: model-based analysis (3/6)

Model Checking with Spin

Linear Temporal Logic Structural Property

Spin / Promela Processes

No verdict set!

Test Component Models

59

59 TAROT 09

Analytical Product Quality Assurance: Dynamic testing: model-based analysis (4/6)

Test Component Models

Structural Property

Linear Temporal Logic Spin / Promela Processes

Model Checking with Spin • Simple possible LTL formula:

• For each path, no verdict is set until a verdict becomes either pass, inconclusive or fail

• More sophisticated and comprehensive formulas possible!

60

60 TAROT 09

Analytical Product Quality Assurance: Dynamic testing: model-based analysis (5/6)

Structural Property

Linear Temporal Logic Spin / Promela Processes

Model Checking with Spin • Result:

• The structural property does not hold in all possible paths.

• The failing traces. Test Component Models

(11)

61

61 TAROT 09

Analytical Product Quality Assurance: Dynamic testing: model-based analysis (6/6)

 Further possibilities:

  Deadlocks   Idle PTC   Default Asymmetries

  …

 10+ properties documented (work in progress)

62

62 TAROT 09

Analytical Product Quality Assurance

- Code Review ... Analytical Test Quality Assurance Static Testing With Computer Dynamic Testing - Guideline Checking - Test Metrics - Smell Detection - … - Simulation -  Model-Based Analysis -  ... Without Computer

 Improvement: test refactoring

63

63 TAROT 09

Analytical Product Quality Assurance: Improvement: test refactoring (1/6)

„a change made to the internal structure of software to make it easier to understand and cheaper to modify without changing its observable behavior“

Fowler, M.: Refactoring – Improving the Design of Existing Code. Addison-Wesley (1999)   Refactoring is a systematic way to restructure test code – no bugfixing, no

new functionality

  Improves quality characteristics like:

  Usability   Maintainability

  Reusability

  TTCN-3 refactoring catalogue with more than 50 refactorings:

  Test behavior, test data, overall test suite structure

  http://www.trex.informatik.uni-goettingen.de/trac/wiki/TTCN3RefactoringCatalogue

64

64 TAROT 09

Analytical Product Quality Assurance: Improvement: test refactoring (2/6)

 

Fixed format:

 Name  Summary  Motivation  Mechanics  Example  unrefactored  refactored 65 65 TAROT 09

Analytical Product Quality Assurance: Improvement: test refactoring (3/6)

Refactoring Example: Inline Template Parameter

 Summary:

  Inline a template parameter which is always given the same actual value.

 Motivation:

  Unneeded parameters create

 code clutter,  more coupling than needed.  Mechanics:

  Copy template.

 Remove parameter from formal parameter list,

 Replace each reference to formal parameter inside the template with the common actual parameter value.

  Remove actual parameter from each template reference.

  Remove original template.

66

66 TAROT 09

Analytical Product Quality Assurance: Improvement: test refactoring (4/6)   TTCN-3 example (unrefactored): module ExampleModule {

template ExampleType exampleTemplate(charstring addressParameter):={

ipv6:=false,

ipAddress:=addressParameter }

testcase exampleTestCase() runs on ExampleComponent {

portA.send(exampleTemplate("127.0.0.1"));

portB.receive(exampleTemplate("127.0.0.1"));

}

(12)

67

67 TAROT 09

Analytical Product Quality Assurance: Improvement: test refactoring (5/6)  TTCN-3 example (refactored): module ExampleModule {

template ExampleType exampleTemplate:={

ipv6:=false,

ipAddress:="127.0.0.1" }

testcase exampleTestCase() runs on ExampleComponent {

portA.send(exampleTemplate);

portB.receive(exampleTemplate);

}

}

68

68 TAROT 09

Analytical Product Quality Assurance: Improvement: test refactoring (6/6)

68 68 •  Effects on the number of templates (experiments):

11,5%

53,4%

36,8%

69

69 TAROT 09

 Organizational Quality Assurance

 Constructive Product Quality Assurance

 Analytical Product Quality Assurance

 Tools

 Summary / Conclusion

Agenda

70

70 TAROT 09

Tools for general-purpose languages

 Guideline checking and smell detection:  Java: Findbugs, PMD, Checkstyle, Sonar (free)  C/C++: Coverity Prevent, PCLint (commercial)  C#: FxCop, StyleCop, devAdvantage

 Software metrics:

 Java: Eclipse Metrics Plugin (free), CodePro AnalytiX (commercial)  C#: DevMetrics (free)

71

71 TAROT 09

Tools for general-purpose languages

 Refactoring:  Java: Eclipse, Netbeans  C/C++: Eclipse, SlickEdit  C#: Visual Studio+Resharper

 Dynamic Analysis:  Java: Java Pathfinder (free)  C/C++: Valgrind (free)

72

72 TAROT 09

Tools developed in Göttingen

 TRex  Metrics  Smell Detection  Refactoring  ETSICheck  TTCN-3 Guideline Checker  ETSIDOC  TTCN-3 Documentation Generator (HTML/PDF)

 Tool for Model-Based Analysis

 Reverse-Engineering Models from TTCN-3 Specifications  Analyzing Properties using Model-Checking

(13)

73

73 TAROT 09

 Organizational Quality Assurance

 Constructive Product Quality Assurance

 Analytical Product Quality Assurance

 Tools  Summary / Conclusion

Agenda

74 74 TAROT 09

Summary

 Today's test specifications are complex

 Quality Assurance (QA) is necessary

 QA aspects to be considered:

 Organizational QA

 Test process model, test process improvement model

 Constructive Product QA

 Training, code / style guidelines, test patterns, …  Analytical Product QA

 Static testing, dynamic testing, refactoring, debugging/correction

75

75 TAROT 09

Conclusion

 QA in testing involves more than coding guidelines!

 QA in testing needs to be considered early on  Example: consideration of tool support in decision making

 This talk covered only selected aspects of test QA  Organizational aspects were only indicated

 Test process metrics not covered  Test debugging methodologies not covered

 Raise the awareness for QA in testing!

76

76 TAROT 09

Thank you for your attention!

77

77 TAROT 09

Literature (1/6)

 Organizational Quality Assurance:

 Carnegie Mellon Software Engineering Institute: CMMI for

Development Version 1.2. 2006.

 TMMi Foundation: TMMi Reference Model Version 2.0. 2009.

 Koomen, T., Pol, M.: Test process improvement: a practical

step-by-step guide to structured testing. Addison-Wesley, 1999.

 TTCN-3 Tools:  http://www.trex.informatik.uni-goettingen.de  http://www.ttcn-3.org/CommercialTools.htm 78 78 TAROT 09

Literature (2/6)

 General-Purpose Tools:  http://www.eclipse.org  http://www.netbeans.org  http://metrics.sourceforge.net  http://www.instantiations.com/codepro  http://www.slickedit.com  http://javapathfinder.sourceforge.net  http://valgrind.org  http://findbugs.sourceforge.net  http://sonar.codehaus.org  http://www.coverity.com  http://www.gimpel.com  http://code.msdn.microsoft.com/sourceanalysis  http://www.anticipatingminds.com

(14)

79

79 TAROT 09

Literature (3/6)

 Constructive Quality Assurance:

 Vouffo-Feudijo, A. and Schieferdecker, I.: Test Patterns with TTCN-3.

FATES 2004, Volume 3395 of Lecture Notes in Computer Science, 2004.

 Din, G.: A Performance Test Design Method and its

Implementation Patterns for Multi-Services Systems. Fraunhofer IRB Verlag, 2009.

 Spillner, A., Linz, T, Schaefer, H.: Software Testing Foundations: A

Study Guide for the Certified Tester Exam. Rocky-Nook, 2007.

 Meszaros, G.: xUnit Test Patterns. Addison-Wesley, 2007.

80

80 TAROT 09

Literature (4/6)

 Analytical Quality Assurance:

 Static Testing of Test Specifications:

 Neukirchen, H., Zeiss, B., Grabowski, J.: An Approach to Quality Engineering of TTCN-3Test Specifications. International Journal on Software Tools for Technology Transfer (STTT) 10(4), 2008.  Neukirchen, H., Zeiss, B., Grabowski, J., Baker, P., Evans, D.: Quality

Assurance for TTCN-3 Test Specifications. Software Testing, Verification and Reliability (STVR) 18(2), 2008.

 Zeiß, B., Vega, D., Schieferdecker, I., Neukirchen H., Grabowski, J.. Applying the ISO 9126 Quality Model to Test Specifications — Exemplified for TTCN-3 Test Specifications. Software Engineering 2007 (SE 2007), 2007.

 Din, G., Vega, D., Schieferdecker, I.: Automated Maintainability of TTCN-3 Test Suites Based on Guideline Checking. Software Technologies for Embedded and Ubiquitous Systems. Volume 5287 of Lecture Notes in Computer Science, 2008.

81

81 TAROT 09

Literature (5/6)

 Analytical Quality Assurance:

 Static Testing of Test Specifications:

 Vega, D., Din, G., Schieferdecker, I.: TTCN-3 Test Data Analyser Using Constraint Programming. Proceedings of the 2008 19th International Conference on Systems Engineering, 2008.

 Bisanz, M.: Pattern-based Smell Detection in TTCN-3 Test Suites. Master’s Thesis, ZFI-BM-2006-44, ISSN 1612-6793, Zentrum für Informatik, Georg-August-Universität Göttingen , 2006.

 Zeiss, B.: A Refactoring Tool for TTCN-3. Master’s Thesis, ZFI-BM-2006-05, ISSN 1612-6793, Zentrum für Informatik, Georg-August-Universität Göttingen, 2006.

 Nödler, J.: An XML-based Approach for Software Analysis -- Applied to Detect Bad Smells in TTCN-3 Test Suites. Master’s Thesis, ZFI-BM-2007-36, ISSN 1612-6793, Zentrum für Informatik, Georg-August-Universität Göttingen, 2007.

82

82 TAROT 09

Literature (6/6)

 Analytical Quality Assurance:

 Dynamic Testing of Test Specifications:

 Zeiss, B., Grabowski, J.: Reverse-Engineering Test Behavior Models for the Analysis of Structural Anomalies. TESTCOM/FATES 2008 Short Papers, 2008.

 Makedonski, P.: Equivalence Checking of TTCN-3 Test Case Behavior. Master’s Thesis, ZFI-MSC-2008-16, ISSN 1612-6793, Zentrum für Informatik, Georg-August-Universität Göttingen, 2008. 83 83 TAROT 09

Contact

 

Web:

 

http://www.swe.informatik.uni-goettingen.de

 

E-Mail:

 

grabowski@cs.uni-goettingen.de

 

zeiss@cs.uni-goettingen.de

84 84 TAROT 09

Special Thanks

 

Andreas Ulrich, Siemens AG

Figure

Updating...

References

Updating...

Related subjects :