Automated Software Testing
Elfriede Dustin Email: edustin@idtus.com 4401 Wilson Blvd, Suite 810 Arlington, VA 22203 Office Phone: (703) 725-3051 www.idtus.comAutomated Software Testing
Elfriede Dustin
•
Latest book “Implementing Automated
Software Testing” Mar 2009
•
Book “The Art of Software Security Testing”
(Dec 2006)
•
Book “SAP Testing” (Spring 2007)
•
Author of book: “Effective Software Testing”
(Dec 2002)
•
Books: “Automated Software Testing” and
“Quality Web Systems”
•3
Who is IDT?
•
IDT specializes in the design, development,
and implementation of Automated Software
Testing solutions
–
Implement turn key automated test suites
–
Training and pilot project implementation with
companies
•
Research & Development related to
Automated Testing Technologies
Software Testing
•
What is Software Testing?
Deliver More Capability, Faster and Cheaper
Challenge – The Software
Landscape
•
Innovative Software
Technologies
•
Increased System Complexity
•
More Software to Test
•
Less Time
•
Fewer People
Our Ability to Test these Technologies has Not Kept
Pace with Our Ability to Create Them
More Time Than Ever is Spent on Software
Testing
1990: Beizer:
“” half the labor expended to develop a working program is typically spent on testing activities.”
2002: Hailpern and Santhanam
"... debugging, testing, and verification activities can easily range from 50 to 75 percent of the total development cost. "
2008: Redmond Developer News
“…Industry analysts estimate that developers spend only about 20 percent of their time
designing and coding ... the bulk of their time is spent on resolving application problems…"
Software Testing
•
What is the opportunity?
The Opportunity
•8
Utilize Automated Testing Strategies and
Technologies to Improve Productivity and Quality
Reduction in Test Days Increased Test Coverage Manual vs. Automated Test Coverage Automate manual tests that
consume the most resources Regression Testing Performance Testing Endurance Testing Interface Testing Compliance Testing
Who Can Benefit?
Organizations and Projects Responsible for Software Testing of Complex Systems
• Regression testing
─ Manpower Intensive
─ Time Consuming
• Expanded test coverage
• Analysis for large quantities of test data
• Performance and endurance testing
Definition of
Automated Software Testing
Application and implementation of software
technology throughout the entire Software Testing
Life Cycle (STL) with the goal to improve STL
•11
Automated Software Testing spans the entire
lifecycle
Additionally: Additionally: RMRM ModelingModeling MiddlewareMiddleware InfrastructureInfrastructureDefect TrackingDefect Tracking
Configuration Configuration Management Management
Memory Leak Memory Leak Detectors Detectors Performance Performance Testing tools Testing tools Documentation Documentation tools tools Development Tools Components Visual Modeling
Automated Testing Tools Requirements Management and
Process Automation
Software Configuration Management Defect Tracking
Middleware Infrastructure
Why Automate?
•
Reduces the time and cost of software testing
•
Improves software quality
•
Enhances manual testing efforts via increased
testing coverage and replaces the manually
mundane and labor-intensive tasks
•
Does what manual testing can hardly
accomplish, such as memory leak detection
under specific conditions, concurrency testing,
performance testing and more
Software Testing
•
Challenges of Automated Software Testing:
What are they?
Need for automated testing of mission critical
systems - Requirements
•
Can’t be intrusive to system under test
•
Needs to be OS independent
•
Needs to be able to handle display and non-display
centric automation
•
Needs to be able to handle multi-computer
environment
Need for automated testing of mission critical
systems - Requirements
•
Can’t be intrusive to system under test
IDT’s answer: Use of VNC Technology and Image based
automation
Need for automated testing of mission critical
systems - Requirements
•
Needs to be OS independent
Need for automated testing of mission critical
systems - Requirements
•
Needs to be able to handle display and non-display
centric automation
IDT’s answer: Capture/Playback Display based
automation and message based automated testing,
handling various protocols
Need for automated testing of mission critical
systems - Requirements
•
Needs to be able to handle multi-computer
environment
IDT’s answer: Implementation handles heterogeneous
environments, connecting multiple computers
Need for automated testing of mission critical
systems - Requirements
•
Non-developers should be able to use the tool
ATRT – Addresses Unique Challenges of Mission Critical
Systems
Current Industry provided automated testing tools focus on:
IDT’s unique ATRT solutions address:
Web based applications Thick client, diverse applications and protocols
Windows Based; focus on object properties; third party control dependent
Cross-platform (Linux, all flavors,
Windows, etc.) – independent of third party controls and/or object properties GUI Applications with Single Consoles Displays and message based testing,
handling various protocols across many computers and consoles
Intrusive to System Under Test (SUT) Non-intrusive to SUT; Uses VNC technology
•
Enables automation of actions of the test engineer:
− During testing, the engineer uses the keyboard, mouse,
touch panels, etc to perform actions
− Testing tool captures actions and information from the
screen, which are baselined in an automated test script
− Input actions and expected results become the baseline
− During test playback, compare latest outputs with
baselined results.
•
Utilizes VNC technology – to remotely connect to
system under test
ATRT Display Automation –
Capture / Record & Playback
ATRT - Capture/Playback
•
ATRT Display Automation uses a keyword driven
approach
–
User selects/clicks on a selection/image and a
related keyword
–
ATRT generates relevant code behind the scene
–
User does not need to be a software
developer/coder
•
Example Keywords used:
• Left-click; Right-click; Wait for; Enter text ….
Capture/Playback Illustration
•23
Test Step:
ATRT Capture Playback Illustration – using
Keywords
•24ATRT
Capture
Action:
1. Select
File
2. Left
Click
Keyword: Left-click generates this code behind the scenes:
try
put IDTImageRetryFunction("1", "1", "Image_1.tiff", "Click", "Image Image_1.tiff was found", "Image Image_1.tiff was not found") if ImageFound ("Image_1.tiff") then
put IdtLogLocal("1", "1", SUB_STEP_PASS, "Image Image_1.tiff was found") Click FoundImageLocation()
else
wait 3
if ImageFound ("Image_1.tiff") then
put IdtLogLocal("1", "1", SUB_STEP_PASS, "Image Image_1.tiff was found on 2nd try") Click FoundImageLocation()
else
wait 3
if ImageFound ("Image_1.tiff") then
put IdtLogLocal("1", "1", SUB_STEP_PASS, "Image Image_1.tiff was found on 3rd try") Click FoundImageLocation()
else
put IdtLogLocal("1", "1", SUB_STEP_FAIL, "Image Image_1.tiff was not found") CaptureScreen "1.1.error" put sendError("1.1.error.tiff") end if end if end if catch put IDTException("1", "1") end try
Capture / Record & Playback: ATRT
generated code based on Keywords
File - Open
Image Comparison
•
Image results are compared to the baseline
image previously captured
–
Eggplant is the COTS product we currently use for
the image comparison
•
Features include:
–
Image library
–
Wait for Image
–
Debug
–
Optical Character Recognition (OCR)
•27Other Features
•
Allow for Manual Steps
•
Real-time Status
•
Reporting
–
Summary
–
Detailed
Automated Test Strategy
Components
•30 ROI Metrics Design Process Strategy Business CaseBusiness Case
•
The Business Case is the summation of the
work of the work performed in the Automated
Test Strategy.
•
It will layout each of the steps outlined in the
following slides.
•
Begins with a current process assessment
•
Identify the keys to successfully implementing
a program that will reduce risk and
Current Process Assessment
•
The Current Process
Assessment reviews your
current testing strategy to
identify:
–
Testing Coverage
–
Testing Coverage
Requirements
–
Testing Limitations
–
Potential Risks
Testing Coverage
Today
Testing Coverage
Needed
Strategy
• The Strategy section defines the plan on how to most cost effectively apply resources to attacking the problem
• Not all tests can and should be done using automation.
• This step will identify the best candidates and the logical
order in which they should be performed.
Im
p
o
rt
a
n
ce
Required Executions
Start w/
these
Process
• The process definition will outline
what it will take to setup and execute automated testing.
• This definition will include:
– Hardware Requirements
– Recommended Software
– Integration Strategy and Requirements
•34
IDT automation techniques allow for the testing of the system without the installation of any software on the actual system thus allowing for a more
accurate testing environment
• The process will also identify expected costs and effort to setup the proper environment.
Design
• The Design step identifies the
people and procedures to perform the specified tasks.
• A key part of this is to identify the people necessary, by skill type, in order to ensure a successful
program.
It is a mistake to think that software developers alone can implement a successful automated testing program
• Once the people have been identified, you design how they will work to accomplish the mission.
Team Velocity Story Completion
Hot Issues Major Risks
Description Impact pot. Likelihood Risk Mitigation Owner / Due
Date M
Waiting on identification of a testing tool and reviewing performance logs on production. Starting with Jmeter scripts until tool selection is complete.
L Performance Losing Tarun January 10. Will have a v elocity decre ase until he is
backfille d on the team. M Urgency 23 26 30 26 30 30 32 33 21 2929 28 0 5 10 15 20 25 30 35
Plann ed Velocity Actua l Velo city
0 100 200 300 400 500 600
Start It. 1 It. 2 It. 3 It. 4 It. 5 It. 6 It. 7 It. 8 It. 9 It. 10 It. 11 It. 12 It. 13 It. 14 It. 15 It. 16 It. 17 It. 18
Comple ted Points Re ma ining Points (No rma l) Remainin g P oin ts (Epic) Plan
Targeted to end of March Release
Team Velocity Story Completion
Hot Issues Major Risks
Description Impact pot. Likelihood Risk Mitigation Owner / Due
Date M
Waiting on identification of a testing tool and reviewing performance logs on production. Starting with Jmeter scripts until tool selection is complete.
L Performance Losing Tarun January 10. Will have a v elocity decre ase until he is
backfille d on the team. M Urgency 2 3 2 6 30 26 30 3 0 3 2 33 21 292 9 2 8 0 5 10 15 20 25 30 35
Plann ed Velocity Actua l Velo city
0 100 200 300 400 500 600
Start It. 1 It. 2 It. 3 It. 4 It. 5 It. 6 It. 7 It. 8 It. 9 It. 10 It. 11 It. 12 It. 13 It. 14 It. 15 It. 16 It. 17 It. 18
Comple ted Points Re ma ining Points (No rma l) Remainin g P oin ts (Epic) Plan
Targeted to end of March Release
Metrics
• Specific Metrics will be designed to be able to
measure the overall success of the program.
• Initial values for these Metrics will be gathered to define the baseline from the current
process.
• Example Metrics include
– Person Hours Per Test
– Tests Per Iteration
ROI
• The last step is to develop a
potential Return On Investment from using Automated Test
• Combines the Costs Identified in the Strategy, Process and Design Steps with the savings from both the automation and the risk and vulnerability reduction.
•38
State of Testing
•
Business Drives Software Development and
Testing
Software Development and Testing
Drives Business
The best business ideas cannot be implemented on time or of high quality, if the software and testing
•39
Software Testing: Perceived vs Actual
Quality
• 10 defects that occur very frequently and impact critical functionality would usually be perceived by an end user as poor quality even if the defect density was very low
• 100 defects that occur very infrequently and have almost no impact on operations would usually be perceived by an end user as good quality even if the defect density was high
• “Usage-Based Testing” exploits the concept of perceived quality and thus yields higher perceived quality
•40
Perceived vs Actual Quality
•
Concept of “Perceived/Expected vs Actual
Quality”
–
Difference between growth and
stagnation/failure/less success
• aol.com vs CompuServe.com
• Google.com vs Yahoo.com
• Amazon.com vs Otherbookseller.com
•41
Perceived vs Actual Quality
•
Concept of “Perceived/Expected vs Actual
Quality”
–
Difference between growth and stagnation/failure
–
Amazon.com vs Otherbookseller.com
–
Etc.
Competition is only a few
clicks away
ABOK Certification
•Tool neutral test automation skill set
•Reflects widely accepted industry practices
•Tool for assessing test automation proficiency
•Tool for identifying a track for test automator growth and development
WHAT IS THE ABOK?
A
A
A
A
utomation
B
B
B
B
ody
O
O
O
O
f
K
K
K
K
nowledge
Automated Software Testing
….. Changing How Testing Is Done