Performance Testing Mobile and Multi-Tier Applications

29  Download (0)

Full text

(1)

mVerify

A Million Users in a Box ®

®

Performance Testing Mobile and

Multi-Tier Applications

Chicago Quality Assurance Association

June 26, 2007

(2)

Goals of Performance Testing

 Validate time requirements/expectations

 Validate utilization requirements/expectations  Validate capacity requirements/expectations

 Reveal load-related bugs

 Prove compliance: SLAs, contracts, competitive rankings

 Fire-drill for recovery

(3)

Business Impact/ROI

 In 2002, slow e-commerce downloads lead to an estimated $25 billion of abandoned transactions

 In 2005, Google’s 15 minute outage estimated to have cost at least $150,000 in lost ad revenue

 Recent study: nearly two-thirds of mobile employees rank poor response time as a “significant” inhibitor to working remotely over a VPN

(4)

Business Impact/ROI

Avoidable Costs

•Lost revenue

•Lost user productivity •Lost IT productivity •Overtime payments •Wasted goods •Fines Risk Mitigation •Enterprise demise •Law Suits •Negative Publicity •Personnel morale Solution Cost •Software Tools •Hardware •Staffing •Services •Training

(5)

Basic Objectives: RASP

Reliability Probability of a failure occurring within a certain period of time

Availability Percent achieved up-time, not including scheduled downtime

Scalability The range load for which an incremental input consumes the same resources

(6)

Reliability

Reliability: probability of non-failure

 Total operational hours or transactions

 Entire user population

 Can be estimated during test, if tests are sufficiently

(7)

Availability – the “nines”

Annual Unscheduled Downtime

Six nines 99.9999% 32 seconds five nines 99.999% 5 minutes four nines 99.99% 53 minutes Three nines 99.9% 8.8 hours

two nines 99% 87 hours (3.6 days) One nine 90% 876 hours (36 days)

 Availability = percent up-time

(8)

Reliability

(Failures/million hours) Availability, 6 min MTTR

Some Data Points

NT 4.0 Desktop 82,000 0.999000000 Windows 2K Server 36,013 0.999640000 Common Light Bulb 1,000 0.999990000 Stepstone OO Framework 5 0.999999500 Telelabs Digital Cross Connect 3 0.999999842

(9)

Performance Metrics

 Response Time  Round-trip time  Throughput  Aggregate transaction processing rate  Utilization  Average % busy  Failure Intensity  Recovery Time Utilization Trans/Sec

Avg Resp, Sec

(10)

Strategies

Performance Testing

 Assess compliance with performance goals

 Assess compliance with resource utilization goals  Provides data to estimate reliability, availability

Stress Testing, Load Testing

 Assess response to over-load scenarios

(11)

Strategies

Benchmarks

 Assess throughput for open standard test suite

Scalability

 Assess performance linearity

Profiling

(12)

Typical Server Side Setup

Emulated Client Internal LAN Server(s) Under Test Emulated Client Emulated Client

(13)

Issues

 Client Emulation Machines

 Synchronization, overall test execution

 Capacity

 Multi-homed, test control subnet

 Test vs. Production Systems

 Separate server farm?

 Network contention

 Isolation versus scale/scope

 Version/Configuration control

 System under test

(14)

Issues

Actual end-user/customer experience?

 Network latency, QoS …

 Thin clients?

 Browser, client software versions  Client OS?

(15)

Edge Monitoring

Emulated Client Internal LAN Server(s) Under Test Emulated Client Emulated Client Internet Monitored Client Monitored Client

(16)

Issues

Client Monitoring Machines

 Synchronization

 Achieving desired test input at desired time

 Capacity

 Data collection

 Availability

 Security (beta test agreement?)

Network configuration

 DMZ

(17)

Connectivity – a Wild Card

Emulated Client Internal LAN Server (s) Under Test Emulated Client Emulated Client Internet Monitored Client Monitored Client

Random latency, jitter,

lost packets, re-ordered packets, re-routed packets, duplicate

(18)

Controlled Connectivity

Emulated Client Internal LAN Server (s) Under Test Emulated Client Emulated

Client Monitored Client

Monitored Client

Network Emulator

Controlled latency, jitter,

lost packets, re-ordered packets, re-routed packets,

duplicate packets,

(19)

Issues

Complexity

 Impairment modeling

 Impairment emulator programming  Coordination with emulated clients  Coordination with monitored clients

Specialized Skills

 Wire Shark (Ethereal)

(20)

How to Maximize Reliability

Combine realistic functional and load testing

 Representative variation in load and usage

 Supports reliability/availability estimation

 Saves time: more test goals supported with fewer tests  Typically effective in finding “weird” bugs

Security?

 Add abuse cases to the usage profile

 Interleave with normal traffic

(21)

Use Dynamic Loading

The real world isn’t flat

Vary behavior rate for actor/actor group

 Arc  Flat  Internet fractal  Negative ramp  Positive ramp  Random  Spikes  Square wave 0.000 500.000 1000.000 1500.000 2000.000 2500.000 3000.000 -5000 0 5000 10000 15000 20000 25000 E ve n ts P e r S e co n d

(22)

Case Study

Event Simulator

DB Script

Writer Java GUI

Test Oracle SilkTest Java Servers Test Object

Serializer Driver Java Java API TX

Formatter Driver 3270 MainFrame

(23)

Case Study

 Every test run unique and realistic

 Simulated user behavior to generate transactions

 Automatically submit in real time

 ~100,000 test cases per hour

 ~200 complete daily cycles

 Evaluated functionality and performance

 Controlled distributed heterogeneous test agents (Java, 4Test, Perl, SQL, Prolog) driving Java/CORBA GUI/API

 Five person team, huge productivity increase

 Achieved proven high reliability

(24)

Notes

Capture/replay scripts

 Static think-time

 Can distort load and response time

Performance Analysis

(25)

Tools

Open Source

 openSTA  PushToTest  Grinder  http://opensourcetesting.org/performance.php

Scripting systems: Tcl, Perl, Ruby, Python

Built-in

(26)

mVerify Testing System

End-to-End

Edge to Core

Integrated functional and performance testing

 Test objects

 XML performance measurements

Adapters for Windows Mobile, Web Services,

ODBC, *nix command line

Forthcoming

(27)

MTS/RPM

Agent Host

Client Host Under Test

Agent Host Console Host MTS Console MTS Test Agent MTS Test Agent

Host Under Test may be

TEST RUN REPORTS MTS Remote Agent Client Under Test MTS Remote Agent ü Cell Phone ü PDA ü Desktop ü Server ü Embedded Processor

Client Host Under Test MTS Remote

Agent

Client Under Test

Server Host Under Test MTS Remote Agent Server Under Test RPM Plug In RPM Plug In RPM Plug In

(28)

MTS/RPM

(29)

Figure

Updating...

References

Related subjects :