• No results found

Automated Testing and Test Management

N/A
N/A
Protected

Academic year: 2021

Share "Automated Testing and Test Management"

Copied!
34
0
0

Loading.... (view fulltext now)

Full text

(1)

Warfighter’s Mission-Critical System:

Warfighter s Mission Critical System:

Automated Testing and Test Management

H. Ferhan Kilical, Ph.D.

Technical Fellow, Electronic Systems (Test , Test Automation, SOA, Performances Test

(2)

Report Documentation Page OMB No. 0704-0188Form Approved

Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number.

1. REPORT DATE

APR 2010 2. REPORT TYPE

3. DATES COVERED

00-00-2010 to 00-00-2010

4. TITLE AND SUBTITLE

Warfighter’s Mission-Critical System: Automated Testing and Test Management

5a. CONTRACT NUMBER 5b. GRANT NUMBER

5c. PROGRAM ELEMENT NUMBER

6. AUTHOR(S) 5d. PROJECT NUMBER

5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES)

Northrop Grumman Electronic Systems,1580-A West Nursery Road,Linthicum,MD,21090

8. PERFORMING ORGANIZATION REPORT NUMBER

9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S)

11. SPONSOR/MONITOR’S REPORT NUMBER(S)

12. DISTRIBUTION/AVAILABILITY STATEMENT

Approved for public release; distribution unlimited

13. SUPPLEMENTARY NOTES

Presented at the 22nd Systems and Software Technology Conference (SSTC), 26-29 April 2010, Salt Lake City, UT. Sponsored in part by the USAF. U.S. Government or Federal Rights License

14. ABSTRACT

Warfighter?s mission-critical system: Agile automated testing and test management Do any of these problems sound familiar? ? Too many failed scripts ? Slipping schedules ? Automation tools that never get off the shelf ? Incomplete test coverage ? Do more with less, and ? Be creative in performing the business Then, come and listen to the story of Warfighter?s mission critical application testing. This case study reveals how testers used automated functional, performance, and Service Test scripts for a major

mission-critical application that had to meet the most rigorous quality standards. Also, working in an agile environment, the team managed end-to-end requirements and defects and performed functional, SOA and performance testing. With risk based and automated test strategies, the team was able to do automated regression and smoke tests in a fast-paced development ? agile environment and managed test results, test sets, and test-related artifacts. It was a very successful project, completed on time with very limited resources.

15. SUBJECT TERMS

16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF

ABSTRACT Same as Report (SAR) 18. NUMBER OF PAGES 33 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified

Standard Form 298 (Rev. 8-98)

(3)

Abstract

Warfighter’s mission-critical system: Agile automated testing and test management Do any of these problems sound familiar?

Do any of these problems sound familiar? – Too many failed scripts

– Slipping schedules

– Automation tools that never get off the shelfAutomation tools that never get off the shelf – Incomplete test coverage

– Do more with less, and

– Be creative in performing the businessp g

Then, come and listen to the story of Warfighter’s mission critical application testing. This case study reveals how testers used automated functional, performance, and

S i T t i t f j i i iti l li ti th t h d t t th

Service Test scripts for a major mission-critical application that had to meet the most rigorous quality standards.

Also, working in an agile environment, the team managed end-to-end requirements and defects and performed functional SOA and performance testing

and defects and performed functional, SOA and performance testing.

With risk based and automated test strategies, the team was able to do automated regression and smoke tests in a fast-paced development – agile environment and

(4)

Acknowledgements

• Special thanks to the project team at Mission Systems NG

• Special thanks to the project team at Mission Systems, NG,

(5)

Complex, Mission Critical Applications & Testing

WarFighter’s Needs

One NET

JTRS

JTRSJTRSJTRS Plan, Prioritize Data Use Data,

Joint WarFighting

Anywhere, Any user …

Application Battle Command Common Services Application Battle Command Common Services See Data,

Capture Data

(6)

General Info for Testing & Integration in

Government Contracting and Test Automation

Keeping Track of:

– RequirementsRequirements

– Defects

– Test cases, test processes and test plan

Managing Testing Cycles

– Problems in infrastructure and scalability

– Ongoing implementations and significant development initiativesg g p g p

– Significant degree of customization and integration

– Limited resource availability

– Integration with Complex Portal and Identity Management SolutionsIntegration with Complex Portal and Identity Management Solutions

Weak Testing Methodology

– Manually intensive

– People - Not process driven

– No automated testing capabilities

(7)

Testing Story

• When a major government contractor delivers software that software must • When a major government contractor delivers software, that software must

comply with the most rigorous quality standards

• By enabling both automation and proper test management, we benefit from g g

critical advantages not offered by manual testing

• If in “agile” development model: Development cycles are short and tests

are conducted at the same time as the coding

are conducted at the same time as the coding

• Things can get particularly complicated when the team is testing SOA

(8)

Why Testing is Very Crucial in Agile &

Waterfall Development?

• Testing needs to be completed!

Waterfall Model

p

• Testing needs to be completed! • Short cycles

• Testing -communication fast

SA Developer IDM Tester SA WB/KMS Developer Tester SA InfraStruc Developer Tester

• Tester’s role in agile environment • No of iteration refactoring

IDM Scrum Master Gov

DB Rep

Secu Rep Govt Rep

WB/KMS Scrum Master

DB Rep

Secu Rep Govt Rep

DB Rep

InfraStruc

Scrum MasterSecu Rep

Developer Developer Developer

• Automation is the only way to go! • What to automate How to automate Govt Rep SA Mapping Scrum Master DB Rep Secu Rep Tester Govt Rep SA p Rep Analysis Scrum Master DB Rep Secu Rep Tester SA Govt Rep DB Rep Reporting Scrum Master Developer Tester Secu Rep • How to automate

• Gains (time and money)

Agile Model

Agile Model

Sprint 1 Sprint 2 Sprint 3 Sprint 4 Sprint 5

A l i A l i A l i A l i A l i

Analysis Analysis Analysis Analysis Analysis

Coding Coding Coding Coding Coding

(9)

We Did Not Have Processes and Tools in Place

• The client had Rational tools (Req Pro ClearQuest and Test Manager) but

• The client had Rational tools (Req Pro, ClearQuest and Test Manager) but

they were not properly used.

• And a lot of Excel sheets and Word docs. Never ending story of not being

able to control requirements and defects able to control requirements and defects ...

• We bought Quality Center, and we made Rational tools and Quality Center

tools talk to one another. We faced challenges with Firewalls and security.

• Meanwhile, we started working on the processes with the client and our NG

internal processes for test.

d h b l h h h l

(10)

Agile Workflow and Tools

9

le

~a~

to male laUonal

Release Manager Assesses Impact Requirements provided by DISA Test Case Completed

Agile Sprint Testing Workflow

Test Results Delivered to DISA

'lan gil toone anomer.

• In Agile Development, test cases are based on the input from Use Case, User Story and Agile Team. As they complete the N build, it it tested and final test cases are formed

.

(11)

Automating Test Scripts

• Functional Testing • Functional Testing

• Performance, Load, Stress Testing • Service Test

• Security and SA Type of Testing • Security and SA Type of Testing

• Important Features in Relation to Automation: – Test Case, Requirement correlation

– Defects, Defect Management and its correlation to requirements and test cases if possible

(12)

Purpose of Automated Testing

Purpose

of Automated Testing

p

g

• Checks virtually any functionality in application.

• Provides consistently re-usable tests to be run each time a site or application changes

• Shortens testing time especially regression testing.

• Tracks all test runs, logs, test results in one, shared repository. Major benefits are:

• Reusability

• Consistency

• Productivity

(13)

Good Automation Candidates

• Tests that need to be run for every build, sometimes referred to as Sanity tests (Smoke and Regression tests).

• Tests that use multiple data values for the same actions are known as

Data-Driven tests (Equals to > < )

Driven tests (Equals to, =>, <= ).

• Identical tests that need to be executed using different browsers (We are using IE6,7 and FF).

(14)

Agile Testing Cycles

• Sprints Testing: For every sprint, test team will have a baseline. The baseline

consists of tests created as a result of sprint requirements that will be fulfilled.

co s sts o tests c eated as a esu t o sp t equ e e ts t at be u ed

• Smoke Testing: For every sprint, we review the new test cases and adjust

standard smoke tests to reflect any needed changes.

• Regression Testing: For every sprint, we review the new test cases, and

based on the requirements and development, we complete a regression test. That full regression test includes all the sprint baseline regression and smoke

tests The regression test is fully automated with

QTP

tests. The regression test is fully automated with

QTP

.

• Load Testing: During sprints at Herndon, based on the needs, we develop

LoadRunner scripts for performance and tuning. At the end of each iteration, our goal is to have a set of LoadRunner scripts that will allow us to see the performance, load and scalabity for major business rules and transactions or identify bottlenecks...

• Service Testing: During sprints at Herndon, based on the needs, we develop

services tests scripts. Our goal is to run these scripts under load as well as security.

(15)

In One Iteration: Smoke and Regression Tests

g

• Total Number of Test Cases

• Smoke – 87 per browser

• Regression 259 per browser

• For each release, we have ~1000 test cases to

be executed, with 5 to 6 resources over 200 hours each

• Regression – 259 per browser • Patch – approx 15 per browser

• Total Number of Releases

18 so far with additional releases to some

hours each.

• This does not include downtime or any technical

problems. • 18 so far, with additional releases to some

rounds

• Days and Resources to Test

• Smoke - 3 5 hours per browser

• With the following assumptions:

– The testers are very familiar with the system.

– The test cases are ready - Step by Step. • Smoke - 3.5 hours per browser

(uninterrupted), usually 2 people • Regression - 16 hours per browser

(uninterrupted), usually 2 people

The test cases are ready Step by Step. – Requirements and test cases are correlated. – For fixes, defects descriptions and defects

are already in the system. • Patch - 4 hours per browser

(uninterrupted), usually 1 person

• Number of cycles before ATRR

H d

• Herndon • Client Suite A • Client Suite B

(16)

Testing Documents

• Test Plan

– Living document updated throughout iterationg p g

– User stories augment the test plan – Delivered at the end of each iteration • Test Cases

– Written throughout the Agile process

– Input to Rational prior to end of each sprintInput to Rational prior to end of each sprint – Automate QTP, LR and ST

• Test Results

– At Herndon with Agile Teams

– Delivered at the end of each sprint • System Test Report

– Living document updated throughout iteration

– Updated at the end of each sprintp p

(17)

Script Development

D l d l b l i h b ll d f i h

• Developed global scripts that can be called from one script to the other

S i t d i t t t t f diff t S h

• Scripts were grouped into test sets for different purposes. Such as quick regression test sets, quick check for critical areas or known issues

issues

•With one script we were able to test the system with different browsers at the same time E g IE6 IE& FF3 etc

browsers at the same time. E.g., IE6, IE&, FF3, etc.

• The same scripts were used for executing tests at different suits. So

with one script we were able to run several tests depending on the with one script we were able to run several tests depending on the situation we were in for that particular day.

(18)

Calling Scripts from Other Test Sets

Calling scripts from other test sets, excluding log in and log out,

preparing test sets and making sure users have the right privileges to f

(19)

Quality Center, Schedule QTP Scripts, Defects

• Quick Test Professional integrates with Quality Center Quality

• Quick Test Professional integrates with Quality Center. Quality

Center opens the selected testing tool automatically, runs the test,

and exports the results to Quality Center. When you run a manual

t

t d t t

il i

t t

d i

t d d

l

(

or automated test, e-mail is sent to a designated developer (or

whoever needs to be notified) to inform the status of the test or

when a defect is written.

• For running tests at night, we have a schedule feature where we

connect QTP via Quality Center. The test sets will be set ahead of

ti

b

d

ti

fl

d t t

ill b

t d b

d

time based on execution flow and test runs will be executed based

on the schedule specified date and time

• Requirements Test Case Test Lab and Defects in Quality Center

• Requirements -Test Case - Test Lab and Defects in Quality Center

(20)

Performance - Load – Stress Testing

Questions a Performance Test Should Answer:

Questions a Performance Test Should Answer:

- Does the application respond quickly enough for the intended users? - Will the application handle the expected user load and beyond?

- Will the application handle the number of transactions required by the business?

I th li ti t bl d t d d t d l d ?

- Is the application stable under expected and unexpected user loads? - Are you sure that users will have a positive experience on go-live day?

(21)

Performance Test Processes at NG Herndon

Performance & Load Test Process in the Application and at Northrop- Herndon

Plan/ Design Define performance test scenarios -Business process -Existing performance issue -SLA's -KPI's -Baseline Application & System Performance Requirements -app usage profile

-user profile

-system component

resource usage profile

- historic and projected

traffic data -Max concurrent users

- Peak hour throughput

-Typical user session duration

System Architecture

& Configuration - Network topology diag.

-ER diag.

- Data flow diag

- Server HW/SW specs Configuration - TCP/IP connection -#thread allocated -Memory allocation (JVM heap size, GC setting ... ) -DB (connection pool

size, SGA size, redo log,

etc)

- FW setting & capability

Build r---Develop LR Scripts Develop LR Scenarios - Define LR architecture

-Set up performance test env.

-Transaction definition

-Test data (user acct.

test parameters, test data ... )

-Stand alone scenario for individual report &

capabilities -Combination scenario

with different reports and capabilities

Execute

Scenario Run-time Settings -Number of VUs

-Test duration I Test iteration

- Define rendezvous points

- Ramp-up I Ramp-down rate

-Think time

- Type of Browser emulation

- Browser cache setting

-Test log setting

-Test orouos set uo

Test Scenario Execute (Herndon)

II

Test Monitoring - SiteScope

•CPU usage & processor queue

•Memory usage (paging/

swapping ... ) • Server average load

-Hit per second

-Throughput

-Transaction response time -Connections (total & new/

closed per second)

-Web server (req/sec,

connection/sec)

- App server (queue length,

queue time)

-DB server (lock-wait, SGA size ... )

- Running SOL scripts ( SOL trace, buffer cache hit ratio, ,.)

-Network delay monitor

Analyze/ Diagnose/ Tune r---1 I I I

: Test Result Application :

1 & System I

: Analysis Tuning :

L___ ---~---]

-Obtain baseline - Merge analysis graph & capture correlation patterns

-Identify bottleneck

transaction

• Response time

• Hit per second

• Transaction per second

-Identify bottleneck system

component

-CPU usage - Memory usage -Throughput trend

-Exam SOL trace explain

plan - Identify slow queries - Review LR and server lloas

-Application turning (optimize queries, reduce DBI/0 ... )

- DB tuning (index, optimize

statistic analysis, SGA size,

redo log buffer cache size .. ) -System component configuration optimization (connection pool setting,

thread setting ... ) -System architecture optimization (Load balancing scheme, FW rule and capabilities ... ) - Network optimization To Suite A

(22)

-Monitor

While running performance test scripts one can see the actual response time and monitor the servers involved in the architecture.

(23)

LoadRunner Controller Monitors and SiteScope

• SiteScope Monitors: • SiteScope Monitors:

• CPU Utilization on Portal • Memory on Portal

• CPU Utilization on SIDE • CPU Utilization on SIDE • Memory on SIDE

• CPU Utilization on SAFE

CPU Utili ti WMS/WFS

• CPU Utilization on WMS/WFS • Memory on WMS/WFS

• LoadRunner MonitorsLoadRunner Monitors • Hits per Second • Throughput

• Trans Response Time etc • Trans Response Time, etc. • Oracle, Web servers, etc • Back End Data Verification

(24)

Challenges of Testing for Agile and SOA

• A more versatile test-bed environments

– It may be difficult to model the whole set of end-to-end software that probably y p y span many different servers

– Ability to simulate unavailable components

• Transition for testers – process-centric testing team

– Broad knowledge of business processes

– Understanding the intricacies of domino effects on business transactions

– Cross-functional teams environment

– Understand and diagnose underlying technology and connectivityUnderstand and diagnose underlying technology and connectivity

• Location and identification of web services (Geographic locations...)

• Availability of web services components: Applications, Middleware, y p pp , ,

Supporting hardware, teams – development, system admin, network, etc.

• Locating and isolating defects are difficult:

• Defects in service components would cause domino effects to applications that utilize those services

• Capture and analysis of all SOAP messages that are passed from one

component to another is overwhelmingp g

(25)

Testing Aspects and Service test

Positive Testing - Generates a full positive test for the selected services. It tests each operation of the selected service.

Standard Compliance – Tests the service’s compliance with industry standards such as WS-I and SOAP.

Service Interoperability – Tests the interoperability of the service’s operations with all supported Web Services toolkits

supported Web Services toolkits.

– .NET Framework with WSE 2 SP3 – Tests the interoperability of the service’s operations using .NET Framework with WSE 2 SP3.

– Axis 1.3 Web Services Framework - Tests the interoperability of the service’s operations using Axis 1.3 Web Services Framework.

– Generic Mercury Solution - Tests the interoperability of the service’s operations using Generic Mercury Solution.

S it T ti T t i it

Security Testing – Tests service security.

– SQL Injection – Attempts to hack the service by injecting SQL statements that will result in an unauthorized extraction of data.

– Cross – site Scripting - Attempts to hack the service by injecting code into a Web site that will p g p y j g

disturb the functionality of the site.

Boundary Testing – Tests the service to its limits using the negative testing technique.

(26)

Map Service Interface Description

The application has three Mapping Interfaces:

• Map Server p

• Reporting Detail Request from Palanterra

• Reporting Mapping Call

Mapping Thrid Party

Application

Map Server Request Map Server Response

HTTP Request HTTP Response

(27)

Reporting Mapping Call

Report Mapping Call is the single internal Mapping interface in the application. When a user clicks the “Map” button within the results page of a report a call is When a user clicks the Map button within the results page of a report, a call is made to the Mapping application.

User ESB Application

Mapping Reports Mapping related servlets

GetCapabiltiesp

CapabiltiesInfo GetMap MapInfo MapInfo

(28)

Generated Script

27 27 Available Graphs B Runtime Graphs ,. Running Vusers

User Delined Data Points

Error Statistics Vusers with Errors

B Transaction Graphs

!Trans Response Time

Trans/Sec [Passed)

Trans/Sec [F ailed,S topped)

Total Trans/Sec [Passed)

B Web Resource Graphs Hits per Second

Throughput

HTTP Responses per Second

Pages Downloaded per Second

Retries per Second

Connections

Connections per Second SSL per Second

8 System Resource Graphs

Windows Resources UNIX Resources Server Resources SNMP Antara FlameTnmwer SiteS cope X 8 Network Graphs

Network Delay Time

8 Firewalls

CheckPoint FireWall-1 d Color Scale Transaction

- 1 01_MCAT CBT Home

Apache -Last 60 sec I UNIX Resources -Last 60 sec

102....--- - - , 102. . - - - , ~B ~B ~B ~B ~A ~A ~1 ~1 ~ ~ ~B ~B

~0:2

.

0

/\,

.

,1\

0 ° : 0 : :

.

?£.<

1-

:

.

.

/

·

~

\ "\ 00:00:30 00:00:40 00:00:50 00:01:00 00:01:10 00:01 :2~ 00:00:20 00:00:30 00:00:40 00:00:50 00:01:00 00:01:10

Elapsed Time (Hour:Min:Sec) Elapsed Time (Hour:Min:Sec)

---~

Trans Response Time -whole scenario Htts per Second -whole scenario

3.1

.,..---..

24....--- - - . 2.79 ~.48 u ~.17 '-' .~.86 ~1.55 ~ 51.24 ( l ~ .93 rr: .62 .31 00:00:00 00:00:10 00:00:20 00:00:30 00:00:40 00:00:50 00:01:00 00:01:10

Elapsed Time (Hour: Min: Sec)

Max 3.929 Min 0.268 21.6 19.2 16.8 ~4A ~ ~ 12 :t: *9.6 7.2 4.8 2A

~

...

·

~

00:00:00 00:00:10 00:00:20 00:00:30 00:00:40 00:00:50 00:01:00 00:01:10 00:01:20

Elapsed Time (Hour:Min:Sec)

Avg Std Last

0.742 1.076 0.303

- 1 vuser_init_ Transaction 0.001 0.001 0.001 0.000 0.001

- 1 02_Login 0.635 0.215 0.318 0.106 0.231

- 1 03_Begin Registration 0.668 0.154 0.241 0.122 0.160

- 1 04_Submit Bio lnlo 1.040 0.149 0.265 0.210 0.194

1 05_Search School by State 0.447 0.276 0.332 0.061 0.327

(29)

With Service Test We are Able to:

• Develop scripts: – without a GUI

– using multiple protocols. In enterprise world we have to deal with a lot of multiple protocols. This feature is very helpful.

– by WSDL, UDDI, File and URL. This is a very helpful feature, too. • These scripts can be executed in LoadRunner for performance • We can analyze traffic over the network

(30)

Practice for Successful SOA Testing Strategy

Start early in the life cycle:

T ti li t li ti St t th d t d t ti d

• Testing client applications – Start the end-to-end testing and tuning 6 months before the deployment.

C t bl i t d t t l

Create an assembly-oriented test plan.

• Test the application before it is totally completed and conduct testing into stages with incremental increase the number of testing into stages with incremental increase the number of components.

• Choose an appropriate set of test cases to support end-to-end testing of the business process and end-user experience.

(31)

With the use of the tools we were able to:

• Prioritize testing priorities based on business risk • Prioritize testing priorities based on business risk

• Access testing assets anytime, anywhere via a browser interface • Create an end-to-end quality management infrastructure

• Create an end to end quality management infrastructure • Manage manual and automated tests.

• Accelerate testing cycles by scheduling and running tests automatically • Accelerate testing cycles by scheduling and running tests automatically,

unattended, 24x7

• Manage multiple versions of requirements, tests, test scripts and business components

components

• Enforce standardized processes and best practices

• Analyze application readiness at any point in the testing process with • Analyze application readiness at any point in the testing process with

(32)

With the use of the tools we were able to:

• 50 to 70% decrease in actual testing time (efficient and faster)

• Able to cope with huge amount of testing and captured defects at early stages of • Able to cope with huge amount of testing and captured defects at early stages of

development

• Able to produce contractual documents such as RTM – Requirement Traceability Matrix, defect reports, test reports, test plans and the like in a timely manner. Matrix, defect reports, test reports, test plans and the like in a timely manner. • Able to produce metrics for defects such as defect containment, defect density,

defect aging and other metrics related to defects

• Were able to capture changes that are done by third party development teams and sub contractors in Service Test

• Able to capture security related vulnerabilities • Able to capture the bottlenecks

• Had a chance to work on tuning and optimization of performance bottlenecks in architecture database and overall performance of the system

architecture, database and overall performance of the system

• Most importantly customer decided to have the same set up that we have at

Herndon at their site. Instead of sending testers to classified lab we are planning to send scripts only That is the plan

(33)

What We Need For Contractor Integration Results

Purpose: integrate and test all system components prior to official delivery to the government.

9 All testing related documentation has been completed and is up to date

9 Successful completion of smoke, patch, and regression tests

9 Performance-Load-Stress test baselines obtained

9 SLAs are met

9 All the test results were delivered into government CM

9 All defects are documented in CM toolAll defects are documented in CM tool

9 Final system test report submitted to the PMO

(34)

Requirements Validation/Regression Testing Results

Requirements Validation Requirements Validation

Results we would like to see:

• Release Requirements testing • Release Requirements testing

– Completed 99.13% testing of all testable requirements

9 Executed 99.63% testing against IE 6.0

9 Executed 98.50% testing against IE 7.0g g

9 Remaining 0.93% of testing could not be functionally tested

• Regression testing

Completed 100% of planned regression testing – Completed 100% of planned regression testing

9 Executed 95.05% testing against IE 6.0

References

Related documents

The work has improved our understanding of publicly funded legal assistance; provided information relating to funding streams aimed at areas of law or geography; established

Friday afternoon's meeting, hosted by Barbara Voigt, Director of HPU's Learning Assistance Center, included a tour of HPU' s Technology Classroom and Multimedia Production

Speci…cally, I look at the e¤ects of three periods of reform: …rst, the early 1990’s, starting with the new FOMC minutes; second, reforms around the turn of the millennium when

The work is divided into three main areas: Disjunctive Conic Cuts (DCCs) for Mixed Integer Second Order Cone Optimization (MISOCO) problems; developing new models and a novel

We developed a mercury emissions inventory for Malaysia by measuring the actual emissions levels in two solid waste incineration facilities (SWIF-a and SWIF-b) and a coal-fired

As a result of analyzing the competitive role, the potential economic development effects and the economic-social environment of univer- sities we can create the virtual model

Generation College Students”; Long, “Latino Students’ Perceptions of the Academic Library”; Neurohr and Bailey, “First-Generation Undergraduate Students and Library

21. That on October 26, 2016, and for a period of time prior thereto, the floor of the storage room adjacent to the kitchen in the facility known as the Help Supportive