Performance Testing Training - Ivs_2

  • Uploaded by: api-19974153
  • 0
  • 0
  • July 2020
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Performance Testing Training - Ivs_2 as PDF for free.

More details

  • Words: 4,036
  • Pages: 37
Introduction to Performance Testing IVS training

Improved Business experience = High Throughput * Speedy Response

Agenda • Performance Testing Basics – What & Why – Differences – Types of Tests – Performance Parameters • Performance Testing Methodology – P-Test Methodology – Test Environment – factors to be considered • Factors taken into consideration for Estimation • Execution Models • Performance Testing Tools

Confidential property of Infosys. For internal circulation only

2

Impact / Importance of Performance focus Source - Butler Group Poorly performing IT applications cost the world's For 42% of organizations, the cost of downtime is 1 industrialized nations GBP45 billion annually above $100, 000 per hour A study by Contingency Planning Association Research found that the average hourly cost of downtime ranges from $28,000 in the shipping and transportation industry to a staggering $6.4 million for brokerage houses Source - Forrester Group Slow in-store apps earn a retail CIO the ire of other executives. Performance problems in live applications disrupt the business, damage productivity, and result in lost revenues, ultimately damaging the credibility of the entire IT organization. When complaints about the performance of a $50 billion retailer’s in-store POS apps flooded headquarters, management laid them in the CIO’s lap. The pressure to resolve the problem and speed up store operations was ferocious, and the CIO ultimately hired a systems integrator to implement a cradle-to-grave performance management program. A health insurance company wastes millions of dollars every year. A $20 billion US health insurance company calculated that its inattention to performance prior to deployment — notably in design, where more than half of the problems originated — was resulting in several millions of dollars in avoidable costs every year. Resolving performance problems before deployment is more cost-efficient by at least one order of magnitude, and ignoring performance in the short term only racks up long-term costs that IT organizations can ill afford. A telecom provider spends millions on app support instead of new app development. At one telecom company, each 15-second timeout in enterprise application integration (EAI) infrastructure resulted in a $4 call to an outsourced contact center; over the course of six months, this resulted in unanticipated support costs of almost $3 million — funds that would otherwise have been used for new development efforts. Enterprise IT organizations struggle to drive down maintenance costs and fund new projects; Forrester’s data indicates that the average IT organization spends 75% of its software budget on ongoing operations and maintenance.2 IT shops stuck in firefighting mode dedicate a larger portion of their budgets to maintenance than is necessary, diverting resources from efforts to deliver new value to the business. Source - Infosys In one of the corporate banking application, the bank lost more than $ 10 million in the last 4 years on issues related to Performance Confidential property of Infosys. For internal circulation only

3

Key Customer Issues Stressed Production Support Team and Application Development Team

Hardware Upgrade does not help achieving better Performance!! SLA of clients are violated due to system issues

What are my competitor’s benchmarks?

How can you pinpoint and fix problems in production before they impact the End – User?

Do we know When and how Production break and its impact?

Batch Jobs taking longer time to finish?

How do the different components of the system affect each other?

Optimized server utilization?

above all... Are users complaining about Response?

System / applications not able to support additional load despite lower H/w utilisation

Performance degradation happening over a period of time but how?

Confidential property of Infosys. For internal circulation only

4

Why I am spending too much on maintenance than on enhancements ?

Single user request

Red

“Give me red”

Desired Outcome

Confidential property of Infosys. For internal circulation only

5

Concurrent users requests “Give me red”

Blue

“Give me red”

Green Memory? Yellow

“Give me red” * * * “Give me red”

DISK?

Network?

CPU?

“Give me red”

* * * Orange Pink

Undesirable outcome from a functionally tested application

Confidential property of Infosys. For internal circulation only

6

What Performance testing can eliminate 

Would you want to use a cricket web-site that takes 5 minutes to update the score and tell you that Sachin Tendulkar was out 10 minutes ago?



Would u want to use an Airlines web-site to book a ticket and wait for 15 minutes to know that there are no seats available?



Payment gateway unavailable after providing all the details to get tickets at PVR web-site

Confidential property of Infosys. For internal circulation only

7

What Performance testing can eliminate contd… 

Would u want to use a university site that crashes when your engineering results are out?



How many of us like to click on a button and see the result – “Error : Page not found”? None of us would want to face these situations Neither do the owners of these sites want their users to face such situations This is where Performance testing comes into action

Confidential property of Infosys. For internal circulation only

8

Typical scenario.. • Popular Online Music Store (POMS) - Good music download service for the users ; Good Business for the proprietor • A new song (say Rap song) uploaded on POMS website - The Song which was not so famous initially, becomes a huge hit after the video was aired on TV Channels ; Never-before-seen-popularity • Just a week after the upload, the Rap song is downloaded by several thousand users, innumerable times. Essentially, the User Base has shot up about 25 times • New to such a scenario, the POMS website struggles to handle hundreds of Concurrent downloads • USP of POMS: Average download time of 5s / 1MB  Gets a beating. Terrible experience for the users who face a high Response Time by POMS to download the now famous Song  Dissatisfied Customer • POMS which boasted a download rate of 800MB/hr  Gets a beating. Throughput dropped down to < 60MB/hr  Dissatisfied Proprietor

Confidential property of Infosys. For internal circulation only

9

What is Performance Testing? 

Definition : Testing conducted to evaluate the compliance of a system or component with specified performance requirements. [IEEE]



Performance Testing is a process by which software is tested and tuned with the intent of realizing the required Quality of Service requirements



This process aims to optimize the application performance for key performance indicators like: Response time Throughput System threshold limits



Performance Tests determine the runtime “behavior” of the application and its supporting infrastructure, under operational conditions

Confidential property of Infosys. For internal circulation only

10

Performance Testing – what?  Performance testing is the discipline concerned with determining and reporting the current performance of a software application when subjected to virtual user load  Performance testing is a critical component of an application’s overall success. It prevents failures by predicting the application behavior and capacity prior to going live

Speed

Measuring Application Performance in terms of  Speed -> time taken by the application to respond to user request

Scalability

 Scalability -> ability to handle increasing user load without performance degradation  Stability -> ability of the application to sustain prolonged usage without performance degradation

Confidential property of Infosys. For internal circulation only

11

Stability

Performance Testing – why? Objectives of Performance Testing  Determine Application Performance at various user loads  Identify Performance Bottlenecks  Performance Benchmark

Confidential property of Infosys. For internal circulation only

12

Key Facts - One Detect Performance Defects  Software Applications when subjected to User Load expose Performance Defects  Defects could originate from any part of the application – Application Program, Database Systems, Application Server, Web Server, Network, Operating System  Causes for Performance Defects could be one of : • Poor Programming Practices – Programming Logic, Database Queries • Inappropriate System Designs – Table/Index Design, Application Design • Incorrect System Configurations – App/Webserver Configuration, OS parameter

Myth: Performance Bottlenecks occur only at high User Load

Confidential property of Infosys. For internal circulation only

13

Key Facts - Two Performance Testing is all about  Application User perspective : Response Time of Business Transactions  Application Owner perspective : Transaction Throughput

Myth: Performance Issues could be solved by increasing Hardware Resources

Confidential property of Infosys. For internal circulation only

14

Key Facts - Three Production Like Test Environment  To simulate real time issues its recommended to conduct performance tests in close-to-real time test environments  If not the same, Test Environments should be proportionately scaled down from Production Environments

Myth: Number of application users is synonymous to Concurrent users

Confidential property of Infosys. For internal circulation only

15

Different from Performance Engineering Performance Testing is a subset of Performance Engineering

Performance Testing

Performance Engineering

Evaluates the application

Eliminates issues in the application

Assesses the system

Addresses the issues in the system

Finds problematic areas in the SUT

Fixes problematic areas in the SUT

Confidential property of Infosys. For internal circulation only

16

Different from Functional Testing Performance Testing is starkly different from Functional Testing Performance Testing

Functional Testing

Evaluates the system Performance

Evaluates the application Functionality

Assesses only Performance Critical Scenarios Assesses all possible Functional Scenarios

Conducted at Multiple Users Load

Conducted at Single User Load

Requirements are provided in numbers

Requirements are hard-line specifications

Analytical and mathematical in nature, based on various theories.

Either pass or fail is the test case’s execution result

Classified as partly art and science

Quite evolved and best practices baseline for various methodologies

Confidential property of Infosys. For internal circulation only

17

Types of Performance Tests Load Test –> Exert constant user load for a relatively lesser duration ; Behaviour @ various loads

Stress Test –> Start with a low user load. Increment the user load with a fixed number of users at a regular interval ; Break Point

Endurance Test –> Exert constant user load for a prolonged duration ; Memory Leaks

Volume Test –> Exert constant user load for multiple iterations with different database volumes each time ; Behaviour @ various DB volumes

Scalability Test –> Start with a low user load. Increment the user load with a fixed number of users at a regular interval ; Max TPS

Confidential property of Infosys. For internal circulation only

18

Performance Test Methodology

Confidential property of Infosys. For internal circulation only

19

P-Test Methodology Requirement Analysis(NFR) Recommendations

Project Kickoff

Business/ Architecture/ Environment Assessment

Planning & Design

Execution

Build Knowledge Repository Breaking down the scope to critical Transactions

Mining the historical data Interviewing the Key Stake holders

Workload Modeling & Characterization

Identification of the Key Performance Indicators

Performance Test Strategizing/ Tool identification

Scope Identification and signoff

Test Architecture Environment Design

Validation of test environment setup done

Test Scenarios Identification/ Design

Execution of Test scripts

Development of test Stubs/Scripts

Collection of test results

Iteration tuning

Recommendation and Go No Go decision making

Post deployment review

Validation/ Dry run of scripts

Analysis of results

Strategy Definition& Design

Iterative Design and Development

Change Management Milestone Iterative process Joint Activity

Requirement Analysis Checkpoint

Design Checkpoint

Confidential property of Infosys. For internal circulation only

20

Recommendation & tuning Checkpoint

Methodology – Requirements Analysis(NFR) System Appreciation –> Understand and appreciate the significance of the various s/w & h/w components which are a part of the SUT (System Under Test) Influx Model -> Collect the requirements using Influx Methodology. Infrastructure Model Transaction Model WorkLoad Pattern Model Growth Model QoS Model Data Retention Model SLA -> Verify whether the SLAs are reasonable and have a concrete basis. Determine In/Out of Scope activities Priority -> Given a set of applications, prioritize them to pick up the critical ones

Microsoft Word Document

Microsoft Word Document

Microsoft Word Document

Confidential property of Infosys. For internal circulation only

21

Methodology – Planning & Design Workload Modeling

Determine the usage pattern of the various transactions. Analyze the transactions by frequency and volume. Resultant is a real time Distribution Mix of the various transactions

Critical Scenarios

Identify critical scenarios to be Performance Tested. Business Critical, Resource Critical

Type of Tests

Identify the types of Tests to be Performance Tested as driven by the requirement viz. speed, scalability, stability

Tools

Choose the Testing tool, Monitors, Profilers as driven by the requirement. Application Protocol, Budget are key factors

Stubs/Drivers

Determine the need of Stubs/Drivers if tools/monitors don’t satisfy the need

Performance Parameters

Identify the monitoring parameters to critically analyze the SUT

POC

May be required to critically ascertain whether the desired output is obtained, if the testing involves a new Tool/Technology

Offshore-ability

Not all applications could be Performance Tested offshore. Need to derive an Offshore-ability criteria and identify applications to be offshore

Test Environment

Validate whether the installed software versions & patches reflect the production environment. Also the Prod Env hardware should be linearly scaled down to the Test Env hardware to facilitate scalability assessment

Design & Develop

Develop Performance Test Scripts, Stubs/Drivers as decided. Validate scripts

Confidential property of Infosys. For internal circulation only

22

Methodology – Test Execution Execute -> the different types of Tests as per the Test Plan ; Baseline/Benchmark Tests Monitor -> the counters during test execution. Requirements will dictate what measurements will be collected and later analyzed Measurements collected during this activity may include but aren't limited to the following: Transaction Response Time TPS Memory Usage CPU Usage Correlate -> the captured output from all the monitors (Testing Tool, Server/DB/Network Monitors) to identify the problematic component Iterative Test Execution -> Isolate the problematic components (modules/scenarios/transactions) and re-iterate the test executions to confirm the findings

Confidential property of Infosys. For internal circulation only

23

Methodology – Recommendations Analyze -> the results and identify Performance Bottlenecks Scope -> Not Tuning but providing Tuning Recommendations is a in-scope function of Performance Testing Tuning Recommendations -> Instances like: Altering System Configurations – DB Init/Storage Parameters, AppServer, WebServer Tweak Programs – Database Queries, Program Design Modifying Designs – Database Design, Table/Index Design Experience -> Common observation is that 60% of Performance issues are related to Database – which includes DB configuration, Indexing and SQL Query structuring

Confidential property of Infosys. For internal circulation only

24

P-Test Environment Real Time -> Ensuring Test Environment as real as Production Environment is desirable Scale -> In case of a scaled down Test Env, its recommended to Linearly scale down from the Production Env, to facilitate scalability assessment Network -> An Isolated Test Environment ensures the performance results are free from all kinds of biases

Confidential property of Infosys. For internal circulation only

25

How to go about deciding the Approach? • What kind of Application is it? • What is the application architecture? – In production environment – In Test environment • What is the deployment strategy? • What is the peak load expected? – Current peak load – future peak load • What are the protocols? – Web/HTTP – SAP – Oracle NCA – Citrix – Thick client etc

Confidential property of Infosys. For internal circulation only

26

Estimation Considerations • • • • • • • • • •

NFR analysis, planning, strategize, etc. Identify the scenarios Evaluate the tool that is to be used Define or collect the SLA’s Identify the Test Environment Test Data generation Scripting of the identified scenarios Dry-Run of the scripts Scenarios for the various types of performance testing Iterations of execution

Confidential property of Infosys. For internal circulation only

27

Scripting • Test tools replace human users with virtual users, the actions a Vuser performs are captured in the script. • Develop the basic script with the help of the tool for recording script. • Enhance the basic script by – –



– – – –

Parameterization – It is the input data to simulate the real time scenario e.g. User Id, Password, entry fields for a transaction Inserting Transactions – Transactions are defined to measure the performance of the server. It measures the time taken by the server to respond to a specified request. It is identified by marking the beginning & end of the task e.g. to measure the time taken for Login, define a transaction for the click of Login button Correlation – Correlating the data enables us to save the dynamic data & use it throughout the scenario run e.g. in a web based application a session id is generated whenever a user logs in. Hence the record time session id needs to be correlated so that the script can be executed. Similarly the functionality of creating a new Insurance case, the policy no. needs to be correlated as it is a unique no. generated for each case. Inserting check points – Checkpoints are added to a transaction to verify that the scripted action has received the correct response from the server or not e.g. a particular string appearing on the page can be checked for in the response Inserting check points – Checkpoints are added to a transaction to verify that the scripted action has received the correct response from the server or not e.g. a particular string appearing on the page can be checked for in the response Adding control-flow structures – Additional logic can be incorporated to the scripts by inserting ifelse, looping structure etc. Error Handling – Abnormal termination of the script can be prevented by handling errors in the response of a transaction and displaying appropriate message or taking an appropriate action

Confidential property of Infosys. For internal circulation only

28

Execution phase • • • • • • • •

Iterative execution Joint Analysis of test results, logs Identify the bottlenecks Identify network impact on the end-user perceived response time Identify the most expensive call functions Identify the database constraints Tuning Recommendations Release of P-test Summary Recommendation Report

Confidential property of Infosys. For internal circulation only

29

Execution Modes OSPTC provides the flexibility to performance test applications either in Remote Execution mode, Script Development mode, Offshore Testing mode or in a combination there of. The mode to be used for a particular application will be determined based on a set of criteria (detailed in subsequent slides) Remote Execution Mode  Performance Test Environment located onsite at Client  Test tools & load generator machines setup at Client’s onsite test environment  Infosys Offshore performance test team accesses test tools & test environment over a remote desktop connection which is secured  Offshore Team scripts & triggers performance tests from offshore over the dedicated network Script Development Mode  Scaled down test environment setup at offshore, for development of test scripts  Load generator machines setup at Client’s onsite test environment  Execution of tests trigged on the Client’s onsite test environment

Confidential property of Infosys. For internal circulation only

30

Execution Modes contd… Offshore Testing Mode  A scaled down or production like test environment set up at offshore  Performance tests executed using tools and load generation machines setup at offshore Infosys locations  All activities done at offshore  Prior to application deployment, sanity tests run on production/test environment to validate results

Confidential property of Infosys. For internal circulation only

31

Reports analyzed for performance issues • – • – – • – • – • –

• – –

Transaction Summary This report summarizes the number of transactions that failed, passed, aborted or ended in error Transaction Response Time – Under Load Displays the transaction response time relative to the number of running vusers at any given point during the scenario The increase in response time should not be too high/too less with increasing vuser • If the difference is too high, then it means the system has problems scaling up because of queued requests or waiting for server resources to be allocated Transaction Performance Summary This report displays the min, max and avg. response time for all the transactions in the scenario Percentile Graph This graph can be used to find the percentage of transactions that have been completed with in the acceptable response time Network Monitor Graph vs. Running vusers This cross comparison provides us the info on the impact of network delay due to increasing running vusers. More running vusers might cause network congestion and so the network delay might increase heavily after a point of time. This gives us the limitation of the network (if any) Transactions per second vs Transaction Response time – Average This cross comparison lets us know the impact in response time (if any) against the number of transactions the server is handling at each second during the scenario run With increasing transactions, the average response time going too high means a bottleneck/limitation in the web architecture

Confidential property of Infosys. For internal circulation only

32

Contd… • Running vusers vs throughput – This cross comparison lets us know how running vusers increase load to the server and thus increases the number of hits to the server • If the number of hits is increasing along with more running vusers then the system is behaving fine • If the number of hits remains constant with more running vusers, then it means that the system is not capable of handling so many people. In this case, we should try to find the number of vusers after which it remains constant • Running vusers vs hits per second – This cross comparison tell us how the server throughput increases/decreases against running vusers increasing/decreasing • If the throughput is increasing along with more running vusers then the system is behaving fine • If the throughput is decreasing with more running vusers, this may be because the failures are more after some time, so the server is sending minimal response for the error messages • Hits/sec vs Transaction Response time – Average – This cross comparison tells us how the number of hits affects transaction performance • CPU Utilization of all tiers – The benchmark is that CPU utilization going above 80% and staying at that level for a longer interval is not favorable. If that’s the case then there is some problem. The application is doing lot of processing and making a heavy load on the CPU • Memory Utilization of all tiers – If there is a high memory requirement, then there may be a lot of virtual memory that need to be used and also this requirement might cause trashing (high disk I/O) which will affect the performance • Disk I/O on all tiers – If the I/O is high, then it affects the performance

Confidential property of Infosys. For internal circulation only

33

Various Performance test tools available in the market Name

Protocols Supported

Licensing

Mercury InteractiveLoadrunner

Http, Winsocket, SOAP, WAP, IMAP, POP3, EJB, RMI-Java, COM/DCOM, Tuxedo, Siebel, SAP, Terminal Emulation

Licensed

Radview’s Webload

Http

Licensed

Compuware QALoad

HTTP, SSL, SOAP, XML, CORBA, TUXEDO, Winsock, Java, ODBC, SAP R/3, PeopleSoft, VT100-520

Licensed

Segue’s SilkPerformer

HTTP, FTP, Streaming Media, LDAP, TCP/IP, UDP, Licensed WAP, MMS, XML/SOAP, Microsoft .NET SOAP Stack, CORBA (IIOP), Oracle OCI, PeopleSoft

Microsoft Web Application Http Stress Tool (WAS)

Freeware

OpenSTA

Http

Freeware , Open source

Http-Load

Http

Freeware

Confidential property of Infosys. For internal circulation only

34

Common Tools and protocols • Some of the commonly used tools – Testing tool • HP LoadRunner • Radview WebLoad • Borland SilkPerformer • Opensource JMeter – Monitoring tool • Wily Introscope • Quest Performasure • Some of protocols used – Web/Http – Oracle NCA – Citrix – SAP – MQ etc

Confidential property of Infosys. For internal circulation only

35

Tool’s selection Feature

HP LoadRunner

Radview WebLoad

Borland Silk Performer

Ability to change recording to a different protocol in the middle of a recording session

Yes, for some protocol

No, Supports only one protocol (Http)

Yes

Actions in a script can be iterated Simple runtime setting No, Programming effort is any specified number of times change required without programming

Simple runtime setting change

Different modem connection speed simulation

Yes

Yes

Yes

Cookies and session Ids automatically correlated during recording

Yes

No, Programming effort is required

Yes

Simulated IP address for virtual users

Yes

No

Yes

Ability to provide graphical results and export them to common formats

Yes

Yes

Yes

Confidential property of Infosys. For internal circulation only

36

Any Questions? Thank you

Confidential property of Infosys. For internal circulation only

37

Related Documents