Testing Documents Test Policy
By Quality Controller
Test Strategy
By Quality Analysts
Company Level
Test Methodology
By Test Lead/ Proj Lead
Test Plan Test Cases Test Procedures
By Test Engineers
Project Level
Test Script Test Log Defect Report
By Test Lead
Test Summary Report.
1
• Test Policy : It is a Company Level Document and developed by Quality Control people ( Almost Management ). This document defines “Testing Objectives”. Format of the Test Policy Document. , Testing Definition
: Verification + Validation.
Test Process
: Proper planning before starting the testing.
Testing Standards
: 1 Defect per 250 lines of code (OR) 1 Defect per 10 Functional Points. ( Screens).
Testing Measurements : QAM, TMM, PCM.
Signature. (C.E.O.)
2
QAM : Quality Assessment Measurements. TMM : Test Management Measurements. PCM : Process Capability Measurements.
3
II. Test Strategy : It is also a Company Level Document and developed by Quality Analyst Category people ( Some times Project Manager also ). This test strategy defines “Testing Approach”, followed by testing team. Components : • Scope & Objectives : Definition of testing and purpose of testing in our organization. • Business Issues
: Budget Control for Testing.
• Test Approach : Mapping between development stages and testing issues.
4
Test Responsibility Matrix ( TRM ) or Test Matrix ( TM ) Development Stages --- > Test Factors
Information Gathering & Analysis
Design
Coding
System Testing
Ease of Use
Authorization
Ease of Operate
Access Control
Maintenance
Changes in Requirements.
---------------
---------------5
4. Test Deliverables : Required test documents to prepare, by testing team during testing. 5. Roles & Responsibilities : Names of jobs in testing team and their responsibilities. 6. Communication & Status Reporting : Required negotiations between every two consecutive jobs. 7. Test automation & Tools : The need for automation in testing process. 8. Defect Reporting & Tracking : Required negotiations between testing team & development team to fix defects. 9. Change & Configuration Management : Required approaches to handle change requests in terms of testing. 10. Risks & Mitigations : Upcoming problems & possible solutions to overcome during testing. 11. Testing Measurements & Metrics : QAM, TMM, PCM. 12. Training Plan : Required training sessions to testing team to understand project requirements. 6
Test Factor : A test factor indicates a testing issue. There are 15 common test factors : •
Authorization : To validate the user to current application.
•
Access Control : Permissions provided to use a specific service.
•
Audit trial : Maintains metadata about the operation.
•
Continuity of processing : Inter process communication.
•
Coupling : Inter System Communications to share the resources.
•
Ease of Use : User friendliness.
•
Ease of Operate : Installation, Un installation and dumping.
•
File Integrity : Creation of internal files. (Ex. Backup files )
•
Reliability : Recover from abnormal situations.
•
Portability : Run on different platforms (Ex. Operating system)
•
Performance : Speed of processing to complete a task.
•
Correctness : To meet customer requirements.
•
Methodology : Follows the standards.
•
Maintainable : Long time maintainable at customer site.
•
Service Levels : Order of Services (Ex. Login -> Compose -> Logout)
7
Test Factors
V/s
Block Box Testing Techniques.
•
Authorization
Security Testing (OR) Functionality (OR) Requirements Testing.
•
Access Control
Security Testing (OR) Functionality (OR) Requirements Testing.
•
Audit Trial
Functionality (OR) Requirements Testing (OR) Error Handling Testing.
•
Continuity of Processing
Operations Testing (White box testing)
•
Coupling
Inter System testing.
•
Correctness
All Block Box Testing Techniques.
•
Ease of Use
User Interface Testing & Manuals Support Testing.
•
Ease of Operate
Installation Testing.
•
File Integrity
Functionality (OR) Requirements Testing, Recovery Testing.
•
Reliability
Recovery Testing, Stress Testing.
•
Portability
Compatibility Testing, Configuration Testing.
•
Service Levels
Functionality (OR) Requirements Testing, Stress
8
Test Factors
V/s
Block Box Testing Techniques.
13. Performance
Load Testing, Stress Testing, Volume Testing.
14. Maintainable
Compliance Testing (W.r. to standard processing is going on or not?)
15. Methodology
Compliance Testing.
9
III. Test Methodology : A Test Strategy defines the overall testing approach followed by Testing Team. Test Methodology is a refinement form of Test Strategy at Project Level. QA prepares this type of Methodology for current project testing depends upon the below procedure. Step 1 : Acquire Test Strategy. Step 2 : Determine the project type. Project type
Analysis
Design
Coding
System Testing
Maintenance
Traditional
Off the Shelf (Out Sourcing)
Maintenance
Note : Depends upon the project type QA decreases No. of Columns in TRM. Step 3 : Determine the application type and Customer requirements. ( QA decreases the No. of Factors ) Step 4 : Identify tactical Risks. ( QA decreases some of the Factors due to the complexity to apply. ( Ex. Performance, Load, Security Testing.) 10
Step 5 : Determine the scope of the project domain. (QA adds the required factors once again to the list. Step 6 : Finalize the TRM for current project. Step 7 : Prepare System Test Plan. Step 8 : Prepare Unit Test Plan.
Test Initiation
Test Planning
Test Design Regression Test Closer
Test Execution Defect Test Reporting
11
IV. Test Planning : After completion of Test Methodology creation for the current project, Test management concentrate on implementation of that methodology. In this stage, the test plan author creates a “System Test Plan” document with “ What to Test”, “When to Test”, “How to Test” and “Who to Test” depends upon the below workbench. BRS/CRS, S/WRS, Development plan
1. Team Formation 2. Identify tactical Risks System Test Plan 3. Prepare System Test Plan
TRM, Strategy or Methodology
4. Review Test Plan.
12
• Team Formation : In general, test planning initially starts with the Testing Team formation depends on below factors Available Testers, Test Duration, Test Environment Resources
Case Study : Client/Server, Web, ERP Projects takes 3 to 5 months for System Testing. ( with 2 to 3 members ) System Software takes 7 to 9 months for System Testing. Machine Critical application takes 12 to 15 months for System Testing. Ratio is : 3:1 ( Developers : Test Engineers )
13
2. Identify tactical Risks : After completion of team formation test plan author concentrate on current project testing risk analysis. Ex.: * Lack of knowledge on that domain for testing team. * Lack of Budget ( Time ) * Lack of resources. * Delay in delivery (Accidental issues like failure of power grid, etc.,) * Lack of Test data. * Lack of seriousness in development process. * Lack of communications.
3. Prepare System Test plan : After completion of team formation & risks finding, test plan author prepare a System Test Plan document in IEEE format (Institute of Electrical and Electronic Engineers). 14
IEEE Format : 2. Test Plan Id. : A unique number or name (Ex. STP_BNK_P1) 3. Introduction : About testing policy, Testing standards, Testing approach, Testing techniques and about this project. 4. Test Items
: Modules or Features or Functions or Services name.
5. Feature to be tested :
What To Test
6. Features not to be tested : Required features for test designing, which ones and why not. 7. Approach : Required testing techniques to be applied on that features w.r.to “TRM” (Quality). 8. Feature pass or fail criteria : When one item is pass, when it is fail. 9. Suspension Criteria : Abnormal situations to recover during this items testing ( Ex. Server Down ) 10.Test Environment : Required H/W, S/W to conduct testing (Incl. Tools) 15
10. Test Environment Specifications : Required documents to prepare during the testing. How To Test
11. Testing Tasks : Necessary task to do before the starting of an item testing. (Ex. To test Delete option, we must have one record )
12. Staff & Training Needs : Names of Test Engineers. Who To Test
13. Responsibilities : Work allocation to corresponding Test Engineer.
14. Schedule : Dates and Time
When To Test.
15. Risks and Mitigations Plan :
16. Approvals : Signatures of Test plan author and QA.
16
4. Review Test Plan : After completion of plan document preparation, test plan author and responsible test engineers concentrate on review of that document for completeness and correctness. In this review, they conduct “coverage analysis”. BRS, S/WRS based
--- What
TRM based
--- How
Risk based
--- When.
17
V. Test Case : After completion of test planning corresponding testing team concentrate on test design to prepare test cases for responsible modules. There are 3 types of test case design methods to conduct core level testing. 2. Business Logic Based Test Case Design Method
(80%)
3. Input Domain Based Test Case Design Method
(15%)
4. User Interface Based Test Case Design Method
(5%)
1. Business Logic Based Test Case Design Method : In general a test engineer prepare a list of test cases depends upon the use cases (or) functional specifications to S/W RS. BRS
TC
UC/FS HLD LLD PROGRAMS/.EXE
18
From the above model the use cases & S/W RS describes how to use a specific functionality in our application. To test that functionality in application build, test engineers prepare test cases depends on that use case. A Test case describes a test to apply on system to validate. Test cases are repeatable. To prepare this type of test cases depends on use cases, test engineer follows below approach;
Step 1: Collect required use cases from S/WRS. Step 2: Selecting use case from that collected list. Step 2.1: Identify entry condition ( Base state ) Step 2.2: Identify Input required ( Test Data ) Step 2.3: Identify Output & Outcome (Expected) Step 2.4: Identify Exit Condition ( End State ) Step 2.5: Study normal flow ( Call State ) Step 2.6: Study alternative flows and exceptions. Step 3: Prepare test cases. Step 4: Review the test case for Completeness and Correctness.
19
Test Case Format : • • • •
Test Case Id. : A unique No. or Name. Test Case Name : Name of the test case ( related to module name ) Feature to be tested : Source of test case/ Use case Id. Test suit Id. : Batch Ids, in which this case is participating.
•
Priority : (p0 - Basic Functionality, p1- General Functional (Ex. i/p domain testing, Error Handling, etc.,), p2- Cosmetic Testing (Ex. User Interface Testing, etc.,) Environment : H/W, S/W required for this case testing. Test Effort : Persons per Hour (Ex. 20 Minimum) Test Duration : Date of execution. Test Setup : Necessary tasks to do before starting this case execution. Test Procedure : A Step by step procedure to execute this case.
• • • • •
Step No.
Action
i/p Required
Output / Outcome
Result
Defect Id.
Comments
20 it 11. Test Case pass or fail criteria : When the test case is pass or when is fail.
2. Input Domain based Test Case : Depends on SRS use cases a test engineer prepare a list of Business Logic Based Test Cases. In general, use cases describes the flow of functionality to prepare input domain test cases. To collect required information about the type and size of the objects, test engineers depends on “Data Model of the Project” ( E-R Diagrams ). To study data model, test engineers follows the below approach. 3. Collect required Data Model design with .respect .to our responsible modules. 4. Study the Data Model in terms of type, size & constraints ( Primary key, etc ) 5. Identify critical attributes which are participating in manipulations, retrieving and other calculations. 6. Identify Non-critical attributes which are Input and Output ( ex. Name, etc., ) 7. Prepare Data Matrix for input objects, such as critical & Non-critical.
Data Matrix Attribute / i/p Object. xxxxx
ECP Valid Invalid Xxx xxx
Xxxx xxxx
BVA Minimum Maximum Xxxxx xxxxx
Xxxxxx xxxxxx 21
3. User Interface Based Test Cases : After completion of Business Logic Based and I/P Domain Based test cases design, testing engineers concentrate on User Interface Based Test Cases to conduct User Interface Testing. For this type of test cases test engineers does not maintains any format. Ex. •Spelling and Grammar checking. •Graphics check ( Size, Font, Style, Color and other Ms-6 Rules. ) •Meaningful error messages with respect to End user level. •Accuracy of data displayed on the screen. •Accuracy of data in the database as a result of user input. •Accuracy of data in database as a result of external factors ( Import, Exporting of files ) •Meaningful help messages. ( Manuals support testing ). We prepare the above type test cases depends on our User Interface conventions and Global User Interface rules & interest of the customer 22 side people.
Defect Reporting : During test execution, test engineers will report
mismatches as defects to development people. To report defects they follow the IEEE format :
•
Defect Id
: Unique No.
•
Defect Description : Summary of that Defect.
•
Build Version Id.
•
Feature : Source module ( or ) source functionality of the defect.
•
Test Case Name
: In which build you got this defect. : Name of the failed test case during testing.
7. Reproducible (Yes/No): If – yes, the defect appears regularly. •
If Yes, attach test procedure : Adding test procedure to the report to reproduce that defect by developer.
9. If No, attach strong reason and snapshot. ( Print screen copy in Paint ). •
Severity
: Show Stopper / Fatal / High. Critical / Major / Medium. Non – Critical / Minor / Low.
10. Priority
: High / Medium / Low.
11. Status ( New / Reopen ) 12. Reported By : 13. Reported on :
:
Name of the test engineer. Date of Submission.
23
Development Team 14. Assigned to
: PM / Team Lead.
15. Fixed By
: ( Accepted or Rejected by PM / TM )
16. Resolution Type
:
17. Resolved By
: Developer Name.
18. Resolved On
: Date.
19. Approved By
: PM name.
20. Approved On: Date.
Defect Age : The time gap between resolve on and reported on.
24
Bug Life Cycle Defect Detection Defect Reproduce Defect Reporting Bug Fixing Bug Resolving Bug Closing.
25
Defect Submission Process : Project Manager Test Lead
Team Lead
Test Engineer
Developer Transmittal Reports
Types of Defects : 2. User Interface Defects. •
Boundary Related Defects
4. Error Handling Defects, 5. Calculations Defects, 6. Service Level Defects 7. Race Conditions (Dead Lock, Improper session closing, compatibility problem) 8. Load Conditions (Buffer overflow, Memory leakage, Queue overflow ) 9. Hardware Defects. 10.Source level Defects ( Ex. User manuals )
26
Types of Defects : 10. Version Control Defects ( Mismatches between two consecutive versions ) 11. Id Control Defects ( Version number mistake, logo mistake, logo missing, etc.,) Test Execution Closure : After completion of all possible test execution cycles and reporting, testing team will concentrate on test execution closure review. Level 3 : Before user acceptance testing, testing team will concentrate on preacceptance or final regression testing. User Acceptance Testing : After completion of Regression testing, testing team will give green signal for UAT. The 2 types of UAT are : 2. Alpha Testing : Acceptance by customer side people in development site. 3. Beta Testing : Acceptance by customer side people in customer site. Sign Off : Depends on feed back from customer side people, our test management concentrate on testing sign off with a list of external documents to release : 2. Testing Methodology / Strategy 3. System Test Plan. 4. Traceability Matrix. 5. Automated Test Script. 6. Final Bug Summary Report. Bug Id/ Description
Feature
Test Case
Reported By
Status (Closed/Deferred)
Comments.
27