Test Planning

  • November 2019
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Test Planning as PDF for free.

More details

  • Words: 2,133
  • Pages: 13
Test Planning •Risk Concepts and Vocabulary •Risks Associated with Software Development •Risks Associated with Software Testing •Risk Analysis •Risk Management •Prerequisites to Test Planning •Create the Test Plan

Risk Concepts and Vocabulary



• • •

Test Case - Test cases are how the testers validate that a software function or a structural attribute of software meets the software specifications Test Data - Test data is information used to build a test case Test Script - Test scripts are an online entry of test cases in which the sequence of entering test cases and the structure of the online entry system must be validated. Risk - The potential loss to an organization. Risk can be measured by performing a risk analysis Risk Analysis - The process of evaluating risks, threats, controls, and vulnerabilities. Threat - A threat is something capable of exploiting vulnerability in the security of a computer system or application. Threats include both hazards and events that can trigger flaws. Vulnerability - A flaw in the system of control that will enable a threat to be exploited. Control - Anything that tends to cause the reduction of risk. Damaging Event - The materialization of a risk to an organizations assets.



To identify the risks in a computerized environment a tester must:

• • • • •

– – –

Identify these risks Estimate the severity of the risk Develop tests to substantiate the impact of the risk on the application.

Risks Associated with Software Development •

The risks associated with Software Development are:

• • • • • • • • • • • • • • • • • • •

Improper use of technology Repetition of errors Cascading of errors Illogical processing Inability to translate user needs into technical requirements Inability to control technology Incorrect entry of data (Conditions that can cause incorrect entry of data) Concentration of data (Conditions that can create problems attributed to the concentration of data) Inability to react quickly (Conditions that can cause computerized applications to react slowly) Inability to substantiate processing Concentration of responsibilities Erroneous or falsified input data Misuse by authorized end users Uncontrolled system access Ineffective security and privacy practices for the application Procedural errors during operations Program errors Operating system flaws Communications system failure

Risks Associated with Software Testing •

When conducting Risk analysis two components are taken into consideration – The probability that the negative event will occur. – The potential loss or impact associated with the event.



It is the test manager’s responsibility to determine how to apply the test methodology to achieve the greatest level of confidence in the application under development. The test manager must determine the appropriate amount of testing to perform based upon the risks associated with the application. The test manager is also responsible for identification of potential risks that might impact testing. – Not Enough Training/Lack of Test Competency – Us versus Them Mentality – Lack of Test Tools – Lack of Management Understanding and Support of Testing – Lack of Customer and User Involvement – Not Enough Schedule or Budget for Testing – Over Reliance on Independent Testers – Rapid Change – Testers are in A Lose-Lose Situation – Having to Say “No” – Test Environment – New technology – New developmental processes

• •



Premature Release Risk – Defined as releasing software under the following conditions – The requirements were implemented incorrectly. – The test plan has not been completed. – Defects uncovered in testing have not been corrected. – The software released into production contains defects; although the testing is not complete the defects within the software system may not be known.

Risk Analysis • •

The objective of performing risk analysis as part of test planning is to help allocate limited test resources to those software components that pose the greatest risk to the organization. Performing risk analysis during test planning is a four-step process as follows: – Form the risk analysis team – Identify risks – Estimate the magnitude of the risk – Select testing priorities



Skills needed for members of a Risk Analysis Team – Knowledge of the user application – Understanding of risk concepts – Ability to identify controls – Familiarity with both application and information services risks – Understanding of information services concepts and systems design – Understanding of computer operations procedures



Candidates for the Risk analysis team should come from the user area any of the following areas: – Software testing – Risk consultant – Project leader



The risk team can use one of the following two methods to identify risks – Risk analysis scenario - In this method, the risk team “brainstorms” the potential application risks using their experience, judgment, and knowledge of the application area. – Risk checklist - The risk team is provided with a list of the more common risks that occur in automated applications and from this list, the team selects those risks that are applicable to the application.

Risk Analysis (Continued) •

The magnitude of a risk can be determined by any of the following means: – Intuition and Judgment - In this process, one or more individuals state they believe the risk is of a certain magnitude. – Consensus - A team or group of people agrees to the severity of magnitude of a risk. – Risk Formula - Using this process, the probability or frequency of an event occurring must be determined, and the loss per frequency must be determined. – Annual Loss Expectation (ALE) Estimation • Make a preliminary assessment of the loss. • Make a preliminary assessment of frequency using a frequency table of multiples of five as in Figure 32. • Calculate an ALE using the loss and frequency tables in Figure 31 and Figure 32. • The Annual Loss Expectation can be lowered by: – Using multiple opinions to establish most likely ALE. – Rank intangible elements (see Figure 33 below). – Assign monetary values to intangibles. – Accumulate additional evidence, if needed. – Make a post-analysis challenge of the decision. – Document the decision reached.



Considerations which may impact the prioritization of risks: – Compliance required to laws and regulations – Impact on competitiveness – Impact on ethics, values and image

Risk Management • •

Risk management is a totality of activities that are used to minimize both the frequency and the impact associated with risks. two activities associated with risk management which are: – Risk Reduction Methods – Contingency Planning



Risk Reduction Methods – The Loss due to risk = the frequency of occurrence multiplied by the loss per occurance – This can then be compared with the estimated cost of implementing a control to see if it is economical to implement a control to reduce the loss.



Contingency Plans – Plans established for activation when a loss is known to occur for a given risk – The role of testers is to evaluate the adequacy of the contingency plans associated with risk.

Prerequisites to Test Planning •

There are five prerequisites to test planning – Test Objectives – Acceptance Criteria – Assumptions – People Issues – Constraints



Test Objectives – Test to assure that the software development project objectives were met – Test to achieve the mission of the software testing group (This didn’t make a lot of sense) – Testing the functional and structural objectives of the software - “Meeting requirements” – Testing to meet the needs of the users – “Fit for use”



Acceptance Criteria – A method of communicating the characteristics of a desired software system. – This can be defined by the development staff – This can be defined by the end user – Its purpose is to facilitate good communication between the IT organization and its users.



Assumptions – Assumptions should be documented for the following reasons • First – To assure that they are effectively incorporated into the test plan • Second – To monitor the assumption should the event included in the assumption not occur.

Prerequisites to Test Planning (Continued) •

People Issues – Tend to be both political and personal – Should be resolved before starting development – Typical Issues • Who should run the project • Who can make decisions • Which organizational group has authority to decide requirements • “Its been tried before and didn’t work” –



When identifying issues people can be divided into four categories, they are: • People who will make the software system happen. • People who will hope the software system happens. • People who will let the software system happen. • People who will attempt to make the software system not happen.

Constraints – Three common constraints – Test Staff Size, Test Schedule, and Budget – Constraints must be integrated into the test plan

Create the Test Plan • • • •

The test plan describes how the testing will be accomplished The test plan should be an evolving document The test plan should provide background information on the software being tested, test objectives and risks, and specific tests to be performed. Tests in the test plan should be: – – –



Repeatable Controllable Ensure adequate test coverage

Understand the characteristics of the software being developed – – – –

Definewhat it means to meet the project objectives. Understand the core business areas and processes. Assess the severity of potential failures. Identify the components for the system: • • • • • • •

– –

Assure requirements are testable. Address implementation schedule issues: • • • • •



Links to core business areas or processes Platform languages, and database management systems Operating system software and utilities Telecommunications Internal and external interfaces Owners Availability and adequacy of source code and associated documentation

Implementation checkpoints Meetings Identification and selection of conversion facilities Time needed to put converted systems into production The conversion of backup and archival data

Address interface and data exchange issues: • • • • • •

Development of a model showing the internal and external dependency links among core business areas, processes, and information systems Notification of all outside data exchange entities Data bridges and filters Contingency plans if no data is received from an external source Validation process for incoming external data Contingency plans for invalid data

Create the Test Plan (Continued) – – •

Evaluate contingency plans for this system and activities. Identify vulnerable parts of the system and processes operating outside the information resource management area.

Build the Test Plan – The development of an effective test plan involves – Set Test Objectives • Test objectives should restate the project objectives from the project plan. • When defining test objectives it is best to keep them at ten or fewer. • To define test objectives testers need to: – – – –



Define each objective so that you can reference it by a number. Write the test objectives in a measurable statement, Assign a priority to the objectives (High, Medium, Low) Define the acceptance criteria for each objective.

Develop the test matrix • The test matrix lists which software functions must be tested and the available tests. • It shows “How” the software will be tested. • Define tests as required • Define Conceptual Test Cases to be Entered as a Test Script –



Define Verification Tests –



A conceptual test script is a high-level description of the test objectives, not the specific test cases that will be entered during online testing. Verification is a static test performed on a document developed by the team responsible for creating software.

Prepare the software test matrix –

The objective of this matrix, is to illustrate how to document which test cases test which software function and which structural attribute.



Define the test administration • The administrative component of the test plan identifies the schedule, milestones, and resources needed to execute the test plan • State Test Plan General Information, this normally includes – Software Project – Summary – Pretest Background – Test Environment – Test Constraints – References – When to Stop •

Define Test Milestones

– Test milestones are designed to indicate the start and completion date of each test.

Create the Test Plan (Continued) •

Write the Test Plan



Guidelines to writing the test plan – Start early – Keep the Test Plan flexible – Review the Test Plan frequently – Keep the Test Plan concise and readable – Calculate the planning effort – Spend the time to do a complete Test Plan



The test plan usually contains the following: – Test Scope – Test Objectives – Assumptions – Risk Analysis – Test Design – Roles & Responsibilities – Test Schedule & Resources – Test Data Management – Test Environment – Communication Approach – Test Tools

Related Documents

Test Planning
November 2019 9
Planning Load Test
November 2019 32
Planning
November 2019 38