test design
page 1 of 15
brought to you by software planner (www.softwareplanner.com). deliver software solutions to specification, on time and on budget with software planner. tracks customer requirements, test cases, defects, project deliverables, and your schedule (appointments and to-do lists) via the web. discussion forums and contact management also built in. free 2 week trial. software planner guided tour
pricing
brochure
free trial
software planner details
test design
owners and list of contacts name
email
phone
role
john doe
[email protected]
303-894-7315 w 303-471-8344 h 303-203-5567 pgr
project manager development lead
joe tester jane prodsupport joe usermgr joe developer jane developer joe dba joe tester jane tester joe customer jane customer josey customer
system test lead production support mgr user test lead developer – presentation tier developer – business tier data base administrator tester tester department vp department mgr product support
signoffs phase
name
date
test design
john doe, pm/dm
xx/xx/xx
signature
joe tester, system test lead jane prodsupport, production support mgr joe user mgr, um joe customer, customer
revision history _____________________________________________________________________________ developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design
page 2 of 15
date
reason for change(s)
author( s)
09/15/198 8
first draft
john doe
_____________________________________________________________________________ developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design
page 3 of 15
table of contents test design................................................................................................................ ........................ owners and list of contacts............................................................................................................... . signoffs.................................................................................................................... ......................... revision history.......................................................................................................................... ........ 1. introduction............................................................................................................................. ... 1.1 background................................................................................................................ .......... 1.2 references........................................................................................................................ .... 1.3 code freeze date...................................................................................................... ............ 1.4 change control.................................................................................................... ................. 2. items to be tested........................................................................................................... ........... 2.1 level of testing................................................................................................................ ...... 2.2 features to be tested............................................................................................... ............. 2.3 test case matrix................................................................................................................... . 2.4 features excluded from testing....................................................................................... ...... 3. testing approach................................................................................................................... ..... 3.1 test deliverables............................................................................................................. ...... 3.2 defect tracker setup.......................................................................................... ................... 4. release criteria....................................................................................................................... .... 4.1 test case pass/fail criteria.......................................................................... .......................... 4.2 suspension criteria for failed smoke test..................................................................... ......... 4.3 resumption requirements...................................................................................... ............... 4.4 release to user acceptance test criteria........................................................... .................... 4.5 release to production criteria.................................................................... ........................... 5. hardware................................................................................................................ ................... 5.1 servers........................................................................................................................ ......... 5.2 server configuration:......................................................................................... ................... 5.3 other servers:.................................................................................................................. ..... 5.4 clients:........................................................................................................ ......................... 6. project plan............................................................................................................................. ... 6.1 critical dates.................................................................................................................... ..... 6.2 budget information................................................................................................... ............
_____________________________________________________________________________ developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design
page 4 of 15
1. introduction the primary goal of this document is to establish a plan for the activities that will verify [product name] as a high quality product that meets the needs of the [product name] business community. these activities will focus upon identifying the following: • • • • • • •
items to be tested testing approach roles and responsibilities release criteria hardware project plan and budget risks and contingencies
1.1 background write one or two sentences that describe the system to be tested. example: the defect tracker system is a sophisticated bug-tracking tool that allows clients to significantly increase the quality of their software deliverables. this is a new product, so no backwards compatibility is necessary.
1.2 references list all reference material you used in creating this plan. example: 1. functional specification, program management, xxx 1999 2. testing computer software, second edition, kaner / falk / nguyen, 1993 3. detailed design, program management, xxx 1999
1.3 code freeze date production code for [product name] will be frozen on mm/dd/yy. our assumption is that any production code changes made after that date is outside of the responsibility of this development project.
1.4 change control after baseline, all changes must be approved and documented by the change control board. if it is agreed that the change is necessary, the impact to development and testing must be agreed upon by the test lead, development lead and project manager. this may (or may not) affect the planned completion date of the project.
2. items to be tested 2.1 level of testing below is a list of services that testing may provide. next to each service is the degree of testing that we will perform. below are the valid level desired: _____________________________________________________________________________ developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design
page 5 of 15 high – high risk area, test this area very hard medium – standard testing low – low risk area, test if time allows none – no testing desired
level desired
service performance testing: performance testing is testing to ensure that the application responds in the time limit set by the user. if this is needed, the client must supply the benchmarks to measure by and we must have a hardware environment that mirrors production. windows / internet gui standards: this testing is used to ensure that the application has a standardized look and feel. it may be as simple as ensuring that accelerator keys work properly and font type and size are consistent or it could be as exhaustive as ensuring that the application could be assigned a windows logo if submitted for one (there are strict guidelines for this). note: if this level of testing is needed, the client must provide their standards as to allow us to compare to that standard. there is a good book that explains microsoft standards: the interface guidelines for software design by microsoft press. platform testing: platform testing is used to warrant that the application will run on multiple platforms (win 95/98, win nt, ie 4.0, ie 5.0, netscape, etc.) (specify which ones) localization: localization testing is done to guarantee that the application will work properly in different languages (i.e. win 95/98 english, german, spanish, etc.) this also involves ensuring that dates will work in dd/mm/yy format for the uk. (specify which ones) stress testing: stress testing is testing to ensure that the application will respond appropriately with many users and activities happening simultaneously. if this is needed, the number of users must be agreed upon beforehand and the hardware environment for system test must mirror production. conversion: conversion testing is used to test any data that must be converted to ensure the application will work properly. this could be conversion from a legacy system or changes needed for the new schema. parallel testing: parallel testing is used to test the functionality of the updated system with the functionality of the existing system. this is sometimes used to ensure that the changes did not corrupt existing functionality. regression of unchanged functionality: if regression must occur for functional areas that are not being changed, specify the functional areas to regress and the level of regression needed (positive only or positive and negative testing). automated testing: automated testing can be used to automate regression and functional testing. this can be very helpful if the system is stable and not changed often. if the application is a new development project, automated testing generally does not pay big dividends. installation testing:
_____________________________________________________________________________ developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design
page 6 of 15 installation testing is testing the setup routine to ensure that the product can be installed fresh, over an existing copy, and with other products. this will test different versions of ocx’s and dll’s. end to end / interface testing: end to end testing is testing all inputs (super-systems) and outputs (sub-systems) along with the application. a controlled set of transactions is used and the test data is published prior to the test along with the expected results. this testing ensures that the application will interact properly with the other systems. usability: usability is testing to ensure that the application is easy to work with, limits keystrokes, and is easy to understand. the best way to perform this testing is to bring in experienced, medium and novice users and solicit their input on the usability of the application. user’s guide / training guides: this testing is done to ensure that the user, help and training guides are accurate and easy to use. guerilla testing: guerilla testing is done to use the system with unstructured scenarios to ensure that it responds appropriately. to accomplish this, you may ask someone to perform a function without telling them the steps for doing it. security testing: security testing is performed to guarantee that only users with the appropriate authority are able to use the applicable features of the system. network testing: network testing is done to determine what happens when different network latency is applied when using the application. it can uncover possible problems with slow network links, etc. hardware testing: hardware testing involves testing with a bad disk drive, faulty network cards, etc. if this type of testing is desired, be very specific about what is in scope for this testing. duplicate instances of application: this testing is done to determine if bringing up multiple copies of the same application will cause blocking or other problems. year 2000: this testing is performed to ensure that the application will work in the year 2000 and beyond. temporal testing: temporal testing is done to guarantee that date-centric problems do not occur with the application. for example, if many bills are created quarterly, you may want to set the server date to a quarter-end to test this date-centric event. disaster recovery (backup / restore): this testing is done to aid production support in ensuring that adequate procedures are in place for restoring upon a disaster. input and boundardy tests: testing designed to guarantee that the system would only allow valid input. this includes testing to ensure that the maximum number of characters for a field may not be exceeded, boundary conditions such as valid ranges and “off-by-one”, “null”, “max”, “min”, tab order from field to field on the screen, etc. out of memory tests: testing designed to ensure that the application would run in the amount
_____________________________________________________________________________ developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design
page 7 of 15 of memory specified in the technical documentation. this testing will also detect memory leaks associated with starting and stopping the application many times.
2.2 features to be tested below is a list of features that will be tested. note: if features are being dropped into production at different times, add a couple of columns to this grid that identifies what features will be in each drop. the items below map back to the functional specifications.
business requirements
ref. no.
feature
functional specification
page initialization and defaults
5.2.1
link to inquiry page
access to this page will come from a link off the resource menu on the select page
5.2.1
page
user impersonation will not be recognized.
5.2.2.1
user name
5.2.2.2
user email alias
5.2.2.3
user company
5.2.2.4
country dropdown
5.2.2.5
question type dropdown
5.2.2.6
question sub type dropdown
5.2.3.1
reference fields
5.2.3.2
subject field
5.2.3.3
all combo boxes
use the logon id to retrieve the user’s first and last name use the logon id to retrieve the user’s email address use the logon id to retrieve the user’s company name this combo box will be populated from the country table this combo box will be populated from a hard coded list this combo box will be populated from a hard coded list the reference fields will default to blank. the subject field will default to blank all combo box or drop down controls will be set to unselected
_____________________________________________________________________________ developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design
page 8 of 15
2.3 test case matrix test case id group: navigation / security 101 102 group: user interface 103 104 105
func spec id
test case description
expected result
400group 5.2.1 401 – group 5.2.1
display inquiry page from resource menu on the select page attempt to display the inquiry page without permission
inquiry page displayed
none none none
tab through inquiry page try all browser functions check all labels for spelling, alignment check selectability of combo boxes (cb) and drop down controls
tab from top to bottom and left to right all functions operate in a standard way all labels displayed according to screen example. all cb and drop down controls are set to unselected
409 – group 5.2.2.1 409 – group 5.2.2.2 409group 5.7.2.1
valid user name associated with logon id
user name displayed (see user table)
valid email address associated with logon id
user email address displayed (see user table)
no user email address associated with logon id
display error message “you must have a email address before the request can be processed. “
409 – group 5.2.2.3 409 – group 5.2.4
valid company name associated with logon id
company name displayed (company table using a join from the user)
no company name associated with logon id
display error message “your company is not registered on [product name] and use of the inquiry form is not allowed.
none
display a valid user id (minimum char. and maximum char) associated with a logon id
user name displayed
402 – group 5.2.3.3 group: default values 106 107 108
109 110 group: boundary tests 111
inquiry page not displayed because unable to display resource menu
2.4 features excluded from testing below is a list of features that will not be tested.
description of excluded item
reason excluded
unit testing or “white box” testing
development is responsible for this. we test it using a “black box” approach (i.e. we can not see the code, but we expect the code to work per the specifications).
3. testing approach the system test team will begin designing their detailed test plans and test cases, as the _____________________________________________________________________________ developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design
page 9 of 15
development team is designing and coding. defect tracker will be used to enter the test cases and to track the defects. it can be accessed from http://www.defecttracker.com. the builds will be delivered to system test via visual source safe (see cover page for vss location) drops coordinated by the development team. the development team will be responsible for installing the partial new builds into the existing structure of the system test environment, and updating the client machines if necessary. build notes with all changes since the last drop and a list of all files to be delivered will accompany each build drop. once the build is dropped by the development team, a series of scripts, called the smoke test, will be run to ensure that the shipment from development is in a state that is ready for testing. the smoke test scripts will test the basic functionality of the system. these scripts may be automated once they are successfully performed manually. if an excessive number of smoke test items fail, the product will be shipped back to development and no testing will begin until the smoke test passes. once the first drop begins, triage meetings will be held to discuss the bug list with the project/development manager. triage meetings are used to prioritize, set priority and severity and assign bugs. each week following the first drop, additional drops will be delivered to system test to test the bugs fixed from the prior drops. defect tracker will be used to track, report and analyze bugs. prior to triage, a defect tracker protocol document will be distributed to the project and development manager to ensure that everyone understands how to use defect tracker and how to effectively enter bugs.
3.1 test deliverables below are the deliverables for each phase of the project.
phase
deliverable
responsible
pre-baseline
project initiation. upon receipt of a functional specification, project initiation will be performed. this includes finding a test lead for the project and setting up a project in defect tracker.
test lead
pre-baseline
kick off meeting. this is done to familiarize the project manager with the test methodology and test deliverables, set expectations, and identify next steps.
test lead
pre-baseline
functional requirement scrubbing. attend meetings to create functional specifications. offer suggestions if anything is not testable or poorly designed.
test lead
_____________________________________________________________________________ developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design pre-baseline
page 10 of 15 create pre-baseline documents
test lead
test plan. the test plan will break functionality into logical areas (most often specified in the functional specification). once completed, the project manager, development manager, user project manager, and production support manager will review it. once reviewed and amended, it must be approved and signed by the test lead, project manager, development manager, production support manager, and user project manager. create testing project plan and budget. before completing the project plan, the development project plan must be completed as to know what dates we are being asked to adhere to. this also guides us in determining if the test estimates are reasonable. the project plan will be detailed, relating back to the functional specification and test plan. post-baseline
create test cases. once the detailed test plan has been created and reviewed by the test and development teams, test cases are created. test cases are stored in defect tracker. each test case includes the steps necessary to perform the test, expected results and contains (or refers to) any data needed to perform the test and to verify it works.
tester, guided by the test lead
post-baseline
project and test plan traceability. review the test plan to ensure all points of the functional specification are accounted for. likewise, ensure that the test cases have traceability with the test plan and functional spec. finally, that the project plan has traceability with the test plan. triage. once testing begins, triage meetings will be held to prioritize and assign bugs. this is conducted by the test lead and will include the project manager and development lead. once user testing begins, the user project manager will also attend. triage meetings are usually held 2 to 5 times per week, depending on the need.
test lead
update project plan and budgeting. update the project plan with % complete for each task and enter notes regarding critical issues that arise. also determine if the test effort is on budget.
test lead
once testing begins
bi-weekly
test lead
_____________________________________________________________________________ developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design
page 11 of 15
weekly
weekly status report. the project manager will specify who is to receive the weekly status report. this report will identify the % complete of all tasks that should be due by that week, the tasks to be worked on in the next week, metrics indicating the testing statistics (once testing begins), budgeting information, and any issues or risks that need to be addressed. this information can be generated from defect tracker.
test lead
before sending to uat
release to uat report. once system testing is done and the code is ready for user acceptance testing (uat), the test lead will create a report that summarizes the activities and outlines any areas of risk. it will be created based on a template (accessible from our web site), and all assumptions will be listed.
test lead
project completion
post mortem analysis. this is done to analyze how well our testing process worked.
test lead, development team, user team
3.2 defect tracker setup the test lead will create a project for defect tracker so that bugs can be tracked. the project name in defect tracker will be [projectname].
4. release criteria 4.1 test case pass/fail criteria the feature will pass or fail depending upon the results of testing actions. if the actual output from an action is equal to the expected output specified by a test case, then the action passes. should any action within a test case fail, the entire feature or sub-feature fails. the specific criteria for test case failure will be documented in defect tracker. if a test case fails, it is not assumed that the code is defective. a failure can only be interpreted as a difference between expected results, which is derived from project documentation, and actual results. there is always the possibility that expected results can be in error because of misinterpretation, incomplete, or inaccurate project documentation. pass criteria: •all processes will execute with no unexpected errors •all processes will finish update/execution in an acceptable amount of time based on benchmarks provided by the business analysts and documented by the development team _____________________________________________________________________________ developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design
page 12 of 15
4.2 suspension criteria for failed smoke test the system test team may suspend partial or full-testing activities on a given build if any of the following occurs: • • • • • • • • •
files are missing from the new build. the development team cannot install the new build or a component. the development team cannot configure the build or a component. there is a fault with a feature that prevents its testing. item does not contain the specified change(s). an excessive amount of bugs that should have been caught during the component/unit test phase are found during more advanced phases of testing. a severe problem has occurred that does not allow testing to continue. development has not corrected the problem(s) that previously suspended testing. a new version of the software is available to test.
4.3 resumption requirements the steps necessary to resume testing: • • •
clean previous code from machines. re-install the item. the problem encountered resulting in suspension is corrected.
resumption of testing will begin when the following is delivered to the system test team: • • •
a new build via visual source safe. a list of all bugs fixed in the new version. a list of all the changes to the modules in the new version and what functionality they affect.
4.4 release to user acceptance test criteria the release criteria necessary to allow the code to migrate to user acceptance testing are as follows: • • • •
there are no open bugs with a severity 1 or 2 test cases scheduled for both integration and system test phases have passed. successfully passes the final regression testing. there are no discrepancies between the master setup and the version used during the final regression testing.
4.5 release to production criteria the release criterion necessary to allow the code to migrate to production is as follows: • • • • • •
there are no open bugs with a severity 1 or 2 test cases scheduled for both integration and system test phases have passed. successfully passes the final regression testing. there are no discrepancies between the master setup and the version used during the final regression testing. the user acceptance test was successfully completed the user acceptance criteria was met.
_____________________________________________________________________________ developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design
page 13 of 15
5. hardware 5.1 servers machine name
purpose
owner
hardware
systest uat
system test machine user acceptance test machine production machine
john doe jane doe
2 x p133 256mb ram 2 x p200 256mb ram
kevin doe
4 x p600 1gb ram
prod
5.2 server configuration: systest: sql server service pack 2 nt service pack 5 uat: sql server service pack 2 nt service pack 5 prod: sql server service pack 2 nt service pack 5
5.3 other servers: none
5.4 clients: machine name
machine location
operating system
hardware
clienttester1 (file server) clienttester2 (client) clienttester3 (client) clienttester4 (client) clienttester5 (client) clienttester6 (client)
server room 14 server room 12 server room 10 server room 14 server room 14 test lab
nt 3.51 server 95 us english 95 uk english 95 german nt 3.51 wkst nt 4.0 wkst
piii-400 65mb ram pii-500 16mb ram p-300 40mb ram p-100 32mb ram p-200 32mb ram p-300 48mb ram
6. project plan the project plan is created using project 98 and is linked into the project manager’s project plan as to eliminate the need to keep a separate copy updated for the project manager. note: make sure that the project plan appears on your web site and that it contains cost information as well as dates. include a summarized (collapsed) view of the project below followed by a view of the cost information. below is an example. you can copy/paste this information from project 98 by including these fields next to each other in the gantt view and selecting the rows and using the copy/paste functionality.
_____________________________________________________________________________ developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design
page 14 of 15
6.1 critical dates
task
work
begin
end
[product name] project timeline test planning resource availability test lead activities box 1 - detailed test plans/test cases **prodsupp** po copy regression testing i functionality 1 functionality 2 interfaces / end-to-end testing system testing original test items receive drop from development install build build verification test regression testing functionality 1 functionality 2 interface testing extended test cases end to end testing reiteration of bug fixes
"2,980h" "1,254h" 40h 452h 762h 48h 232h 58h 224h 200h "1,102h" 344h 0h 16h 32h 48h 24h 136h 88h 758h 472h 152h
mon 11/24/97 mon 11/24/97 mon 12/29/97 mon 11/24/97 mon 11/24/97 mon 1/5/98 mon 11/24/97 tue 1/13/98 wed 12/31/97 thu 1/15/98 mon 2/9/98 mon 2/9/98 mon 2/9/98 mon 2/9/98 wed 2/11/98 fri 2/13/98 mon 3/2/98 wed 2/11/98 wed 2/11/98 mon 2/23/98 thu 3/19/98 fri 4/10/98
fri 5/8/98 fri 5/8/98 wed 1/14/98 fri 5/8/98 fri 2/6/98 mon 1/12/98 mon 1/12/98 thu 1/22/98 fri 2/6/98 thu 2/5/98 thu 4/30/98 wed 3/4/98 mon 2/9/98 tue 2/10/98 thu 2/12/98 fri 2/20/98 wed 3/4/98 fri 2/27/98 wed 2/25/98 thu 4/30/98 fri 4/17/98 thu 4/30/98
6.2 budget information note: ,include a 20% contingency in your plan by either listing specific risks that account for that 20% or by increasing your hours on each task as to assume the 20% contingency.
task
total cost
[productname] project timeline test planning resource availability test lead activities box 1 - detailed test plans/test cases **prodsupp** po copy regression testing i functionality 1 functionality 2 interfaces / end-to-end testing system testing original test items receive drop from development install build build verification test regression testing functionality 1 functionality 2 interface testing (schema changes) extended test cases end to end testing reiteration of bug fixes
"$164,820.00" "$78,520.00" "$2,000.00" "$38,420.00" "$38,100.00" "$2,400.00" "$11,600.00" "$2,900.00" "$11,200.00" "$10,000.00" "$55,100.00" "$17,200.00" $0.00 $800.00 "$1,600.00" "$2,400.00" "$1,200.00" "$6,800.00" "$4,400.00" "$37,900.00" "$23,600.00" "$7,600.00"
brought to you by software planner (www.softwareplanner.com). deliver software solutions to specification, on time and on budget with software planner. tracks customer requirements, test _____________________________________________________________________________ developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design
page 15 of 15
cases, defects, project deliverables, and your schedule (appointments and to-do lists) via the web. discussion forums and contact management also built in. free 2 week trial. software planner guided tour
pricing
brochure
free trial
software planner details
_____________________________________________________________________________ developed by pragmatic software co., inc. http://www.pragmaticsw.com