Example of Tool Evaluation Criteria Once the test engineer has narrowed the search for a particular type of test tool to two or three lead candidates, the Evaluation Scorecard depicted in Table 1 can be used to determine which of the tools best fits the particular requirements.
Tool Evaluation Scorecard Table 1
Evaluation Scorecard—Automated GUI Testing (Record/Playback) Tool
Test Tool Characteristic
Weight (1–10)
Score (1–5)
Value (1–50)
7 5 5
5 5 3
35 25 15
7
4
28
8
4
32
8
4
32
Ease of Use Learning curve Easy to maintain the tool Easy to install—tool may not be used if difficult to install
Tool Customization Can the tool be customized (can fields in tool be added or deleted)? Does the tool support the required test procedure naming convention?
Platform Support Can it be moved and run on several platforms at once, across a network (that is, cross-Windows support, Win95, and WinNT)?
continued
1
2
Example of Tool Evaluation Criteria
continued from page 1
Test Tool Characteristic
Weight (1–10)
Score (1–5)
Value (1–50)
8 8
5 5
40 40
10
3
30
9
5
45
9 9
5 5
45 45
9 8
5 4
45 32
10
5
50
9
5
45
10 10
4 4
40 40
9
4
36
9
3
27
10
3
30
9
5
45
10 7
5 5
50 35
Multiuser Access What database does the tool use? Does it allow for scalability? Network-based test repository—necessary when multiple access to repository is required
Defect Tracking (For more detail on evaluating defect tracking tools, see Chapter 8) Does the tool come with an integrated defect-tracking feature?
Tool Functionality Test scripting language—does the tool use a flexible, yet robust scripting language? What is the complexity of the scripting language: Is it 4 GL? Does it allow for modular script development? Complexity of scripting language Scripting language allows for variable declaration and use; allows passing of parameters between functions Does the tool use a test script compiler or an interpreter? Interactive test debugging—does the scripting language allow the user to view variable values, step through the code, integrate test procedures, or jump to other external procedures? Does the tool allow recording at the widget level (object recognition level)? Does the tool allow for interfacing with external .dll and .exe files? Published APIs—language interface capabilities ODBC support—does the tool support any ODBC-compliant database? Is the tool intrusive (that is, does source code need to be expanded by inserting additional statements)? Communication protocols—can the tool be adapted to various communication protocols (such as TCP/IP, IPX)? Custom control support—does the tool allow you to map to additional custom controls, so the tool is still compatible and usable? Ability to kick off scripts at a specified time; scripts can run unattended Allows for adding timers Allows for adding comments during recording
continued
3
Example of Tool Evaluation Criteria
continued from page 2
Test Tool Characteristic Compatible with the GUI programming language and entire hardware and software development environment used for application under test (i.e., VB, Powerbuilder) Can query or update test data during playback (that is, allows the use of SQL statements) Supports the creation of a library of reusable function Allows for wrappers (shells) where multiple procedures can be linked together and are called from one procedure Test results analysis—does the tool allow you to easily see whether the tests have passed or failed (that is, automatic creation of test results log)? Test execution on script playback—can the tool handle error recovery and unexpected active windows, log the discrepancy, and continue playback (automatic recovery from errors)? Allows for synchronization between client and server Allows for automatic test procedure generation Allows for automatic data generation Y2K compliance
Weight (1–10)
Score (1–5)
Value (1–50)
10
5
50
10
4
40
10 10
5 5
50 50
10
3
30
5
3
15
5 8 8 10
10 5 5 5
50 40 40 50
8 8 8 8
5 5 5 5
40 40 40 40
9
5
45
10 10
3 3
30 30
10
3
30
10
3
30
10
3
30
10
5
50
Reporting Capability Ability to provide graphical results (charts and graphs) Ability to provide reports What report writer does the tool use? Can predefined reports be modified and/or can new reports be created?
Performance and Stress Testing Performance and stress testing tool is integrated with GUI testing tool Supports stress, load, and performance testing Allows for simulation of users without requiring use of physical workstations Ability to support configuration testing (that is, tests can be run on different hardware and software configurations) Ability to submit a variable script from a data pool of library of scripts/data entries and logon IDs/password Supports resource monitoring (memory, disk space, system resources) Synchronization ability so that a script can access a record in database at the same time to determine locking, deadlock conditions, and concurrency control problems
continued
4
Example of Tool Evaluation Criteria
continued from page 3
Test Tool Characteristic Ability to detect when events have completed in a reliable fashion Ability to provide client to server response times Ability to provide graphical results Ability to provide performance measurements of data loading
Weight (1–10)
Score (1–5)
Value (1–50)
9
5
45
10 8 10
3 5 5
30 40 50
10 8
4 3
40 24
8
5
40
8
5
40
7
4
28
10 10
5 5
50 50
10 9
5 4
50 36
10 7 9
4 3 4
40 21 36
8 8 8
4 4 4
32 32 32
8 8 10 9 9 8
4 5 3 4 5 4
32 40 30 36 45 32 2,638
Version Control Does the tool come with integrated version control capability? Can the tool be integrated with other version control tools
Test Planning and Management Test planning and management tool is integrated with GUI testing tool Test planning and management tool is integrated with requirements management tool Test planning and management tool follows specific industry standard on testing process (such as SEI/CMM, ISO) Supports test execution management Allows for test planning—does the tool support planning, managing, and analyzing testing efforts? Can the tool reference test plans, matrices, and product specifications to create traceability? Allows for measuring test progress Allows for various reporting activities
Pricing Is the price within the estimated price range? What type of licensing is being used (floating, fixed)? Is the price competitive?
Vendor Qualifications Maturity of product Market share of product Vendor qualifications, such as financial stability and length of existence. What is the vendor’s track record? Are software patches provided, if deemed necessary? Are upgrades provided on a regular basis? Customer support Training is available Is a tool Help feature available? Is the tool well documented? Availability and access to tool user groups Total Value
5
Example of Tool Evaluation Criteria
As the weighted values for the test tool characteristics will vary with each type of test tool, the test team may wish to develop an evaluation scorecard form for each type of test tool required. In Table 1, an automated GUI test tool (capture/playback) candidate is evaluated against the desired test tool characteristics. The total value of 2,638 for this candidate must then be compared with the total values derived for the other two candidates. As noted in the sample scorecard summary below, Candidate 3 achieved a rating of 75.3% in being able to provide coverage for all the desired test tool characteristics: Candidate
Score
Rating
Candidate 1 Candidate 2 Candidate 3
2,360 2,530 2,638
67.4% 72.3% 75.3%
An optional evaluation scoring method involves sizing up the three candidates using only the most important test tool characteristics. Note that 12 of the characteristics were assigned a weight of 10. Table 2 reflects the scores for the three test tool candidates using a preferred scorecard form based upon product information obtained from each vendor.
Table 2
Preferred Scorecard—GUI Record/Playback Tool
Test Tool Characteristic Integrated defect-tracking feature Recording at the widget level Published APIs—Language interface capabilities ODBC support—tool supports ODBC-compliant databases Custom control support Allows for adding timers Compatible with GUI language/ development environment Can query or update test data during playback Supports the creation of a library of reusable functions Allows for wrappers (shells) Test results analysis
Candidate 1 (1–5)
Candidate 2 (1–5)
Candidate 3 (1–5)
3 5 4
5 5 5
3 5 4
4
4
4
3 4 5
5 5 5
3 5 5
4
4
4
4
4
5
4 3
5 5
5 3 continued
6
Example of Tool Evaluation Criteria
continued from page 5
Test Tool Characteristic
Candidate 1 (1–5)
Y2K compliance Supports stress, load, and performance testing Allows for simulation of users Supports configuration testing Ability to use variable scripts Supports resource monitoring Synchronization ability Client to server response times Performance measurements of data loading Version control Supports test execution management Allows for test planning Measuring test progress Price is within estimated range Customer support Total Value
Candidate 2 (1–5)
Candidate 3 (1–5)
5 3
5 5
5 3
3 3 3 3 4 3 3 4 5 4 5 3 3 970
5 3 4 4 5 4 4 4 4 3 5 5 5 11700
3 3 3 3 5 3 3 4 5 5 5 4 3 10300
A Preferred Scorecard Summary is provided in Table 2. Note that using this different model for scoring, test tool Candidate 2 achieves a higher rating than Candidate 3, which had posted the highest rating using the Evaluation Scorecard method. Candidate 2 achieved a rating of 90.0% for being able to provide coverage for the highest priority test tool characteristics. Candidate
Score
Rating
Candidate 1 Candidate 2 Candidate 3
97 117 103
74.6% 90.0% 79.2%
Remember that the evaluation for each kind of test tool being considered for an organization or project is different. Each particular type of test tool has its own particular desired characteristics and a different weight scheme for the tool’s desired characteristics. The guidelines of what to look for and weigh when evaluating a GUI test tool will be different from guidelines for how to evaluate a network monitoring tool.