Example Of A Metrics Program

  • Uploaded by: Venkatesh.R
  • 0
  • 0
  • October 2019
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Example Of A Metrics Program as PDF for free.

More details

  • Words: 2,621
  • Pages: 11
Example of a Metrics Program

The following is a fictional example illustrating a simple application of the 8-Step Metrics Progam that will yield short-term feedback. Before committing a major investment in time, it is often valuable to become familiar with a new program by applying it to a small project with a short time frame. The experience gained should be applicable to the implementation of a larger program with long-term goals and a broader scope. Most software organizations can replicate this example to yield valuable information with a minimum of investment.

Background Integrated Software is a small company specializing in integrated CASE tools. It has few formal procedures, and its software development process is best described as traditional. The company performs code inspections on an ad hoc basis, and a testing program is established. Management is considering a major upgrade to the main product that will involve significant re-engineering. Time is short, budgets are thin and experience with metrics is non-existent. Integrated Software’s past experience shows that some projects have had serious cost overruns. The instability of a recent release resulted in a loss of customers. Projects that are underestimated, over-budget, or that produce unstable products, have the potential to devastate the company. Accurate estimates, competitive productivity, and renewed confidence in product quality are critical to the success of the company. Hoping to solve these problems as quickly as possible, the company management embarks on the 8-Step Metrics Program by Software Productivity Centre Inc..

Step 1: Document the Software Development Process Integrated Software does not have a defined development process. However, the new metrics coordinator does a quick review of project status reports and finds that the activities of requirements analysis, design, code, review, recode, test, and debugging describe how the teams spend their time. The inputs, work performed, outputs and verification criteria for each activity have not been recorded. He decides to skip these details for this "test" exercise. The recode activity includes only effort spent addressing software action items (defects) identified in reviews.

Step 2: State the Goals The metrics coordinator sets out to define the goals of the metrics program. The list of goals in Step 2 of the 8-Step Metrics Program are broader than (yet still related to) the immediate concerns of Integrated Software. Discussion with development staff leads to some good ideas on how to tailor these goals into specific goals for the company. The coordinator lists questions about each goal.

Estimates

The development staff at Integrated Software considers past estimates to have been unrealistic as they were established using “finger in the wind” techniques. They suggest that current plans could benefit from past experience as the present project is very similar to past projects. The metrics coordinator narrows and restates Step 2, Goal 2 (Improve software estimation): Company Goal 1: Use previous project experience to improve estimations of productivity. Questions asked about the goal: • What is the actual labor rate of past projects? • How complicated is the software being developed? • Does the labor rate vary for different types of software?

Productivity Discussions about the significant effort spent in debugging center on a comment by one of the developers that defects found early on in reviews have been faster to repair than defects discovered by the test group. It seems that both reviews and testing are needed, but the amount of effort to put into each is not clear. The metrics coordinator decides to focus Step 2, Goal 8 (Improve staff productivity) around questions of rework. Company Goal 2: Optimize defect detection and removal. Questions asked about the goal: • How much effort is spent in testing versus reviews? • How many defects are discovered in testing versus reviews? • How much effort is spent repairing defects discovered in reviews? • How much effort is spent repairing defects discovered in testing? • How efficient are reviews in removing defects? • How efficient is testing in removing defects? • What is the optimal defect detection efficiency to achieve in reviews prior to testing?

Quality The test group at the company argues for exhaustive testing. This however, is prohibitively expensive. Alternatively, they suggest looking at the trends of defects discovered and repaired over time to better understand the probable number of defects remaining. The coordinator limits Step2, Goal 6 (Improve software quality) to establishing a quantitative indication of stability at product release. Company Goal 3: Ensure that the defect detection rate during testing is converging towards a level that indicates that less than five defects per KSLOC will be discovered in the next year. Questions asked about the goal: • How many defects have been discovered so far? • How many defects have been repaired so far? • What is the trend in number of defects discovered and repaired over time?

Step 3: Define Metrics Required to Reach Goals Working from the Step 3 tables, the metrics coordinator chooses the following metrics for the metrics program. Company Goal 1: Improve Estimates • actual effort for each type of software in PH • size of each type of software in SLOC • software product complexity (type)

• labor rate (PH/SLOC) for each type Company Goal 2: Improve Productivity • total number of person hours per activity • number of defects discovered in reviews • number of defects discovered in testing • effort spent repairing defects discovered in reviews • effort spent repairing defects discovered in testing • number of defects removed per effort spent in reviews and recode • number of defects removed per effort spent in testing and debug Company Goal 3: Improve Quality • total number of defects discovered • total number of defects repaired • number of defects discovered / schedule date • number of defects repaired / schedule date

Step 4: Identify Data To Collect The metrics coordinator uses the tables in Step 4 to identify the necessary raw data for the metrics program.

Figure A.1 : Identifying data to collect.

Goal 1 Metrics

Data to collect

Actual effort for each project in PH

For each project: • actual number of person hours to complete

Size of each type of software

For each project: • total SLOC produced

Software product complexity (type)

Category of each project

Labor rate (PH/SLOC) for each type of project

For each type of project: • total number of person hours to complete • total SLOC produced

Figure A.1 : Identifying data to collect cont’d.

Goal 2 Metrics

Data to collect

Total number of person hours per activity

For each activity (reviews and testing): • Total number of person hours

Number of defects discovered in reviews Number of defects discovered in testing

For each activity (reviews and testing): • Total number of defects discovered

Effort spent repairing defects discovered in reviews

For recode activity: • Total number of person hours spent

Effort spent repairing defects discovered in testing

For debugging activity: • Total number of person hours spent

Number of defects discovered per effort spent in reviews and in testing

For each activity (reviews and testing): • Total number of defects discovered For each activity (recode and debug): • Total number of person hours spent

Goal 3 Metrics

Data to collect

Total number of defects discovered

Number of defects discovered

Total number of defects repaired

Number of defects repaired

Number of defects discovered/schedule dates

Number of defects discovered/schedule dates

Number of defects repaired/schedule dates

Number of defects repaired/schedule dates

Step 5: Define Data Collection Procedures Integrated Software needs to collect data regarding defects, effort, implementation, and project management. • Defect data includes dates of defect detection and repair, and the number of defects discovered and repaired per activity. Defect data should be available from the minutes of meetings, test reports, and code headers. However, as Integrated Software has not previously kept such data, the metrics coordinator must assume that all defects detected in reviews were repaired in the recode activity.

• Effort data includes total person hours to complete each activity and is available in project status reports only. • Implementation data includes the type and size of software for each project. This data is available from the development staff. In order to record the data efficiently, the coordinator creates the following forms:

Figure A.2: Data collection form - Defect.

Week #

# defects detected in reviews

# defects detected in tests

# defects repaired in debug

1 2 3 4 etc. Figure A.3 : Data collection form - Effort.

Person

Activity

A

Code

B

Code

C

Code

A

Review

B

Review

C

Review

A

Recode

B

Recode

C

Recode

A

Debug

B

Debug

C

Debug

D

Test

E

Test

Start date

Complete date

Person hours

Procedures for Data Collection The metrics coordinator decides he will be responsible for collecting the necessary data. He documents the following procedures for his role: Effort Data

Collect copies of the project status reports from the project managers. Determine the start and completion dates for each person for each activity and compute the person hours accordingly. Record the data on an effort form. Total overall persons per activity and record the total in a metrics database.

Implementation Data Collect copies of the source code from the developers. Count source lines of code using the same tool on all projects to ensure consistency. Determine the type of software for each project. Enter the total size and type of software for each project in the database. Defect Data Gather defect data from the source code and project status reports mentioned above. Also, collect minutes of review meetings from the developers, and weekly test reports from the test group. Defects Detected: From the minutes of meetings and the weekly test reports, count the number of defects detected for each week and enter the totals on a form. Defects Repaired: From the comments in the code headers (which include dated references to the defects), tally the number of defects repaired each week on a form. Enter the totals in a metrics database. Finally, use the database to compute all metrics and generate the graphs and reports to summarize them.

Step 6: Assemble a Metrics Toolset The coordinator assembles the tools necessary for the metrics program. Integrated Software has a spreadsheet on the PC that can easily manage the data collection forms. The metrics program also needs an method for counting lines of code. The development team has an editor which supports user defined macros, so they develop a macro in the editor to count code consistently. The algorithm implemented ignores blank and commented lines and includes data and executable statements.

Step 7: Create a Metrics Database Integrated Software’s metrics database needs to retain the information entered directly from the forms it has used. It must also permit the analysis of this data and calculation of the metrics specified. It needs to be able to display metrics graphically for presentations and reports. Since Integrated Software’s defect tracking system keeps history data on defects, this data will be extracted directly into a spreadsheet, where it can be used to compute and present the defect trend metrics.

Step 8: Define the Feedback Mechanism Given the small size of the company, the coordinator decides that results from the metric analysis should be presented in a meeting, saving the effort of writing a detailed report. The graphs and metrics calculated will be prepared on overhead transparencies for presentation, and handouts of the slides will be provided. The data will be presented as shown below in “Prepare Data to Support Goals.” The data to be analyzed in the metrics program involves three recently completed projects which best reflect Integrated Software’s current corporate culture. The shortterm goals of Integrated Software only call for a single feedback session. However, the coordinator realizes that if the information prepared is useful, it will lead to process changes and an ongoing metrics program. The data collected will be the company’s first baseline, which can be enhanced later as more projects are entered into the metrics program.

When the data has been prepared for presentation, the coordinator will hold the meeting with management and development staff. The objective of the meeting will be two-fold: to present the results and to discuss recommendations for process change.

Prepare Data to Support Goals Company Goal 1: Improve Estimates The process that Integrated Software will use for estimating size and person hours for future projects will follow the methods prescribed by L.H. Putnam in "A General Empirical Solution to the Macro Software Sizing and Estimating Problem," IEEE Transactions on Software Sizing (4, 4:345-361). This approach uses the established baseline as a basis for estimating size of software for similar types of projects. It then statistically combines independent estimates from a number of estimators. Improved size estimates can then be input to a cost model to determine target productivity, budget and schedule and thus achieve the first goal. The metrics computed to support this goal should be illustrated in a graph or table similar to the following examples.

Figure A.4 : PH, SLOC, and Labor rate estimation examples chart.

Project 1 GUI

Project 2 DB

Project 3 Embedded RT

Person Hours

8000

5000

6000

SLOC

24000

20000

30000

Labor rate

0.33

0.25

0.2

Company Goal 2: Improve Productivity Integrated Software will achieve its second goal to optimize defect detection and removal by using the metric data to determine the efficiency of defect detection in reviews and testing. Defect detection and removal efficiency is computed manually by dividing the total number of defects detected in reviews by the total effort in person hours in the review and recode activities. Similarly for testing, the number of defects detected in test is divided by the total effort in person hours in test and debug. The metrics computed to support this goal can be illustrated similarly to the following examples. Integrated Software anticipates that defect detection and removal efficiency will be higher using reviews than testing. Thus, the recommendation expected is that reviews should be continued until their efficiency falls below the baseline efficiency determined for testing. Testing would then begin.

Figure A.5 : Person Hours by activity examples graph.

Person Hours

150 0

1600 1400

13 0 0

1200 1000 800 500

600

500

400 10 0

200 0 Code

Re v i e w

Re c o d e

Test

De b u g

# Defects Detected and Corrected Figure A.6 : Defects detected and corrected examples graph.

150

150

150

100

100

100

Test

Debug

50 0 Review

Figure A.7 : Efforts/Defects examples graph.

Recode

Effort / Defects 20 18 16 14 12 10 8 6 4 2 0 Review

Test

Company Goal 3: Improve Quality Integrated Software will achieve its third goal, to establish confidence in product stability, by graphing the defect detection and removal rates regularly through the review and testing phases. The company will continue testing until the detection rate trend indicates stability in the software at a level consistent with the goal. The metrics computed to support this goal will be illustrated as below. The average number of defects per week discovered over the recent month is multiplied by twelve months to predict a maximum number of defects to be discovered in the next year. Stability is indicated when this number drops below five defects per KSLOC. The release target is thus computed as 5 times KSLOC.

Defect Trend

Figure A.8 : Defect trend example - 200 graph. 150

Defects Discovered

100

Defects Repaired

50

Release Target

0 Week Week Week Week Week Week Week 1 2 3 4 5 6 7

Figure A.9 : Total defects example graph.

Total Defects 600 500 400 300 200 100 0 Week Week Week Week Week Week Week 1 2 3 4 5 6 7

Defects Discovered Cumulative Defects Repaired Cumulative Target Maximum

Related Documents

A Example Of Xelatex
November 2019 25
Metrics
November 2019 35
Metrics
November 2019 32
Metrics
November 2019 35