Cmmi Baseline Parametric Model Calibration Process Em

  • May 2020
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Cmmi Baseline Parametric Model Calibration Process Em as PDF for free.

More details

  • Words: 1,808
  • Pages: 5
PRX-QPM-03 v1.0

Baseline Parametric Model Calibration Process (Expert Mode)

01/10/06

Process: Baseline Parametric Model Calibration Phase: Global Process Owner: SSC San Diego Systems Engineering Process Office (SEPO) Description: The purpose of Baseline Parametric Model Calibration (BPMC) is to develop revised Constructive Cost Model (COCOMO) equations resulting from calibration to current software projects at SSC San Diego. The calibrated equations are used by projects to make staff hour, cost, and schedule estimates for future software business. This data serves as an input to the Baseline Data Analysis (BDA) report that establishes performance benchmarks and identifies opportunities for the Center’s Process Improvement (PI) initiative. This process can be performed at both the enterprise and organization levels within SSC San Diego. Entry Criteria/Inputs: Exit Criteria/Outputs: • Overarching plans exist to plan this process, • Data inputs for BDA report have been establish schedules, assign responsibilities, submitted and are under configuration control establish monitoring procedures, and provide for • Measurements on the status of this process improvement of the process (i.e., Project have been collected, analyzed and stored Management Plan (PMP) for the SSC San Diego • Objective evaluation of this process has been Process Improvement (PI) Initiative, reference completed in accordance with a defined (a)) quality assurance process • Commitment to parametric model calibration • Review with higher-level management has disciplines per the Systems/Software Engineering been completed to resolve issues Management (SEM) Policy, reference (b), has • The effectiveness of this process has been been made assessed and proposed improvements • Project data submittals to the appropriate submitted in accordance with overarching measurement repository have been made plans • Adequate resources are committed to parametric model calibration activities • Training is completed for personnel performing parametric model calibration Roles: • PI Initiative Manager: oversees the analysis of measurement data at the enterprise-level. • Organization Managers: responsible for the collection and analysis of measurement data in the organization-level measurement repository • Data analysis team: collects, validates, analyzes, performs calibration, and reports the results as input to the BDA report • Project Managers: responsible for the data submittal to the applicable measurement repository • Project Members: objectively submit data as required by guiding measurement plan • SEPO: oversees the parametric calibration process as required for the PI Initiative • Quality Assurance (QA): provides objective verification of calibration activities, ensures process adherence • Configuration Management (CM): maintains control of process artifacts Assets/References: References (a) - (b) are available from http://sepo.spawar.navy.mil/ . References (c) (h) are available in the SEPO Library. a. Project Management Plan for the SSC San Diego Process Improvement (PI) Initiative, PL-OPF-01, SSC San Diego b. Systems/Software Engineering Management Policy, PR-OPD-09, SSC San Diego c. COCOMO II, Dr. B. Boehm, Prentice Hall, 2000 d. Costar Version 6.0 (Available from www.softstarsystems.com) e. Software Life Cycle Processes, Institute of Electrical and Electronics Engineers (IEEE)/Electronic Industries Association (EIA) 12207.0, Mar 1998

1

PRX-QPM-03 v1.0

Baseline Parametric Model Calibration Process (Expert Mode)

01/10/06

f.

Capability Maturity Model Integration for Systems Engineering/Software Engineering/Integrated Product and Process Development, and Supplier Sourcing, V1.1, Carnegie Mellon University (CMU)/Software Engineering Institute (SEI)-2002-TR-012, SEI, March 2002 at http://www.sei.cmu.edu/cmmi/ g. Processes for Engineering a System, Electronic Industry Alliance (EIA) Standard 632, Jan 1999 h. Systems Engineering – System Life Cycle Processes, International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC) 15288, ISO/IEC 15288:2002(E), Nov 2002 Tasks: 1. Facilitate data collection 4. Develop parametric equations 2. Aggregate and normalize data 5. Validate results 3. Verify the aggregate structure 6. Communicate results Measures: • Effort, percent complete, and funds expended for this process as necessary for tracking in the PI Initiative MS Project Plan

PROCESS TASKS: 1. Facilitate data collection The project members submit data (measurements) in the required form, content, and frequency to the applicable measurement repository. The data required for parametric calibration includes project Life Cycle Phases with start and completion dates, project size (i.e., Source Lines of Code (SLOC), components, number of requirements, etc), staff hours, and costs. 2. Aggregate and normalize data The data analysis team extracts raw data relative to the focus of parametric calibration from the appropriate measurement repository and collects the data into a Microsoft Excel spreadsheet(s). The data is organized with rows for each reporting project and the columns representing the raw data. For example, a column for the duration of each life cycle phase in months, the staff hours by phase, total costs by phase, and the project size information. An additional column could be generated containing an algorithm to derive the equivalent Source Lines of Code (SLOC) value from the project inputs (i.e., function points, components, number of requirements, etc.). See Table 1, Effort Calibration Spreadsheet. 3. Verify aggregate structure The data analysis team must verify the data to ensure that it has merit and credibility before using the data for calibration. Verifying of the data involves the activities listed below: a. Verify that the data is of the correct type, consistent with the specification for submittal to the measurement repository. b. Verify that the data is complete. For example, if a data submittal is for ‘staff hours’ and ‘cost’ then both entries must be present to be a credible submission. For missing data, contact the submitter or reject the data submittal. c. Verify arithmetic correctness. If the data submitted is the result of an arithmetic operation, such as the calculations for developing equivalent ‘SLOC’, verify that those calculations have been performed correctly. d. Verify the consistency of the data. Improbable data elements should be investigated to determine their accuracy and to document the circumstance for the value if it is found to be a submittal of merit. Often data that lie outside a nominal pattern is a candidate for analysis for special cause. 4. Develop parametric equations The basic parametric algorithms for effort and schedule estimation in COCOMO II are shown in Figure 1.

2

PRX-QPM-03 v1.0

Baseline Parametric Model Calibration Process (Expert Mode)

01/10/06

Figure 1. Parametric Algorithms By modifying the default effort multiplier (e.g., 2.94) and schedule multiplier (e.g. 3.67), the algorithm can be calibrated to an individual project. However, the need is for a calibration that is representative of projects performing under the same life cycle strategy within a common business domain. In short, one must compare and calibrate apples to apples. A specific example would be calibrating an algorithm for all satellite communication projects using an incremental life cycle model. Statistically, it is desirable to have data points from at least six projects in the common business domain. The Excel spreadsheet in Table 1, provides an example of a tool that will generate a calibrated effort multiplier following the guidance of COCOMO II, reference (c) using data from eight projects. Alternately, one could calibrate each project individually, through trial and error, and then use the arithmetic mean of the resulting effort multiplier constants. It should be noted that the Effort Adjustment Factors (EAF) and Scale Factors (SF) are set at their default values when initiating the COCOMO applications. The assumption is that projects within a specific business domain share common variables as represented by EAF and SF. To that end, those variables are built into the calibrated effort and schedule multiplier constants. The approach is that EAF and SF variables would be applied to quantify variances between baseline upgrades or individual projects within the business domain. TABLE 1. EFFORT CALIBRATION SPREADSHEET

For the schedule multiplier constant, a spreadsheet solution such as illustrated for the effort multiplier could be developed, or the use of the arithmetic mean of the constants developed through trail and error for each project could be applied.

3

PRX-QPM-03 v1.0

Baseline Parametric Model Calibration Process (Expert Mode)

01/10/06

Additional calibrations can involve modeling the distribution of effort and schedule results to the phases of the governing life cycle model. This involves tracking the actual effort and schedule allocations to the respective phases of the referenced life cycle model. For simplicity, only the software subset of a system is used and basic life cycle phases are used as listed below: a. Requirements b. Product design c. Detailed design d. Code and unit test e. Integration and test Two spreadsheets can be developed, one addressing percent effort distribution by phase and the other percent schedule distribution by phase. Each spreadsheet would have project data by row and the respective phases as columns. In this manner, an arithmetic mean can be calculated for the sample projects’ distribution of effort and schedule by phase. This data can then be used to partition the results derived from the calibrated effort and schedule equations. Such data can be invaluable in future planning estimates. In a like manner, a further break down of the gross effort and schedule data can be partitioned to activities such as project office support, QA, CM, documentation, etc. Naturally, these calculations are dependent on the collection and availability of archived project data. Currently, data collection granularity is not sufficient to achieve this level of calibration. Calibration can be simplified by the use of parametric cost estimating models and their associated tools. Costar is an example of a COCOMO II estimation tool that has an associated calibration tool. The tool, Calibrate COCOMO (CALICO) allows the user to build models that drive the Costar tool. These models could contain calibrated equations, effort and schedule distribution tables (see Figure 2), life cycle phase titles such as requirements and product design, milestone titles, etc, as needed to accurately model an organization’s means of production.

Figure 2. CALICO Default Effort Distribution Table 5. Validate results

4

PRX-QPM-03 v1.0

Baseline Parametric Model Calibration Process (Expert Mode)

01/10/06

Once the calibrated effort and schedule multiplier constants have been developed, the new algorithm should be applied to each project within the sample to determine the Mean Absolute Deviation (MAD) error. Analysis of the minimum and maximum values will determine the need for adjustment to the constants. The objective of the adjustments is to establish a normal distribution of error (+/-) for the projects in the sample. If one project is the cause of a significant divergence, it should be analyzed for special causes associated with the deviation. For example, the special cause may be an unplanned work stoppage. 6. Communicate results The results of the analysis of the data are published in the Baseline Data Analysis (BDA) report. The result of the parametric calibration are published in the BDA to both support projects using parametric modeling to achieve more accurate estimates for future work and to provide the PI initiative visibility into overall process improvement. For example, if over time the effort multiplier constant is reduced in value for a specific business domain then it can be concluded that productivity is improving. The BDA report is considered sensitive information as it contains information on Center productivity and defect containment. Consequently, following QA analysis, review and approval, the report is made available only from the SPI Agents Infosite.

5

Related Documents

Process Improvement - Cmmi
November 2019 14
Cmmi
November 2019 28
Process Model
June 2020 9
Cmmi
May 2020 21