Don’t Bite the Whole Apple! Frank Doherty PMW-189 Deputy Program Manager
[email protected]
Corinne C. Segura Integrated Computer Engineering, Inc. A Wholly Owned Subsidiary of ASC
[email protected]
Purpose
To provide guidelines for selection of Best Practices and Lessons Learned to improve the chance of software project success.
-2-
OUTLINE • • • • • • •
Historical Perspective Identification of Main Problems Development of Solution Set Solution Selection Implementation End Results Lessons Learned
-3-
Past History Shows……. • Only 16% Of Software Projects Are Expected To Finish On Time And On Budget(1) • Projects Completed By The Largest US Organizations Have Only 42% Of Originally Proposed Functions(1) • An Estimated 53% Of Projects Will Cost Nearly 190% Of Their Original Estimates(1) • In Large Companies, Only 9% Of Projects Will Be Completed On Time & On Budget(1) • Canceled Projects - $81 Billion Loss to US in 1995(1) • Average MIS - 1 Year Late, 100% Over Budget(2)
(1) Standish Group International Report: “Chaos”, as reported in Open Computing. Copyright SPC (2) Capers Jones, Applied Software Measurement, McGraw-Hill -4-
The Defense Science Board Said (over 10 years ago) “Today’s major problems with military software development are not technical problems, but management problems.” [Defense Science Board Task Force on Military Software, executive summary.
And after ten years it still holds true!
-5-
How Do We…. • Define the main issues • Prioritize the areas needing focus • Establish criteria for selecting a solution set • Measure effectiveness of changes implemented
-6-
PMW-189 Experience
-7-
Our Mission • PMW 189 is responsible for development, production, and deployment of fleet cryptologic/signals intelligence/direction finding systems (5 ACAT III/IV programs, $1.2 Billion budget in FYDP) • Software is the backbone of the capability provided by PMW189 – Complex, operationally demanding applications – Historically software has been our “Achilles Heel”
-8-
Prioritized Main Issues • Our programs have a high rate of rework and scrap • Our products have numerous defects after delivery to the fleet user • Seem to have difficulty in finding the right developer who can deliver a quality product • We are reactive rather that proactive -9-
rly ef D eq R
n ur Ch
ed in
qs
s
U De nd ve efi Pr lo ne oc pm d es en se t Po s Fo or F De r Co oun ve nt da lo in tio pm ue n en d t
o Po
Re
Ex S pe taf rie f nc e
Cause and Effect Po or V & V
High Rework And Scrap
- 10 -
Phase One Focus on improving the development and requirements processes
- 11 -
Phase One Implementation Strategy • Baseline current performance • Focus on top risk items • Select a maximum of four changes for simultaneous implementation within the program • Motivate the entire team for implementation of the selected best practices - 12 -
Baseline Current Product • Utilizing the Software Program Managers Network (SPMN) 16 Point Plan to: – Assess program health and status – Realize true state of software processes – Identify top risks – Validate current schedules and estimates
- 13 -
Process Improvement Focus Was Implementation Of SPMN 16 Point Plan ™ 16 Point Plan are 16 proven practices from industry & SPMN Airlie Council
Project Integrity
• Adopt Continuous Risk Management • Estimate Cost and Schedule Empirically • Use Metrics to Manage • Track Earned Value • Track Defects against Quality Targets • Treat People as the Most Important Resource
• • • •
Construction Integrity
Product Stability & Integrity
• Adopt Life Cycle Configuration Management • Manage and Trace Requirements • Use System-Based Software Design • Ensure Data and Database Interoperability • Define and Control Interfaces • Design Twice, Code Once • Assess Reuse Risks and Costs
Incorporates proven commercial best practices Focuses on high-leverage, bottom line Brings big savings Uses flexible templates
• Inspect Requirements and Design • Manage Testing as a Continuous Process • Compile and Smoke Test Frequently
• • • • • •
Not proprietary Specific Measurable Attainable Realistic Readily implemented - 14 -
Baseline Process • First assessment was held at Program Office and was supported by the developer • Utilized the ICE/SPMN tool and questionnaire • Ratings based on the following scale: 1 = Component not evident 2 = Respondent aware of requirement. Not yet implemented 3 = Implementation under way. Not yet in place. 4 = Implementation evident, but not effective 5 = Effective implementation - 15 -
Em pir Fo ic a rm l C al R Me os tr ic ta i nd s k M s -B an S as ed c hed agem u Pr o De en f ec t jec le Es t E t t i m ar n Ma Tra Pe a ... n ed op le A c king V a agem lu A e wa r e gains e Tr a nt P c ki t ng Co r ogr a Qua En n l i mM ty f ig d T t u Sy s te o End r atio anag ... nM e mA R an ment r c h equi Da a r em ge itec ta me en an Fo n d D tur e ts rm Tr a t Ba al D ata s c ba ef i s e ed So ing niti o I f nte Co V is ib n & r op t... Co s tl e e nt Jus a r ab tif ie nd In r ol o ilit f s da Inte y pe c nd Qu table r... alit y - J Des ig Fr e F M us or m n qu an ti en ag tC ing al Ins f ie... om pe pile Tes ts c tio ns an a s dS As mo se ke t Te s s tin g
First Assessment Results 16-Point Plan Assessm ent
5.0
4.0
3.0
2.0
1.0
0.0 `
- 16 -
Observations • Unwarranted optimism plagues the program • Overall processes and requirements not defined • Schedules and estimates did not have a firm basis • Difficulties in understanding the impacts of process shortfalls • Product functionality missing - 17 -
Subsequent Assessments • After baseline assessment, the contractor implemented changes in 12 of the 16 areas • Teams were formed to work on assigned focus areas • Risk Management was imbedded into daily business practices • Two follow-on assessments were conducted at 4 month intervals to measure progress - 18 -
Em pi
rm al R r ic is k al M C M an os et ag ric ta em nd sB en Sc as t he ed d Pr ul oj e ec . .. tM Ea an rn D ef e ag d ec ... V tT al ue ra ck Pe Tr in op ac g le ki A ng ga A w i ns ar e tQ Pr ua og li . ra . m M an a. ..
Fo
Project Integrity Summary 5 4
3
2
1 First
Second
Third
0
- 19 -
to
En d
C on fig ur at io
Sy n st R M em eq an u A ag i r D rc e em at m hi a e t en ec nt an t s t d ur Tr Fo D e ac at rm B a a in ba al se g D se d ef So In in f.. t i er tio Vi . o si n p er bl & e ab C C an on ili os d ty tr t-J o In l us sp of tif ec In ie ta ... d bl an e d D es Q ua ig n lit yJu st i.. .
En d
Construction Integrity
5.0 4.0
3.0
2.0 First
Second
Third
1.0
0.0
- 20 -
Product Quality 5.0
4.0
3.0
2.0
1.0
0.0
a rm o F
o ct i pe
ns
s set s g tin sA s a e s T est ke T o m ing dS ag n n ea Ma pil m o tC n e qu Fre
s l In
First Second Third
- 21 -
Did It Work ? • Significant improvements in – Software durability – Software reliability – Training – Documentation
• Cultural resistance diminishing
- 22 -
Phase One Lessons Learned • Keys to successful implementation - - top management leadership, training, and joint PMW, SSC and industry teaming • Real pay-offs occur when risk mitigation is solved by prevention-based technical strategies • Risk can be a guide to needed process improvement/application of Best Practices • Can implement only 1-2 Best Practices at a time • There is benefit in fostering an environment where everything is challenged • Use of outside experts to assess risks is a good idea • We want to repeat our successes - not our failures - 23 -
Priority Two Issue
Our products have numerous defects after delivery to the fleet user
- 24 -
ce en eri xp ters Ine Tes
V V& or Po
Ina Te de q st Sc uate he du le
I Te nad st eq Pr ua ac te tic es
f ko s Lac tion pec Ins
L Re ack gre of Te ssio sti ng n
Cause and Effect
High Number of Defects
- 25 -
Phase Two Focus on product functionality, reliability and test strategies
- 26 -
Phase Two Prioritization • Phase Two prioritization – Near Term: • Reliability Growth • Good Testing Practices
– Mid-Term • Unwarranted Optimism • Accountability of Practices
- 27 -
Established a Working Group • Team members were experienced in software development practices and included an independent member • Informally assessed test practices • Identified issues and obstacles • Defined issues • Prioritized recommendations for near and mid term accomplishment • Identified required resources for implementation - 28 -
Reliability Growth Plan • Near Term Recommendations – Conduct an independent Pre-OPEVAL Assessment – Examine fault profiles – Establish a reliability improvement scale – Establish quality gates for each DT evolution – Conduct a team meeting with OPTEVFOR to review OT rules and how they will be applied to program - 29 -
Reliability Growth Plan (cont’d) – Establish a cut-off period for any future developments or ECPs and focus remaining time on building reliability – Construct a “backwards” schedule to determine latest dates for corrective action – Review defect tracking system and required metrics to determine discovery and correction rates – Validate the test environment
- 30 -
Reliability Growth Plan (cont’d) • Mid-term Actions: – Conduct end-to-end path testing – Expand white box testing – Conduct code end-to-end walkthroughs – Determine software reliability allocations for all subsystems – Conduct regression testing on each build Words of Wisdom: “Being naïve and expecting this to be done by statistically unsophisticated people who can’t or won’t do the math in order to discover which caveats do and don’t apply in your case.” Testing: Best and Worst Practices – A Baker’s Dozen by Boris Beizer - 31 -
Test Practice Recommendations • Change test structure. Add enhanced negative testing and system level white box testing • Enhance current regression test process • Conduct path analysis and schedule testing to focus on the top 20% of paths utilized • Complete a Pedigree Analysis • Conduct Fagan Inspections on key threads/critical paths Words of Wisdom: “ Good testing practices can’t be defined by simple rules of thumb or fixed formulas. What is best in one circumstance can be worst in another.” Testing: Best and Worst Practices – A Baker’s Dozen by Boris Beizer - 32 -
Interim Results • Reliability Growth is the main focus – 8% improvement in MTBOMF (sw)
• Defect Discovery Rate – More robust test environment – Seen a 48% increase in PCR’s discovered within first month of implementation
• Defect Removal Rate – Closure rate is steady – Fix stage scheduled between major test events - 33 -
Priority Three Issue
Seem to have difficulty in finding the right developer who can deliver a quality product We are reactive rather that proactive
- 34 -
Po or Pr Sele oc ct es ion s
ve ct i g ffe ctin Ine ntra Co
Ina De deq ve ua l o Ex pm te pe rtis ent e
ce an rm rfo Pe
on ati tim es Es iqu n or Po Tech
st Pa
Ina A c de qu qu ate i s Str i ate tion gy
Cause and Effect
Poor Acquisition Practices
- 35 -
Phase Three Evolve the Program Office into a world class systems acquisition team by developing a strong software and hardware knowledge base. - 36 -
What’s Next? • Phase Three – Focus on required expertise to become a world class program office: • Expand knowledge base from hardware intensive to software intensive acquisition management • Institutionalize Risk Management across all programs • Continually measure performance against standards • Remove Crisis Management from program environments • Allow managers to focus on what they have to do; Not which fire has to be put out. • Build an environment of trust where potential problems can get a thoughtful analysis
- 37 -
Consider How Far We’ve Come!
HPO 16 Point Plan
2002
Best Practices Strategic Planning
Risk Management
1998 - 38 -
Patterson-Conner Change Adoption Model
Commitment Phase Acceptance Phase
Preparation Phase
Degree of Support for Change
Strategic Planning Risk Management Best Practices 16 Pt Plan HPO
Institutionalization Policies/Support Structures New Employee Orientation Tailored Model Adoption Leading and reinforcing change Periodic organizational re-appraisal Trial Use Measurement system Piloting Plan Customized training for adopters
Understanding Process Implementation Approach Detailed course for change agents in the model Awareness Longer Tutorials for different roles in the organization Quick appraisal to understand the baseline
Contact Presentations/PR on how it works/Examples
Garcia. Suzanne, Are You Prepared for CMMI?, CrossTalk, Mar 2002
- 39 -
Implementation Strategy • Assess and baseline acquisition performance based on SA-CMM and DoD instructions – – – –
Identify deficiencies Provide training Reassess to monitor progress Conduct gap analysis
• Determine high risk areas and provide necessary training to construct a plan for corrective action • Revitalize the risk management program - 40 -
Implementation Strategy • Establish a metrics IPT to provide performance measurements • Employ a self evaluation program • Put into practice the transition mechanisms required to proceed to the next level of excellence
- 41 -
Lessons Learned • Develop a potential solution set by analyzing cause and effects • Select two process improvements which are in alignment with top risks, resource availability, and team capabilities • Apply a structured approach in increments that are attainable and measurable – “short term wins” to build enthusiasm and commitment to change – ROI - 42 -
Lessons Learned • Measure your progress incrementally and continue to refine your best practices. • Broaden your application after you have proven the concept • Select the next two improvements and begin the cycle again.
- 43 -
Lessons Learned • Bottom line: – Start small, adopt a process, implement the strategy and continually refine. – Don’t become discouraged if improvements are not realized immediately. – Get total commitment from every member of the team. – Mature your environment and processes in parallel to obtain institutionalization. – Employ leadership which creates an environment that is conducive to change and maintains a vigilant focus!
“Don’t Bite the Whole Apple!” - 44 -