Sunteți pe pagina 1din 64

®

Project Manager’s
Guide to Systems
Engineering
Measurement for
Project Success
A Basic Introduction to Systems Engineering Measures
for Use by Project Managers
Guide to Systems Engineering Measurement
®

Project Manager’s Guide to


Systems Engineering
Measurement for Project Success
A Basic Introduction to Systems Engineering Measures for Use by Project Managers

Document No.:  INCOSE-TP-2015-001-01


Version: 1.0
Date:  21 March 2015

Authors
This document was prepared by the Measurement Working Group of the International Council on Systems
Engineering. The authors who made a significant contribution to the generation of this Guide are:
Ronald S. Carson, PhD, ESEP, Lead The Boeing Company (retired)
Paul J. Frenz, CSEP, MWG Chair General Dynamics
Elizabeth O’Donnell The Boeing Company

REVISION HISTORY
Version Revision Date Comments/Description
New 21 March 2015 Approved by INCOSE Technical Board as an INCOSE Technical Paper

Copyright © 2015 by the International Council on Systems Engineering. All rights reserved.

Published by the International Council on Systems Engineering, San Diego, California

No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical,
photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without
the prior written permission of the Publisher. Requests to the Publisher for permission should be addressed to the INCOSE Administrative
Office, International Council on Systems Engineering, 7670 Opportunity Rd #220, CA 92111-2222 USA, +1 858.541.1725, publications@incose.org.
Limit of Liability/Disclaimer of Warranty:  While the publisher and author have used their best efforts in preparing this publication, they
make no representations or warranties with respect to the accuracy or completeness of the contents of this publication and specifically
disclaim any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales
representatives or written sales materials. The advice and strategies contained herein may not be suitable for your situation. You should
consult with a professional where appropriate. Neither the publisher nor author shall be liable for any loss of profit or any other commercial
damages, including but not limited to special, incidental, consequential, or other damages.

INCOSE publishes its products in a variety of formats. For more information about INCOSE products, visit our web site at www.incose.org.

ISBN: 978-1-937073-06-01

General Citation Guidelines:  References to this publication should be formatted as follows, with appropriate adjustments for formally
recognized styles:
INCOSE (2015). Project Manager’s Guide to Systems Engineering Measurement for Project Success: A Basic Introduction
to Systems Engineering Measures for Use by Project Managers (Version 1.0). R. S. Carson, P. J. Frenz, and, E. O’Donnell.
San Diego, CA: International Council on Systems Engineering.
INCOSE Notices
Author Use:  Authors have full rights to use their contributions unfettered, with credit to the INCOSE technical source, except as noted in the
following text.
INCOSE Use:  In accordance with Section 107 of the 1976 United States Copyright Act, limited use is granted to reproduce for purposes
such as criticism, comment, news reporting teaching, scholarship, or research. Reproduction for these purposes is limited to one 1,000 word
extraction from legally acquired copies of this publication. For instructional purposes, only one copy of the 1,000 word extraction is allowed
per student. All limited use shall provide attribution to INCOSE and the original author(s) where practical, provided this copyright notice is
included with all reproductions. No other use, such as preparing derivative works, and redistributing is granted.

ii INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

Table of Contents

1 INTRODUCTION....................................................................................................................................................................... 1

Project Management and Measurement....................................................................................................................... 1


How This Guide Is Organized.............................................................................................................................................. 2
Symbols Used In This Guide................................................................................................................................................ 2

2 MEASUREMENT IN SYSTEMS ENGINEERING...................................................................................................... 3

What is Systems Engineering?........................................................................................................................................... 3


Value of Measurement........................................................................................................................................................... 4
Why Measure Systems Engineering?.............................................................................................................................. 4
Measurement as part of a Feedback Control System.............................................................................................. 4
Top Ten Reasons to Measure.............................................................................................................................................. 5

3 QUICK-START GUIDE TO SE MEASUREMENT FOR PROJECT MANAGERS .......................................... 6

Quick-Start Measurement Questions.............................................................................................................................. 6


Overview of Measurement Guide Table......................................................................................................................... 8

4 IDENTIFYING AND MEASURING TECHNICAL DEBT........................................................................................10

What is the Technical Debt Trap?....................................................................................................................................10


Why is Technical Debt Incurred?.....................................................................................................................................11
How is Technical Debt Incurred?....................................................................................................................................11
How to Avoid and Measure Technical Debt!..............................................................................................................12

5 PROJECT TECHNICAL MEASURES THROUGHOUT THE LIFE CYCLE.....................................................16

Identifying measures that help manage the risks....................................................................................................16


Schedule...............................................................................................................................................................................17
Problem Reports and Peer Reviews.........................................................................................................................18
Technical Uncertainty Reduction..............................................................................................................................19
Scope Change....................................................................................................................................................................20
Technology Readiness Level or Technical Maturity.........................................................................................21
Solution Satisfies Requirement..................................................................................................................................22
Technical Performance..................................................................................................................................................23
Counts and Stability of Elements of the System................................................................................................24
Reliability, Maintainability, and Availability...........................................................................................................25
Defect Containment........................................................................................................................................................26

6 A PROJECT MEASUREMENT CASE STUDY............................................................................................................27

Cosmo Development Project – A Case Study...........................................................................................................27

7 ACRONYMS...............................................................................................................................................................................58

GUIDE TO SYSTEMS ENGINEERING MEASURMENT iii


Guide to Systems Engineering Measurement
®

Preface
Acknowledgements
The completion of this document required the review and comment of members from the Measurement
Working Group and other groups in INCOSE. Their efforts are greatly appreciated:

INCOSE technical documents are developed within the working groups of INCOSE. Members of the
working groups serve voluntarily and without compensation. The documents developed within INCOSE
represent a consensus of the broad expertise on the subject within INCOSE.

Comments
Comments for revision of INCOSE technical reports are welcome from any interested party, regardless
of membership affiliation with INCOSE. Suggestions for change in documents should be in the form of a
proposed change of text, together with appropriate supporting rationale. Please use the feedback form that
is provided at the end of this document. Comments on technical reports and requests for interpretations
should be addressed to:

INCOSE Central Office


7670 Opportunity Rd., Suite 220
San Diego, CA 92111-2222

http://www.incose.org/practice/techactivities/wg/measure/

Additional Copies / General Information


Copies of this document, as well as any other INCOSE document can be obtained from the INCOSE
Central Office. General information on INCOSE, the Measurement Working Group, any other INCOSE
working group, or membership may also be obtained from the INCOSE Central Office at:
International Council on Systems Engineering telephone:  +1 858-541-1725
7670 Opportunity Road, Suite 220 toll free phone (us): 800-366-1164
San Diego, California 92111-2222  |  USA fax:  +1 858-541-1728
e-mail: info@incose.org web site:  http://www.incose.org

iv INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

1 INTRODUCTION

In This Chapter
ÂÂProject Management and Measurement

ÂÂHow this Guide is Organized

ÂÂSymbols Used in this Guide

Project Management and Measurement


Project managers have many things on their plates when coordinating the activities to develop products,
services or systems. There are many processes and steps that must be taken to achieve a successful
project, to meet or exceed stakeholder expectations and to fulfill project requirements. Measurement is
one of these essential activities, and is instrumental in ensuring the success of a project. As identified by
PMI (Project Management Institute), “if you cannot measure it, you cannot control it; if you cannot control it,
you cannot manage it.” 1

Typical project measures, such as cost and schedule measures, provide the project manager (and systems
engineer) visibility into how well the project is tracking against its planned budget and schedule targets.
For the project manager, meeting these target measures – while crucial to assessing the performance
of this system that produces the product or service – are not all that should be considered to ensure the
project achieves the technical objectives and is on the path for success.

With the increasing complexity of systems and the important role of systems engineering (SE) on today’s
projects, there is other information which can be measured, evaluated and acted upon by the managers
who control the SE processes on these projects, for project success.

Analysis, status reporting, and assessment of the project’s systems engineering measures can
complement cost and schedule control, and can help meet programmatic targets. By tracking these SE
measures the project manager gains visibility into whether the delivered system will meet its requirements
and satisfy the customer’s needs and expectations. The project manager or systems engineer needs to
decide which measures are worth addressing or tracking, what tailoring is needed, and how to act upon
the results of the measurements.

This guide provides explanations and examples of some of the systems engineering data, and how it
can be collected, measured, tailored and controlled towards ensuring project success. This is not a
comprehensive treatise on creating and implementing a measurement program. Rather, it is an informative
guide to assist you in determining which measures can best enable the success of your project. If you’re
looking for straightforward information and measurement techniques, this is the guide for you.

1 2002 PMI briefing “You Can’t Manage What You Don’t Measure!!!,” Theresa Ramirez, PMP, 10/18/2002, https://
www.hashdoc.com/documents/2520/you-can-t-manage-what-you-don-t-measure (accessed 17 February 2015).

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 1


Guide to Systems Engineering Measurement
®

How This Guide Is Organized

The six chapters in this guide are geared toward helping you understand the problems that systems

engineering measurement aim to identify and control, and to identify the steps involved in the development,
analysis, and utilization of these measures.

Here’s an overview of what’s inside:


Chapter 2:  Measurement in Systems Engineering
Chapter 3:  Quick-Start Guide to SE Measurement for Project Managers
Chapter 4:  A Look at Technical Debt
Chapter 5:  Project Technical Measures Throughout the Life cycle
Chapter 6:  A Project Management Case Study

Symbols Used In This Guide


As you peruse this guide, you will observe several symbols that are designed to alert you to key information.

This graphic is used to point out a key idea or concept.

This symbol appears next to suggestions or ideas that are important to remember.

This symbol identifies a suggestion that should be heeded, to avoid potential pitfalls.

This symbol indicates that more information and detail are provided in later sections.

2 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

2 MEASUREMENT IN SYSTEMS ENGINEERING

In This Chapter
ÂÂWhat is Systems Engineering?

ÂÂValue of Measurement

ÂÂWhy Measure Systems Engineering?

ÂÂMeasurement as Part of a Feedback Control System

ÂÂTop Ten Reasons to Measure

What is Systems Engineering?


Systems Engineering (SE) is an interdisciplinary approach to successfully creating systems that meet a
defined set of business and technical requirements that meets the user’s needs. It focuses on the early
part of the product development cycle, to define customer needs and required functionality, and document
system requirements. It continues with design synthesis, analyses, and system validation. The Systems
Engineering process considers the complete problem and life cycle — concept, design, development,
manufacturing, operations, training, test, and disposal — within the context of cost, schedule, performance
and risk.

System Development
Solution/System
Integration, Verification, & Validation Planning Realization

Upper Level Upper Level


System Element Integration, Verification, & Validation Planning System Element
Development Realization
Dec
o
mp
Arc ition &
os
hite D

I, V, & V Planning
ctu efin

on
re

ti
rifica
d Ve re
ratio rchitectu
it

Lower Level Lower Level


ion

System Element System Element


n an

Development Realization
A
Integ

Figure 2-1.  Systems Engineering “Vee model.” 2

2 INCOSE Systems Engineering Handbook, v3.2.2, Figure 3-4

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 3


Guide to Systems Engineering Measurement
®

Value of Measurement
Measurement is central to achieving the objectives of a project and an effective systems engineering
process. It is important to both becoming aware of and increasing our knowledge. We measure in order to
assess where we are in terms of current performance, to set goals for improvement, and to foresee what
could potentially occur based on a current situation.

With good attention to measurement comes knowledge, essential for project success – knowledge about
project progress, performance, and problems. A defined system for measurement provides the project
manager and others with an effective way to communicate this knowledge, to track project goals and
objectives, and to provide rationale for decisions.

Why Measure Systems Engineering?


Measurement for Systems Engineering (SE) helps the project manager achieve the objectives of
transforming needs and requirements into a system design solution, ensure functional, physical, and
operational integration, assess quality and manage risks.

Systems engineering measures allow you the ability to gauge project performance with respect to these
objectives. With attention to good SE measures, you can more readily see how your investment in systems
engineering result in continuous improvement and the achievement of project goals.

Measurement as part of a Feedback Control System


As presented and discussed in the INCOSE Systems Engineering Measurement Primer 3, measurement
is a key element in a management feedback control loop that allows for the monitoring of SE processes.

SYSTEMS ENGINEERING

Resources Process Work Products


(e.g., Plans, designs, requirements, Output
specifications, analyses, hardware,
software, integration contriol docs,
V&V procedures, etc.)

Process
Resource Measure Product
Measure Measure

Action Analysis Measurement

Figure 2-2.  Systems Engineering Measurement as a Feedback Control System 4.

3 http://www.incose.org/ProductsPubs/pdf/INCOSE_SysEngMeasurementPrimer_2010-1205.pdf (accessed
17 February 2015).

4 INCOSE Systems Engineering Measurement Primer, December 2010 (accessed 17 February 2015).

4 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

Processes are executed through the Systems Engineering functional activity to produce products, as
shown in Figure 2-2. The feedback control loop to control SE processes is indicated by the reversed arrows.

Key data collected and analyzed as the processes take place provide insight into the activities and the
progress of developing the systems engineering products. This measurement data and analysis for product
measures enables you to make necessary adjustments, address the project progress, and maintain control.

Top Ten Reasons to Measure


A measurement-based feedback system improves the effectiveness and efficiency of the project team. It
enables tracking product progress, improved communication, fine-tuned processes and more. With good
measures, you can easily see the results of your systems engineering activities. They are:

1. Detect and analyze issues and trends – Use your measurement data and analysis to help you
identify and understand the areas that are going well or need improvement.

2. Identify and correct problems early – Provide the information early on to enable the implementation
of appropriate corrective action and to avoid or minimize cost overruns and schedule slippages. By
getting feedback on the system processes, you’ll be able to recognize error-prone products that can
then be corrected earlier, generally at lower cost.

3. Assess the quality – Use your data to help you evaluate the quality of the engineering or program
technical products.

4. Make key tradeoffs – Measurement throughout the project helps in the performance of tradeoff
analysis, and the selection of the optimal approach or determination of feasible alternative solutions.

5. Enable a focus on risk areas – Measures help identify problems and complexities (requirements
development, design, technical performance progress, etc.), to help identify root causes of risks and
problems and allow you to manage risks before they become issues.

6. Track specific project objectives – By providing status on progress with respect to achieving technical
and management objectives, you can perform better technical planning, make adjustments to
resources based on discrepancies between planned and actual progress, and make other decisions
to revise project plans.

7. Communicate effectively – Measures can help you provide the project team with quantified infor-
mation related to status, potential problems, progress, and completion. Regular communication
on measures can increase awareness of progress, reduce uncertainty and ambiguity, and improve
organizational focus and comprehension.

8. Know which data to ignore – Learning to read your measures will help you discern which data you
can ignore. By sorting those out, you will focus on the measures that are useful to you.

9. Spend your time where it matters – Learning which measures are most important for your project
means getting more time to focus on other areas.

10. Describe rationale for decisions – A good measurement system presents the information for
decision-making and provides accountability of the decisions made.

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 5


Guide to Systems Engineering Measurement
®

3 QUICK-START GUIDE TO SE MEASUREMENT FOR PROJECT MANAGERS

In This Chapter
ÂÂQuick-Start Measurement Questions

ÂÂOverview of Measurement Guide Table

Using Systems Engineering measures, you can evaluate results or progress related to the specific
aspects of your project. Some measures can be more applicable than others, depending on project
considerations such as project phase, development strategy, applicable tools and databases, the product
domain, and particular category of measurement. An understanding of the factors that influence these
types of project considerations lets you adjust your approach and select appropriate measures, as well as
make improvements that can help you achieve your goals. The following sections can help to answer key
questions and get you started towards selecting what to measure.

Quick-Start Measurement Questions


Why should I measure?

What gets measured gets done. It’s that simple.

What should I measure?

You should measure what is critical to your program to be successful. See Chapter 5 for
guidance on measurement selection.

What do I measure during each phase of development life cycle? 5

See the next section and Chapter 5 – Use the Measurement Guide Table to identify appropriate
measures.

If I were going to measure one or two SE measures, what do you recommend?

This is highly project specific, but if forced to make a recommendation, from a scheduling
perspective, measure Problem Report Aging, and Technical Inch-stones late starts and stops
(finishes). From a technical perspective, measure reduction in uncertainty, over time, of any technical or
management parameter necessary for making decisions.

What tools should I use to work with measures?

Microsoft Excel® provides enough capability for most measurement efforts. Examples in
Section 5 are done with Microsoft Excel.

5 Reference INCOSE SE Handbook and ISO/IEC 15288:2008.

6 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

How do I measure with minimum budget to achieve the most?

You want to select the “critical few” measures that provide the insight into areas of highest risk
to your specific project.

What is technical debt?

Technical Debt: The promise to complete a technical shortcoming in the future while declaring
it complete enough today. See Chapter 4, A Look at Technical Debt.

Where do I find more information on measures?

INCOSE has a number of resources on measurement guidance, including the


Measurement Primer and Leading Indicators Guide. See also ISO/IEC 15939:2007.

Where do I find more on technical measures – measures of performance (MOP), measures of effectiveness
(MOE), and technical performance measures (TPM)?

See the INCOSE Technical Measurement Guide.

What do I measure if I am using an Agile/Spiral development life cycle?


Expect things to change in an agile environment. So consider measures applicable to meeting
schedule commitments and satisfying user needs.

My group is using Model-based SE (MBSE). What should I measure?


Model-based SE makes it easier to count things. Use the database tools to count things
monthly and monitor for changes. Expect stability ahead of technical reviews. High volatility is a
form of technical debt. Use the model to identify gaps and inconsistencies among requirements
and interfaces.

What do I measure given the knowledge/experience/maturity of my staff?

Knowledgeable staff can effectively compensate for not having detailed processes. New
product domains or customers may present unexpected challenges to novice or even to
experienced staff, so additional technical maturity measurements may be useful.

What do I do when data are disputed by members of the project team?

Let’s go to the expert, Dr. Edward Deming: “In God we Trust; all others bring data.” In other
words, trust the data first. Then ask, “Why are the data in question?”

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 7


Guide to Systems Engineering Measurement
®

What do I measure with regard to a specific product domain, e.g., software-intensive, cyber, regulatory
(FAA, FDA), medical, aerospace, automotive, etc.?

See the “Product Domain” rows in the next section and in Chapter 5.

Overview of Measurement Guide Table


The specific measures a project manager should apply, or for which a project will most benefit, depend
on various factors, such as phase of the life cycle, type of development project, and specific categories of
measures. These factors are shown in the following table and are explained below.

Project
Considerations Applicable Factors

Measurement Technical
Size Complexity Stability Schedule
Category Quality
Implement
Conceive and Architect and Operate and
Phase and Verify Validate
Define Design Support
Integrate
Development Acquirer- Supplier-
Waterfall Agile/Spiral Increments
Strategy Funded Funded

Tools and Manual or Requirements Static Model- Simulation-


Databases Spreadsheet Management Based SE Based SE

Software- Hardware- Regulatory


Product Domain Complex Commercial Government
Intensive Intensive Environment

The categories of Project Considerations are indicated in the first column: Measurement Category, Phase,
Development Strategy, Tools and Databases, and Product Domain. Each of these is elaborated
in turn so that a project manager can select those measures most applicable. The specific
measures listed in Section 5 are associated with applicable factors in each row.

Measurement Category.  Measures are associated with specific matters that may need to be addressed:
ƒƒ Technical Quality (maturity, correctness, completeness)
ƒƒ Size (number of requirements, systems, interfaces, etc.)
ƒƒ Complexity (number of stakeholders with different needs, number of external system interfaces,
degree of coupling of subsystems)
ƒƒ Stability (variation of any measure over time)
ƒƒ Schedule (timely accomplishment of technical tasks)

Phase.  Measures are associated with a specific phase of the project in the life cycle:
ƒƒ Conceive and Define (identification and development of needs, requirements, and concepts)
ƒƒ Architect and Design (definition of the system requirements, subsystems, interfaces, and design)

8 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

ƒƒ Implement and Integrate (realization of the system elements in hardware, software, training,
support, and the integration of these various elements)
ƒƒ Verify (proving by various means that the implemented design satisfies the requirements)
ƒƒ Validate (establishing that the as-built system satisfies the stakeholder needs)
ƒƒ Operate and Support (using and maintaining the capabilities of the system, including training)

Development Strategy.  Measures are associated with different approaches to development:


ƒƒ Waterfall (primarily serial, “finish-to-start” development activity based on well-defined needs)
ƒƒ Agile / Spiral (flexible or adaptive development with an uncertain end state or user needs at the
beginning)
ƒƒ Increments (serialized waterfall process with incremental enhancements)
ƒƒ Acquirer-Funded (development is paid by customer – applies to many government projects)
ƒƒ Supplier-Funded (development is paid by the supplier – applies to most commercial activities)

Tools and Databases.  Measures associated with different levels of infrastructure for developing and
managing development information:
ƒƒ Manual or Spreadsheet (development artifacts are primarily paper-based; systems engineering is
labor-intensive)
ƒƒ Requirements Management (systems engineering automation is limited to managing requirements
for traceability and configuration control)
ƒƒ Static Model-Based SE (SE uses automated tools for defining and managing system requirements
and architecture)
ƒƒ Simulation-Based SE (SE requirements and architecture are derived from system simulations so
that validation is inherent in the development process)

Product Domain.  Measures that depend on the type or development environment of the product:
ƒƒ Software-Intensive (project is primarily software development using existing or off-the-shelf
hardware)
ƒƒ Hardware-Intensive (project requires significant hardware development and integration)
ƒƒ Complex (project uses numerous highly interacting elements of hardware and software)
ƒƒ Regulatory Environment (the product must be certified by one or more regulatory authorities prior
to operation)
ƒƒ Commercial (the product will be offered for commercial sale)
ƒƒ Government (the product will be offered primarily for government sale)

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 9


Guide to Systems Engineering Measurement
®

4 IDENTIFYING AND MEASURING TECHNICAL DEBT

In This Chapter
ÂÂWhat is Technical Debt?
ÂÂWhy is Technical Debt incurred?
ÂÂHow to Avoid and Measure Technical Debt!

Imagine you have been managing a project for eight months. You have met all of your milestones,
successfully completed all reviews, and your program measures are all positive, with both CPI and SPI
greater than 1.0, risks appropriately treated, and you have hardly touched your management reserve or
contingency schedule.

Leaning back and reflecting on your program management skills and your future promotion, in walks your
technical lead who mentions that there are a few issues to still clean up before we can go to production.
Snapped back to reality and sitting up straight, you ask “What do you mean? The schedule says we’re done
and ready to build!”

What hit this program manager is Technical Debt. Technical Debt is the promise to complete a technical
shortcoming in the future while declaring it “complete enough” today. And he/she is not the first program
manager to receive this surprise when all program measures look good.

In this chapter, you will learn what comprises technical debt, how it is incurred, how you can identify and
measure it, and techniques for debt management.

What is the Technical Debt Trap?


Similar to personal debt, the program is explicitly, or more commonly, implicitly deferring a technical
challenge or risk to the future because you don’t want, or cannot spend the time and/or money, to
successfully solve a technical challenge before declaring the task complete.

Common forms of technical debt are:


ƒƒ Contract Data Requirement List (CDRL) item delivered but not approved
ƒƒ Contract Data Requirement may be approved but is still wrong - by no response or response
ƒƒ Incomplete/incorrect Drawings
ƒƒ Incomplete/incorrect Design
ƒƒ Incomplete/incorrect Interfaces
ƒƒ Incomplete/incorrect Specification
ƒƒ Any work product declared “done” in Earned Value Management System with effort remaining
ƒƒ Open Problem Reports (PR)
ƒƒ Incomplete Peer Reviews
ƒƒ Hidden Rework, where the costs are hidden in other work packages.

10 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

The challenge for the program manager is that the program management measures don’t typically track
technical debt, because the schedule says it is “100% – Done.”

Technical Debt:  The promise to complete a technical shortcoming in the future while
declaring it complete today.

Why is Technical Debt Incurred?


No one consciously sets out to create technical debt. It is a product of undisciplined responses to
pressures to meet schedules or budgets, and sometimes is a matter of inexperience. Technical debt can
build until it is too extreme to ignore. Some examples of traps observed in actual projects follow:

Example – Delivery of an incomplete document:

The schedule has as a task “deliver a CDRL,” typically information in some form. The CDRL item is
delivered, so the Control Account Manager (CAM) takes 100% credit and moves to the next task. The
schedule was not developed to have a separate task for CDRL approval with duration and budget.
Depending on your customer, that CDRL may require rework for any of a number of reasons, such as
missing or incorrect content. That unplanned rework is technical debt.

Example – Deficient technical content:

The drawing package is to be completed and submitted four weeks prior to a program review. In order to
meet the schedule, the drawing package is declared complete and is submitted prior to the formal review.
While the customer is reviewing the drawing package, the work continues -trying to clean up the loose
ends prior to the formal review. The drawing package was declared fully complete, within schedule, and
there is no formal task or budget monitoring the remaining efforts.

Just like personal debt, these unaccounted for tasks begin to snowball, and the program
manager uncovers a significant issue one day that can no longer be ignored.

How is Technical Debt Incurred?


Fundamentally, there are three ways in which Technical Debt is incurred.
1. Omission: Tasks unaccounted for within schedule and/or budget;
2. Wishful Thinking: Tasks declared completed but not really complete; and
3. Undetected Rework: Tasks believed to be completed but done incorrectly.

1. Omission: Tasks unaccounted for within schedule and/or budget


Sorry to say, but most of these items are learned by experience or through a veteran team member.
The key is to identify the tasks that are left out of the schedule and/or budget and to account for them.
Commonly unaccounted-for tasks include:
ƒƒ Contract Data Requirement rework upon disapproval
ƒƒ Incomplete (Drawings/Design/Interfaces/Specification) efforts or rework

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 11


Guide to Systems Engineering Measurement
®

ƒƒ Incomplete Peer Reviews effort and resulting rework


ƒƒ Open Problem Reports *
ƒƒ Any unplanned Rework

2. Wishful Thinking: Tasks declared to be completed but not really complete


Common incomplete tasks include:
ƒƒ Any work product inappropriately declared as done for Earned Value Management System (EVMS)
ƒƒ Incomplete Peer Review effort and resulting rework

People want to believe that things are not so bad and that they can recover without impact. This
is rarely the case, for many reasons. Your job is to use the proper techniques to prevent, avoid
and detect Technical Debt in a timely matter and to take corrective action to minimize impact.

3. Undetected Rework: Tasks believed to be completed but done incorrectly


Common tasks believed to be completed, but done incorrectly, include:
ƒƒ Contract Data Requirement approved but still requiring rework
ƒƒ Incorrect (Drawings/Design/Interfaces/Specification) efforts or rework

These are hard to detect and to plan for in advance. Since the product status is declared as complete, it
would have been baselined and would require a Problem Report to update. You are able to measure
incomplete work once the Problem Report is written.

* A Problem Report is a tracking method to control changes to baselined or completed


products. Once a product has been declared done, any change or correction requires
a Problem Report to be written and approved, typically by a change control board.

How to Avoid and Measure Technical Debt!


To avoid Technical Debt, you will need to apply three methods:

1. Account for unscheduled tasks


This seems obvious, but a little extra care to completely schedule follow-on tasks can provide the
needed insight to guard against this type of Technical Debt. You begin by identifying classes of
scheduled tasks that typically have follow-on tasks. Depending on your contract, Contract Data
Requirements may require approvals by customer. Create separate tasks for rework and customer
approval for each, and place it within your schedule with the appropriate logical dependencies,
duration, and resources. Depending on the Contract Data Requirement, the size of the rework task
should be appropriately sized (i.e., a 10-page non-technical document can be estimated as much
smaller than a 200-page technical specification). Note that you might not want to schedule these as
sequentially linked tasks if not required.

12 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

Work your way through each of the classes of unscheduled tasks. You will notice that this
may add duration to your schedule. Sorry, but this is reality. Some of the rework will not be
required but others may take longer than scheduled. However, having the work properly
scheduled allows you the opportunity to staff the right resources and to have adequate budget to
complete the required tasks.

Another advantage of explicitly scheduling these tasks is that you will be more likely to allocate an
appropriate budget in a separate work package. This allows you to take the EVMS credit for the initial
task completion and to appropriately schedule follow-on tasks.6

2. Establish measures to provide early warning


There are a number of measurement candidates for measuring the incursion of Technical
Debt. The trick is selecting the critical few that are appropriate for your project. You want to
select measures that are easy to collect and that provide an early warning of the incursion
of Technical Debt. For example, counting “TBDs” (to be determined) and “TBR” (to be resolved or
refined) is useful when the program properly uses them to manage technical debt — the realization
that more work remains in defining requirements, writing plans, developing interfaces, or completing
analyses. However, these simple measures can be abused with disastrous program results.7 Further,
there may be implicit “TBDs” that cannot be easily counted without some analysis. Appropriate
caution is required to ensure that the simple measures are not misleading.

Each project and situation is different, but here is an example of a measure for more formal situations,
with a formal configuration management system: Measuring age and state of problem reports (PR). You
can typically extract this data electronically from whatever tool is used to store the problem reports.

Most tools move the problem reports through a series of states. Typical states are: Open, Analysis,
CCB Review, Assigned, Verification, and Closed. It is important to close the problem reports as soon
as possible to reduce cost to the program. It is also important to make sure that all problem reports
are analyzed as soon as possible. Any problem report in an Open or Analysis state represents an
uncharacterized risk to the program. An example of a problem report aging graph is shown in Figure
4-1. All problem reports Open over one month and all PRs in Analysis over one month should be
reviewed; all non-Closed problem reports should be reviewed after three months.

6 EIA-748C, “Earned Value Management Systems,” http://webstore.ansi.org/RecordDetail.aspx?sku=EIA-748-C


(accessed 17 February 2015).

7 Friedman, George and Sage, Andrew P., “Case Studies of Systems Engineering and Management in Systems
Acquisition,” Systems Engineering, Vol 7, No. 1, 2004, pp. 90.

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 13


Guide to Systems Engineering Measurement
®

All Problem Report Status Aging


60

50 Open

Assign
40
Problem Reports

Analyze

30 Verify

CCB Review
20
Closed

10

0
1 2 3 4 5 6+
Months Open

Figure 4-1.  An example Problem Reporting graph.

Industry data associated with the relative cost of delaying the correction of an error or
making a change is shown in Table 4-1 and also in Figure 4-2. The cost of making the
change/correction escalates the longer you delay. If it would take one unit to make the
change/correction in the Design phase, then it would take 40 units to make the same change/
correction after the System Test phase.
Cumulative Percentage Life Cycle Cost against Time

Table 4-1.  Relative Error Cost Found, 100% Committed Costs


95%
by Phase.8 90%
85%
ts 500–1000X
fec
80%
70% e
Program Phase Relative 70%
ra ct D
Operations
Completed Cost Ext
to Through
ost
60% 20–100X
C Prod/Test Disposal
50%
Design 1
40%
Unit Test 5 3–6X
30% 100%
Integration Test 16 Develop
20% Design 50%
System Test 40 Concept
10%
15% 20%
8%
Acceptance Test 110 0%
Time
Figure 4-2. Committed Life-cycle Cost and Cost to Extract Defects
vs. Time 9

8 Mike Phillips, “V&V Principles”, Verification and Validation Summit 2010.  http://www.faa.gov/about/office_org/
headquarters_offices/ang/offices/tc/library/v&vsummit2010/Briefings/V%20and%20V%20Principles%20-%20
M%20Phillips.pptx  (accessed 5 August 2014).

9 INCOSE Systems Engineering Handbook, v3.2.2, Figure 2-4.

14 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

3. Providing CAM training for taking schedule earned value credit


A key method for heading off technical debt is to provide your CAMs upfront training on consistent
tools and methods for determining percentage complete on a task, This is a relatively inexpensive
class/meeting to review standards to be used on the project so that all CAMs use the same criteria.
Unfortunately it is commonly overlooked assuming that all CAMs were trained in the same methods.

The program/organization needs to define what “earning rules” will be used to take credit for tasks.
There are several commonly used methods such as the 50/50 method which allow 50% credit when
you start a task and 50% when completed, Others include: 20/80, 25/75, and 0/100. It is important
to select an earning method and appropriate guidance in a program CAM Earned Value Guidance
document that is available to all CAMs which also facilitates knowledge transfer if a change in
personnel occurs. This CAM Earned Value Guidance document should emphasize the importance
taking earned value credit correctly and ethically to avoid technical debt.

As part of the training and program CAM Earned Value Guidance document, it is important to ensure
the quality of the work product as well. The CAM is responsible for ensuring the technical quality
and completeness of the work product or the result is Technical Debt which will need to be paid with
interest in the future.

Now it is up to you!

The art of managing Technical Debt involves a multi-pronged approach:


ƒƒ Plan all tasks up front, within your schedule, and with an appropriate control account structure that
helps avoid incomplete tasks being declared complete.
ƒƒ Select appropriate measures for your project that provide the needed insights into management of
Technical Debt.
ƒƒ Providing CAM training for taking schedule earned value credit

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 15


Guide to Systems Engineering Measurement
®

5 PROJECT TECHNICAL MEASURES THROUGHOUT THE LIFE CYCLE

In This Chapter
ÂÂIdentifying measures that help manage the risks in different life cycle phases and for different
types of programs

Identifying measures that help manage the risks


Imagine that you have just been given responsibility for a project. It may be new, or it may have been going
on for a while. You will likely want to check the financial and schedule aspects of the project and to
understand the following: Are we on time? Are we on budget? Do we have a plan, how much work is left to
go, and how much budget do we have?

Beyond these typical, important project management measures lurks the risk of technical debt.
In the last chapter we examined what can happen if technical debt accumulates from late
starts, incomplete work, deferred decisions, and issues that arise. In this chapter we identify
specific measures for managing technical risk to help avoid the accumulation of technical debt. These
are organized according to the “quick-start” program descriptors in Section 3 for easier reference. Each
measure in the Measurement Guide Table that follows is listed, along with a “how to use” set of steps and
references for further information. “Shading” indicates that the measure may be useful for a program with
the shaded characteristics.

16 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

Schedule Measures

Project
Considerations Applicable Factors

Measurement Technical
Size Complexity Stability Schedule
Category Quality
Implement
Conceive and Architect and Operate and
Phase and Verify Validate
Define Design Support
Integrate
Development Acquirer- Supplier-
Waterfall Agile/Spiral Increments
Strategy Funded Funded

Tools and Manual or Requirements Static Model- Simulation-


Databases Spreadsheet Management Based SE Based SE

Software- Hardware- Regulatory


Product Domain Complex Commercial Government
Intensive Intensive Environment

Delayed starts are leading indicators for delayed finishes. However, be wary of
starting tasks when necessary data is not available, is incomplete, or is likely to
change because rework of dependent input is likely; “don’t be a slave to schedule.” • Late Starts
For effective feedback control, the measurement delay should be (# or % of
no greater than the measurement frequency. In this case of weekly schedule
measurement, the data should be available before the next week begins. events)

Weekly Inchstones
Weekly Inchstones(Cum)
(Cum)
300 • Late
250
Upper Left: Completions
Cumulative tracking
of “planned” and
(# or % of
200
of Tasks

Planned Starts (Cum)


“actual” starts and schedule
stops, by week,
150 Actual Starts (Cum) events)
Number of Tasks

provides overall
Number

Planned Stops (Cum)


100
Actual Stops (Cum) project schedule
status.
50

0
10/23

11/20

12/18

10/22

11/19
5/8/2

6/5/2

7/3/2

1/1/2

4/9/2

5/7/2

6/4/2

7/2/2
3/27/
4/10/
4/24/

5/22/

6/19/

7/17/
7/31/
8/14/
8/28/
9/11/
9/25/
10/9/

11/6/

12/4/

1/15/
1/29/
2/12/
2/26/
3/12/
3/26/

4/23/

5/21/

6/18/

7/16/
7/30/
8/13/
8/27/
9/10/
9/24/
10/8/

11/5/

Weekly
Weekly Actual vs. Actual
Scheduled vs. Stops
Starts and Scheduled Starts and Stops
25

20
Lower Left:
15
Weekly difference of
Tasks

10
planned vs. actual
5
Tasks

starts and stops


Number of of

0
provides immediate
Number

5/9/2014

6/6/2014

7/4/2014

8/1/2014

1/2/2015

5/8/2015

6/5/2015

7/3/2015
3/28/2014

4/11/2014

4/25/2014

5/23/2014

6/20/2014

7/18/2014

8/15/2014

8/29/2014

9/12/2014

9/26/2014

11/7/2014

12/5/2014

1/16/2015

1/30/2015

2/13/2015

2/27/2015

3/13/2015

3/27/2015

4/10/2015

4/24/2015

5/22/2015

6/19/2015

7/17/2015

7/31/2015

8/14/2015

8/28/2015

9/11/2015

9/25/2015
10/10/2014

10/24/2014

11/21/2014

12/19/2014

(5)
visibility of schedule
(10)
compliance
(15)

(20)

(25)

(30) Start Variances (Cum) Stop Variances (Cum)

(35)

For all schedule-related measures it is important to find the root cause of what is
late so that the program critical path is not jeopardized and rework is not incurred by
immature or incomplete work.

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 17


Guide to Systems Engineering Measurement
®

Problem Reports and Peer Reviews Measures

Project
Considerations Applicable Factors

Measurement Technical
Size Complexity Stability Schedule
Category Quality
Implement
Conceive and Architect and Operate and
Phase and Verify Validate
Define Design Support
Integrate
Development Acquirer- Supplier-
Waterfall Agile/Spiral Increments
Strategy Funded Funded

Tools and Manual or Requirements Static Model- Simulation-


Databases Spreadsheet Management Based SE Based SE

Software- Hardware- Regulatory


Product Domain Complex Commercial Government
Intensive Intensive Environment

Delayed resolution of problems or review of technical information may accumulate


more technical debt and may indicate that critical decisions are being delayed, which
jeopardizes the schedule. • Problem Report
Aging (how
Histograms also work well for these types of counting measures. long after
opening until
Below left, PR Aging is depicted with a histogram showing the number of Problem closing, days)
Reports in each category of delay. Below right, the histogram indicates % of peer
reviews held on time in each program phase. • Peer reviews
actually held vs.
scheduled (%)
8
Number of Peer Reviews Held on Time (%)

7 100
Number of Peer Reviews

6 80
5
4 60

3 40
2
20
1
0 0
>10 >30 >60 >90 Phase Phase Phase Phase
days days days days 1 2 3 4
Aging (days) Program Phase

18 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

Technical Uncertainty Reduction Measures

Project
Considerations Applicable Factors

Measurement Technical
Size Complexity Stability Schedule
Category Quality
Implement
Conceive and Architect and Operate and
Phase and Verify Validate
Define Design Support
Integrate
Development Acquirer- Supplier-
Waterfall Agile/Spiral Increments
Strategy Funded Funded

Tools and Manual or Requirements Static Model- Simulation-


Databases Spreadsheet Management Based SE Based SE

Software- Hardware- Regulatory


Product Domain Complex Commercial Government
Intensive Intensive Environment

Unresolved uncertainty carries technical debt into the decision-making process. The
goal is not to eliminate the uncertainty, but to reduce it to a level at which a decision
can be made with acceptable risk. This applies to individual technical parameters as • Measure
well as to the results of technical reviews. reduction in
uncertainty
Trend lines similar to technical performance measures 10 make the uncertainty visible (% or number)
compared to the needed value. In the example, the uncertainty of Parameter 1 must of each key
be reduced below the decision threshold prior to making the decision. technical
6 parameter
Parameter 1 over time
Uncertainty
5 and compare
Decision threshold with needed
Parameter Uncertainty and
Decision Threshold (%)

4 uncertainty
value for
3
making
decisions
2
Decision

1
• May be
combined
0 with technical
0 2 4 6 8 10 12 performance
Time after project start (month)
measures
For example, the trend line for reduction in Parameter 1 (e.g., weight or power)
uncertainty crosses the decision threshold of 3% at month 10; decisions based on
Parameter 1 prior to the point are at higher risk for being revisited once Parameter 1 is
known with more accuracy. Reduction in uncertainty can arise from developmental
testing or through more accurate analysis of the system.

10 INCOSE Technical Measurement Guide (December 2005, accessed 17 February 2015)


http://www.incose.org/ProductsPubs/products/techmeasurementguide.aspx

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 19


Guide to Systems Engineering Measurement
®

Scope Change Measures

Project
Considerations Applicable Factors

Measurement Technical
Size Complexity Stability Schedule
Category Quality
Implement
Conceive and Architect and Operate and
Phase and Verify Validate
Define Design Support
Integrate
Development Acquirer- Supplier-
Scope Change
Waterfall Agile/Spiral Increments
Strategy Funded Funded (number or
Tools and Manual or Requirements Static Model- Simulation-
Databases Spreadsheet Management Based SE Based SE
volatility of
Software- Hardware- Regulatory requirements
Product Domain Complex Commercial Government
Intensive Intensive Environment
(#, %)
Primarily Primarily Experienced,
Staff Capability Intermediate
Novice Experienced new domain
• Identify
Requirements
It is not uncommon to have some requirements changes during a project. Project
(#) at each
managers need to be aware of additions or modifications to requirements that
architectural
(a) affect contractual agreements or (b) change the required effort or resources
level or for each
necessary to meet project obligations (cost, schedule, people, laboratories).
subsystem

Trend analyses are useful for tracking scope changes. Action thresholds for change
• Determine
may decrease over time as the design matures and the impact of requirements
number of
changes becomes greater. Prior to a system requirements review (SRR) the volatility is
requirements
expected to be high, but must settle down ahead of the SRR. Failing to move the SRR
changed each
will incur technical debt and likely rework. Once the critical design review (CDR) takes
month
place, most subsequent changes will increase project costs and lengthen schedules. 11

Requirements Volitility: ABC Program • Normalize by


100% total number of
90%
Planned Metric requirements
SRR Driven
80%
SRR New Requirements
70% • Monitor for
Volitility Percentage

Deleted Requirements
60%
unexpected
Revised Requirements
changes
50% Total

40% Regression
• Evaluate risk
30%
based on
20% program phase
10%

0%
r y p v r y
Jan Ma Ma Jul Se No Jan Ma Ma

11 Figure from INCOSE “Systems Engineering Leading Indicators Guide v2.0, January 29,
2010, section 3.1, http://www.incose.org/ProductsPubs/pdf/SELI-Guide-Rev2-01292010-
Industry.pdf (accessed June 2014).

20 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

Technology Readiness Level or Technical Maturity Measures

Project
Considerations Applicable Factors

Measurement Technical
Size Complexity Stability Schedule
Category Quality
Implement
Conceive and Architect and Operate and
Phase and Verify Validate
Define Design Support
Integrate
Development Acquirer- Supplier-
Technology
Waterfall Agile/Spiral Increments
Strategy Funded Funded Readiness
Tools and Manual or Requirements Static Model- Simulation-
Databases Spreadsheet Management Based SE Based SE
Level (TRL 12
Software- Hardware- Regulatory or technical
Product Domain Complex Commercial Government
Intensive Intensive Environment maturity)
Primarily Primarily Experienced,
Staff Capability
Novice
Intermediate
Experienced new domain of solution
elements
Technical maturity (or technology readiness) level identifies the technical debt
inherent in the elements of the solution based on the development status (e.g., in- • Evaluate
production, prototype, variation on a product family). Most projects require at least individual
TRL 6 (prototype) before incorporating an item in a development project. solution
elements
A quick way to evaluate the state of the program is to create a histogram showing
how many items are in a given maturity category so that appropriate management • Identify
oversight can be provided to manage the technical risk. In the example, management program
attention should be focused on the elements with TRL < 7 and on developing risks based
contingency plans in case any element does not achieve full maturity according to on technical
a development plan. maturity or
TRLs
12
Number of Subsystems or Elements

Minimum maturity
• Monitor for
10
change
8
Note: TRL 9 is
6 fully mature
(demonstrated
4 in actual
operation)
2

0
TRL1 TRL2 TRL3 TRL4 TRL5 TRL6 TRL7 TRL8 TRL9
Technical Maturity

12 Defense Acquisition University, “Systems Engineering Fundamental” (2001, accessed 17


February 2015). http://www.everyspec.com/DoD/DOD-DAU-DSMC/DAU_SYSTEMS_ENG_
FUNDAMENTALS_2001_22394/

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 21


Guide to Systems Engineering Measurement
®

Solution Satisfies Requirement Measures

Project Solution
Considerations Applicable Factors
satisfies
Measurement Technical
Size Complexity Stability Schedule requirements
Category Quality
Implement
(% compliant) 13
Conceive and Architect and Operate and
Phase and Verify Validate
Define Design
Integrate
Support • At each tech­
Development Acquirer- Supplier- nical review,
Waterfall Agile/Spiral Increments
Strategy Funded Funded evaluate the
Tools and Manual or Requirements Static Model- Simulation- ability of the
Databases Spreadsheet Management Based SE Based SE
conceptual/
Software- Hardware- Regulatory
Product Domain
Intensive Intensive
Complex
Environment
Commercial Government prototype
Primarily Primarily Experienced, solution to
Staff Capability Intermediate
Novice Experienced new domain satisfy each
requirement.
The key technical progress measure for development programs is an evaluation of Once verifi­cation
the degree to which the design is satisfying the requirements. Any non-compliance begins, assess
is an issue that must be corrected and indicates a need for rework. Unknown the solution
as “verified”
compliance is risk of a future discovery of non-compliance and is therefore a form of
based on the
technical debt based on uncertainty. verification
results.
This measure can be represented as a time-dependent bar chart showing progress
• Calculate % of
of technical compliance until all requirements are verified. requirements
satisfied by the
solution.
100% 6000
90% SRR: system
% Requirements Satified

80% 5000 requirements


Total Requirements

review
70% 4000
60% Not Compliant SFR: system
by Design

functional
50% Unknown Compliance 3000
review
40% Assessed as Compliant
30% 2000 PDR: preliminary
Verified design review
20% 1000
10% Total Requirements CDR: critical
0% 0 design review

SRR SFR PDR CDR TRR FCA PCA TRR: test readi­
ness review
Time/Phase
FCA: functional
configuration
audit

13 Carson, Ronald S. and Bojan Zlicaric, “Using Performance-Based Earned Value to Measure PCA: physical
Systems Engineering Effectiveness,” Proceedings of INCOSE 2008 (Utrecht, Netherlands). configuration
audit

22 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

Technical Performance Measures

Project
Considerations Applicable Factors

Measurement Technical
Size Complexity Stability Schedule
Category Quality
Implement
Conceive and Architect and Operate and
Phase and Verify Validate
Define Design Support
Integrate
Development Acquirer- Supplier-
Waterfall Agile/Spiral Increments
Strategy Funded Funded
Tools and Manual or Requirements Static Model- Simulation-
Databases Spreadsheet Management Based SE Based SE
Software- Hardware- Regulatory
Product Domain Complex Commercial Government
Intensive Intensive Environment
Primarily Primarily Experienced,
Staff Capability Intermediate
Novice Experienced new domain

Technical performance measures can be applied for selected technical parameters • Track the
to ensure adequate progress is being achieved. Time-based plots of estimated progress
or demonstrated performance are compared with required values (minimum or of selected
maximum) to help manage the risk. This is a quantitative form of a risk mitigation technical
plan. A plan line with decision bounds should be established early in the program parameters
with required progress in achieving the threshold value (e.g., “not to exceed”). Failure compared with
to achieve the required progress converts the risk to an issue and may require a required values
design change to ensure technical compliance. to ensure
INCOSE Systems Engineering adequate
Specified “Not to Exceed” Value Handbook v. 3.2.2
100% INCOSE-TP-2003-002-03.2.2 progress
Action Team to Bring Demonstrated October 2011 is being
90 Back into Spec Variance Predicted Variance
Planned achieved 14
Value Current
80 Profile Estimate

70
Achievement
to Date

Demonstrated
Values

Estimated Allocated Calculated Measured


Value Value Value Value

Proposal Gate Gate Test Delivery


Figure 5-18 TPM Monitoring ©CSM, Used with permission

14 INCOSE SE Handbook v3.2.2, http://www.incose.org/ProductsPubs/products/


sehandbook.aspx  (accessed June 2014) and INCOSE Technical Measure Guide (2005)
http://www.incose.org/ProductsPubs/products/techmeasurementguide.aspx (accessed
June 2014)

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 23


Guide to Systems Engineering Measurement
®

Counts and Stability of Elements of the System Measures

Project
Considerations Applicable Factors

Measurement Technical
Size Complexity Stability Schedule
Category Quality
Implement
Conceive and Architect and Operate and
Phase and Verify Validate
Define Design Support
Integrate
Development Acquirer- Supplier-
Counts and
Waterfall Agile/Spiral Increments
Strategy Funded Funded stability of
Tools and Manual or Requirements Static Model- Simulation-
Databases Spreadsheet Management Based SE Based SE
elements of the
Software- Hardware- Regulatory system 15, for
Product Domain Complex Commercial Government
Intensive Intensive Environment example:
Primarily Primarily Experienced,
Staff Capability Intermediate
Novice Experienced new domain
Number of
external systems
Database tools enable managers to more easily count elements of the solution,
and stakeholder
whether requirements, interfaces, or solution elements (subsystems, boxes, wires,
or program
etc.). While the absolute numbers may not be critical, sudden growth can indicate
interfaces (#)
scope change or increased complexity and development risk.

Number
Visibility of these changes is provided by simple charts of counts vs. time. Project
of internal
managers should monitor these measures for unexpected changes while the
interfaces (#)
design should be stable. For example, “External Systems” should be stable at
Systems Requirements Review, and “Elements” and “Interfaces” should be stable at Number of solu­
Preliminary Design Review. In the graph none of the three conditions is satisfied so tion elements (#)
that the project manager should investigate root causes and take corrective action
• Count the item
to avoid additional technical debt from the changing design. Increasing complexity
of interest (#)
based on increasing element and interface counts may also lead to more risk during
the integration phase after the critical design review. • Monitor for
changes (%)
45
PDR
SRR

40
35
Elements
30
Interfaces
Count

25
External Systems
20
15
10
5
0
1 2 3 4 5 6 7 8 9 10
Time (months)

15 Carson, Ronald and Paul Kohl, “New Opportunities for Architecture Measurement”,
Proceedings of INCOSE 2013 (Philadelphia, PA).

24 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

Reliability, Maintainability, and Availability Measures

Project
Considerations Applicable Factors

Measurement Technical
Size Complexity Stability Schedule
Category Quality
Implement
Conceive and Architect and Operate and
Phase and Verify Validate
Define Design Support
Integrate
Development Acquirer- Supplier-
Waterfall Agile/Spiral Increments
Strategy Funded Funded
Tools and Manual or Requirements Static Model- Simulation-
Databases Spreadsheet Management Based SE Based SE
Software- Hardware- Regulatory
Product Domain Complex Commercial Government
Intensive Intensive Environment
Primarily Primarily Experienced,
Staff Capability Intermediate
Novice Experienced new domain

Once development is nearly complete the project can begin to accumulate data on • Compare
operational performance for reliability and system availability. The Verification phase measured
provides a “first look” at these system performance measures that have significant vs. predicted
consequences during operations and support phase. reliability and
availability
A time-dependent line chart can be used to compare current performance vs. vs. predicted
operational need or requirement. The need for design or other changes can become values (%)
apparent if deficiencies are other than initial “growing pains.” In the example below
the implemented design is failing to meet its reliability requirement even as the • Compare
system moves into operation, and root cause investigation may be required to identify measured
and correct the deficiency. mean repair
time vs.
1 predicted
(minutes)
0.95
0.9 Reliability
Reliability (probability)

Requirement
0.85
Reliability-Actual
0.8

0.75
Verification

Operation

0.7
0.65
0.6
1 2 3 4 5 6 7 8 9 10
Time (months)

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 25


Guide to Systems Engineering Measurement
®

Defect Containment Measures

Project
Considerations Applicable Factors

Measurement Technical
Size Complexity Stability Schedule
Category Quality
Implement
Conceive and Architect and Operate and
Phase and Verify Validate
Define Design Support
Integrate
Development Acquirer- Supplier-
Waterfall Agile/Spiral Increments
Strategy Funded Funded

Tools and Manual or Requirements Static Model- Simulation-


Databases Spreadsheet Management Based SE Based SE

Software- Hardware- Regulatory


Product Domain Complex Commercial Government
Intensive Intensive Environment

Technical debt in the form of rework accumulates when errors in technical data are not
identified and corrected before the data is used by other groups (e.g., Requirements
for Design and Verification, Design for Build and Verification, Trade-off Analyses for Defect
Design). The longer the delay in discovering the error, the larger the cost of the rework. containment 16

Histograms of defect containment are a valid way to display this information (defects • Errors
introduced by phase vs. phase in which they are discovered and corrected). introduced in
one phase and
14
Verify identified and
12 corrected in a
Build
Number of Defects by Phase

10 Design another phase


Requirements (# or %) 17
8

0
Pre-Design Pre-Build Pre-Verify Post-Verify

This measure can be used within a project for additional spirals, increments, or agile
scrums so that more rigor is applied in finding defects prior to propagation. The
measure is also useful for organizational and system process improvement so that
error propagation can be reduced on successive projects.

16 Michael Phillips, “V&V Principles,” V & V Summit 2010, http://www.faa.gov/about/office_


org/headquarters_offices/ang/offices/tc/library/v&vsummit2010/Briefings/V%20and%20
V%20Principles%20-%20M%20Phillips.pptx  (accessed June 2014).
17 INCOSE “Systems Engineering Leading Indicators Guide v2.0, January 29, 2010, section
3.15, http://www.incose.org/ProductsPubs/pdf/SELI-Guide-Rev2-01292010-Industry.pdf
(accessed June 2014).

26 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

6 A PROJECT MEASUREMENT CASE STUDY

In This Chapter
ÂÂProject Case Study with example use of technical measures

Cosmo Development Project – A Case Study


Week 1
The phone rings – it’s George’s manager, Lee, with good news. George has been selected to be the
program manager for the new Cosmo development project, just won. This project consists of redesign and
repackaging of the standard “widget” into a 50% smaller package. George opens his email and reviews the
Statement of Work (SOW). It’s a short project – estimated to take six months. George will need hardware
(HW) engineering for the main effort and some software (SW) engineering to possibly update the drivers if
new interfaces are required for the new components. George contacts the engineering organization and
requests a systems engineer (SE) to function as the technical lead and to take care of systems engineering
duties, along with the needed hardware and software engineers.

After carefully reviewing the SOW, the team came together to lay out the Integrated Master Schedule (IMS).
As normal, the initial schedule was for nine months instead of the needed six months. George and Sara
(SE lead) work with the other engineers to fit the IMS into the required timeframe that is acceptable to all.
George, who has been previously burned by technical debt, asked the team to step back from the IMS
to see if all needed tasks were accounted for. Sara noted that the deliverable documentation frequently
needed to be updated after initial, formal delivery before final approval was received.. George added tasks
to the IMS to allow rework and resubmission tasks, but is careful to not link added tasks to non-dependent
tasks that would push out the schedule. Judy, from HW, noted that the required peer reviews of drawings
didn’t allow for rework and verification as well. While acknowledging that it is a little more difficult to add
to the IMS without pushing the date out, the team justified their changes by asserting the expectation that
an improved widget would be built the first time. Now comfortable with the IMS, the team baselines the
schedule.

Due to the tight schedule, Sara began working the requirements from the SOW almost immediately. The
requirements were not extensive, but the team elected to make use of their requirements management tool
to manage and track requirements. After a check of the requirements’ goodness (Correct, Complete, Clear,
Consistent, Verifiable, Traceable, Feasible, Modular and Design Independent 18), the requirements were
added to the tool and baselined.

While the requirements were being worked, George completed his plan to manage the program. He
created the appropriate cost accounts and budgets for each of the expected tasks. In addition, George
created a Risk plan and spent time with the team identifying risks and developing mitigation plans. During
this activity George noted that they would need to make a final technical decision on size based on some

18 INCOSE Guide for Writing Requirements, INCOSE-TP-2010-006-01, 17 April 2012 (accessed 17 February 2015).

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 27


Guide to Systems Engineering Measurement
®

power dissipation analyses that had only just begun. He identified a date by which the power dissipation
had to be known, to within +/-5%, and added that into the IMS along with a tracking chart.

George was expected to report monthly as to management program status. George had freedom to
tailor the standard program measures and elected to use Schedule Performance Index (SPI) and Cost
Performance Index (CPI) measures, along with the Risk matrix. These would provide insight into the overall
program health but George wanted visibility into the technical health as well.

Working with Sara and Andy (HW Lead), the team discussed what critical few areas needed to be
measured to be successful. Schedule came up first. It was one of the driving requirements. George pointed
out that SPI would take care of that. Sara countered back that the schedule is too short to depend on a
monthly SPI measure. There wouldn’t be enough time to take corrective action by the time an issue was
noticed. Sara proposed an “Inchstones” IMS Measure, to track late starts and stops on a weekly basis.
It could be generated automatically from the IMS and dropped into a spreadsheet with minimal effort,
allowing for weekly insight into schedule performance. Sara explained that it would be easier to recover
a day then to recover a week. The measure was new to George, but meeting the commitment date was
critical to the project’s success so he accepted the measure.

Due to the main objective being a 50% reduction in size, size was added to the technical performance
measures (TPM) for reporting on a monthly basis.

Andy pointed out that due to the tight schedule and long lead times for HW components, requirements
volatility was critical to manage as well. Without further discussion, the team agreed to include this
measure as well on a monthly basis.

Sara, aware that the schedule tends to push shortening the time for verification of the design, nominated
one more measure to pick up later in the life cycle: Requirements Verification percentage. George and the
team accepted the addition of a Requirements Verification measure on a weekly basis, and decided that
these would be their tracked technical measures – the critical few that would provide the technical insight
needed to be successful.

With the project planning complete, and with programmatic and project measures selected, the project
team began to execute their plan.

The first step for the measurement effort was creating baseline charts for Program Risks, Power Dissipation
Uncertainty, Requirements Volatility, and Size TPM.

28 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

Program Risk Assessment

D 1
Probability

Low Risk
C 3 Medium Risk
HighRisk
B 2

1 2 3 4 5
Impact

Risk #1 Long lead times for Risk #2  New Software drivers will be Risk #3  Reduced size widget will not
components will delay integration required for new components meet power dissipation threshold
and test
• Schedule and budget only allow • Reducing the size of widget package
• Due to tight schedule, long lead for components with compatible will create power dissipation
parts could drive schedule and cost drivers capability

• Mitigate through allowing • Mitigate through software review • Mitigate through monitoring
component selection if and approval before any new parameter uncertainty measure
commercially available components can be selected

Figure 6-1.  Program Risk Summary. Initial program risks were identified.

Power Dissipation Uncertainty

6
Power Dissipation
5 Uncertainty
Parameter Uncertainty and
Decision Threshold (%)

Decision threshold
4

2
Decision

0
0 1 2 3 4
Time after project start (month)

Figure 6-2.  Decision Uncertainty Measurement. Uncertainty needs to be reduced below 3% in 3 months.

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 29


Guide to Systems Engineering Measurement
®

Requirements Volatility

30 PDR CDR

25
% Requirements Change

20

15

10

0
Baseline Mar-15 Apr-15 May-15 Jun-15 Jul-15 Aug-15

Requirement Changes % Planned Requirement Changes %

Figure 6-3.  Requirements Volatility Plan.

Widget Size TPM

6
Actual Size (cm)
5 Threshold
5 Upper Threshold
Plan Line
Actual Size (cm)

0
1 2 3 4 5 6 7
Months

Figure 6-4.  Size TPM. The plan is to achieve the threshold requirement in 3 months.

30 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

Week 2
After the first week, Sara prepared the weekly technical measures charts – Inchstones and Late Starts and
Stops.
Weekly Inchstones (Cum)
140
130
120
110
100
90
Planned Starts (Cum)
Number of Tasks

80
70 Actual Starts (Cum)
60
Planned Stops (Cum)
50
40
Actual Stops (Cum)
30
20
10
0
3/27/2015
4/3/2015

4/10/2015

4/17/2015

4/24/2015

5/1/2015

5/8/2015

5/15/2015

5/22/2015

5/29/2015

6/5/2015

6/12/2015

6/19/2015

6/26/2015

7/3/2015

7/10/2015

7/17/2015

7/24/2015

7/31/2015

8/7/2015

8/14/2015

8/21/2015

8/28/2015

9/4/2015

9/11/2015

9/18/2015

9/25/2015
Figure 6-5.  Week 2 stop/start inchstone status indicates the baseline plan.

The first thing George noticed was that the project was already behind on schedule starts. Checking
further, George learned that Andy was only available 25% of the time and the electrical engineer was still
not available. This prompted George to take corrective action to visit engineering and get a commitment to
provide needed resources immediately.

Cumulative Late Starts/Stops


5

1
Number of Tasks

0
3/27/2015
4/3/2015

4/10/2015

4/17/2015

4/24/2015

5/1/2015

5/8/2015

5/15/2015

5/22/2015

5/29/2015

6/5/2015

6/12/2015

6/19/2015

6/26/2015

7/3/2015

7/10/2015

7/17/2015

7/24/2015

7/31/2015

8/7/2015

8/14/2015

8/21/2015

8/28/2015

9/4/2015

9/11/2015

9/18/2015

9/25/2015

(1)

(2)
4 starts
(3) behind plan
Start Variances (Cum)
(4)
Stop Variances (Cum)
(5)

Figure 6-6.  Week 2 stop/start cumulative variances indicate 4 late starts.

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 31


Guide to Systems Engineering Measurement
®

Week 3
The measures showed no further erosion of late starts now that the HW resources had been provided, but
the late stops increased by one due to the prior late starts.

Weekly Inchstones (Cum)

140
130
120
110
100
90
Planned Starts (Cum)
Number of Tasks

80
70 Actual Starts (Cum)
60
Planned Stops (Cum)
50
40
Actual Stops (Cum)
30
20
10
0
3/27/2015
4/3/2015

4/10/2015

4/17/2015

4/24/2015

5/1/2015

5/8/2015

5/15/2015

5/22/2015

5/29/2015

6/5/2015

6/12/2015

6/19/2015

6/26/2015

7/3/2015

7/10/2015

7/17/2015

7/24/2015

7/31/2015

8/7/2015

8/14/2015

8/21/2015

8/28/2015

9/4/2015

9/11/2015

9/18/2015

9/25/2015
Figure 6-7.  Week 3 stop/start inchstone status indicates deficient starts.

Cumulative Late Starts/Stops

1
Number of Tasks

0
3/27/2015
4/3/2015

4/10/2015

4/17/2015

4/24/2015

5/1/2015

5/8/2015

5/15/2015

5/22/2015

5/29/2015

6/5/2015

6/12/2015

6/19/2015

6/26/2015

7/3/2015

7/10/2015

7/17/2015

7/24/2015

7/31/2015

8/7/2015

8/14/2015

8/21/2015

8/28/2015

9/4/2015

9/11/2015

9/18/2015

9/25/2015

(1)

(2)
2 stops behind
(3)
Start Variances (Cum)
(4) 4 starts behind
Stop Variances (Cum)
(5)

Figure 6-8.  Week 3 stop/start cumulative variances indicate four late starts and two late completions.

32 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

Week 4
The measures showed no further erosion of late starts but the project was still behind schedule. This
prompted George to request the HW team to either put in additional effort or to use an alternative resource
to get back on schedule. Andy indicated that his tasks were going well and he should be able to catch up in
the next week.
Weekly Inchstones (Cum)

140
130
120
110
100
90
Planned Starts (Cum)
Number of Tasks

80
70 Actual Starts (Cum)
60
Planned Stops (Cum)
50
40
Actual Stops (Cum)
30
20
10
0
3/27/2015
4/3/2015

4/10/2015

4/17/2015

4/24/2015

5/1/2015

5/8/2015

5/15/2015

5/22/2015

5/29/2015

6/5/2015

6/12/2015

6/19/2015

6/26/2015

7/3/2015

7/10/2015

7/17/2015

7/24/2015

7/31/2015

8/7/2015

8/14/2015

8/21/2015

8/28/2015

9/4/2015

9/11/2015

9/18/2015

9/25/2015
Figure 6-9.  Week 4 stop/start inchstone status indicates deficient starts and completions.

Cumulative Late Starts/Stops

1
Number of Tasks

0
3/27/2015
4/3/2015

4/10/2015

4/17/2015

4/24/2015

5/1/2015

5/8/2015

5/15/2015

5/22/2015

5/29/2015

6/5/2015

6/12/2015

6/19/2015

6/26/2015

7/3/2015

7/10/2015

7/17/2015

7/24/2015

7/31/2015

8/7/2015

8/14/2015

8/21/2015

8/28/2015

9/4/2015

9/11/2015

9/18/2015

9/25/2015

(1)

(2)

(3)
Start Variances (Cum)
(4)
Stop Variances (Cum)
(5)

Figure 6-10.  Week 4 stop/start cumulative variances indicate four late starts and two late completions.

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 33


Guide to Systems Engineering Measurement
®

Week 5
Andy was able to pull in all but one of the tasks so the late starts showed improvements. The late stops
continued to increase due to the previous late starts.

Weekly Inchstones (Cum)

140
130
120
110
100
90
Planned Starts (Cum)
Number of Tasks

80
70 Actual Starts (Cum)
60
Planned Stops (Cum)
50
40
Actual Stops (Cum)
30
20
10
0
3/27/2015
4/3/2015

4/10/2015

4/17/2015

4/24/2015

5/1/2015

5/8/2015

5/15/2015

5/22/2015

5/29/2015

6/5/2015

6/12/2015

6/19/2015

6/26/2015

7/3/2015

7/10/2015

7/17/2015

7/24/2015

7/31/2015

8/7/2015

8/14/2015

8/21/2015

8/28/2015

9/4/2015

9/11/2015

9/18/2015

9/25/2015
Figure 6-11.  Week 5 stop/start inchstone status indicates deficient starts and completions.

Cumulative Late Starts/Stops

1
Number of Tasks

0
3/27/2015
4/3/2015

4/10/2015

4/17/2015

4/24/2015

5/1/2015

5/8/2015

5/15/2015

5/22/2015

5/29/2015

6/5/2015

6/12/2015

6/19/2015

6/26/2015

7/3/2015

7/10/2015

7/17/2015

7/24/2015

7/31/2015

8/7/2015

8/14/2015

8/21/2015

8/28/2015

9/4/2015

9/11/2015

9/18/2015

9/25/2015

(1)

(2)

(3)
Start Variances (Cum)
(4)
Stop Variances (Cum)
(5)

Figure 6-12.  Week 5 stop/start cumulative status indicates improved starts.

34 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

First Monthly Review


(Note:  Monthly reviews frequently don’t occur until after the month closes and the project managers have
had time to obtain financial data and prepare presentations. Corrective actions may be delayed up to six
weeks!)
Cumulative SPI & CPI

With our first look at SPI and 1.1


CPI, the chart shows that we are 1.08
behind on schedule and we are 1.06 SPI Cum
under-spending. The SPI of .97 is 1.04 CPI Cum
a concern but does not turn the Index 1.02
program red. [Note:  SPI should 1
0.98
be ≥ 1; CPI should be ≤ 1.]
0.96
0.94
0.92
0.9
Apr May Jun Jul Aug Sep Oct

Figure 6-13.  Cumulative SPI & CPI indicates more resources are needed
because the project is behind schedule and under budget.

Program Risk Assessment

Risk #1 decreased
E
in probability due to
less uncertainty with D
Probability

component lead times. Low Risk


C 3 1 Medium Risk
HighRisk
B 2

1 2 3 4 5
Impact

Risk #1 Long lead times for components


will delay integration and test
• Reduced due to progress identifying
parts and verifying lead times

Figure 6-14.  Program Risk Status indicates improvement in Risk #1.

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 35


Guide to Systems Engineering Measurement
®

Work was progressing by hardware engineering on meeting the power dissipation need. Some progress
was being made, but data on the chart required discussion with engineering on minimal progress to date
within short schedule.
Power Dissipation Uncertainty

6
Power Dissipation
5 Uncertainty
Parameter Uncertainty and
Decision Threshold (%)

Decision threshold
4

Decision
1

0
0 1 2 3 4
Time after project start (month)

Figure 6-15.  Decision Uncertainty Measurement. Uncertainty needs to be reduced below 3% in


three months.

Requirements Volatility
The requirements volatility
30 PDR CDR was within plan as the initial
requirements efforts were
25 completed.
% Requirements Change

20

15

10

0
Baseline Mar-15 Apr-15 May-15 Jun-15 Jul-15 Aug-15

Requirement Changes % Planned Requirement Changes %

Figure 6-16.  Requirements Volatility Status indicates volatility below allowable


levels (good).

36 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

The projected Widget size was making progress towards goal but seemed to be lagging due to late
hardware resource assignments.
Widget Size TPM

6
Actual Size (cm)
5 Threshold
5 Upper Threshold
Plan Line
3.9
Actual Size (cm)

0
1 2 3 4 5 6 7
Months

Figure 6-17.  Size TPM indicates progress to plan.

Requirements Compliance

100%
90%
% Requirements Satified

80%
70%
60% Not Compliant
by Design

50% Unknown Compliance


40% Assessed as Compliant
30%
20%
10%
0%
Mar Apr May Jun Jul Aug Sep
Month

Figure 6-18.  Requirements compliance status indicates 20% of requirements have not yet
been assessed, which increases risk of future noncompliance (a form of technical debt).

With George’s weekly insight and management of the project from a technical perspective, the project is
green across the board for program management measures. George took a deep breath and knew he had
successfully dodged a bullet. Maybe there was something to these technical SE measures?

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 37


Guide to Systems Engineering Measurement
®

Week 8
Technical measures have been looking good week to week with the entire team reviewing measures and
agreeing to corrective actions. Schedule starts have even gotten ahead of plan this week.

Weekly Inchstones (Cum)

140
130
120
110
100
90
Planned Starts (Cum)
Number of Tasks

80
70 Actual Starts (Cum)
60
Planned Stops (Cum)
50
40
Actual Stops (Cum)
30
20
10
0
3/27/2015
4/3/2015

4/10/2015

4/17/2015

4/24/2015

5/1/2015

5/8/2015

5/15/2015

5/22/2015

5/29/2015

6/5/2015

6/12/2015

6/19/2015

6/26/2015

7/3/2015

7/10/2015

7/17/2015

7/24/2015

7/31/2015

8/7/2015

8/14/2015

8/21/2015

8/28/2015

9/4/2015

9/11/2015

9/18/2015

9/25/2015
Cumulative Late Starts/Stops

1
Number of Tasks

0
3/27/2015
4/3/2015

4/10/2015

4/17/2015

4/24/2015

5/1/2015

5/8/2015

5/15/2015

5/22/2015

5/29/2015

6/5/2015

6/12/2015

6/19/2015

6/26/2015

7/3/2015

7/10/2015

7/17/2015

7/24/2015

7/31/2015

8/7/2015

8/14/2015

8/21/2015

8/28/2015

9/4/2015

9/11/2015

9/18/2015

9/25/2015

(1)

(2)

(3)
Start Variances (Cum)
(4)
Stop Variances (Cum)
(5)

Figure 6-19.  Week 8 stop/start inchstone status indicates recovery of starts and completions.

38 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

Second Monthly Review


Both SPI and CPI showed movement towards an index of 1.0 and indicates that project performance was
returning to plan.
Cumulative SPI & CPI

1.1
1.08
1.06 SPI Cum
1.04 CPI Cum
1.02
Index

1
0.98
0.96
0.94
0.92
0.9
Apr May Jun Jul Aug Sep Oct

Figure 6-20.  Cumulative SPI & CPI indicates some recovery of schedule
while still under budget.

Power Dissipation Risk was reduced in probability as lower power components were identified. This is also
reflected with progress in Power Dissipation chart.

Program Risk Assessment

D
Probability

Low Risk
C 1 Medium Risk
HighRisk
B 2 3

1 2 3 4 5
Impact

Risk #3  Reduced size widget will not


meet power dissipation threshold
• Identified parts with lower power
consumption that reduced dissipation
need

Figure 6-21.  Program Risk Status indicates improvement in Risk #3.

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 39


Guide to Systems Engineering Measurement
®

Power Dissipation Uncertainty

6
Power Dissipation
5 Uncertainty
Parameter Uncertainty and
Decision Threshold (%)

Decision threshold
4

Decision
1

0
0 1 2 3 4
Time after project start (month)

Figure 6-22.  Decision Uncertainty Measurement is on plan to achieve the decision point below
3% in three months.

Requirements Volatility is on plan so there is no concern there.

Requirements Volatility

30 PDR CDR

25
% Requirements Change

20

15

10

0
Baseline Mar-15 Apr-15 May-15 Jun-15 Jul-15 Aug-15

Requirement Changes % Planned Requirement Changes %

Figure 6-23.  Requirements Volatility Status indicates volatility below allowable


levels (good).

40 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

The technical concern this month is that Size TPM is behind the projected glide path.

Widget Size TPM

6
Actual Size (cm)
5 Threshold
5 Upper Threshold
Plan Line
3.9
Actual Size (cm)

4 3.6

0
1 2 3 4 5 6 7
Months

Figure 6-24.  Size TPM indicates progress is off-plan.

Requirements Compliance
After discussions with Andy,
100%
it becomes clear that they
90%
have a problem part for
% Requirements Satified

80%
which an acceptable reduced
70%
size part hasn’t been located. Not Compliant
60%
by Design

Sara is assigned a task to Unknown Compliance


50%
pull together a tiger team Assessed as Compliant
40%
of subject matter experts to
30%
resolve the situation. This 20%
monthly measure will now 10%
be reported weekly until 0%
resolved. George is aware Mar Apr May Jun Jul Aug Sep
that this is unplanned-for Month
effort, and provides additional
Figure 6-25.  Requirements compliance status indicates fewer requirements are
budget from management identified as non-assessed, and that some requirements are not compliant and
reserve for the task. need to be corrected (these are issues).

After George again completes the monthly program review, with green measures across the board,
George’s manager, Lee, follows up with finance to confirm status. Lee knew that this is normally the part
of the life cycle where problems start to show up on programs (Technical Debt). Finance independently
confirms that the program is on track.

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 41


42
Number of Tasks Number of Tasks

0
10
20
30
40
50
60
70
80
90
100
110
120
130
140

0
1
2
3
4
5

(5)
(4)
(3)
(2)
(1)
®

3/27/2015
3/27/2015
Week Ten
4/3/2015
4/3/2015
4/10/2015
4/10/2015
4/17/2015
4/17/2015
4/24/2015
4/24/2015
5/1/2015
5/1/2015
5/8/2015
5/8/2015
5/15/2015
5/15/2015
5/22/2015
5/22/2015
5/29/2015
5/29/2015
6/5/2015
6/5/2015
6/12/2015
6/12/2015
6/19/2015
6/19/2015
6/26/2015

INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


6/26/2015
7/3/2015
7/3/2015
7/10/2015
7/10/2015
7/17/2015
Weekly Inchstones (Cum)

7/17/2015
7/24/2015

Cumulative Late Starts/Stops


7/24/2015

Stop Variances (Cum)


Start Variances (Cum)
7/31/2015
No concerns this week – Starts and Stops stay at or ahead of plan.

7/31/2015
8/7/2015
8/7/2015
8/14/2015
8/14/2015
8/21/2015
8/21/2015
8/28/2015
8/28/2015
9/4/2015
Guide to Systems Engineering Measurement

9/4/2015
Actual Stops (Cum)
Actual Starts (Cum)

9/11/2015
Planned Stops (Cum)
Planned Starts (Cum)

9/11/2015
9/18/2015
9/18/2015
9/25/2015
9/25/2015

Figure 6-26.  Week Ten stop/start inchstone status indicates starts and completions are on or ahead of plan.
MEASUREMENT WORKING GROUP 

Week 12
All weekly measures continue to be collected and reviewed. The TPM measure has also moved to weekly
review progress of the tiger team.
Widget Size TPM

Actual Size (cm)


4 3.6 Threshold
3.4 Upper Threshold
3.5
Plan Line
3
Actual Size (cm)

2.5
2
1.5
1
0.5
0
6/1/15 6/8/15 6/15/15 6/22/15 6/29/15
Weeks

Figure 6-27.  Size TPM indicates the progress is still off-plan.

Week 13
The focused effort of the tiger team is beginning to show significant progress at meeting size requirement
this week.
Widget Size TPM

Actual Size (cm)


4 3.6 Threshold
3.4 Upper Threshold
3.5
Plan Line
3
Actual Size (cm)

2.7
2.5
2
1.5
1
0.5
0
6/1/15 6/8/15 6/15/15 6/22/15 6/29/15
Weeks

Figure 6-28.  Size TPM indicates recovery toward the threshold value.

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 43


Guide to Systems Engineering Measurement
®

Third Monthly Review


SPI reflects that project is ahead of schedule and CPI shows that it is being accomplished under budget.
Cumulative SPI & CPI
1.1
1.08
1.06 SPI Cum
1.04 CPI Cum
1.02
Index

1
0.98
0.96
0.94
0.92
0.9
Mar Apr May Jun Jul Aug Sep

Figure 6-29.  Cumulative SPI & CPI indicates recovery of schedule while still under budget.

There are no changes in risks this month.


Program Risk Assessment

D
Probability

Low Risk
C 3 1 Medium Risk
HighRisk
B 2

A
1 2 3 4 5
Impact
Figure 6-30.  Program Risk Status indicates no status change.

Power Dissipation Uncertainty has been reduced to a level required for making decisions.

Power Dissipation Uncertainty


6 Figure 6-31.  Decision
Power Dissipation Uncertainty Measurement
5 Uncertainty
Parameter Uncertainty and

indicates that uncertainty


Decision Threshold (%)

Decision threshold
4 has been reduced to a level
for which the technical
3
decision can be made —
2 the technical debt has
been retired. This technical
1
measure no longer needs
0 to be tracked.
0 1 2 3 4
Time after project start (month)

44 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

Requirements Volatility is on plan so there is no concern there. Requirements compliance continues to


make progress but a watch is placed on the non-compliant requirements remaining.
Requirements Volatility

30 PDR CDR

25
% Requirements Change

20

15

10

0
Baseline Mar-15 Apr-15 May-15 Jun-15 Jul-15 Aug-15

Requirement Changes % Planned Requirement Changes %

Figure 6-32.  Requirements Volatility Status indicates volatility below allowable levels (good).

Requirements Compliance

100%
90%
% Requirements Satified

80%
70%
60% Not Compliant
by Design

50% Unknown Compliance


40% Assessed as Compliant
30%
20%
10%
0%
Mar Apr May Jun Jul Aug Sep
Month
Figure 6-33.  Requirements compliance status indicates fewer requirements have yet to be
assessed and fewer requirements are non-compliant.

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 45


Guide to Systems Engineering Measurement
®

Week 14 Weekly Inchstones (Cum)


Technical measures continue 140

to be positive. Starts and 130


120
Stops measures continue to 110
100
show the project is ahead 90

of project plan. Through the Planned Starts (Cum)

Number of Tasks
80
70 Actual Starts (Cum)
efforts of the tiger team, the 60
Planned Stops (Cum)
Widget Size TPM has met the 50
Actual Stops (Cum)
40
Widget size requirement. 30
20
10
0
3/27/2015
4/3/2015

4/10/2015

4/17/2015

4/24/2015

5/1/2015

5/8/2015

5/15/2015

5/22/2015

5/29/2015

6/5/2015

6/12/2015

6/19/2015

6/26/2015

7/3/2015

7/10/2015

7/17/2015

7/24/2015

7/31/2015

8/7/2015

8/14/2015

8/21/2015

8/28/2015

9/4/2015

9/11/2015

9/18/2015

9/25/2015
Cumulative Late Starts/Stops
5

1
Number of Tasks

0
3/27/2015
4/3/2015

4/10/2015

4/17/2015

4/24/2015

5/1/2015

5/8/2015

5/15/2015

5/22/2015

5/29/2015

6/5/2015

6/12/2015

6/19/2015

6/26/2015

7/3/2015

7/10/2015

7/17/2015

7/24/2015

7/31/2015

8/7/2015

8/14/2015

8/21/2015

8/28/2015

9/4/2015

9/11/2015

9/18/2015

9/25/2015
Andy suggests that it is time (1)

to retire the TPM measure (2)

since component selection (3)


Start Variances (Cum)
has met the size requirement. (4)
Stop Variances (Cum)
The Tiger Team successfully (5)

identified an alternative
Figure 6-34.  Week 14 stop/start inchstone status indicates starts and
method to perform needed completions are ahead of plan.
functionality, allowing
smaller parts to be used Widget Size TPM
and getting Size TPM within
the 50% window. Given 4 3.6 Actual Size (cm)
Threshold
3.4
that the uncertainty in the 3.5 Upper Threshold
Plan Line
TPM estimate is less than 3
Actual Size (cm)

2.7 2.4
2.5
the difference to the Upper
2
Threshold value, the team
1.5
agrees to retire the TPM 1
measure to avoid the busy 0.5
work and to provide focus 0
on the measures potentially 6/1/15 6/8/15 6/15/15 6/22/15 6/29/15
Weeks
needing attention.
Figure 6-35.  Size TPM indicates satisfaction of the size requirement.

46 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

Week 18
George receives a call requesting that additional features be added to the project. George hesitates and
remembers that requirements volatility is a key technical driver for his program. George responds with:
“Are you willing to add duration to the project to get these features?” Of course the answer is no, so
George sticks to his guns and suggests that these changes be added to a future project. The customer,
understanding the impact, agrees. Weekly inchstone status continues to show good progress.

Weekly Inchstones (Cum)


140
130
120
110
100
90
Planned Starts (Cum)
Number of Tasks

80
70 Actual Starts (Cum)
60
Planned Stops (Cum)
50
40
Actual Stops (Cum)
30
20
10
0
3/27/2015
4/3/2015

4/10/2015

4/17/2015

4/24/2015

5/1/2015

5/8/2015

5/15/2015

5/22/2015

5/29/2015

6/5/2015

6/12/2015

6/19/2015

6/26/2015

7/3/2015

7/10/2015

7/17/2015

7/24/2015

7/31/2015

8/7/2015

8/14/2015

8/21/2015

8/28/2015

9/4/2015

9/11/2015

9/18/2015

9/25/2015

Cumulative Late Starts/Stops


5

1
Number of Tasks

0
3/27/2015
4/3/2015

4/10/2015

4/17/2015

4/24/2015

5/1/2015

5/8/2015

5/15/2015

5/22/2015

5/29/2015

6/5/2015

6/12/2015

6/19/2015

6/26/2015

7/3/2015

7/10/2015

7/17/2015

7/24/2015

7/31/2015

8/7/2015

8/14/2015

8/21/2015

8/28/2015

9/4/2015

9/11/2015

9/18/2015

9/25/2015

(1)

(2)

(3)
Start Variances (Cum)
(4)
Stop Variances (Cum)
(5)

Figure 6-36.  Week 18 stop/start inchstone status indicates starts and completions are ahead of plan.

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 47


Guide to Systems Engineering Measurement
®

Week 20 Weekly Inchstones (Cum)


Now that the program has 140

completed the design phase, 130


120
and the initial developmental 110

unit is being built, it is time 100


90
to begin collecting the Planned Starts (Cum)

Number of Tasks
80
70 Actual Starts (Cum)
Requirements Verification 60
Planned Stops (Cum)
Measures. The Requirements 50
Actual Stops (Cum)
40
Verification Measure will 30

track progress against plan of 20


10
requirements being verified to 0
3/27/2015
4/3/2015

4/10/2015

4/17/2015

4/24/2015

5/1/2015

5/8/2015

5/15/2015

5/22/2015

5/29/2015

6/5/2015

6/12/2015

6/19/2015

6/26/2015

7/3/2015

7/10/2015

7/17/2015

7/24/2015

7/31/2015

8/7/2015

8/14/2015

8/21/2015

8/28/2015

9/4/2015

9/11/2015

9/18/2015

9/25/2015
ensure that design fully meets
requirements. (There was
no value in starting to collect
this measure prior to the Cumulative Late Starts/Stops
verification phase of project.) 5

To be successful, verification 4

tests and analyses must be 3

completed successfully (the 2

verification measure), not 1


Number of Tasks

just completed (the schedule 0


3/27/2015
4/3/2015

4/10/2015

4/17/2015

4/24/2015

5/1/2015

5/8/2015

5/15/2015

5/22/2015

5/29/2015

6/5/2015

6/12/2015

6/19/2015

6/26/2015

7/3/2015

7/10/2015

7/17/2015

7/24/2015

7/31/2015

8/7/2015

8/14/2015

8/21/2015

8/28/2015

9/4/2015

9/11/2015

9/18/2015

9/25/2015
(1)
measures). Unsuccessful
(2)
tests indicate an issue that will
(3)
need to be resolved (could be Start Variances (Cum)
(4)
redesign, or revising the tests Stop Variances (Cum)
(5)
or requirements).
Figure 6-37.  Week 20 stop/start inchstone status indicates starts and
completions are ahead of plan.

Requirements Verification

“Unverified Requirements” 200


includes verification that has
not been completed as well 150
Requirements

as failures during verification.


As needed, separate
100
measures can be used to
track these individually. Unverified Reqts
50 Verified Reqts
Verification Plan

Figure 6-38.  Requirements 0


Verification progress measures 20 21 22 23 24 25 26 27
and plan. Week

48 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

Week 21 Weekly Inchstones (Cum)


Although the schedule 140

inchstones continue to show 130


120
work accomplished ahead 110

of plan, the Requirements 100


90
Verification Measure is Planned Starts (Cum)

Number of Tasks
80
70 Actual Starts (Cum)
beginning to fall behind plan, 60
Planned Stops (Cum)
possibly indicating some 50
Actual Stops (Cum)
40
failures during verification 30

activities. In the technical 20


10
discussions, Sara and Andy 0
3/27/2015
4/3/2015

4/10/2015

4/17/2015

4/24/2015

5/1/2015

5/8/2015

5/15/2015

5/22/2015

5/29/2015

6/5/2015

6/12/2015

6/19/2015

6/26/2015

7/3/2015

7/10/2015

7/17/2015

7/24/2015

7/31/2015

8/7/2015

8/14/2015

8/21/2015

8/28/2015

9/4/2015

9/11/2015

9/18/2015

9/25/2015
try to assure George that there
is nothing to worry about.
However, George has been
doing some reading lately Cumulative Late Starts/Stops
to learn more about how 5

to manage with measures 4

and recalled Dr. W. Edwards 3

Deming’s words: “In God 2

we trust; all others must 1


Number of Tasks

bring data.” In other words, 0


3/27/2015
4/3/2015

4/10/2015

4/17/2015

4/24/2015

5/1/2015

5/8/2015

5/15/2015

5/22/2015

5/29/2015

6/5/2015

6/12/2015

6/19/2015

6/26/2015

7/3/2015

7/10/2015

7/17/2015

7/24/2015

7/31/2015

8/7/2015

8/14/2015

8/21/2015

8/28/2015

9/4/2015

9/11/2015

9/18/2015

9/25/2015
(1)
trust the data first! George
(2)
decides to take corrective
(3)
action now to bring this Start Variances (Cum)
(4)
project in on schedule. Stop Variances (Cum)
(5)
George decides to have daily
standup meetings to review Figure 6-39.  Week 21 stop/start inchstone status indicates starts and
progress with verification. completions are ahead of plan.
After the first week, it is clear
Requirements Verification
to the team that there was
lots of wishful thinking going 200
on and help was needed.
An additional resource was 150
Requirements

obtained from engineering,


along with management
100
reserve allocation, to return
verification to plan. Unverified Reqts
50 Verified Reqts
Verification Plan
Figure 6-40.  Requirements
Verification progress measures 0
indicates that successful verifi­ 20 21 22 23 24 25 26 27
cation is falling behind the plan. Week

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 49


Guide to Systems Engineering Measurement
®

Week 22
Schedule Measures continue to be reviewed weekly but only Requirements Verification is shown here for
analysis.
Requirements Verification
200

150
Requirements

100

Unverified Reqts
50 Verified Reqts
Verification Plan
0
20 21 22 23 24 25 26 27
Week
Figure 6-41.  Requirements Verification progress measures indicates that verification is recovering.

Fifth Monthly Review


The fifth monthly program review continued to show green measures across the board. Final delivery was
just a few weeks away. Program Measures showed no sign of concern.

Cumulative SPI & CPI

1.1
1.08
1.06 SPI Cum
1.04 CPI Cum
1.02
Index

1
0.98
0.96
0.94
0.92
0.9
Mar Apr May Jun Jul Aug Sep

Figure 6-42.  Cumulative SPI & CPI indicates good schedule and cost status.

50 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

Program Risk Assessment

E Risk #1  Long lead times for


components will delay integration
and test
D
Probability

Low Risk • Retired — All components received


C Medium Risk
Risk #3  Reduced size widget will not
HighRisk meet power dissipation threshold
B 2 3
• Identified parts with lower power
A consumption that reduced
dissipation need
1 2 3 4 5
Impact
Figure 6-43.  Program Risk Status indicates further improvement in Risk #3 and that Risk #1 has been retired.

Requirements Volatility

30 PDR CDR

25
% Requirements Change

20

15

10

0 Figure 6-44.  Requirements Volatility Status


Baseline Mar-15 Apr-15 May-15 Jun-15 Jul-15 Aug-15
indicates excessive requirements changes
Requirement Changes % Planned Requirement Changes % at CDR with recovery to an acceptable level.

Requirements Compliance
100%
90%
% Requirements Satified

80%
70%
60%
by Design

50%
40%
Not Compliant
30%
Unknown Compliance
20%
Assessed as Compliant Figure 6-45.  Requirements compliance status
10%
0% indicates fewer requirements that have not
Mar Apr May Jun Jul Aug Sep yet been assessed and no non-compliances.
Month

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 51


Guide to Systems Engineering Measurement
®

Week 23
Schedule Measures continue to be reviewed weekly but only Requirements Verification is shown here
for analysis.
Requirements Verification

200

150
Requirements

100

Unverified Reqts
50 Verified Reqts
Verification Plan
0
20 21 22 23 24 25 26 27
Week

Figure 6-46.  Requirements Verification progress measures indicates that verification


is progressing but is slightly behind plan.

52 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

Week 24
The Requirements Verification Measure continues to make progress. Schedule continues to be ahead of plan.
Weekly Inchstones (Cum)
140
130
120
110
100
90
Planned Starts (Cum)
Number of Tasks

80
70 Actual Starts (Cum)
60
Planned Stops (Cum)
50
40
Actual Stops (Cum)
30
20
10
0
3/27/2015
4/3/2015

4/10/2015

4/17/2015

4/24/2015

5/1/2015

5/8/2015

5/15/2015

5/22/2015

5/29/2015

6/5/2015

6/12/2015

6/19/2015

6/26/2015

7/3/2015

7/10/2015

7/17/2015

7/24/2015

7/31/2015

8/7/2015

8/14/2015

8/21/2015

8/28/2015

9/4/2015

9/11/2015

9/18/2015

9/25/2015
Cumulative Late Starts/Stops
5

1
Number of Tasks

0
3/27/2015
4/3/2015

4/10/2015

4/17/2015

4/24/2015

5/1/2015

5/8/2015

5/15/2015

5/22/2015

5/29/2015

6/5/2015

6/12/2015

6/19/2015

6/26/2015

7/3/2015

7/10/2015

7/17/2015

7/24/2015

7/31/2015

8/7/2015

8/14/2015

8/21/2015

8/28/2015

9/4/2015

9/11/2015

9/18/2015

9/25/2015

(1)

(2) Figure 6-47.  Week 24 stop/


(3) start inchstone status indicates
(4)
Start Variances (Cum) starts and completions are
Stop Variances (Cum)
ahead of plan.
(5)

Requirements Verification

200

150
Requirements

100

Unverified Reqts
50 Verified Reqts
Verification Plan Figure 6-48.  Requirements
Verification progress measures
0 indicates that verification is on
20 21 22 23 24 25 26 27 plan.
Week

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 53


Guide to Systems Engineering Measurement
®

Week 25
The Requirements Verification Measure progress continues to be monitored to ensure that the project is
successfully completed.

Requirements Verification

200

150
Requirements

100

Unverified Reqts
50 Verified Reqts
Verification Plan
0
20 21 22 23 24 25 26 27
Week
Figure 6-49.  Requirements Verification progress measures indicate that verification
is ahead of plan.

Week 27
The Requirements Verification Measure reaches 100%. All other technical measures are on plan as well.
Discussions turn to de-staffing the project now that effort is nearly complete.

Requirements Verification

200

150
Requirements

100

Unverified Reqts
50 Verified Reqts
Verification Plan
0
20 21 22 23 24 25 26 27
Week
Figure 6-50.  Requirements Verification progress measures indicate completion
(all requirements have been successfully verified).

54 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

Final Mownthly Review


The final monthly program review shows green program measures across the board. Final delivery is made
on time and within budget. Lee congratulated George on a job well done and asked about his availability
for a new high profile project.
Cumulative SPI & CPI

1.1
1.08
1.06
1.04
1.02
Index

1
0.98
0.96
0.94 SPI Cum
0.92 CPI Cum
0.9
Mar Apr May Jun Jul Aug Sep

Figure 6-51.  Cumulative SPI & CPI indicates schedule completion, under budget.

Program Risk Assessment

D
Probability

Low Risk
C Medium Risk
HighRisk
B

A
1 2 3 4 5
Impact

Risk #2  New Software drivers will be Risk #3  Reduced size widget will not
required for new components meet power dissipation threshold
• Retired — All software drivers • Retired — Power dissipation
implemented and verified threshold met

Figure 6-52.  Program Risk Status indicates that all risks have been retired.

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 55


Guide to Systems Engineering Measurement
®

Final Monthly Review  (continued)

Requirements Volatility

30 PDR CDR

25
% Requirements Change

20

15

10

0
Baseline Mar-15 Apr-15 May-15 Jun-15 Jul-15 Aug-15

Requirement Changes % Planned Requirement Changes %

Figure 6-53.  Requirements Volatility Status indicates Volatility of zero at project


completion, as expected.

Requirements Compliance

100%
90%
% Requirements Satified

80%
70%
60%
by Design

50%
40%
Not Compliant
30%
Unknown Compliance
20%
Assessed as Compliant
10%
0%
Mar Apr May Jun Jul Aug Sep
Month

Figure 6-54.  Requirements compliance status indicates 100% compliance as


verification is completed.

56 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

Project Closeout
As part of project closeout, the team took some time to document lessons learned.

1. Tailoring your measures collected to the critical few for your project avoids wasted effort while
providing needed insights. Select measures that are easy to collect, at a high level if necessary,
and drill in only when there is a need.

2. Weekly schedule late starts and stops allowed early corrective action to maintain schedule.

3. Planning for all tasks helped prevent technical debt from getting out of control. “Hope is not a plan.”

4. Technical Performance Measures (TPM) provided focus on the technical requirements.

5. Uncertainty measures linked technical maturity to decisions and actions.

6. Trust the data first – take action based on data received.

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 57


Guide to Systems Engineering Measurement
®

7 ACRONYMS

ACRONYM WHAT IT MEANS

CAM Control Account Manager


CCB Change Control Board
CDR Critical Design Review
CDRL Contract Data Requirements List
CPI Cost Performance Index
EVMS Earned Value Management System
FAA Federal Aviation Administration
FDA Food and Drug Administration
HW Hardware
IMS Integrated Master Schedule
INCOSE International Council on Systems Engineering
IV&V Integration, Verification and Validation
MBSE Model Based Systems Engineering
MOE Measures of Effectiveness
MOP Measures of Performance
MWG Measurement Working Group
PDR Program Design Review
PMI Project Management Institute
PMP Project Management Professional
PR Problem Report
SE Systems Engineer
SE Systems Engineering
SOW Statement of Work
SPI Schedule Performance Index
SW Software
TO Technical Operations
TPM Technical Performance Measures
TRL Technology Readiness Level
V&V Verification and Validation

58 INCOSE-TP-2015-001-01 | VERSION: 1.0 | 21 MARCH 2015


MEASUREMENT WORKING GROUP 

(INTENTIONALLY BLANK)

GUIDE TO SYSTEMS ENGINEERING MEASURMENT 59


®

INCOSE Publications Office


7670 Opportunity Road, Suite 220
San Diego, CA  92111-2222  USA
Copyright © 2015 International Council on Systems Engineering

S-ar putea să vă placă și