Documente Academic
Documente Profesional
Documente Cultură
COST
ESTIMATING
HANDBOOK
Joint
Government/Industry
Initiative
Fall 1995
SPONSORED BY DOD
Parametric Cost Estimating Handbook
PREFACE
This Handbook is intended to be used as a general guide for implementing and evaluating
parametric based estimating systems, and as the text material for a basic course in parametric
estimating techniques. The information contained in this Handbook complements and enhances
the guidance provided by DCAA Cost Audit Manual (CAM), FARs/DFARs, and other
government regulations or public laws relating to cost estimating. The Handbook is structured so
each chapter can be used as a stand alone guide and reference. If the reader is interested in only
one subject, then one chapter can be read without reading any other. Expected users of this
Handbook include cost model developers, analysts, auditors, and all levels of cost management.
Every phase of the government’s procurement process is being reviewed and changed to
reflect the realities of today’s cost conscious environment. Among the many aspects of the
procurement process under scrutiny is the use of traditional cost estimating tools and processes.
These tools and processes, at times, have proven expensive to maintain and operate and did not
always provide realistic or timely results. Today’s Acquisition Reform mind-set dictates that the
cost estimating community consider a different set of tools and processes which are less labor
intensive to operate and result in credible cost estimates.
It is partly from the impetus of Acquisition Reform that this document has been created.
All of us in the cost estimating community need to find ways to perform our work more cost
effectively. Pushed by the common goal of reducing proposal preparation cost, contractor and
oversight management are pursuing a cooperative approach to problem solving. This Handbook
is an example of such a cooperative effort. If the Handbook succeeds to help produce better
estimates, analyses, or audits, then the process has been successful.
This is the first edition of the Handbook. It will be used and tested at pilot contractor
sites. The Initiative Steering Committee will be responsible for receiving and incorporating any
comments for improving the Handbook from the pilot sites and others, and publishing the second
edition of the Handbook. The second edition will be issued in the summer of 1996. A user reply
sheet is provided in this Preface to facilitate submission of your suggestions. An Order Form for
this Handbook is included at the end of the Preface.
The preparation of this Handbook included the review of nearly four thousand pages of
documentation. Some of that total is included here. The material has come from many sources,
both individual and organizational contributors. There is no way all of the contributors can be
acknowledged, since most of them are anonymous. There are, however two who should be
Parametric Cost Estimating Handbook
mentioned at this time, besides those acknowledged later in the test. First, the Parametric
Estimating Initiative Steering Committee for authorizing this Handbook and specifically,
NAVSEA for funding the Handbook’s development.
Thanks to the rest of the Committee for their inputs, reviews and critiques of the draft.
And, especially, the Space Systems Cost Analysis Group (SSCAG) who supported the initial
concept of the Handbook, and the may individuals within that organization who conceptualized,
critiqued, and developed much of the material.
Bill Brundick,
General Editor
Parametric Cost Estimating Handbook
Executive Chairman
Bob Scott Bob Spiker
Executive Director of Contract Management Controller, Financial and Government Accounting
Defense Contract Management Command - AQC Westinghouse Electric Corporation - ESG
Cameron Station P.O. Box 17319, Mail Stop A585
Alexandria, Virginia, 22304-6190 Baltimore, Maryland 21203-7319
703-274-0821 phone 410-765-2913 phone
703-274-0823 fax 410-765-6038 fax
Michael Thibault
Deputy Director
DCAA Headquarters
Cameron Station
Alexandria, Virginia 22304-6178
703-274-7281 phone
703-617-7450 fax
Co-Chairman
Jim Collins David Eck
Parametric Estimating Specialist Chief, Policy Formulation Division
Westinghouse Electric Corporation D61-ESG DCAA Headquarters
P.O. Box 1693, Mail Stop 1293 Cameron Station, Room 4A250
Baltimore, Maryland 21203 Alexandria, Virginia 22304-6178
410-765-8033 phone 703-274-7314 phone
410-765-5289 fax 703-617-7452
Members
Jim Balinskas Dean Boyle
Contract Pricing & Finance Division Deputy Technical Assessment Manager
NASA Headquarters, Code HC DPRO - Northrup Grumman
300 E Street SW Defense Logistics Agency, DCMC
Washington, DC 20546 Bethpage, New York 11714-3593
202-358-0445 phone 516-575-9742 phone
202-358-3082 fax 516-575-6527
We would appreciate specific comments such as the suggested alternative language and rationale
to support the language, additional information that would add value to the Handbook, or
existing information that can be deleted. Specific examples that enhance the Handbook are
welcome.
ORDER FORM
for the
YES, please send me _____ copies of the Parametric Cost Estimating Handbook.
There is no charge for the Handbook. Requestors are encouraged to limit requests
to one copy per organization.
Attention/Individual
Street Address
FOREWORD
The Committee is also sponsoring a pilot program to test the use of parametric tools. As
of September 1995, eleven companies are participating in the pilot program. The pilot program
is expected to last until the Summer of 1996. This Handbook will be tested at the pilot program
sites. The Committee will update the Handbook to incorporate comments, best practices, and
additional examples developed at the pilot sites. The Committee also invites your comments on
the Handbook and any examples you may have. A User Reply Sheet is provided in the Preface
to facilitate your input. The Committee expects to publish the second edition of the Handbook
by the Summer of 1996.
The Committee hopes, with your help, to make the Handbook the guide people turn to
when using and evaluating parametric tools.
Executive Chairmen
Joint Government/Industry
Parametric Cost Estimating Initiative
Steering Committee
Parametric Cost Analysis Handbook
TABLE OF CONTENTS
FOREWORD
PREFACE
i
Parametric Cost Analysis Handbook
ii
Parametric Cost Analysis Handbook
iii
Parametric Cost Analysis Handbook
iv
Parametric Cost Estimating Handbook
CHAPTER 1
INTRODUCTION, BACKGROUND,
AND
DEFINITIONS
1
Parametric Cost Estimating Handbook
CHAPTER I
INTRODUCTION
2
Parametric Cost Estimating Handbook
the same, or superior results. Parametric estimating approaches fit very well into overall BPR
methods.
The importance of Business Process Reengineering was recently underscored by Lloyd K.
Mosemann, II, Deputy Assistant Secretary of the Air Force, in his closing Keynote Address
entitled “Predictability,” to the Software Technology Conference in Salt Lake City, Utah, on
Thursday, April 14, 1994. Although addressing the software process, Mr. Mosemann’s
comments are relevant to the cost estimating process in general. In summary, he said, in part:
“There seems to be an inability within the software community, in general, to predict how
much a software system will cost, when it will become operational, and whether or not it will
satisfy user requirements. We need to deliver on our promises.
“We have a poor track record regarding predictions. A 1979 GAO report concluded that only
two percent of software contracted for was useable exactly as delivered. Twenty different studies
have come to the same conclusion. Therefore, we in the DoD are focusing our attention on
process improvement: These include: specific metrics usage plans, reuse plans, peer
inspections, process controls, proposed architectures in executable code, and government access
to contractor on-line development environments.
“This emphasis on process will give all of us in the software community a greater confidence
that the prospective contractor will deliver the promised product on time and on budget.”
Mr. Mosemann’s emphasis on process improvements to improve the quality of predictability
of cost and schedule fits nicely with the concept of expanding the use of parametric tools in the
cost estimating workplace. Parametrics can play a role in the BPR process as was underscored
by Anthony A. DeMarco in his article, CAPE (Computer Aided Parametric Estimating for
Business process Re-Engineering, in the PRICE Newsletter, October 1994. In his article, in
summary, Mr. DeMarco states that:
“Business Processing Reengineering (BPR) is the reengineering of an organization by
examining existing processes and then revamping and revising them for incremental
improvement. It is doing more with less and sometimes entails “starting over.”
3
Parametric Cost Estimating Handbook
“Parametric tools can assist BPR. On one level, they can improve and streamline the BPR
phases. On another level, parametric technology is the ‘best practice’ for estimating. Parametric
tools bring speed, accuracy and flexibility to estimating processes, processes that are often
bogged down in bureaucracy and unnecessary detail.”
The need to reengineer the DoD cost estimating process (Acquisition Reform initiatives)
became self evident to certain people from both government and industry. A Steering Committee
was chartered by government and industry executives to explore the role played by parametrics in
the cost estimating process. One goal was to determine what barriers, if any, exist to expanding
the role of parametrics, and to develop the action plans to overcome those barriers. The
committee consists of representatives from all of the Armed Services, the oversite community,
selected contractors. This Handbook has been authorized by that Steering Committee.
The Handbook is intended to be used by both model developers and model reviewers,
their management or oversite, either technical or financial. Government and industry cost
analysts and auditors who utilize CER’s and/or parametric cost models to develop or evaluate an
estimate generated with these parametric tools should find the Handbook useful. It is also
intended to be utilized as a source document by trainees within a generic parametric cost
estimating training program.
This Handbook includes basic information concerning data collection, Cost Estimating
Relationship (CER) development, parametric cost models, and statistical techniques.
Parametric techniques are a credible cost estimating methodology that can provide
accurate and supportable contractor estimates, lower cost proposal processes, and more cost-
effective estimating systems.
4
Parametric Cost Estimating Handbook
An estimating workbench context model is shown in Figure I-1. The model indicates the
tools required within the estimating community of contractors, customers and government
agencies. Figure I-2 is a graphical representation of the complete parametric cost estimating
process. The figure indicates the process from inputs through modeling and into a post processor
phase. The post processor allows for the conversion of parametric output into a cost proposal.
BACKGROUND
The origins of parametric cost estimating date back to World War II. The war caused a
demand for military aircraft in numbers and models that far exceeded anything the aircraft
industry had manufactured before. While there had been some rudimentary work from time to
time to develop parametric techniques for predicting cost, there was no widespread use of any
cost estimating technique beyond a laborious buildup of labor-hours and materials. A type of
statistical estimating had been suggested in 1936 by T. P. Wright in the Journal of Aeronautical
Science. Wright provided equations which could be used to predict the cost of airplanes over
long production runs, a theory which came to be called the learning curve. By the time the
demand for airplanes had exploded in the early years of World War II, industrial engineers were
using Wright’s learning curve to predict the unit cost of airplanes.
In the late 1940’s, the DoD, and, especially, the United States Air Force began a study of
multiple scenarios concerning how the country should proceed into the age of jet aircraft,
missiles and rockets. The Military saw a need for a stable, highly skilled cadre of analysts to
help with the evaluation of such alternatives. Around 1950, the military established the Rand
Corporation in Santa Monica, California, as a civil “think-tank” for independent analysis. Over
the years, Rand’s work represents some of the earliest and most systematic studies of cost
estimating in the airplane industry.
The first assignments given to Rand concerned studies of first and second generation
ICBM’s, jet fighters and jet bombers. While the learning curve technique still proved useful for
predicting the behavior of recurring cost, there were still no techniques other than detailed labor-
hour and material estimating for projecting what the first unit cost might be (a key input to the
5
Parametric Cost Estimating Handbook
ESTIMATING WORKBENCH
CONTEXT MODEL
PARAMETRIC
MODELS/TOOLS
REQUEST FOR
CUSTOMERS SUPPORTING DATA
SOLICITATION RESPONSE
BUYERS/REQUESTORS,
CONTRACTING OFFICERS
PROGRAM
MANAGERS
PROPOSAL/COST DATA,
•
• HISTORICAL DATA,
• CONTRACTOR DATA, AUDITORS
• COST ELEMENT ANALYSIS
• CERs
• OTHER PARAMETRIC DATA
• OTHER NONCOST DATA ACADEMIA
• EXTERNAL DATA -
COST COMPARISONS
• BACKGROUND DATA -
HISTORICAL DATA USERS
OTHER GOV’T
AGENCIES/SOURCES
FIGURE I-1
6
Parametric Cost Estimating Handbook
learning curve equation). Worse still, no methods were available for quickly estimating the non-
recurring costs associated with research, development, testing and evaluation (RDT&E). In the
defense business in the early to mid 1950’s, RDT&E had suddenly become a much more
important consideration. There were two reasons for that fact. First, a shrinking defense budget
(between World War II and the Korean War) had cut the number of production units for most
military programs, and second, the cost of new technology had greatly magnified the cost of
development. The inability to quickly, and accurately, estimate RDT&E and first unit production
costs had become a distinct problem.
7
Parametric Cost Estimating Handbook
Fortunately, within Rand, a cost analysis department had been started in 1950. This
group proved to be prolific contributors to the art and science of cost analysis -- so much so that
the literature of aerospace cost estimating of the 1950’s and 1960’s is dominated by the scores of
Rand cost studies that were published during that time. In the mid 1950’s, Rand developed the
most basic tool of the cost estimating discipline, the Cost Estimating Relationship (CER), and
merged the CER with the learning curve to form the foundation of parametric aerospace
estimating. This estimating approach is still used today.
By 1951, Rand derived CER’s for aircraft cost as a function of such variables as speed,
range, and altitude. Acceptable statistical correlations were observed. When the data was
segregated by aircraft types (e.g., fighters, bombers, cargo aircraft, etc.), families of curves were
discovered. Each curve corresponded to different levels of product or program complexity. This
Parametric stratification especially helped clarify development cost trends. Eventually, a useable
set of predictive equations were derived which were quickly put to use in Air Force planning
activities.
The use of CER’s and data stratification were basic breakthroughs in cost estimating,
especially for RDT&E and first unit costs. For the first time, cost analysts saw the promise of
being able to estimate relatively quickly and accurately the cost of proposed new systems. Rand
extended the methods throughout the 1950’s, and by the early 1960’s, the techniques were being
applied to all phases of aerospace systems.
Since these rather humble beginnings, the state-of-the-art in parametric estimating has
been steadily improving by an explosive growth in the number of practitioners, important
methodological improvements, and greatly expanded databases. All of the major aerospace
contractors and government aerospace organizations have dedicated staffs of parametricians
who maintain and expand databases, develop parametric cost models, and utilize the tools of
parametrics to make estimates of new and ongoing programs. NASA and the DoD routinely use
parametric estimates to form the basis of new project cost commitments to Congress. The
contractor community also routinely uses parametric cost models, especially during product
concept definition. These estimates are used for decision making regarding bid strategies and
are used as submittals to the government. It is only at the production and full scale development
phase that parametrics are not commonly utilized for official proposal submissions (although
8
Parametric Cost Estimating Handbook
contractors still frequently use parametrics to generate target costs and cross-checks on the
labor-material/buildup estimates).
Over the past several years industry and professional estimating associations (e.g.,
International Society of Parametric Analyst (ISPA), Society of Cost Estimating and Analysis
(SCEA), and the Space Systems Cost Analysis Group (SSCAG)) have been actively working
with both Defense Contract Management Command (DCMC) and Defense Contract Audit
Agency (DCAA) to explore the expanded opportunities for the use of parametric cost estimating
techniques in firm business proposals. ISPA was formed in 1978 when a parametric estimating
user’s group evolved into a more generic Society. The Space Systems Cost Analysis Group
formed in 1977 under the sponsorship of the U.S. Air Force Space Division, with a mission to:
1. Promote Cost Analysis Research
2. Develop new tools to improve cost estimating techniques
3. Promote sound practices, and
4. Provide a forum for government and industry cost analysts concerned with the
development and production of space-design hardware and software.
Then, in April 1994, a joint Industry and Government workshop on Parametric Cost
Estimating occurred at the Defense Contract Audit Institute in Memphis, TN. Under the
initiative and leadership of the DCMC, the DCAA, and industry proponents, a group of
knowledgeable government and industry executives, policy formulators, and parametric
practitioners were assembled to evaluate why there is not greater use of parametric cost
estimating in DoD and NASA business proposals; identification of the barriers to expanded use
of parametrics; and, action planning to take advantage of identified opportunities.
At the conclusion of the workshop, it became clear to the participants that there were no
barriers which precluded further implementation and use of parametric cost estimating by
contractors in DoD or NASA business proposals. Rather, barrier analysis and actions
recommended focused on the need for industry leaders to demonstrate that parametric
estimating systems can be relied upon by the Government customers, and the need for the
Government to train employees so that there exists a clear message that valid parametric
estimates are a useful and often cost effective estimating approach.
9
Parametric Cost Estimating Handbook
10
Parametric Cost Estimating Handbook
considered valid estimators by the government. However, the technique also uses cost-to-
noncost CER’s which require additional analysis to determine their validity and acceptability as
estimating tools.
Parametric techniques focus on the cost drivers, not the miscellaneous details. The
drivers are the controllable system design or planning characteristics and have a predominant
effect on system cost. Parametrics uses the few important parameters that have the most
significant cost impact on the product(s), hardware or software, being estimated.
COST REALISM
A widely used term today is “cost realism.” Now, no one expects a cost estimate to
precisely predict what a hardware or software product or a time and material service will cost.
So, cost realism is not about the exact cost estimate. It’s about the system of logic, the
assumptions about the future, and the reasonableness of the historical basis of the estimate. That
is, it’s about the things that make up the foundation of the estimate.
11
Parametric Cost Estimating Handbook
* If the program is competitive, has the contractor or government program office created
program expectations that are far too optimistic?
The cost estimator or analyst must ensure that they are working toward the goal of cost
realism. It doesn’t matter whether or not they are employed by private industry, or the customer
as a cost analyst or an auditor. If a contractor chooses to accept a management challenge in a
competitive procurement, that’s certainly acceptable. However, the basis for the challenge
should be clearly identified.
There is no easy answer to the cost realism dilemma we all face. Unreasonable biases
and expectations from contractor and customer have driven the cost estimating process in the
past, and personal and programmatic motivations may continue to drive it in the future. But one
thing is certain: the cost estimating process will continue to confront future unknowns. These
unknowns are what make the cost estimating job one of the most difficult there is. But sound
assumptions, high quality historical data, and unbiased analysts and estimators will improve the
process for all.
12
Parametric Cost Estimating Handbook
CHAPTER II
13
Parametric Cost Estimating Handbook
CHAPTER II
This chapter provides guidelines concerning data types, data sources, data normalization
and adjustment techniques. This chapter also includes utilization recommendations about data to
support the development and use of auditable CERs and cost models.
GENERALIZATIONS
A universal format for collecting technical and cost information is the Work Breakdown
Structure (WBS). (See Appendix B) The WBS provides for uniform definitions and collection of
cost and technical information. MIL-STD-881B provides guidelines for establishing the WBS at
DoD, Service and contractor levels.
Historical cost and labor hours data are required as a basis for cost estimating, and
parametric estimating is no exception. Data should be collected and maintained in a manner that
provides a complete audit trail, and expenditure dates should be recorded so that dollar valued costs
can be adjusted for inflation.
Also required is Technical Non-Cost Data that describes the physical, performance and
engineering characteristics of a system, sub-system or individual item. For instance, weight is a
common non-cost variable used in CER’s and parametric estimating models. (Other typical
examples of cost driver variables include horsepower, materials of construction, watts, thrust,
length, etc.)
A fundamental requirement for the inclusion of a non-cost variable in a CER is that it be a
statistically significant predictor of cost (that is, it should be a cost driver).
Relevant program data including development and production schedules, quantities
produced, production rates, equivalent units, breaks in production, significant design changes, and
anomalies such as strikes, explosions, and other natural disasters are also necessary to fully explain
any perturbations in historical data. Such perturbations may exhibit themselves in a profile of
14
Parametric Cost Estimating Handbook
monthly cost accounting data as the labor hour charging may show an unusual "spike" or
"depression" in the level of charged hours. Such historical information comes from knowledgeable
program personnel or program records (also known as program "memory").
The collecting point for cost and labor hours data is, in most instances, called the general
ledger or a company accounting system. All cost and labor hours data, used in parametric CER’s or
cost models, must be consistent with, and traceable back to, the original collecting point (the
source). The data should also be consistent with accounting procedures and cost accounting
standards.
Technical non-cost data comes from engineering drawings, engineering specifications,
certification documents, or direct experience (i.e., weighing an item). Schedule, quantity and
equivalent units, and similar information comes from Industrial Engineering, Operations
Departments, program files or other program intelligence.
Inflation indices normally combine external and internal information. Examples of external
information used in these determinations include the Consumer Price Index (CPI), Producer Price
Index (PPI), Commodity Price Indices and other forecasts of inflation from various econometric
models.
There are other external sources of data including databases containing pooled and
normalized information from various places (other companies or public record information).
Although such information can often be useful, weaknesses of these sources include:
(1) The inability of the user to have knowledge of the procedures (i.e., accounting) used
by the other contributors.
(2) The treatment of anomalies (how they were handled) in the original data.
(3) Knowledge of the manufacturing processes used and how they compare to the current
scenario being estimated.
(4) The inability to accurately forecast future indices.
Internal contractor information includes analyses such as private corporate inflation studies,
or "market basket" analyses. Such interval information provides data specific to a company's
product line(s) (i.e., radar products) that could be relevant to a generic segment of the economy as a
whole (i.e., electronics); etc. Such specific analyses would normally be prepared as part of an
15
Parametric Cost Estimating Handbook
exercise to benchmark government provided indices (the CPI), and to compare corporate
performance to broader standards.
It is important to realize that sources of data can be almost unlimited, and all relevant
information should be considered in a parametric analysis, if practical. Although major sources are
described above, data sources should not be constrained to a specific list.
Any data included in calculating parametric parameters will vary between model
developers. However, the way in which parametric models are calculated from historical data and
the way they are applied in the estimating process should be consistent within individual estimating
systems.
What follows below are some of the more significant adjustments that may have to be made
to historical parametric cost data.
Consistent Scope
Adjustments are appropriate for differences in program or product scope between the
historical data and the estimate being made. For example, if the systems engineering department
made a comparison of five similar programs and then realized that only two of the five had design
to cost (DTC) requirements. To normalize the data, the DTC hours were deleted from the two
programs to create a consistent systems scope and definition for CER development.
Anomalies
Historical cost data should be adjusted for anomalies (unusual events), prior to CER
analysis, when it is not reasonable to expect these unusual costs to be present in the new projects.
The adjustments and judgments used in preparing the historical data for analysis, should be fully
documented. For example, a comparison has been made to compare the development test program
from five similar programs and then certain observations are made (from history and interviews)
that one of the programs experienced a major test failure (e.g., qualification, ground test, flight test).
16
Parametric Cost Estimating Handbook
A considerable amount of labor resources were required to fact find and then determine the root
cause of and develop an action plan for a solution. Should the hours be left in or deleted?
Improved Technology
Cost changes, due to changes in technology, are a matter of judgment and analysis. All
bases for such adjustments should be documented and disclosed. For example, electronic circuitry
was originally designed with discreet components, but now the electronics are ASIC technology. A
hardware enclosure once was made from aluminum and now is made, for weight constraints, of
magnesium. What is the impact on the hours? Perfect historical data may not exist, but judgment
and analysis should supply reasonable results.
For example, suppose there are four production (manufacturing hours) lots of data that look
like this:
Lot 1 = 256,000 = 853 hours/unit
Lot 2 = 332,000 = 738 hours/unit
Lot 3 = 361,760 = 952 hours/unit
Lot 4 = 207,000 = 690 hours/unit
Clearly, Lot 3's history should be investigated. It is not acceptable to merely "throw out"
Lot 3 and work with the other three lots. A careful analysis should be performed on the data to
determine why it behaved the way it did. There may have been a strike, or possibly an unusual and
serious engineering problem impacted production costs. In any event, careful analysis is important.
Inflation
There are no fixed ways to establish universal inflation indices (past, present or future) that
fit all possible situations. Inflation indices are influenced by internal considerations as well as
external inflation rates. Therefore, while generalized inflation indices may be used, it may also be
possible to tailor and negotiate indices used on an individual basis to specific labor rate agreements
(e.g., FPRAs) and the actual materials used on the project. Inflation indices should be based on the
17
Parametric Cost Estimating Handbook
cost of materials and labor on a unit basis (piece, pounds, hour) and should not include other
considerations like changes in manpower loading or the amount of materials used per unit of
production. The key to inflation adjustments is consistency. If cost is adjusted to a fixed reference
date for calibration purposes, the same type of inflation index must be used in escalating the cost
forward or backwards, from the reference date, and then to the date of the estimate.
Learning Curve
The learning curve, as originally conceived, analyses labor hours over successive production
units of a manufactured item. The curve is defined by the following equation:
Hours/Unit = First Unit Hours *Ub
or
Fixed Year Cost/Unit = First Unit Cost *Ub
Where: U = Unit number
b = Slope of the curve
In parametric models, the learning curve is often used to analyze the direct cost of
successively manufactured units. Direct Cost equals the cost of both touch labor and direct
materials - in fixed year dollars. Sometimes this may be called an improvement curve. The slope is
calculated using hours or constant year dollars. A more detailed explanation of learning curve
theory is presented in Chapter III.
Production Rate
Production rate effects (changes in production rate, i.e., units/months) can be calculated in
various ways. For example, by adding another term to the learning or improvement curve equation
we would obtain:
Hours/Unit = Ub * Rr
or,
Fixed Yr $/Unit First Unit $ * Ub * Rr
Where: U = Unit number
b = Learning curve slope
18
Parametric Cost Estimating Handbook
R = Production rate
r = Production rate curve slope
The net effect of adding the production rate effect equation (Rr) is to adjust First Unit $ for
rate. The equation will also yield a different "b" value.
Rate effect may be ignored or can be treated in different ways in different models. If
possible, rate effects should be derived from historical data program behavior patterns observed as
production rates change while holding the learning curve constant.
The rate effect can vary considerably depending on what was required to effect the change.
For example, were new facilities required or did the change involve only a change in manpower or
overtime?
Once data has been collected and normalized, cost models can be developed. Although we
will discuss cost models in much more depth in Chapters IV and V, a few comments are relevant
here.
There are two general types of cost models: internal (contractor developed) and
commercially available. Internal, contractor developed models are derived from unique contractor
data and generally do not require calibration since they have been calibrated in a defacto manner.
On the other hand, commercial models are based on more universal data, and almost always need
some form of calibration to be useful.
The cost driver equation(s) utilized in a commercial cost model are based on a database
external to the specific data being used to support the current estimate. Calibration, then, is the
process of computing a multiplier(s), to be used with the general purpose equation(s), such that the
combined equation(s) will predict the cost as reflected by the technical and programmatic data
being used to support the estimate.
Specialized (Internal) cost models are based directly on the data being used to support the
estimate. Since the CER’s are derived directly from the supporting data, the model is, by definition,
calibrated.
19
Parametric Cost Estimating Handbook
It is the combined use of the model with the estimating process that must achieve acceptable
results to provide a basis for the validation of the model. It may also involve disclosure of how the
model works so that the effects of scaling and heuristic analysis can be evaluated by management,
customers or auditors.
Validation implies that interested parties have agreed that the model is a valid and
acceptable estimating tool, and predicts costs within a reasonable range of accuracy.
Almost certainly, any data utilized will have to undergo a review and audit, so proper
documentation is a must. Documentation should include:
(1) A record of all mathematical formulas and "goodness of fit" and other statistics.
(2) A record of adjustments to the original cost data and why those adjustments were made.
(3) A statement of work for the historical data; judgment factors should be identified with
rationale.
(4) An audit trail for the data used for validation that will allow rederivation of the adjusted
data and/or CER from the original source(s). This information must be identified and
available upon request.
Any CER’s and data used in a cost model will need to be updated periodically and/or as
agreed to with the PCO. The updating procedure should satisfy the Truth in Negotiation Act
20
Parametric Cost Estimating Handbook
(TINA) requirements by the time of agreement on price or another time as agreed upon by the
parties.
When preparing a cost estimate, look for all credible data sources. If at all possible, use
primary sources of data.
There are nine main sources of data and they are listed in the chart below.
21
Parametric Cost Estimating Handbook
22
Parametric Cost Estimating Handbook
Collecting the data to produce an estimate, and evaluating the data for reasonableness, is a
very critical and time-consuming step of the estimating process.
When collecting the data needed to integrate cost, schedule, and technical information for
an estimate, it is important to obtain cost information, and also the technical and schedule
information. The technical and schedule characteristics of programs are important because they
drive cost. They provide the basis for the final cost.
For example, assume the cost of another program is available and a program engineer has
been asked to relate the cost of the program to that of some other program. If the engineer is not
provided with specific technical and schedule information that defines the similar program, the
engineer is not going to be able to accurately compare the programs, nor is he or she going to be
able to respond to questions a cost estimator may have regarding the product being estimated vis-à-
vis the historical data.
23
Parametric Cost Estimating Handbook
The bottom line is that the cost analysts and estimators are not solely concerned with cost
data. They need to have technical and schedule data available in order to adjust, interpret, and lend
credence to the cost data being used for estimating purposes.
A cost estimator has to know the standard sources where historical cost data exists. This
knowledge comes from experience and from those people, the so-called local experts, that are
available to answer key questions.
A cost analyst or estimator should be constantly searching out new sources of data. A new
source might keep cost and technical data on some item of importance to the current estimate. Do
not hesitate to ask anyone who might know or be able to help, since it is critical to have relevant
cost, schedule and technical information at all times.
The chart below summarizes important points about data collection and evaluation.
24
Parametric Cost Estimating Handbook
It is very important to note that when performing a statistical analysis, be sure that
functional specialists can provide realistic and reliable parameters for independent variables, given
the stage of the program being estimated. Illustrating this point, suppose a statistical relationship is
developed that has very strong correlation, and a potential cost driver has been discovered.
However, data for the same independent variable for the estimate is not available. The parametric
model would not then help with the estimate.
Finally, knowledge of basic statistics, modeling skills and an understanding of analytical
techniques is necessary to develop parametric estimating relationships. (See also Chapter III).
The above information is summarized on the chart below.
25
Parametric Cost Estimating Handbook
CER does not estimate its own database well, then its credibility with other data points outside its
database would be questionable.
To successfully use parametric model methodology, a requirement is to obtain realistic,
most-likely range estimates for the independent variable values. Sometimes functional experts are
not sure what the real performance requirements are for a new program. In such cases, a most-likely
range will provide values that reflect an assessment of the associated uncertainties or unknowns (a
Risk Analysis).
Again, the top functional experts who know the program should identify and estimate the
range of cost driving characteristics. They should also confirm the applicability of the specific
parametric cost model from a technical perspective.
The information needed to use a parametric data model is listed on the chart that follows.
A parametric estimating methodology can be used at any stage of a program's life cycle.
For example, a general parametric model may be utilized in the early, conceptual phase of a
program, although the same parametric model could be inappropriate to use in the follow-on
26
Parametric Cost Estimating Handbook
When a parametric model is applied to values outside its database range, the credibility of
the resulting estimate becomes questionable. In cost estimating, one rarely finds large, directly
applicable databases, and the source document has to be evaluated to determine if the parametric
can be applied to the current estimate. However, it is possible to develop parametric tools that
relate cost based on generic complexity values or tables. Such generalized parameters, can be
related to the task at hand by an experienced modeler that results in a good cost model, but a
parametric model always needs to make sense for the present estimate.
Additionally and before using, one should validate models based on expert opinion. This is
accomplished first by obtaining some actual, historical data points (technical, schedule, and cost) on
completed programs similar to the current program. With this data in hand, apply the model to the
actual technical and schedule information and see how well the parametric model predicts the
known cost. If the model estimated the actual costs with an acceptable margin of error, validate the
model for programs that are similar to the historical data point. Careful validation will help insure
that cost models are appropriately used.
Many times a parametric model needs to be adjusted if the new system has cost drivers
and/or requirements that are not reflected in the parametric's database. In some of these cases a
combination of parametric methodology with an approach taken from the analogy methodology can
27
Parametric Cost Estimating Handbook
be used to develop an estimate. This is accomplished by adjusting the results of the parametric
approach with scaling or complexity factors that reflect any unique requirements.
For example, parametrics and analogy approaches could be effectively combined to
estimate the costs of a new program for which technology advancement is anticipated. First, either
develop or use an existing parametric model, based on similar data points, to estimate the cost of
the program, without technology advancement.
Second, address the technology advancement by consulting with functional experts to
obtain a most-likely range for a relative scaling factor that will reflect any advancements in
technology. The relative scaling or complexity factor is applied to the result of the parametric
estimate, and adjusts it for the impact of technology advancement. This is a solid and valid
estimating approach for assessing the cost impacts of technology advancement, or other
"complexity" differences.
In such cases, the parametric model has to be adjusted so that it makes sense vis-à-vis the
current estimate.
If there exist no realistic estimates for the independent variable values for the product or
effort being estimated, then parametric models should not be used. The level of uncertainty in the
estimate will be considerable, and the validity of the estimate questionable. In cases such as this,
parametrics can only be used as a benchmark.
As stated previously, it is very difficult for functional specialists to provide a single point
estimate. Moreover, a single point estimate does not reflect the uncertainty inherent in any
functional expert's opinions. Consider requesting most likely range values rather than point
estimates, if possible.
Even after a parametric analysis has been completed it is prudent to follow it with a risk
analysis evaluation of the high risk areas and key cost drivers.
The chart below displays these points.
28
Parametric Cost Estimating Handbook
* Using a parametric model without adjustment when new system requirements are not
reflected in the parametric's database
* Using a parametric model without access to realistic estimates of the independent
variables' values for product/effort being estimated
* Requesting point estimates for independent variable value values versus a most likely
range, if possible and practical
Illustration 1
You plan to do a parametric estimate of a system using some company history. The system
under investigation is similar to a system built several years ago. The two systems compare as
follows:
Parameter Historical System Prospective System
Date of Fabrication Jul 89-Jun 91 Jul 95-Dec 95
Production Quantity 500 750
Size- Weight 22 lb. external case 20 lb. external case
5 lb. int. chassis 5 lb. int. chassis
8lb. of elec parts 10 lb. elec parts
Volume 1 cu ft-roughly cubical .75 cu ft-rec. solid
12.l x ll.5 x l2.5 8 x lO x l6.2
Other Prog Features Manual of oper incl. Minor chgs to manual
5% Elec parts as spares
Normalization: In this instance, we would require adjusting for inflation factors, the
quantity difference, the rate of production effect and the added elements in the original program (the
spare parts and manual). The analyst should be careful normalizing these data. General inflation
factors are almost certainly not appropriate to most situations. Ideally, the analyst will have a good
index of costs which is specific to the industry and will use labor cost adjustments specific to
his/her company. The quantity and rate adjustments will have to consider the quantity effects on
29
Parametric Cost Estimating Handbook
the company's vendors and the ratio of overhead and setup to the total production cost. Likewise,
with rate factors each labor element will have to be examined to determine how strongly the rate
affects labor costs. On the other hand, the physical parameters do not suggest that significant
adjustments or normalizations are required.
The first order normalization of the historic data would consist of:
1) Material escalation using industry or company material cost history.
2) Labor escalation using company history.
3) Material quantity price breaks using company history.
4) Possible production rate effects on touch labor (if any) and unit overhead costs.
Because both cases are single lot batches, and are within a factor of two in quantity, only a
small learning curve or production rate adjustment likely is required.
Illustration 2
You are considering building some equipment for the first time. Relevant labor effort
history relates to equipment built some time ago but reliable data is only available from the third
lot, beginning at unit 250. The available history indicates a cost improvement curve on this history
of 95% from the fourth lot on. This history is considered the best available.
Normalization here requires two steps. Unless you believe the early lot cost improvement
curve is the same as the historical lot improvement curve of the later lots (unlikely), you will need
to identify cost improvement curves which may be applicable. These data points will have to be
normalized to some common condition and the resulting improvement curve applied to the case at
hand. Suppose the relevant history is as follows:
30
Parametric Cost Estimating Handbook
Clearly, the selection of an appropriate learning curve will be a judgment call that will
depend upon the case at hand and how it relates to the historical data. For instance, there is some
suggestion that the fewer the number of lots, the steeper the learning curve. How does the
production rate and quantity match the data? In this situation, the analyst could break up the
production history and examine the relationship of the learning curve to the lot sequence, the
quantity, and the production rate. The date may reflect the technology involved and any historical
comparisons will have to include the impact of relevant technology.
SUMMARY
In summary, let’s explore why there might be a database problem. Given the large amount
data available on a hardware system, one might wonder why building a database presents such a
problem. Let's consider a few of the more important reasons:
(1) Information in the wrong format. While we do indeed possess a great deal of data, in
many cases this data is not in an appropriate format to be very useful in developing CER’s. The
reason is primarily due to the fact that the information systems which generate required data were
originally established to meet organizational objectives that are no longer relevant or are not
concerned about cost estimating or analysis. In short, the orientation of a large number of past and
existing information systems is toward the input side with little or no provision for making
meaningful translations reflecting output data useful in CER development or similar types of
analysis.
(2) Differences in definitions of categories and inconsistent WBS's and WBS categories.
We also have a problem in building a database when the definition of the content of cost categories
fails to correspond to the definition of analogous categories in existing databases. For example,
some analysts put engineering drawings into the data category while others put engineering
drawings into the engineering category. A properly defined WBS product tree and dictionary can
avoid or minimize these inconsistencies.
31
Parametric Cost Estimating Handbook
(3) The Influence of Temporal Factors. It goes without saying that historical data is
generated over time. This means that numerous dynamic factors will influence data being collected
in certain areas. For example, the definition of the content of various cost categories being used to
accumulate the historical data may change as a system evolves. Similarly, inflation changes will
occur and be reflected in the cost data being collected over time. In addition, as the DoD deals with
a rapidly changing technical environment, cost data generated for a given era or class of technology
is necessarily limited. In fact, we would consider ourselves especially lucky if we can obtain five to
ten good data points for certain types of hardware. Too few data points can lead to problems in our
attempts to develop statistically sound CER’s for forecasting costs into the near and distant future.
(4) Comparability Problems. Such problems include, but are not limited to, changes to a
company's department numbers, accounting systems and disclosure statements, changes from
indirect to direct change personnel for a given function, among others.
To summarize, when we develop a database, we must normalize that database to be sure we
have comparable data. For example, if we are building a database with cost data, we must first
remove the effects of inflation so that all costs are displayed in constant dollars. We must also
normalize the data for consistency in content; that is, we must ensure that a particular cost category
has the same definition in terms of content. Normalizing cost data is a challenging problem, but it
is a problem which must be resolved if a good database is to be constructed.
Resolving database problems so that an information system exists to meet our needs is not
easy. For example cost analysis methodologies typically vary considerably from one analysis or
estimate to another. The requirements for CER’s -- hence the data and information requirements --
are not constant over time. In short, the analyst who is working in support of long range forecasting
needs cannot specify information needs once and for all. These needs must be reviewed
periodically.
The maintenance and update of any database must also be considered. An outdated
database may be of very little use in forecasting future acquisition costs. However, we must also
consider the costs of maintaining and updating a database. We may find that we simply cannot
afford to provide for many current databases. Finally, since today we deal with a set of rapidly
changing technology, we may find ourselves limited to developing a CER with a relatively small
database.
32
Parametric Cost Estimating Handbook
Resolving the database problem must, at a minimum, focus on being provided with an
explicit SOW within the RFP and standardized cost category definitions. Then we must consider
how to manage and utilize the database we have within our individual organizations.
33
Parametric Cost Estimating Handbook
CHAPTER III
AND
CER DEVELOPMENT
34
Parametric Cost Estimating Handbook
CHAPTER III
Proper CER development and application depends heavily upon understanding of certain
mathematical and statistical techniques. Some of the simpler and more widely used techniques
are explained in this chapter. Many other techniques are available in standard statistical text
books.
CER development and cost modeling are forms of forecasting real world cost events.
How much effort will it take to build a bomber or a space sensor? An estimate has a probability
of being within a given percentage of the “correct” answer. The better the project cost data and
cost model, the closer will be the predicted cost to the final actual cost at project completion.
Since a model of the real world involves simplifications, the final actual cost will rarely equal the
estimated cost.
These modeling uncertainties translate into “risk”” of producing a realistic cost estimate
within a given percentage of the final actual project cost. Such uncertainties can be grouped into
two major categories:
1. Uncertainty of any organization to perform as planned due to unexpected resource
or schedulings in the scope of effort to produce the design, prototype or product, and,
2. Uncertainty associated with the development (and, hence, usefulness) of any cost
model. This item includes:
a. Uncertainty associated with omission of a key cost driver,
b. Mis-specification of the form of the model equation,
c. Modeling limitations associated with a lack of data, and,
d. Data consistency across multiple project databases.
35
Parametric Cost Estimating Handbook
The first type of risk, (1) can be addressed through improved specifications in the scope
of work and an improvement in the clarity of understanding between the customer and
contractor. This would result in a decrease in the amount of contract and engineering change
proposals (CCE/ECPs) and unnecessary rework.
One of the second types of risk, (2a), uncertainty associated with omission of key cost
drivers, is addressed through development of historical cost data by product line, defined Work
Breakdown Structure (WBS), and systematic understanding of the types of cost associated with
development, prototype, and production programs.
In risk (2b), the avoidance of the mis-specification of the form of the model equation,
careful review of in-house and available industry data is required to create the basis of the model.
Affirmative answers are required to the following four questions. Does the model make basis
economic sense? Is the cost estimate an extrapolation within the scope of the cost and database?
Is this type of cost estimate no higher a risk estimate than an interpolation between existing
historical data in size and complexity? How well do the outputs compare to history? If poor
patterns exist between the model and history, it is likely that an important parameter has been left
out. The form of the cost equation should then be re-examined.
Risk (2c) deals with the question about the sufficiency of data points that are available to
validate the cost model. A lack of data points will increase the uncertainty of the cost model
equation. Can more data points be included? The use of additional relevant data can increase the
validity of any model. In the absence of substantial data, have experts critique the reasonableness
of the model outputs.
Usually there is a lack of data in a relatively new technological or programmatic area of
cost modeling. How can we model the costs associated with an improvement in processes or
teaming arrangements that improve producibility and lower costs relative to the old ways of
program management and control? Metric data is now being gathered within many companies,
so this data issue should become much less severe in years to come. In the meantime,
organizational process improvement curves, a variation of the quantity of production (repetition)
based on historical cost improvement curves, can be used to evaluate the benefits of process
improvements.
36
Parametric Cost Estimating Handbook
Risk (2d), data consistency, can be mitigated by using Work Breakdown Structure (WBS)
cost elements which are standardized across projects. (At the end of this Handbook, Appendix B
contains a treatise about developing Work Breakdown Structures). If definitions for system
engineering, project management, etc., are not consistent, then the resulting cost model and its
cost estimating capability for each of the WBS elements may be flawed. Cost analysts need to
have at their disposal a correct WBS cost estimating tool, complete with a WBS dictionary. if a
cost model is developed for similar hardware items (objects) across programs, then the definition
of these items should be standardized as much as possible. Differences in definitions from
program to program or between product lines are captured by the complexity of labor, or material
cost differences.
For CERs to be valid, they must be developed based on sound statistical concepts. Once
valid CER’s have been developed, then parametric cost modeling can proceed. The following
discussion will review some of the more commonly used statistical techniques for CER
development.
CERs are the key tool used in estimating by the cost analyst. CERs may be used at any
time in the estimating process. For example, CERs may be used in the concept or validation
phase of estimating program costs because there is insufficient system definition to use anything
else. CERs may also be used in later phases of estimating program costs as a cross check of
another estimating procedure, or as the primary basis for an estimate.
The value of a CER is dependent upon the soundness of the database from which the
CER is developed and the appropriateness of that CER to what is to be estimated. Determination
of the “goodness” of a CER and its applicability to the system being estimated requires a
thorough analysis by the cost analyst. The process of validating and selecting a CER is the
subject of this chapter. We will begin by defining CERs and then look at how a CER may be
developed. Next, we will look at when to use a CER and note the strengths and weaknesses of
CERs. Finally, we will consider the techniques for developing CERs; the linear regression model
37
Parametric Cost Estimating Handbook
of the Least Squares Best Fit (LSBF) model, along with a few other statistical techniques.
Appendix D describes additional statistical techniques.
We can make a few general observations about CER development. We know that CERs
are analytical equations which relate various categories of cost (either in dollars or physical units)
to cost drivers or explanatory variables. We also know that CERs can take numerous
forms,ranging from informal rules of thumb or simple analogies to formal mathematical
functions derived from statistical analysis of empirical data. If we are going to develop a CER,
however, then we need to focus on assembling and refining the data that constitute the empirical
basis for that CER.
In deriving a CER, assembling the database is especially important and, generally, a very
time-consuming activity (see Chapter II). Deriving valid CERs is a difficult task and the number
of really good, valid CERs is significantly fewer than one might expect. While there are many
reasons for the lack of valid CERs, the number one reason is the lack of an appropriate database.
When developing a CER, the analyst must first hypothesize a logical estimating
relationship. For example, does it make sense to expect that costs will increase as aircraft engine
thrusts increase? Given that it does make sense that cost and engine thrust has some direct
relationship, the analyst will need to refine that hypothesis to determine whether the relationship
is linear or curvilinear. After developing a hypothetical relationship, the analyst needs to
assemble a database.
Sometimes, when assembling a database, the analyst discovers that the raw data is at least
partially in the wrong format for analytical purposes and displays irregularities and
inconsistencies. Adjustments to the raw data, therefore, almost always need to be made to ensure
a reasonably consistent and comparable database. It is important to note that no degree of
sophistication in the use of advanced mathematical statistics can compensate for a seriously
deficient database.
Since the data problem is fundamental, typically a considerable amount of time is devoted
to collecting data, adjusting that data to help ensure consistency and comparability, and providing
38
Parametric Cost Estimating Handbook
for proper storage or information so that it can be rapidly retrieved when it is needed. In fact,
more effort is typically devoted to assembling a quality database than to anything else. Given the
appropriate information, however, the analytical task of deriving CER equations is often
relatively easy.
When a CER has been build from an assembled database based on a hypothesized logical
statistical relationship, and within an accepted or revised hypothesis, one is ready to apply the
39
Parametric Cost Estimating Handbook
CER. It may be used to forecast future costs or it may be used as a cross check of an estimate
done with another estimating technique. For example, one may have generated an estimate using
a grassroots approach -- a detailed build up by hours and rates -- then “benchmark” the grassroots
estimate with a CER.
A CER built for a specific forecast may be used with far more confidence than a generic
CER. One must be especially careful in using a generic CER when the characteristics of the
forecast universe are, or are likely to be, different from those reflected in the CER. One may
need to qualify a generic CER by reviewing the database and the assumptions made for its use.
A need to update the database with data appropriate for forecasting may be necessary. One may
also find that the generic CER can be used only as a starting point.
When using a generic CER as a point-of-departure, enhance or modify the forecast in
light of any other available supplementary information. This most likely will involve several
iterations before the final forecast is determined. It is important to carefully document the
iterations so that an audit trail exists explaining how the generic CER evolved to become the
final forecast.
In using any CER -- specially built or generic -- the main theme is invariably: Be careful;
Use good judgment! CERs are a fundamental estimating tool used by cost analysts. However,
we must us them while applying a good deal of judgment. For example, we need to be concerned
about the uncertainty of a future system in terms of chance elements in the “real” world; that is,
uncertainty about the state of the world in terms of technological, strategic, and political factors.
Also, we must carefully document as we go, so that an all important audit trail is maintained.
In applying good judgment in the use of CERs, we need to remain mindful of the
strengths and weaknesses of CERs. Some of the more common ones are presented below:
Strengths
40
Parametric Cost Estimating Handbook
1. One of the principle strengths of CERs is that they are quick and easy to use. Given a
CER equation and the required input data, one can generally turn out an estimate
quickly.
2. A CER can be used with limited system information. Consequently, CERs are
especially useful in the RDT&E phase of a program.
3. A CER is an excellent (statistically sound) predictor if derived from a sound database,
and can be relied upon to produce quality estimates.
Weaknesses
1. CERs are sometimes too simplistic to forecast costs. Generally, if one has detailed
information, the detail may be reliably used for estimates. If available, another
estimating approach may be selected rather than a CER.
2. Problems with the database may mean that a particular CER should not be used.
While the analyst developing a CER should validate that CER, it is the responsibility
of any user to validate the CER by reviewing the source documentation. Read what
the CER is supposed to estimate, what data were used to build that CER, how old the
data are, how they were normalized, etc. Never use a cost model without reviewing
its source documentation.
Now that we know what a CER is, how to develop a CER, when to use a CER, and some
of a CER’s strengths and weaknesses, we can develop techniques for building CER’s. The LSBF
technique is only one mathematical transformation of the database - the linear regression model.
Other sophisticated curvilinear models can also be developed, but will not be explored in this
Handbook. An analyst should be mindful that little in the estimating world is linear.
REGRESSION ANALYSIS
The purpose of regression analysis is to improve our ability to predict the next “real
world” occurrence of our dependent variable. Regression analysis may be defined as the
mathematical nature of the association between two variables. The association is determined in
41
Parametric Cost Estimating Handbook
the form of a mathematical equation. Such an equation provides the ability to predict one
variable on the basis of the knowledge of the other variable. The variable whose value is to be
predicted is called the dependent variable. The variable about which knowledge is available or
can be obtained is called the independent variable. In other words, the dependent variable is
dependent upon he value of independent variables.
The relationships between variables may be linear or curvilinear. By linear, we mean that
the functional relationship can be described graphically (on a common X-Y coordinate system)
by a straight line and mathematically by the common form:
y = a + bx
where y = (represents) the calculated value of y - the dependent variable
x = the independent variable
b = the slope of the line, the change in y divided by the corresponding
change in x
a and b are constants for any value of x and y
Looking at the bi-variate regression equation -- the linear relationship of two variables --
we find that regression analysis can be described by an equation. The equation consists of two
distinctive parts, the functional part and the random part. The equation for a bi-variate regression
population is:
Y = A + BX + E
where A + BX is the functional part (a straight line) and E is the random part. A and B
are parameters of the population that exactly describe the intercept and slope of the
relationship.
E represents the ran or “error” part of the equation. The random part of the equation is
always there because the errors of assigning values, the errors of measurement, and errors
of observation. These types of errors are always with us because of our human
limitations, and the limitations associated with real world events.
42
Parametric Cost Estimating Handbook
y = a + bx + e,
where a + bx represents the functional part of the equation and e represents the random part. Our
estimate of A and B in the population are represented by a and b, respectively, in the sample
equation. In this sense then, a and b are statistics. That is, they are estimates of population
parameters. As statistics, they are subject to sampling errors. As such, a good random sampling
plan is important.
1. The values of the dependent variable are distributed by a normal distribution function
about the regression line.
2. The mean value of each distribution lies on the regression line.
3. The variance of each array of the independent variable is constant.
4. The error term in any observation is independent of the error term in all other
observations. When this assumption is violated, data is said autocorrelated. This
assumption fixes the error term to be a truly random variable.
5. There are no errors in the values of the independent variables. The regression model
specifies that the independent variable be a fixed number -- not a random variable.
For example, you wish to estimate the cost of a new bomber aircraft at mach 2, then
mach 2 is the value of the independent variable. Mach 2 is a fixed number. The
regression model will not handle errors in the independent variables.
6. All causation in the model is one way. This simply means that if causation is built
into the model, the causation must go from he independent variable to the dependent
variable. Causation, though neither statistical nor a mathematical requirements, is a
highly desirable attribute when using the regression model for forecasting. Causation,
of course, is what we, as cost analysts, are expected to determine. We do this in our
hypothesis of a logical mathematical relationship in either building or reviewing a
CER equation.
CURVE FITTING
43
Parametric Cost Estimating Handbook
There are two standard methods of curve fitting. One method has the analyst plot the data
and fit a smooth curve to the data. This is known as the graphical method. The other method
uses formulas or a “best-fit” approach where an appropriate theoretical curve is assumed and
mathematical procedures are used to provide the one “best-fit” curve; this is known as the Least
Squares Best Fit (LSBF) method.
We are going to work the simplest model to handle, the straight line, which is expressed
as:
Y = a + bx
Graphical Method
To apply the graphical method, the data must first be plotted on graph paper. No attempt
should be made to make the smooth curve actually pass through the data points which have been
plotted; rather, the curve should pass between the data points leaving approximately an equal
number of points on either side of the line. For linear data, a clear ruler or other straightedge may
be used to fit the curve. The objective in fitting the curve is to “best-fit” the curve to the data
points plotted; that is, each data point plotted is equally important and the curve you fit must
consider each and every data point.
Although considered a rather outdated technique today, plotting the data is still always a
good idea. By plotting the data, we get a picture of the relationship and can easily focus on those
points which may require further investigation. Hence, as a first step, we shold plot the data and
note any data points which may require further investigations before developing a forecasting
graphical curve or mathematical equation.
LSBF Method
The LSBF method specifies the one line which best fits the data set we are working with.
The method does this by minimizing the sum of the squared deviations of the observed values of
Y and calculated values of Y. For example, if the distances: (Y1 - YC1,), (Y2 - YC2), (Y3 - YC3),
(Y4 - YC4), etc., parallel to the Y-axis, are measured from the observed data points to the curve,
then the LSBF line is the one that minimizes the following equation (see Figure III-1):
(Y1 - YC1)2 + (Y2 - YC2)2 + (Y3 - YC3)2 + ... + (Yn - YCn)2
44
Parametric Cost Estimating Handbook
y • yc4
y3 • • y4
• yc3
y1 • • yc2
• yc1 • y2
x
Figure III-1. The LSBF Line.
In other words, the sum of the deviations from the observed value of Y, and the
calculated value of Y - Yc squared, is a minimum; i.e., (Y - YC)2 is a minimum. This same
distance, (Y - YC) is the error term or residual. Therefore, the LSBF line is one that can be
defined as:
ΣE2 is a minimum because Σ (Y - YC)2 = ΣE2
For a straight line,
Y = a + bx
and, with N points, we have
(X1, Y1,), (X2, Y2), (X3, Y3), ... (Yn , Yn)
45
Parametric Cost Estimating Handbook
the number of observations or ΣX/n = x and the sum of the “Ys” divided by the number of
observations or Σy/n = y . Now, instead of considering the mean as a point when dealing with
only one variable, we are now using a line -- the LSBF regression line. Note that:
The parameters, a and b, define a unique line with a Y-intercept of a and a slope of b.
To calculate the values needed to solve for a and b, we need a spreadsheet (See Table III-
1). For example, use the values in Table III-2.
Table III-1. Table Needed to Get Sums, Squares and Cross Products
X Y X*Y X2 Y2
X1 Y1 X1 * Y1 X12 Y12
X2 Y2 X2 * Y2 X22 Y22
X3 Y3 X3 * Y3 X32 Y32
- - - - -
- - - - -
ΣXn ΣYn Σ(Xn * Yn) ΣXn2 ΣYn2
Table III-2
X Y XY X2 Y2
4 10 40 16 100
11 24 264 121 576
3 8 24 9 64
9 12 108 81 144
7 9 63 49 81
2 3 6 4 9
36 66 505 280 974
We can substitute the table values into the equations for b and a:
where x = ∑ x / n and, y = ∑y/n
Solving for b, we get:
b=
∑ xy − nx y
2
∑ x − nx
2
b = 505 - 6(6)11
46
Parametric Cost Estimating Handbook
280 - 6(6)2
b = 109
64
b = 1.703125
a =.78128
Therefore, the regression equation (calculated y) is
Yc =.78128 + 170312
. x
Multiple Regression
In simple regression analysis, a single independent variable (X) is used to estimate the
dependent variable (Y), and the relationship is assumed to be linear (a straight line). This is the
most common form of regression analysis used in contract pricing. However, there are more
complex versions of the regression equation that can be used to consider the effects of more than
one independent variable on Y. That is, multiple regression analysis assumes that the change in
Y can be better explained by using more than one independent variable. For example,
automobile gasoline consumption may be largely explained by the number of miles driven.
However, it might be better explained if we also considered factors such as the weight of the
automobile. In this case, the value of Y would be explained by two independent variables.
Yc = A + B1X1 + B2X2
where: Yc = the calculated or estimated value for the dependent variable
A = the Y intercept, the value of Y when X = 0
X1 = the first independent (explanatory) variable
B1 = the slope of the line related to the change in X1, the value by which change
when X1 changes by one
X2 = the second independent variable
47
Parametric Cost Estimating Handbook
B2 = the slope of the line related to the change in X2, the value by which change
when X2 changes by one
Multiple regression will not be considered in depth in this Handbook. Consult a statistics
text to learn more about multiple regression.
Curvilinear Regression
In some cases, the relationship between the independent variable(s) may not be linear.
Instead, a graph of the relationship on ordinary graph paper would depict a curve. For example,
improvement curve analysis uses a special form of curvilinear regression. As with multiple
regression, consult a statistics text to learn more about curvilinear regression.
Now that we have developed the LSBF regression equations, we need to determine how
good the equation is. That is, we would like to know how good a forecast we will get by using
our equation. In order to answer this question, we must consider a check for the “goodness” of
fit, the coefficient of correlation (R) and the related coefficient of determination (R2). There are
a number of other statistics we could check which would expand upon our knowledge of the
regression equation and our assurance of its forecasting capability. Some of these will be
discussed late.
Correlation Analysis
One indicator of the “goodness” of fit of a LSBF regression equation is correlation
analysis. Correlation analysis considers how closely the observed points fall to the LSBF
regression equation we develop. The assumption is that the more closely the observed values are
to the regression equation, the better the fit; hence, the more confidence we can expect to have in
the forecasting capability of our equation. It is important to note that correlation analysis refers
only to the “goodness” of fit or how closely the observed values are to the regression equation.
Correlation analysis tells us nothing about cause and effect, however.
48
Parametric Cost Estimating Handbook
Coefficient Of Determination
The coefficient of determination (R2) represents the proportion of variation in the
dependent variable that has been explained or accounted for by the regression line. The value of
the coefficient of determination may vary from zero to one. A coefficient of determination of
zero indicates that none of the variation in Y is explained by the regression equation; whereas a
coefficient of determination of one indicates that 100 percent of the variation of Y has been
explained by the regression equation. Graphically, when the R2 is zero, the observed values
appear as in Figure III-2 (bottom) and when the R2 is one, the observed values all fall right on the
regression line as in Figure III-2 (top).
y y
No Correlation x
R2 = 0
FIGURE III-2
R2 tells us the proportion of total variation that is explained by the regression line. Thus
R2 is a relative measure of the “goodness” of fit of the observed data points to the regression line.
For example, if we calculate R2 using the formula above and find that R2 = 0.70, this means that
70% of the total variation in the observed values of Y is explained by the observed values of X.
Similarly, if R2 = 0.50, then 50% of the variation in Y is explained by X. If the regression line
perfectly fits all the observed data points, then all residuals will be zero, which means that R2 =
49
Parametric Cost Estimating Handbook
1.00. In other words, a perfect straight-line fit will always yield R2 = 1. As the level of fit
becomes less accurate, less and less of the variation in Y is explained by Y’s relation with X,
which means that R2 must decrease. The lowest value of R2 is 0, which means that none of the
variation in Y is explained by the observed values of X. Some applications require R2 of at least
0.7 or 0.8. An R2 < 0.25, which corresponds to an R < 0.5, would never be acceptable.
Coefficient Of Correlation
The coefficient of correlation (R) measures both the strength and direction of the
relationship between X and Y. The meaning of the coefficient of correlation is not as explicit as
that of the coefficient of determination.
We can determine whether R is positive or negative by noting the sign of the scope of the
line, or b. In other words, R takes the same sign as the slope; if b is positive, use the positive root
of R2 and vice versa. For example, if R2 = 0.81; then R = + 0.9 and we determine whether R
takes the positive root (+) or the negative root (-) by noting the sign of b. If b is negative, then we
use the negative root of R2 to determine R. So to calculate R we need to know the sign of the
slope of the line.
It is most important to note that R does not tell us how much of the variation in Y is
explained by the regression line. R is only valuable in telling us whether we have a direct or an
inverse relationship and as a general indicator of the strength of the association.
50
Parametric Cost Estimating Handbook
graphed as a straight line on log-log graph paper and all the regression formulae apply to this
equation just as they do to the equation y = a + bx. In order to derive a learning curve from cost
data (units or lots) the regression equations need to be used, whether or not the calculations are
performed manually or using a statistical package for your personal computer. In this sense, the
learning curve equation is a special case of the LSBF technique.
Since in learning curve methodologies cost is assumed to decrease by a fixed amount
each time quantity doubles, then this constant is called the learning curve “slope” or percentage
(i.e., 90%). For example,
For unit #1
Y1 = A(1)b = A (First Unit Cost) and
For unit #2
Y2 = A(2)b = Second Unit Cost
So,
Y2 = A*(2)b = 2b = a Constant, or “Slope”
Y1 A
Note that:
Y6 = .7616 = 0.9
Y3 .8462
51
Parametric Cost Estimating Handbook
Any good statistical package (for instance StatView) can perform all the calculations (and
many others) shown. A quality package will let you customize your results (create presentations)
save your work, and calculate all these statistics: frequency distributions, percentiles, t-tests,
variance tests, Pearson correlation and covariance, regression, ANOVA, factor analysis and
more. Graphics and tables such as scattergrams, line charts, pie charts, bar charts, histograms,
percentiles, factors, etc., should be available to the user. Statistical analysis is greatly simplified
using these tools.
When working with the LSBF technique, there are a number of limitations, errors and
caveats of which we need to be aware. The following are some of the more obvious.
52
Parametric Cost Estimating Handbook
both the number of public telephones and liquor sales. As analysts, we must ensure that we have
chosen approximately related data sets and CERS, and that real cause and effect is at work. Be
careful not to find relationships when they do not exist.
Summary
In this chapter we have so far discussed the development and use of Cost Estimating
Relationships (CERs). We noted that in using or developing a CER, a high quality database is
most critical (see Chapter II). Specifically, we highlighted the difficulties of assembling a good
database for CER development and indicated several reasons why there could be a problem. We
next considered the strengths and weaknesses of a CER. Finally, we developed one simple
model for generating a CER -- the LSBF model. We completed our discussion of CER’s by
identifying several limitations, errors and caveats when using CERs. Next, we’ll consider the use
of CERs.
Basically, CER’s reflect changes in prices or costs (in constant dollars) as some physical,
performance or other cost-driving parameter(s) is changed. The same parameter(s) for a new
item or service can be input to the CER model and a new price or cost can be estimated. Such
relationships may be practically applied to a wide variety of items and services.
53
Parametric Cost Estimating Handbook
Construction
Many construction contractors use a rule of thumb which relates floor space to building
cost. Once a general structural design is determined, the contractor or buyer can use this
relationship to estimate total building price or cost - excluding the cost of land. For example, if
we were building a brick two-story house with a basement, we may use $60/square foot (or
whatever value is currently reasonable for the application) to estimate the price of the house.
$60/sq ft x 2200 sq ft = $132,000 house price
Weapons Procurement
In the purchase of an airplane, CERs are often used to estimate the cost of the various
parts of the aircraft. One item may be the price for a wing of a certain type of airplane, a
supersonic fighter for example. History may enable the analyst to develop a CER relating wing
surface area to cost. You may find that there is an estimated $40,000 of wing cost (for instance
NRE) not related to surface area and another $1000/square foot that is related to surface area to
build one wing. For a wing with 200 square feet of surface area we could estimate a price as:
estimated price = $40,000 + 200 sq ft x $1,000 per sq ft
= $40,000 + 200,000
= $240,000
Electronics
Manufacturers of certain electronic items have discovered that the cost of completed
items varies directly with the number of total electronic parts in the item. Thus, the sum of the
number of resistors, capacitors, inductors, and transistors in a specific circuit design may serve as
an independent variable (cost driver) in a predictive CER. Assume a CER analysis indicates that
$0.57 a unit is not related to the number of components with another $.11 added per part. If
evaluation of the drawing revealed that an item was designed to contain 11 capacitors, 12
resistors, 5 transistors, and 2 inductors, the total part count becomes 30. Substituting the 30 parts
into the CER:
estimated cost = $0.57 + $.11 per part * number of parts
= $0.57 + $.11 (30)
= $0.57 + $3.3
54
Parametric Cost Estimating Handbook
= $3.87
55
Parametric Cost Estimating Handbook
“turns over” today every two years. Technology is an area requiring user expertise and
knowledge for evaluating price effects of the application of new technology, to the current
requirement. The evaluation of the cost impact of technology surges is a difficult task to
anticipate and quantify.
Estimate Reliability
In considering whether or not to use a specific estimate, we need to examine the “track
record” of the cost estimator or the organization providing the estimate. If, in the past, the
estimates have been close to actuals, greater reliance may be placed on the estimate. If estimates
have been significantly above or below known actuals, then lower reliability should be placed on
current estimates. Knowing the reliability of past estimates does not free the cost or price analyst
from the obligation to review the estimate and the estimating methodology as they relate to
proposal accuracy.
Cost and price analysis should be temperred with product value. Knowledge of the
product, its functions and its use is essential for sound contract pricing. Value analysis is a
systematic and objective evaluation of the function of a product and its related costs. Its purpose
is to ensure optimum value. Questions that help in the evaluation are:
1. What must the product do?
2. What does it cost now? What does it cost to operate and maintain?
3. In what other ways can the function be performed?
4. What will these alternatives cost?
Two Examples Of CER Use
One can utilize Consumer Price Index numbers to perform simple value or price analysis.
Index numbers are, quite simply, CER’s, and predict, from history the effects of inflation. For
example, assume an item of equipment in 1980 cost $28,000 when an appropriate Consumer
Price Index (CPI) was 140. If the current index is now 190 and an offer to sell the equipment for
$40,000 has been suggested, how much of the price increase is due to inflation? How much of
the price increase is due to other factors?
Solution: Px = Ix
Py Iy
56
Parametric Cost Estimating Handbook
$28,000 = 1.40
Py 1.90
1.4 Py = $53,2000
Py = $53,200 = $38,000
1.4
$38,000 now is roughly the equivalent of $28,000 in 1980. Hence the price difference due to
inflation is $38,000 - $28,000 = $10,000. The difference due to other causes is $2,000
($40,000 - $38,000).
The above example would illustrate the use of CPI numbers for a material cost
analysis. The steps were:
1. If we know what the price of an item was in the past, and we know the index
numbers for both that time period and today, we can then predict what the price of
that item should be now based on inflation alone.
2. If we have the same information as above, and we have a proposed price, we can
compare that price to what it should be based on inflation alone. If the proposed
price is higher or lower than we expect with inflation, then we must investigate
further to determine why a price or cost is higher or lower.
Consider the purchase of a house as another example that uses the LSBF technique.
Historical data for other homes purchased, may be examined during an analysis of proposed
prices for a newly designed house. Using this data, we can demonstrate a procedure for
developing a CER.
Step 1: Designate and define the dependent variable. In this case we will attempt to
directly estimate the cost of a new house.
Step 2: Select item characteristics to be tested for estimating the dependent variable. A
variety of home characteristics could be used to estimate cost. These include
such characteristics as: square feet of living area, exterior wall surface area,
number of baths, and others.
57
Parametric Cost Estimating Handbook
Step 3: Collect data concerning the relationship between the dependent and
independent variable. The example, Table III-3, shows data collected on five
house plans so that we can determine a fair and reasonable price for a house of
2100 square feet and 2.5 baths.
Table III-3.
Sq. Feet Sq. Feet Exterior
House Model Unit Cost Baths Living Area Wall Surface
Burger 166,500 2.5 2,800 2,170
Metro 165,000 2.0 2,700 2,250
Suburban 168,000 3.0 2,860 2,190
Executive 160,000 2.0 2,440 1,990
Ambassador 157,000 2.0 1,600 1,400
New House Unknown 2.5 2,600 2,100
Step 4. Explore the relationship between the independent and dependent variables. As
stated earlier, analysis of the relationship between the item characteristics and the dependent
variable may be performed using a variety of techniques.
Step 5. Determine the relationship that best predicts the dependent variable. Figure III-4
graphically depicts the relationship between the number of baths in the house and the price of the
house. From the graph, there appears to be a relationship between the number of baths and house
price. The relationship, however, may not be a good estimating tool, since three houses with a
nearly $8,000 price difference (12 percent of the most expensive house) have the same number of
baths.
$170
$165
$160
$155
$150
1 2 3 4
58
Parametric Cost Estimating Handbook
Figure III-5 graphically relates square feet of living area to price. In this graph, there
appears to be a strong linear relationship between house price and living area.
$170
$165
$160
$155
$150
1400 1600 1800 2000 2200 2400 2600 2800 3000
$165
$160
$155
$150
1400 1500 1600 1700 1800 1900 2000 2100 2200 2300
59
Parametric Cost Estimating Handbook
Based on this graphic analysis, it appears that square feet of living area and exterior wall
surface have the most potential for development of a cost estimating relationship. We may now
develop a “line-of-best-fit” graphic relationship by drawing a line through the average of the x
values and the average of the y values and minimizing the vertical distance between the data
points and the line (see Figure III-7 and Figure III-8).
Viewing both these relationships, we might questions whether the Ambassador model
data should be included in developing our CER. In developing a CER, you need not use all
available data if all data is not comparable. However, you should not eliminate data just to get a
better looking relationship. In this case, we find that the Ambassador’s size is substantially
different from the other houses for which we have data and the house for which we are
estimating the price. This substantial difference in size might logically affect the relative
construction cost. The trend relationship in Figure III-8 and Figure III-9, using the data for the
four other houses, would be substantially different than relationships using the Ambassador data.
Based on this information, you might decided not to consider the Ambassador data in CER
development.
60
Parametric Cost Estimating Handbook
Figure III-8. Linear Trend of Cost to Exterior Wall Surface (Sq. Ft.)
If you eliminate the Ambassador data, you find that the fit of a straight line relationship of
price to the exterior wall surface is improved. For the relationship of price to square feet of
living area, you find a close relationship, i.e., almost a straight line. (See Figure III-9)
$170
$168
$166
(2700: $165,000)
$164
$162
$160
2400 2600 2800 3000
FIGURE III-9. Square Feet (Thousands)
If you had to choose one relationship, you would probably select the one shown in Table
III-3 (square feet of living area) over the relationship involving exterior wall surface because
61
Parametric Cost Estimating Handbook
there is so little variance shown about the trend line. If the analysis of these characteristics did
not reveal a useful predictive relationship, you might consider combining two or more of the
characteristics already discussed, or exploring new characteristics. However, since the
relationship between living area and price is so close, we may reasonably use it for our CER.
In documenting our findings, we can relate the process involved in selecting the living
area for price estimation. We can use the graph developed as an estimating tool. The cost of the
house could be calculated by using the same regression analysis formula discussed herein:
CERs, like most other tools of cost or price analysis, must be used with judgment. Judgment is
required to evaluate the historical relationships in the light of new technology, new design, and
other similar factors. Therefore, a knowledge of the factors involved in CER development is
essential to proper application of the CER. Blind use of any tool can lead to disaster.
COMMON CERs
Table III-4 lists some common CERs used to predict prices or costs of certain items. In
addition to CERs used for estimating total cost and prices, others may be used to estimate and
evaluate individual elements of cost. CERs, for example, are frequently used to estimate labor
hours. Tooling costs may be related to production labor hours, or some other facet of production.
Other direct costs may be directly related to the labor effort involved in a program.
62
Parametric Cost Estimating Handbook
How good is a CER equation and how good is the CER likely to be for estimating the
cost of new projects? What is the confidence level of answers at +/- x% from the number
estimated, i.e., how likely is the estimated cost to fall within a specified range of cost outcomes?
First, certain necessary conditions for a statistical analysis of a CER need to be stated:
1. There are more data points than coefficients to be estimated.
2. Error terms do not form a systematic pattern.
3. The independent variables are not highly correlated.
4. The form of the equation to be estimated is linear or has been translated into a
linear form using logarithms.
5. The model makes sense from an economics and technical point of view.
63
Parametric Cost Estimating Handbook
The “F” statistic measures the ratio of “explanation” of the explanatory variables (cost
drivers) and the “residual” (error) term. The F statistic should have a value greater than 4.0 or 5.0
to indicate that a good cost driver has been selected for the cost model and that the form of the
equation is acceptable. (A value of 1.0 indicates that the cost driver explains only 1/2 of the
variation in the cost. This would not be a particularly good cost driver variable). The higher the F
value the better the prediction capability of the cost drivers. Also, the higher the “F” statistic, the
higher will be the R2 value.
“Partial” F statistics can be used to examine the contribution of a single cost driver term.
The higher the value, as in the “F” statistic, the better the additional contribution of the particular
cost driver. See a statistics text or Appendix D for a more detailed explanation of the “F”
statistics.
The “t” statistic can be used to test the validity of adding a particular cost driver
variable. First, a “null” hypothesis is made that the cost parameter adds no predictive value to
the model (cost equation). That is, the value of the parameter for the cost driver term being
reviewed has a value of 0 (zero). The “t” test is used to make a decision to accept or reject the
“null” hypothesis for a given cost driver term. Generally, a “t” value greater than five leads to
the conclusion to reject the null hypothesis. A “t” value less than five leads to the acceptance of
the null hypothesis, that the cost term does not add predictive value to the CER. Each case to
accept or reject the null hypothesis depends upon the difference between the hypothesized and
estimated coefficients, the confidence interval desired, and the degrees of freedom of the data
(number of data observations minus the numbers of parameters being estimated).
64
Parametric Cost Estimating Handbook
The “t” statistic is also used for other applications. For example, in order to determine if
two groups of data are from the same population or from two different populations.
When then degrees of freedom in a data set approach 30, the statistics of the “t”
distribution approach the normal distribution. If it is not known whether a normal distribution is
justified, the “law of large numbers” can be invoked that states that for a large enough sample
(large enough cost database), the error term involved in estimating cost will approach a normal
distribution. That is, the normal distribution can be used instead of the “t” distribution to test the
null hypothesis. The “t” is of similar shape to the normal distribution, except that there is a
larger probability of lower cost or higher cost (extreme outcomes) associated with the “t”
distribution rather than the normal distribution.
Also, an analysis of the plot residual values can be useful. If a pattern exists, then
correlation may be explained by other factors. If the plot of residuals is a scatter plot with no
patterns, then the CER equation may be good if other factors are favorable.
Another important statistical measure is the bandwidth or confidence interval associated
with the application of the CER cost estimates. The bandwidth of the cost estimate depends
upon the confidence interval required or desired, the parameter value, and the degrees of freedom
of the data.
See Appendix D for a more detailed explanation of the “t” distribution and confidence
intervals.
CER Analysis
Cost estimating relationships (CERs) relate cost to some other program element in a
definite way. Examples of CERs are per diem rates, “shop supplies”, sales tax estimates, etc.
CERs supposedly relate one cost to another or with a well defined parameter. When rolled into
an interlocking algorithm, analysts have to probe both the estimate and the underlying data used
to develop a CER. What distinguishes a CER from a conventional estimating approach is that
CERs define a general relationship based on a set of data rather than a specific relationship based
65
Parametric Cost Estimating Handbook
on a direct precedent. A CER may be less precise than a convential estimating method but the
cost savings resulting from the CER approach may be worth the potential loss of precision.
Within detailed cost estimates, CER’s may be used for estimating small or derivative cost
elements. CER’s are also commonly used for budgetary estimates, “rough order of magnitude”
estimates, and simple cost-benefit calculations when the preliminary or uncertain nature of the
project discourages a costly estimating effort. However, well built cost models in the hands of a
professional can be better cost predictors than detail methods because certain judgement and
other biases are more controlled.
General Features
A CER can be a functional relationship between one variable and another and may
represent a statistical relationship between some well defined program element and some specific
cost. Since most company overhead rates are percentages of direct labor expense, these are
CERs. Computer and travel costs often show statistical relationships to engineering costs, design
is frequently closely correlated to drafting costs and these, in turn, to the number of drawings,
parts, size, weight, etc. Many costs can be related to other costs or non-cost variables in some
fashion but not all such relationships can be turned into a CER.
A CER must have two characteristics. First, it should link in some rational way the
underlying variable - the independent variable, and the cost being developed. Second, the CER
should have a strong statistical fit, R2, and confidence interval, with the basis element.
Evaluating an Estimate
CERs are used in lieu of a direct substantive link between a cost element and some basis
of estimate (BOE). Since CERs are developed from collections of cases and represent average
values, CERs have uncertainty built into them. (A direct BOE to cost estimate extrapolation is
preferred if the cost element is significant, if a good BOE can be found, and if some well defined
extrapolation can be postulated between the BOE and the cost element.) The first step in an
analysis of a CER influenced estimate is to identify how much of the total estimate is CER
driven.
66
Parametric Cost Estimating Handbook
The second step in analyzing a proposal using CERs is to evaluate the CERs themselves.
Since CERs supposedly relate an element of cost to some variable or factor, the analyst must
determine whether the relationship is truly functional and statistical. If the CER is a factor
implied by a functional relationship, the analyst needs an explanation of the function and support
for the assertion of a relationship. Both deterministic and statistical support are required. In
other words, does the relationship make logical sense and is the pattern of influence regular
enough to be helpful? Base data must be available for examination, preferably original,
“unsmoothed” data. Again, the purpose of a CER is to save time and effort. If the amount of the
proposal affected by CERs is not great, the evaluation effort applied to the CER should be an
appropriate amount.
In a “worse case” situation, the analyst may have to back track to the original data set
used to develop a CER. In that case the analyst should attempt to see if all the relevant cases have
been included and no “cherry picking” has occurred. In other words, what “risk” is involved by
using the CER?
Assuming the original data set is available and complete, the developer of the CER must
explain the theory of the relationship and the data processing performed. If “outliers” were
excluded, the estimator needs to explain why. If the explanation of the exclusion is
unsatisfactory, the analyst may want to develop a set of CER’s with the outliers included.
Ordinarily, outliers affect the deviation of the estimate rather than the value of the CER, but it is
useful to check. If several data points have been excluded and if these influence the CER mean
and standard deviation significantly, the CER may not be operationally useful even if theoreticaly
valid. To illustrate, suppose a relationship is identified between variable K and cost variable C.
Suppose the CER is 7 x K = C. If the value 7 is developed from an arithmetic average of a dozen
values but three “outliers” have been excluded, then inclusion of the outliers may spread out the
sample standard deviation to the point that confidence in the relationship may become suspect.
The CER estimator should be able to supply the original data set and his/her analysis. The cost
analyst may need to replicate the estimate to verify the calculation. If this is done, the CER
statistics should be examined for:
1. The sample size. Confidence in the estimate will increase as the sample size
increases.
67
Parametric Cost Estimating Handbook
2. The standard error of the mean (or the point estimate) should be shown along with
the standard deviation of the calculated mean.
3. The standard deviation of the sample set. What is the range of the majority of the
data points? Confidence in the estimate increases as more and more data points
fall within a specified range.
4. If the CER is developed from a correlation calculation, the cost analyst can
examine the coefficient of correlation. Correlation infers a link between the two
factors but the relationship may be accidental. Standard statistical tests exist for
checking the likelihood that a given correlation coefficient is accidental and
should be used if the sample set is small, or if “R” is less than .8 (R2 = .64).
The last step in evaluation of a CER is calculating the effect which reasonable variations
on the CER value can have on the estimate. If reasonable variations on the CER impact the
estimate greatly, the analyst has to be skeptical of the explanatory power of the CER. The effect
of this is to widen the actual range of an estimate.
68
Parametric Cost Estimating Handbook
If the CERs represent a cost-effective response to an estimating problem and if they are
rationally developed and solidly based, the CERs are valid and accurate tools for an estimate.
Assuming no “show stopper” problems are uncovered, the analyst can accept the concept of the
CERs and apply such margins of variance as seem reasonable.
This chapter has presented the concept of Cost Estimating Relationships (CERs) and the
statistical underpinnings of CER development and application. The basic mathematical
relationships were described and examples showing the use of CERs were also presented. A flow
chart of the CER development process is shown in Figure III-10.
69
Parametric Cost Estimating Handbook
Select CERs
C = aX + by Validation Approval
Revalidation
CER Database
When Necessary
TO COST MODELS
70
Parametric Cost Estimating Handbook
CHAPTER IV
71
Parametric Cost Estimating Handbook
CHAPTER IV
INTRODUCTION
Hardware cost is defined as all program cost, excluding software development, for all
components, assemblies, subsystems, and systems for sea, ground, airborne or space applications.
Hardware cost includes: mechanical, electromechanical, electrical/electronic, structural or
pneumatic equipment, and may also include system test, or system operations. It is defined as the
cost to develop, design, build, test, launch, and operate a system.
Hardware cost estimating approaches are categorized into five basic methods: (a)
parametric, (b) analogy, (c) grassroots, (d) other (methods such as, standard time or expert
consensus approaches), and (e) combinations of the other form. The first three are well recognized
and clearly defined. The last two "methods" cover everything else. This section will focus on
parametric hardware cost modeling techniques.
Parametric estimating is the mathematical procedure or process in which product or service
descriptors directly yield consistent cost information. A parametric cost model is an estimating
system comprised of Cost Estimating Relationships (CERs) and other parametric estimating
functions, e.g., cost quantity relationships, inflation factors, staff skills, schedules, etc. Parametric
cost models yield product or service costs at designated Cost or Work Breakdown Structure
(CBS/WBS) levels and may provide departmentalized breakdowns of generic cost elements. A
parametric cost estimate provides a logical and repeatable relationship between input variables and
the resultant costs.
Hardware parametric cost estimating techniques are widely utilized by industry and
government. The use of these techniques expedites the development of price proposals by the
contractor, evaluation by the government, and subsequent negotiation.
The simplest form of parametric estimating tool is a CER. In this form a relationship
between cost and non-cost variables is investigated to determine any possible cause and effect
linkages. A statistical analysis is performed on representative samples of a historical database to
72
Parametric Cost Estimating Handbook
validate these linkages. The analysis is usually based on the investigation of the relationship
between two or more variables. The known variable (physical parameter, schedule, etc.) is called
the independent variable; the variable we are trying to predict, the cost of the item, is the dependent
variable. The resultant CER is an expression of the mathematical relationship between parameters
characterizing the item to be estimated and its cost. The strength of CER's lie in their relative
simplicity and the convenience they afford in predicting costs.
The primary limitations of CER's is that they will yield reliable cost estimates only within
the limits of the spread of independent variables that were used when the CER was developed. In
high technology areas, the limits of input parameters (independent variables) might go outside the
boundaries of the historical database. Therefore, either the CER's must be re-examined and updated
as the historical database increases, or the use of the CER must be limited, or used appropriately by
properly trained professionals.
A more sophisticated form of parametric estimating is the computerized Parametric
Estimating Model. These models estimate costs on a broader scale and are more versatile and
much less restrictive than the acceptable boundaries of CERs.
With parametric estimating models, system acquisition costs can be derived that are based
upon parameters such as quantity, size, weight, power consumption, environmental specification,
complexity and type of electronic packaging, etc. Some models can also provide schedule
information, or evaluate specified schedules in terms of costs. Parametric estimating models can
provide early cost measurement of system concepts with a limited description of physical or other
parameters. They are also frequently utilized for independent assessment of conventionally
prepared cost estimates.
Numerous "home grown" parametric cost models exist throughout industry and government
to cover a specific range or type of products or systems. There are also universal parametric
estimating systems that generate, internal to the model, appropriate expressions of CER's for a
broad range of products or systems. These models perform a mathematical extrapolation of past
experience in order to predict cost.
Computerized parametric cost estimating models often contain many mathematical
equations relating the input variables to cost. Each set of input parameters uniquely defines the
item of interest for cost modeling.
73
Parametric Cost Estimating Handbook
Universal parametric (i.e, generic) cost estimating models usually require calibration.
Considering the variety of products, the differences in accounting methods, etc., the calibration
process "fine tunes" these models to emulate the producer's organizational and product
characteristics. The calibration process simulates past product history and organizational traits.
Parametric estimating systems (models) exist to rapidly compute hardware development
and design costs, manufacturing costs of prototypes and production units along with all the
manufacturing support costs. Models are also available for computing operation and support costs
of hardware in the field. Also, models used for the development and maintenance costs of
computer software are available.
A model is a representation of a real-life system. In the case of cost estimating, a parametric
model is a system of data, equations, and inferences presented in mathematical form to represent or
describe the cost of an item. It relates knowns (system descriptions or parameters) to unknowns
(cost of systems). A model can be as simple as a single equation (CER), or as complex as an
inter-relation of many equations and functions. It can also be designed to estimate items as small as
modules or components, or as large as subsystems, systems, total programs, or large development
activities.
A Hardware Cost Model provides estimates of system acquisition costs based upon
quantitative parameters such as complexity, quantity, weight, and size; and qualitative parameters
such as environmental specification, type of packaging, and level of integration; and schedule
parameters such as months to first prototype, manufacturing rate, and amount of new design.
Hardware cost parametrics bring speed, accuracy, and flexibility to the cost estimating process.
Early cost measurement of concepts is crucial to a new venture since there is little opportunity to
change program costs significantly once a design has been detailed. Parametric cost models have
been developed to operate with limited concept description so that many alternatives can be costed
before designs and bills of material are finalized. Parametrics can also be used extensively as the
basis of a cost estimate in preparation of proposals as well as for the independent assessment of
conventionally prepared cost estimates.
74
Parametric Cost Estimating Handbook
75
Parametric Cost Estimating Handbook
of integration and test costs which are attributed to each level of integration should also be
estimated.
When a parametric model calculates a cost for manufacturing, it does not use a parts list and
labor resource build up, but rather a parametric representation of the parts and labor costs.
Fundamental Input Parameters To Typical Parametric Models Are Listed Below:
* Functional design parameters.
* Quantities of equipment to be developed, produced, modified, purchased furnished and
integrated and tested.
* Applications (technology of materials and processes) of structural and electronic
portions of the hardware.
* Hardware geometry consisting of size, weight of electronic and structural elements, and
electronic packaging density.
* Amount of new design required and complexity of the development engineering task.
* Operational environment and specification requirements of the hardware.
* Schedules for development, production, procurement, modification, and integration and
testing.
* Fabrication process to be used for production.
* Yield considerations for hardware development.
* Pertinent escalation rates and mark-ups for General and Administration charges, profit,
cost of money, and purchased item handling.
76
Parametric Cost Estimating Handbook
The model’s input parameters uniquely define the hardware for cost estimating and
modeling. The resultant cost output is determined from the model's mathematical equations alone.
Cost may be estimated with a minimal amount of hardware information. This feature
makes models a legitimate tool for cost estimation of programs in the conceptual stage of
development. Of course, it is always preferable to provide as much information to the models as
possible. In this way, statistical uncertainty associated with the input variables is reduced.
The modeling activity is basically a conversation between the model and the cost analyst.
Parametric data representing the hardware to be costed is formulated, and the analyst calls upon
embedded text, tables, and utilities of the model to help create and store the data used to drive the
model. In the course of using a parametric model, the analyst controls the output formats, the
ability to perform sensitivity and risk analysis, and of course the data that are used to estimate the
cost of the hardware. The data elements created during a session usually represent systems or
sub-systems composed of many separate sub-assemblies. For example, an aircraft might be
represented as a system composed of an airframe, propulsion units, avionics, air to ground controls,
flight controls, and ordinance. In turn, the flight controls might be a sub-system composed of
instrumentation, radar, and various processors. At an even lower level, we might find a digital
processor composed of input/output circuits, data storage memory, program memory, logic circuits,
power supply, and chassis and housing. The number of hardware elements and relative detail of the
parametric information in each is determined by the analyst. There is no limit to the level of
detailed data used for an analysis as long as the cost model contains historically verifiable
parametric cost relationships at the desired level. Nor is there a requirement that precludes a
parametric user from treating the entire aircraft as a basic assembly of the lowest order. Thus, an
analysis might be accomplished with one data set representing the aircraft as an assembly, or it
might be attained with many elements representing the aircraft as a system of sub-systems,
assemblies, and sub-assemblies. The amount of detail provided to the model is purely at the
descretion of the analyst.
Parametrics can estimate costs for the development and production phases of the acquisition
process. Outputs are categorized by such cost elements as: drafting, design, project management,
system engineering, data, manufacturing, and tooling and test equipment, etc.
77
Parametric Cost Estimating Handbook
Some parametric models contain the effects of phased interactions between engineering and
manufacturing. In addition to considering a normal performance period, estimates of the cost
impacts due to accelerated or protracted engineering schedules or due to an operation plan that
requires stops and restarts of production effort can also be computed.
A comprehensive parametric model should allow an analyst to:
* Calibrate (Automated is best)
* Estimate the cost of multiple lot production
* Calculate manufacturing costs of non-homogenous assemblies
* Determine the cost impact of compressing or extending development or production
schedules.
* Estimate the cost impact of the development schedule (concurrency or lapse) on
production.
* Perform Cost Risk Analysis
A graphic depiction of hardware modeling is shown in Figure IV-1.
HARDWARE MODELING
Hardware model applies common parameters in the comparative evaluation of new requirements
in light of analogous histories.
• Weight/Volume
FIGURE IV - 1
Hardware modeling normally entails stringing together numerous CERs using an
appropriate mathematical logic. The math logic is derived during the calibration process, and
allows the math model to emulate the "real world" of program history or to other supportable data
or estimates. When the "real world" model is developed, then the new requirements can be
78
Parametric Cost Estimating Handbook
evaluated based on the calibrated model. Modeling offers significant advantages to the estimator in
that once CER development and the calibration process is complete, cost estimating is quick,
repeatable, and cost effective.
For instance, the first unit production missile cost (T1) may be modeled as
T1 = [(Structure CER) + (Propulsion CER) + (G&A CER)]*1.15 (integration factor)
Then,
Total Missile H/W Cost = T1 * Qb, where,
Q = quantity, and b is a cost improvement exponent.
Also, for systems engineering and program management we may get
S/E + P/M = .10*(Missile H/W cost)
Models can also be used to estimate non-recurring development costs. These costs would
be estimated using parameters such as the number of prototypes, number of tests, the amount of
new design and design repeat, design complexity factors, and others.
A simplified parametric cost estimating process is shown in Figure IV-2. The process
connects the parametric model to a post processor that in turn converts the parametric model
outputs into typical cost proposal reports. Ultimately, the customer needs proposal reports in
standardized formats so that competing contractors can be conveniently compared.
Just as the value of a house, when it is evaluated for a mortgage or a refinancing, is
calculated using parametric approaches (square feet, # rooms, zip code, # bathrooms, etc.), the
complex products we normally deal with can be evaluated in exactly the same manner. Parametric
modeling can greatly simplify the cost estimating and cost analysis processes.
79
Parametric Cost Estimating Handbook
COST BY WBS
80
Parametric Cost Estimating Handbook
Parametric cost models exist that are designed to provide quick and reliable estimates for
development and production costs of individual custom micro-circuit chips and/or electronic
modules. These models provide the capability to estimate costs and schedules associated with the
development and production of modular electronics, custom Application Specific Integrated
Circuits (ASICs), hybrid or MMIC packages, common components, and system modules. In view
of the significant cumulative expenses for custom chips, these models are invaluable for
procurement or product planning. Models can assist analysts in future product planning and
incorporate state of the art technology in the cost estimating process.
Using a WBS structure, analysts can use models to form cost estimates for individual
micro-circuits (chips), the integration of these chips, and additional electronic components on
electronic modules. If the situation exists, these models may be integrated onto electronic modules
with additional components. The final electronic module may then be integrated into the WBS
used by the hardware models.
Micro-circuit model input parameters are determined by physically describing the
micro-circuit, other electronic components, and the electronic module. A wide range of custom
chips can be processed through use of generic input parameters which include:
* Number of pins
* Number of transistors and/or gates
* Die size
* Type of packaging
* CAD sophistication
* Amount of new design
* Manufacturing yield
* Development/Production schedules
Inputs for electronic modules include:
* Module size
* Number of sides used for component mounting
* Technology
* Function (Memory, Analog, etc.)
* Quantity produced and/or developed
81
Parametric Cost Estimating Handbook
Components as defined for use within these models may be custom chips as well as
off-the-shelf items. The amount of detail about the components may be very limited or as detailed
as available information will permit. A component database file may be used to catalog standard
components that are used on several different electronic modules. This file may contain
information about the standard components including the cost, if known. Whenever a component
from a database file populates a module, it can be referred to by name with only the number of
components per module required as additional information. The analyst should be allowed to
provide as much cost information as is available. This information may contain component cost,
board cost, and/or assembly and test costs. Any costs not provided will then be estimated by the
model.
Output formats will depend on the item modeled. For a custom micro-circuit, costs might
be estimated for:
Development engineering costs such as:
* Chip specification,
* Chip design,
* Systems engineering,
* Project management,
* Data,
* Prototype manufacturing
For electronic modules, some of the costs that are estimated are:
* Development engineering:
* Drafting
* Design
* Systems engineering
* Project management
* Data
82
Parametric Cost Estimating Handbook
* Prototype manufacturing
* Tooling/Test Equipment
As with the hardware models, micro-circuit models have the capability to be "fine tuned" to,
directly reflect a particular company and/or product through the process of calibration. In this
sequence of operation, the model uses actual cost data as inputs and provides development and
manufacturing indices directly related to the historical costs and specifications. These calibrated
indices can then be used as the basis for estimates of proposed custom micro-circuits and/or
modules.
Life cycle cost is defined as the cost associated with all the phases of a program: design,
development, prototype, production, and maintenance and operations (M&O or O&S). Models are
available to estimate life cycle costs of a variety of hardware programs. With appropriate
information, the life cycle costs of hardware systems can be evaluated. Through integration with
the WBS structure, cost factors for different equipments are calculated in order to estimate life
cycle costs. Moreover, the models provide for tailoring our analyses to fit a variety of maintenance
concepts and supply systems in order to customize specific programs and user organizations. After
the inputs of a set of descriptors, the models produce categories of life cycle cost in each of three
phases: development, production, and support.
In a typical application, analysts can instruct the model to select the most cost effective of
numerous built-in maintenance concepts. Operating features allow the analyst to specify a subset of
83
Parametric Cost Estimating Handbook
the maintenance concepts for model evaluation. In addition, the analyst can tailor a variety of
maintenance concepts by mixing the preestablished concepts. Some of these concepts could be:
* Discard CRU at Failure.
* Replace parts at depot.
* Replace parts at manufacturer location.
* Replace mods at equipment level. Scrap bad mods.
The strength of life cycle models are derived from three factors: 1) use of hardware
models to rapidly generate all hardware related input parameters -- particularly useful in the
decision making stages of early concept development before hardware design details are known;
2) availability of preset maintenance concepts - among which the models can be instructed to
determine the most cost effective approach; and, 3) ability to rapidly tailor program parameters to
reflect specific support conditions. The interactive feature of parametrics permit immediate
processing of sensitivity analyses to determine the effect of uncertainties of input data.
Some of the sensitivity analyses allow the determination of the effects of changes in:
* Number of equipment level maintenance and/or supply locations.
* Number of organization level maintenance and/or supply locations.
* Number of intermediate level maintenance and/or supply locations.
* Number of depot level maintenance and/or supply locations.
* Duration of the support period.
* Cost, size, and weight of units, modules, and parts.
* Cost of contractor repair.
* Test equipment costs and loading factor.
* Shipping costs.
* Crew size and labor rates.
* Dedicated vs. non-dedicated crews.
* Attrition.
Determination of the effects of the following may be addressed using risk analysis
features built into some models:
* Equipment operating time.
* Unit Mean Time Between Failure (MTBF).
84
Parametric Cost Estimating Handbook
Another feature is the capability to model different theaters of deployment and multi year
specification of equipment deployment and employment (called gradual force structuring). This
capability permits more accurate modeling of remote depots sending work back to a central depot
or the manufacturing facility (contractor facility). Force structure data permits variation in force
levels for each year, as well as the planned levels of equipment operation for each year.
There are three principal categories of data required to estimate costs with Life Cycle
models: deployment and employment; hardware parameters; and program globals.
Hardware Parameters
Hardware dependent parameters should be generated by the hardware parametric models for
each element modeled in the WBS structure, and then input into the life cycle model.
Program Globals
85
Parametric Cost Estimating Handbook
This is the largest group of inputs, and they are used to describe the maintenance and supply
system to the model. All have preset values that may be changed by the analyst. Unique sets may
be created for use in subsequent applications of the model.
COMMERCIAL MODELS
There are several commercially available hardware cost estimating models. Three typical
models (FASTE, PRICE, SEER) that estimate one or more of the hardware types are discussed
below. Although there are many others (See Appendix E) this chapter will only discribe these three
as representative. The models will allow an analyst to:
* An Economics File.
* Input variables to the model.
* Get needed information through the on-line Help system.
MicroFASTE
The MicroFASTE model is used to help the analyst to develop a parametric model to be
used to estimate the costs associated with implementing a project.
The goal of a particular project may be the production and installation of a Hardware
system, a Software system (or a combination of both). It may be the construction of nuclear power
stations, radar systems or manned space stations.
Even performing cost estimates for the implementation of a financial funding program, the
construction and operation of an underground coal or uranium mine, or the design and production
of a highly complex module containing very advanced micro-circuitry technology are possible
through the techniques of parametric systems analysis.
However, the MicroFASTE model (where the "E" on the end of "MicroFASTE" means
Equipment), is exclusively for use in performing parametric analyses for Hardware systems.
Hardware development projects involve several major phases of implementation. These
phases and activities have much in common whether the project involves production and
integration of a small hardware device, or a large, complex multi-assembly hardware system. It is
these phases in which costing studies are performed. MicroFASTE classifies these common
implementation phases into the following categories and sub-categories:
Systems Engineering
87
Parametric Cost Estimating Handbook
Establishes the equipment's design, performance and test specifications, predicated on the
controlling requirements.
Documentation
The recording of engineering activities, preparation of equipment manuals and required
management reports.
Prototype and Testing
Covers all charges connected with the manufacture and testing of engineering prototypes,
and includes all brass and bread board models.
Project Management
Takes in the overall management of all areas connected with the engineering efforts such
as planning, budgeting, operations and financial controls.
2. Production
Manufacturing
Involves the direct production charges. This is the same cost value as calculated when
Total Production is specified without the detail breakdowns.
Engineering Support
Embodies the engineering effort that is realted to the manufacturing activity such as
material design reviews, corrections of defects, etc.
Documentation
The recording of production events as well as changes to equipment manuals as
necessitated by design modifications caused by production problems.
88
Parametric Cost Estimating Handbook
Production Tooling
Covers special required tooling. It does not include the cost of capital equipment, or tools
that are amortized in overhead accounts.
Project Management
Takes in the management of all areas associated with production such as planning,
budgeting, operations and financial control.
System Integration
Integration is a study, as an independent analysis that covers the costs of consolidating
various Items into a higher order of assembly. In an Item's Acquisition study, there are variables
used to indicate the character of the Item's integration requirements. When an integration study has
been completed, the calculated integration costs may be accumulated into the project's total costs.
Equipment Installation
Equipment Installation studies are applicable to large energy producing systems such as
steam or nuclear power plants, and large hardware systems such as those installed in chemical
manufacturing plants. When a significant percentage of the project's total cost will be involved in
putting the equipment in place and making necessary mechanical and electrical connections to
perform its intended task, that is when it becomes necessary to perform Installation cost analyses.
89
Parametric Cost Estimating Handbook
Installation studies are not generally applicable to relatively small hardware systems, even
though the hardware system may indeed have to undergo an actual installation procedure to have it
perform its intended function. The costs associated with these activities are accounted for during
the System Integration phase of cost analysis.
Installation is a study that covers the costs of placing this type of equipment at its work site.
Installation studies are performed on an Item-by-Item basis to determine each pertinent Item's
installation costs. Installation costs, therefore, are not computed on a total system or project basis.
However, after completing an Item's Installation study, the calculated costs may be accumulated
into the project's total costs.
Normally, the Installation study should follow an Item's Acquisition study. Installation
efforts may include production and/or engineering phases, and their costs may be processed in
either a total or a detail manner.
Calibration
Typically, when a costing study is performed on some new unit of equipment, there are
various factors which may not be known about the item, such as how much one unit will weigh
after it is produced. The model provides the capability of performing a type of study called a
Calibration, which is a costing calculation in reverse, used to generate reference factors. When
weight and its complexity factor, traditional key factors in parametric cost calculations, are absent,
the model will generate a calculated or "fiducial" weight along with a corresponding weight factor
if input costs are given.
In performing calibrations for an Item, the analyst supplies as much information in
parametric form as is known about the item, and enter the previously known costs of a reasonable,
similar item. Then, after the model performs its calculations for the set of calibration data, a
calculated weight as well as other calculated reference factor values, including the complexity
factor, will be displayed. Those calculated values can be retained, and used as input values when
deriving the estimated costs of any new unit of equipment.
It may be desirable to perform an iterative series of Calibration and Forward Costing
exercises for a particular unit of equipment until reference factor values are derived that seem most
reasonable for a given project.
The processes of Calibration and Forward Costing can be applied to any or all of program
phases: Acquisition, Life Cycle, System Integration and Installation.
90
Parametric Cost Estimating Handbook
91
Parametric Cost Estimating Handbook
This early-on estimating capability makes PRICE a tool for design engineers, since much of
a product's cost is locked in by early engineering decisions. PRICE can provide engineers with cost
information needed to develop minimum-cost designs.
* Weight -- tells the model the size of the product being estimated.
* Manufacturing Complexity -- a coded value that characterizes product and process
technologies and (optionally) the past performance of the organization.
* Platform -- a coded value that characterizes the quality, specification level, and reliability
requirements of the product application.
* Quantities -- the number of prototypes and production items to be estimated.
* Schedule -- the dates for the start and completion of the development and production phases
may be specified. The model will compute any dates that are not specified. Only the date for
the start of development is required.
* Development Costs -- effort associated with drafting, design engineering, systems engineering,
project management, data, prototype manufacturing, prototype tooling, and test equipment.
* Production Costs -- effort associated with drafting, design engineering, project management,
data, production tooling, manufacturing, and test equipment.
* All costs are reported at the material, labor, overhead, and dollar level.
It is generally acknowledged that most of the cost of a product is locked in by the early
engineering design decisions. This means that the early decision process should consider cost as
well as technical alternatives. The lack of product definition early in the concept stage precludes a
credible conventional cost estimate. With PRICE H, however, engineers and managers are able to
develop cost estimates for each alternative to select minimum cost-product designs. Before any
92
Parametric Cost Estimating Handbook
investment has been made in design engineering, PRICE H can compute the unit production cost of
a product.
PRICE H can be quickly iterated to obtain estimates under varying assumptions until the
value selected for the estimate represents acceptable risk. With PRICE H, cost sensitivity to project
uncertainties can be explored, and management's "what if" questions can be answered.
PRICE HL
The PRICE HL (Hardware Life Cycle) Model calculates costs, cost effectiveness, and
operational availability at the system, subsystem, assembly, and sub-assembly levels. It is used in
all phases of the acquisition process and is especially well suited when linked with PRICE H for
high-level analyses in the concept and system design phases. This coupling provides a system level
trade-off capability to effectively determine hardware life cycle costs during early stages of
hardware designs. PRICE HL provides cost outputs for all phases of the hardware life cycle.
PRICE M
The PRICE M (Electronic Module and Microcircuit) Model produces estimates of
development and production costs for electronic modules and application specific integrated
circuits (ASICs). It uses cost estimating relationships based on the number of transistors and/or
gates, percentage of new circuit cells and design repeat, specification level, degree of
computer-aided design, product familiarity, engineering experience, and calibration factors to
estimate ASIC specification and design costs. Additional relationships provide systems
engineering, project management, and data costs. PRICE M contains a database for frequently used
components that can contain predefined costs or that can be calculated from the component input
specification.
SEER
The SEER family of models include the basic hardware cost estimation model (SEER-H)
and a hardware life cycle cost model. The hardware cost model estimates hardware cost and
schedules and includes a tool for risk analysis. The hardware model is sensitive to differences in
hardware technologies ASIC, MCMS, exotic materials, miniaturization, etc., and to different
acquisition scenarios; make, modify, customer-furnished, purchased, off-the-shelf, etc. It is also
93
Parametric Cost Estimating Handbook
sensitive to differences in electronic versus mechanical parameters and makes estimates based on
each hardware item's unique design characteristics.
The SEER hardware life cycle model (SEER-HLC) evaluates the life cycle cost effects due
to variations in: reliability, mean time to repair and repair turnaround times. SEER-HLC
complements SEER-H, and both models will run on a personal computer. The models are based on
actual data, utilize user-friendly graphical interfaces, and possess built-in knowledge bases and
databases that allow for estimates from minimal inputs.
SEER-IC is an expert system for planning integrated circuit development and production.
SEER-IC provides a systematic, expert approach to cost and resource estimation and a capability
for Design to Cost (DTC) tradeoffs and risk analysis. SEER-IC includes an industry-wide
knowledge base.
SEER-DFM (Design for Manufacturability) is a tool designed to assist the engineer produce
and assemble products with efficiency, in a manner designed to exploit the best practices of an
organization. Diverse team members have an idea of what their true interests are and of how they
can work together and input to the model accordingly. The fundamental goal of DFM is to make
engineering decisions that optimize the manufacturing process.
Two fundamental analysis steps are taken in a DFM regime: the gross and detailed tradeoff
analysis.
* Gross analysis involves product design decisions, and also fundamental process and tooling
decisions. Factors that influence gross analysis include the quantity of the planned product,
the rate at which it will need to be produced, and the investment required. There are also
machinery, assembly and setup costs to contend with.
* Detailed analysis takes place once many of the primary production parameters, such as
design and basic processes, have been fixed. Factors that can be adjusted for and balanced
at the detailed level include tolerances, the proportion of surface finishes, secondary
machining options (drilling, reaming, polishing), and the age and degree of process
intervention (how soon is a machine recalibrated or rebalanced?).
SEER-DFM explores the DFM effort by developing engineering tradeoffs that are involved
in a DFM analysis and presents the results of each as costs. SEER-DFM models the categories of
materials and processes used in modern production lines. SEER-DFM integrates the following
models:
94
Parametric Cost Estimating Handbook
In many cases regression-analysis based cost models are generated by government agencies
(or program offices), government support contractors, or hardware/software contractors. In such
cases, a sufficient quantity of cost data is usually available to build statistical, i.e., regression based
CERS. The flow diagram in Figure IV-3 shows the most commonly used process.
95
Parametric Cost Estimating Handbook
96
Parametric Cost Estimating Handbook
broken into work packages which are not very suitable for determining component or subsystem
costs. Considerable effort is usually required to extract cost drivers from the cost of WBS elements
which are meaningful and "uncontaminated" by activities which are applicable to more than one
WBS element.
Preliminary CERs are conceptualized, and both cost and technical data are collected
according to the WBS elements and CER format. The cost data are then normalized with respect to
base year constant dollars, quantity of hardware, completeness of cost inclusions, etc., and statistical
regressions are performed to correlate cost (dollars or hours) to cost drivers. Mathematical
equations for the CERs resulting from the regression analyses are first checked for their statistical
validity, and next for technical credibility. An extensive review cycle is then entered, complete
with Beta testing, checking by technical and cost experts within the government and contractor
communities, and modification of the CERs based on revised information or additional data
sources. The final step of this process consists of a significant amount of documentation of the
CER equations, description of included and excluded costs, data sources, ground rules, rationale,
limits, uncertainties and applicable comments.
This cost model process leads to parametric CERs which can be used by the government
and by contractors.
The Government often sponsors the Development of Parametric Cost Models (e.g.,
Unmanned Space Craft Model Version 7 by Tecolote Research. And the Missile Analysis and
Display Model by EER Systems Corporation which was sponsored by the USA Missile Command).
These Cost Models are usually based on historical data from a number of systems and are used by
Government and Contractor Cost Analysts alike to develop cost estimates. They are often used as
secondary estimating techniques and as crosschecks to the primary estimating methodology.
Non-statistical cost models are those which are based on other types of mathematical
analysis, and are generally engineering models. In many cases, insufficient data exist to construct a
meaningful, regression based CER or model. For example, if only two cost data points are present
97
Parametric Cost Estimating Handbook
for an item (a hardware item like a transmitter, for instance) an approximation of a CER would
consist of fitting a power curve with an exceptable exponent through the two data points. The
exponent should make sense in light of the analyst's experience.
Non-statistical cost models are generally more judgmental than those based on regression
and additional actual data. Uncertainties associated with a small number of actual data points could
easily lead to large inaccuracies of the cost model. However, a larger amount of data on which a
regression-based cost model is based does not necessarily improve its accuracy. It depends on the
quality of the majority of the data points (quantity does not always equate to quality).
The use of non-statistical cost models puts additional emphasis on the competence of the
model originator and will generally increase the skepticism of an auditor. Non-statistical cost
models for proposals should generally be used under the following conditions:
(1) An insufficient (to produce statistically based CERs and models) amount of historical data
exists;
(2) The existing database is really not applicable (i.e., new technology or new products are
being estimated);
(3) The cost modeler has sufficient expertise, knowledge and competence to construct the
model (e.g., with respect to choice of variables);
(4) The model can be validated for at least one, preferably more than one, case; and,
(5) Excellent cost and technical visibility exists for the few data points which form the basis of
the model.
The auditor or reviewer needs to scrutinize the cost model with respect to the above
mentioned five criteria, and request sufficiently reasonable explanations with respect to the
derivation of the cost model. If necessary, the auditor may also require an explanation why the
particular set of variables are being used in the model.
Non-statistical cost models may not always have weight and complexity as independent
variables. Size may be implied by amount of energy, thrust level, tank volume, or parts count.
Complexity may be implied by temperature, efficiency, thermodynamic cycle type, or
manufacturing processes and packaging density. In many cases, non-statistical cost models can be
constructed by equating cost to these directly measurable parameters. Care has to be taken to
equate cost to independent technical parameters.
98
Parametric Cost Estimating Handbook
In general, non-statistical cost models are used at a top level, i.e., are often not divisible into
the typical WBS labor and material cost components. However, they can also cover simple
cost-to-cost ratios such as quality control labor to manufacturing touch labor hours with only one or
two data points. In this case, it is up to the contractor to convince the auditor that the historically
developed relationship (e.g., labor ratio) for one product will also hold for another (hopefully
similar) product.
MODEL CALIBRATION
The act of calibration standardizes a model. Many models are developed for specific
situations and are, by definition, calibrated to that situation. Such models usually are not useful
outside of their particular environment. However, general cost estimating models including
commercially available ones such as the FAST, PRICE and SEER Models (described earlier) are
designed to be useful as estimating tools for a wide range of situations. The act of calibration is
needed to increase the accuracy of one of these general models by making it temporarily a specific
model for whatever product it has been calibrated for. Calibration is in a sense customizing a
generic model.
Items which can be calibrated in a model are: product types, operating environments, labor
rates and factors, various relationships between functional cost items, and even the method of
accounting used by a contractor. All general models should be standardized (i.e. calibrated), unless
used by an experienced modeler with the appropriate education, skills and tools, and experience in
the technology being modeled.
Calibration is the process of determining if there is any deviation from a standard in order to
compute correction factors. For cost estimating models, the standard is considered historical actual
costs. The calibration procedure is theoretically very simple. It is simply running the model with
normal inputs against items for which the actual costs are known. These estimates are then
compared with the actual costs and the average deviation becomes the correction factor for the
model. The actual data used for the calibration runs determines what type of calibration is done. In
essence, the calibration factor obtained is really good only for the type of inputs that were used in
the calibration runs. For a general total model calibration, a wide range of components with actual
costs need to be used. Better yet, numerous calibrations should be performed with different types
of components in order to obtain a set of calibration factors for the various possible expected
99
Parametric Cost Estimating Handbook
estimating situations. For instance, the PRICE Hardware model addresses this situation using
internal complexity factors. These can be modified by calibration to the component level. Board
fabrication, parts costs, packaging costs, and the overall design and manufacturing costs have
factors that can be tuned to a specific product, product line, or business.
Cost models are significantly more complex subjects than simple CERs. Typically a cost
model uses both CERs and mathematical relationships between the CERs. Both cost models and
CERs are based on historical data. The analyst reviewing a cost model must review the model, the
basis history and the specific application. In addition, models must be evaluated heuristically by test
cases and sensitivity analyses. If a product specific model is used in an estimate, the analyst needs
access to more information and be able to explore the model's internal relationships and data base.
The disadvantage of generic proprietary cost models is that the cost analyst sometimes does not
have access to the data base and the CERs embedded in the model's "black boxes."
Heuristics
Company
Software Model Specific Factors
Pricing Factors
and Formats
FIGURE IV - 4
When considering calibration, from an auditor's standpoint, basically three things need to be
known about an estimate that is prepared using a model. There are:
(1) The skill level of the model builder.
(2) If the model was calibrated, what kind of data was used to do the calibration? Was it an
overall general calibration using several data points or a more specific calibration with
actual cost data that represents the project being modeled? A calibration will have more
credibility if it is performed with data similar to the project being estimated.
(3) The third consideration has to do with the quality of the actual cost and non-cost data
that went into the calibration. Historical actual costs are not necessarily "real" costs as
far as a calibration activity is concerned (See Chapter II: Normalization of Data). Other
sources such as actuals from a database, vendor firm quotes, vendor ROMS,
management estimates to completion, industry published costs, and even estimates from
other products or other organizations sometimes can be used as inputs for calibration.
The appropriateness of such inputs need to be considered. Knowing what went into the
calibration (for a specific model) and how it relates to the model's current estimate
needs to be taken into consideration when evaluating the appropriateness and quality of
the calibration.
When a product specific cost model is used to perform the actual cost estimate, the analyst
has to evaluate the model much the same as for single CERs. In addition, however, the relationship
between the cost elements has to be explored. Are the cost drivers generating only one set of costs
or are the various drivers producing overlapping costs - for example, if a factor is calculating both
program management and data, does the program management number exclude data costs? If
product specific models are used, the basis cost history is important for checking the calibration of
the model. It also has to be examined to see if the source information is clean and well defined.
101
Parametric Cost Estimating Handbook
Regardless of the model type, it is desirable for the model to be calibrated for the specific
situation at hand. For proprietary product specific models, the calibration has to include company
specific calibration in order to account for company peculiarities, accounting methods and rates.
Once the general features of the company are incorporated into the model, the auditor has to check
the basis of estimate (sometimes called complexity) factors. Since models usually broaden their
effective databases by incorporating functional relationships within parameters, the historical basis
data utilized by the model has to be examined for both cost validity and the mathematical
relationships between the technical parameters and cost. This step is important since the system to
be estimated rarely is exactly like (technically or functionally) the historical case and some
normalization is usually required in the value of the independent variables. The analyst must see
the documentation associated with the basis case both for the costs and the technical features. In
other words, the basis case must show the source data for the estimated engineering, data, prototype
costs, etc. The basis case documentation must also show the technical data - weight, size
complexity etc. Again, if the cost elements are significant, considerable detail may be necessary to
verify the basis calibration. For instance, if Printed Wiring Assembly (PWA) costs are a significant
driver for the "black box" being estimated, the basis costs and technical descriptions must be
available in some detail perhaps down to the board and chip count level.
Once the auditor has verified the model validity both for the corporation and for the specific
basis case, he or she can then address the application of the model to the case at hand. This
involves examining the tasks being estimated and the technical features of those tasks. In
development cost estimates, the technical detail may be thin, consisting of simply performance
requirements. In this case, the basis system performance characteristics become significant. For
production estimates, hardware item drawings should exist and can be compared directly to the
drawings of the basis case - which might even be a previous version of the item being estimated. In
all cases, any extrapolation must be supported by quality information. Vague references of
"engineering judgment" are not satisfactory; this judgment has to be translated into concrete
hardware definitions.
102
Parametric Cost Estimating Handbook
* Is the model proprietary or non-standard? Does the overall estimate contain interface
models or algorithms which transfer data into and between other models?
* If non-standard, are the model interrelationships well described and well defined, and do
they make logical sense?
* Do the model elements deal with all the costs in the estimate? If not, is there any
overlap between the costs derived through the model and the costs developed outside
the model?
* Is the model used without adjustment? If adjusted, what is the justification and source
data for the adjustments?
* Is the basis case used for reference in the estimate a close analog to the case being
estimated? If not, how far is the basis case, functionally and physically, from the case to
be estimated? What are the functional relationships within the model parameters, and
how sensitive is cost to error in specifications? What is the source documentation for
the base case?
* What are the values of the elements in the case to be estimated? How do they compare
to the values in the basis case? What are the source data for the case to be estimated?
Given that not all these questions have unequivocal answers, the auditor can examine how
variations in the parameter values affect the estimate. If the model is computerized, the auditor can
even do "what if" tests to see if some particular cost relationship is especially important. For those
elements which show significance and which are unsatisfactorily supported, the auditor can request
further information and data.
The objective of the analysis is to see whether the estimate is fair and reasonable. In
complicated logical structures, this issue of overall reasonableness may evolve as a result of
examination of the elements of the estimate. In addition, softness in data, vagueness in
specification and bias in projection can elevate uncertainty in the cost estimate to the point that this
uncertainty becomes a reasonableness issue. In short, the cost analyst evaluating a cost model has
to assure him/herself that the model is a good simulation of the true cost influences, that the basis of
the estimate is appropriate and well founded, and that the extension of the basis through the model
is a true picture of what is proposed in the technical description of the system. If the answer to all
these questions is “yes” and if the uncertainty associated with the estimate is reasonable, the analyst
is looking at a good cost model and estimate.
103
Parametric Cost Estimating Handbook
Appendix E lists other examples of hardware models and describes in some detail a few of
them.
104
Parametric Cost Analysis Handbook
CHAPTER V
SOFTWARE PARAMETRIC
COST ESTIMATING
105
Parametric Cost Analysis Handbook
CHAPTER V
SOFTWARE DEFINITION
Most analysts are familiar with all three types of software as all are used in personal
computers. MS-DOS (Micro Soft Disk Operating System), for example, is a trade name for the
operating system for Intel X86 Personal Computers. LOTUS 123 and Micro Soft Excel are
examples of application based software. Listing a directory of user files is a utility software
function. The system software procured by the US Government is usually a complex
combination of all three types of software.
THE IMPORTANCE OF SOFTWARE TODAY
106
Parametric Cost Analysis Handbook
107
Parametric Cost Analysis Handbook
engineering, we are beginning to realign our software and organizational structure so that they
correspond to real world objectives and needs.
Another area where software re-engineering can aid business re-engineering is the
identification of business rules from legacy course code. Embedded within legacy systems are
the implied rules that govern how the organization was run at the time the system was developed.
By extracting these business rules, an organization can better understand and, later, codify the
rules by which they conduct business.
Improving software quality and productivity is a major challenge faced by the Department
of Defense (DoD) and the non-military community (DOE, NASA) as well. In addition,
providing affordable weapon system flexibility through software is a specific challenge for the
DoD.
Improving software quality is basic to how well software meets the requirements and
expectations of the users. It also means ensuring that software is adequate, reliable, and efficient.
Improving productivity means favorably increasing the ratio between the resources required to
develop software and the size and complexity of the developed software.
The growth in computer use and computer hardware capabilities has placed demands of
increasing magnitude and complexity on computer software. Software development processes,
along with attendant methodologies, which may have worked well in the past, often break down
when applied to the development of today’s software. For example, studies show that every five
years the sizes of software projects, as measured by Source Lines Of Code (SLOC), increase by
an order of magnitude, and that the scaling of the development effort, demanded by the order-of-
magnitude increases now require fundamental changes in the development process. As the sizes
of software projects have increased, software development processes based on individual
programmers have given way to processes based on small teams and, in turn, small teams have
given way to larger teams and so on. Higher scaling of software development processes by
merely increasing team sizes reaches limits on effective project management and resource
availability.
Today’s users of software demand software applications of greater size and complexity
than before. Advances in computer hardware capabilities are more than adequate to match the
demands of users; however, software as it is developed using prevailing processes and
108
Parametric Cost Analysis Handbook
methodologies is not. The challenge is finding software development processes with attendant
methodologies and technologies that meet user demands and that improve software quality and
productivity.
As a percent of total cost, software cost has grown disproportionally more than hardware
cost (see Figure V-1).
The military, like the business world today, sees software providing the versatility and
leverage to achieve performance goals. For example, software demonstrated its flexibility to
quickly change weapon system capabilities in Desert Storm, the most newsworthy being the
rapid development of a new software package for the Patriot Missile system to counter the
SCUD. Because of the versatility and leverage provided by software, the DoD’s appetite for
software in the future has been described as “virtually insatiable.”
Software is an increasingly important element in military systems of all kinds. The
capabilities of current and future military systems are dependent on the performance of the
systems’ software. As a system is upgraded or improved, much of the additional capability is
109
Parametric Cost Analysis Handbook
achieved through new software. In fact, many of the functions essential to mission or
organizational success are partly or completely accomplished through the use of software.
Unfortunately, software development and maintenance is an error-prone, time-consuming
and complex activity. Experience has revealed that many software development efforts falter
because the management of these projects fall into several common traps. These problems are:
* Lack of adequate requirements/specifications.
* Lack of attention to user needs and constraints.
* Lack of visibility into the development process.
* Lack of control at key points during development.
* Lack of standardization.
* Lack of attention to cost of ownership considerations.
* Lack of adequate documentation.
* Lack of adequate training and skills in estimating effort and schedule.
Falling into these traps leads directly to increased development and support costs,
schedule slips, and reduced systems capability.
DOD-STD-2167A/498, Defense System Software Development, established uniform
requirements for software development and is widely applied in all software application areas by
all DOD components. DOD-STD-7935A/498, DOD Automated Information System (AIS)
Documentation Standards provides guidelines for the development and revision of
documentation for information systems. For embedded systems, DOD-STD-2167A/498 is
appropriate. For information systems, both DOD-STD-2167A/498 and DOD-STD-7935A are
appropriate. The documentation requirements of DOD-STD-7935A take precedence over those
of DOD-STD-2167A/498. In both application areas, tailoring of these standards must be done to
reflect the unique needs and constraints of each project.
The actual software development tasks will be accomplished by a development
organization that is an element of, or subcontractor to, the system development contractor. On
occasion, this “contractor” may be a DOD agency. The development contractor has the
responsibility for delivering a software product that meets all contractual requirements.
Unfortunately, at the beginning of a contract’s development efforts, it us usually not possible to
110
Parametric Cost Analysis Handbook
specify precisely and completely all of the characteristics of the final software products and their
development processes. Experience has shown that the difference between successful and
unsuccessful development efforts is the vigor and timeliness of the direction given to the
contractor by the Program Manager, supported by the Project Office.
Figure V-2 indicates the relative impact of the penalty (cost) for delayed error correction
during a software project’s life cycle, from almost no cost during preliminary design to high cost
impacts during validation and operative stages. BPR focuses on the later stages (maintenance)
and the reduction of errors at those times by re-engineering the early phases.
In today’s world of shrinking budgets, providing affordable, flexible software systems
requires cost control and predictability that are not found in the traditional software development
processes. Increasingly, the DoD demands that software be developed within predictable costs
and schedule. Enter, therefore, parametric cost modeling.
111
Parametric Cost Analysis Handbook
This section defines the basic process of software development as currently practiced by
the majority of government contractors. Major phases and software development activities are
defined, as well as key milestones for the measurement of progress.
112
Parametric Cost Analysis Handbook
113
Parametric Cost Analysis Handbook
The overall process of developing a cost estimate for software is not different from the
process for estimating any other element of cost. There are, however, aspects of the process that
are peculiar to software estimating. Some of the unique aspects of software estimating are driven
by the nature of software as a product. Other problems are created by the nature of the estimating
methodologies. Brooks, in his 1982 collection of essays, referred to large system software
programming as a “tar pit.” His description of one such project is typical of Space and Missiles
Center and NASA experience with software development. He states:
“The product was late, it took more memory than planned, the costs were several times
the estimate, and it did not perform very well until several releases after the first.”
114
Parametric Cost Analysis Handbook
The software WBS (see Appendix B) is an excellent tool for visualizing the software
product. The WBS need not be complex, nor does it need to be highly detailed. A simple
product tree line drawing is often adequate for initial software estimates. The hardware WBS
can be a useful tool in developing the initial WBS for software. There is usually a software
Computer Software Configuration Item (CSCI) or similar module associated with each hardware
Line Replaceable Unit (LRU). As the program evolves, the initial or draft WBS should include
all software associated with the program regardless of whether it is developed, furnished, or
purchased. If furnished or purchased software were omitted, it would not be possible to capture
the cost of integrating preexisting or purchased software with the development software.
The WBS should depict only major software functions, and major subdivisions. It should
not attempt to relate the software to the hardware it controls. Each of the major software
functional units can be modeled as a Computer Software Configuration Item (CSCI). Lower
level WBS elements can be modeled as a component. Once the WBS is established the next step
is to determine which estimating technique should be used for deriving the estimate.
One of the most challenging tasks in project management is accurately estimating needed
resources and required schedules for software development projects. The software estimation
process includes estimating the size of the software product to be produced, determining which
functions the software product must perform, estimating the effort required, developing
preliminary project schedules, and finally, estimating overall cost of the project.
Size and number of functions performed are considered major productivity
(“complexity”) factors during the software development process. Effort is divided into labor
categories and multiplied by labor rates to determine overall costs. Therefore, software
estimation is sometimes referred to as software cost estimation.
Software life cycle models identify various phases and associated activities required to
develop and maintain software, and provide excellent input into the software estimation process.
Some of the more common and accepted life cycle models include: (1) Waterfall Model; (2)
Rapid Prototyping; (3) Incremental Development Model; (4) Spiral Development Model; (5)
Reusable Software Model; and (6) the Transform Model [Boehm and Davis]. These models
form a baseline from which to begin the software estimation process and should be reviewed and
tailored to the proposed project.
115
Parametric Cost Analysis Handbook
Software identified as mission-critical and developed for the United States government
usually must be developed in accordance with DOD-STD-2167A/498. This standard establishes
uniform requirements for software development, and does not specify or discuss any particular
method of software development. However, it requires the inclusion and documentation of the
major software development life cycle activities. The standard also requires that reviews and
audits be held in accordance with MIL-STD-1521B, “Technical Reviews and Audits for Systems,
Equipment, and Computer Programs.”
Additionally, structured approaches to sub-task identification are extremely beneficial in
determining tasks and the required effort for each task. The WBS is a method which strongly
support this process.
The software estimation activity should be approached as a major task and therefore
should be well planned, reviewed and continually updated. The basic steps required to
accomplish software estimation are described in the following paragraphs.
116
Parametric Cost Analysis Handbook
The project should be organized into a hierarchical set of tasks to the lowest level of
detail that available information will allow. Additionally, a breakdown of documentation
requirements and associated tasks should be defined (the detailed WBS).
The WBS helps establish a hierarchical view and organization of the project. The top
level is the software system or final software product, and subsequent levels help identify tasks
and associated sub-tasks and are used to define and encapsulate system functions. Each of these
tasks are divided into software development phases such as design, code and test, and integration.
All activities associated with each level must be defined including: project planning and control,
configuration management, product assurance and documentation.
In addition to early development of detailed knowledge about the project, the WBS
provides an excellent methodology for project data collection, tracking, and reporting. During
development of the project, each of the WBS tasks can be given a project budget, and a job
number which is used for reporting time spent on each project phase or activity. This provides
an excellent project tracking and history data collection method. Most government contracts
require that such a Cost/Schedule Control System Criteria (C/SCSC) be established. When the
data are collected to an established WBS, the information can be placed in a database to be used
in refining, tailoring, and customizing the software estimation process. This information
becomes an invaluable input to the software estimation process for future projects.
Software project tasks/subtasks should be defined to the smallest component possible.
All technical aspects of the project should be understood as fully as possible since the more
details known about the project the more accurate the estimates will be. Newer methodologies
are evolving which aid software developers in identifying various functions needed to support the
project, such as Object-Oriented Analysis and Design (OOA, OOD). Therefore, organizations
should actively keep abreast of evolving technologies.
117
Parametric Cost Analysis Handbook
(1) The inability to accurately size the software project. This results in poor
implementations, emergency staffing, and cost overruns caused by underestimating
project needs.
(2) The inability to accurately specify a development environment which reflects
reality. This results in defining cost drivers which may be inappropriate,
underestimated or overestimated.
(3) The improper assessment of staff skills. This results in misalignment of skills to
tasks and ultimately miscalculations of schedules and level of effort required, as
well as either underestimating or overestimating project staffing requirements.
(4) The lack of well defined objectives, requirements, and specifications, or
unconstrained requirements growth during the software development life cycle.
This results in forever changing project goals, frustration, customer dissatisfaction,
and ultimately, cost overruns.
All potential risks associated with the proposed software development project should be
defined and weighed, and impacts to project cost should be determined. This information should
always be included in the software estimation process.
Estimation Methodologies
Several methods (if possible) should be used during the software estimation process. No
one methodology is necessarily better than the other, in fact, their strengths and weaknesses are
often complimentary to each other. It is recommended that more than one software estimation
methodology be used for comparison and verification purposes. One method may overlook
system level activities such as integration, while another method may have included this, but
overlooked some key post-processing components. Five of the methods discussed in Dr.
Boehm’s book Software Engineering Economics are: analogy, bottom-up, top-down, expert
judgment, and algorithms (parametrics).
These methods are often used in conjunction with each other and have been used for
many years by managers of software projects without the use of any formal software estimation
118
Parametric Cost Analysis Handbook
tools. Software estimation tools have only recently been developed which incorporate these
methods, and many incorporate multiple methodologies.
Analogy Method
Estimating by analogy means comparing the proposed project to previously completed
similar projects where project development information is known. Actual data from the completed
projects are extrapolated to estimate the proposed project. Estimating by analogy can be done
either at the system level or the component level.
The main strength of this method is that the estimates are based on actual project data and
past experience. Differences between completed projects and the proposed project can be identified
and impacts estimated. One problem with this method is in identifying those differences. This
method is also limited because similar projects may not exist, or the accuracy of available historical
data is suspect. Also, many projects for DOD weapon systems may not have historical precedents.
The analogy or comparative technique uses parametric approaches including the use of CER's.
Bottom-Up Method
Bottom-up estimation involves identifying and estimating each individual component
separately, then combining the results to produce an estimate of the entire project.
It is often difficult to perform a bottom-up estimate early in the life cycle process because
the necessary information may not be available. This method also tends to be more time consuming
and may not be feasible when either time or personnel are limited.
Top-Down Method
The top-down method of estimation is based on overall characteristics of the software
project. The project is partitioned into lower-level components and life cycle phases beginning at
the highest level. This method is more applicable to early cost estimates when only global
properties are known.
Advantages include consideration of system-level activities (integration, documentation,
project control, configuration management, etc.), many of which may be ignored in other estimating
methods. The top-down method is usually faster, easier to implement and requires minimal project
119
Parametric Cost Analysis Handbook
detail. However, disadvantages are that it can be less accurate and tends to overlook lower-level
components and possible technical problems. It also provides very little detail for justifying
decisions or estimates.
120
Parametric Cost Analysis Handbook
Benefits
When the software estimation process is performed correctly, the benefits realized far
outweigh the cost of doing the estimating. Some of the major benefits include lowering the cost of
doing business, increasing the probability of winning new contracts, increasing and broadening the
skill-level of key staff members, acquiring a deeper knowledge of the proposed project prior to
beginning the software development effort, and understanding, refining, and applying the proper
software life cycle mode.
As these software estimating components are enhanced, refined, and continually applied,
the software estimating process, associated tools, and resulting products attain higher levels of
quality and ultimately benefit all.
Accurately estimating the resources and time needed for a software development project is
essential even for the survival of the project. In the great majority of cases, the resources and time
actually expended are much more than the initial planning estimates. An approach for estimating
the resources and schedule needed for software development is the use of a software cost and
121
Parametric Cost Analysis Handbook
schedule model that calculates the resources and time needed as a function of some other software
parameters (such as the size of the program to be developed).
The more times an organization has developed software of the same size and complexity for
the same type of application, the more accurate the estimates for a new project will be.
Unfortunately, when the Program Manager attempts to extrapolate information from small and less
complex software development efforts to larger, more complex software in different application
areas, the results are often unreliable. The problem results from and exponential relationship
between software size and development effort.
For example, one very widely used parametric software cost model is the Constructive Cost
Model (COCOMO). The basic COCOMO model has a very simple form:
Estimates from the basic COCOMO model can be made more accurate by taking into
account other factors concerning the required characteristics of the software to be developed, the
qualification and experience of the development team, and the software development environment.
Some of these factors are:
* Complexity of the software
* Required reliability
* Size of data base
* Required efficiency (memory and execution time)
* Analyst and programmer capability
* Experience of team in the application area
* Experience of team with the programming language and computer
* Use of tools and software engineering practices
* Required schedule
122
Parametric Cost Analysis Handbook
Many of these factors affect the person months required by an order of magnitude or more.
COCOMO assumes that the system and software requirements have already been defined, and that
these requirements are stable. This is often not the case.
Another popular software cost model is the Putnam model. The form of this model is:
The Putnam model is very sensitive to the development time: decreasing the development
time can greatly increase the person-months needed for development.
One significant problem with the COCOMO, PUTNAM and similar models is that they are
based on knowing, or being able to estimate accurately, the size (in lines of code) of the software to
be developed. There is often great uncertainty in the software size. Computer programs, several of
which are available for PCs, have been developed to implement variations of the COCOMO and
other cost models.
Another estimating approach, called Function Point Analysis (FPA), is used to estimate
both the size of the software to be developed and the effort required for the development. In FPA,
you determine the number and complexity of inputs, outputs, user queries, files, and external
interfaces of the software to be developed. The sum of these numbers, weighted according to the
complexity of each, is the number of function points in the system.
This function point number is directly related to the number of end-user business functions
performed by the system. Using data from past projects, it is possible to estimate the size of the
software needed to implement these function points (typically about 100 source language
statements are needed for each function point) and the labor needed to develop the software
(typically about 1 to 5 function points per person month). FPA has been developed and applied
almost exclusively in Information System applications. A variation, called feature point analysis,
has been defined for other application areas. The chief difference between feature point analysis
123
Parametric Cost Analysis Handbook
and FPA is that the number and complexity of the algorithms to be implemented are considered in
calculating the number of feature points.
Do not depend on a single cost or schedule estimate. Use several estimating techniques or
cost models, compare the results, and determine the reasons for any large variations. Document the
assumptions made when making the estimates. Monitor the project to detect when assumptions
that turn out to be wrong jeopardize the accuracy of the estimate.
As mentioned earlier, one of the critical problems facing software development project
managers is determining accurate software estimations for level of effort, schedules, SLOC, and
overall costs. Since trends indicate that the cost of producing software products is escalating and
consuming an ever increasing percentage of budgets, the need to quickly generate more reliable
estimates is becoming even more important.
For many years, project managers have relied on software development teams to estimate
the cost of producing software products. This has always been a subjective and intuitive process
influenced by such factors as personality, opinions, and pressure to win contracts. The process has
encouraged low estimates and short schedules, the results of which have been devastating to
companies and projects, including projects of national importance.
For these reasons, software parametric cost estimating tools have been developed since the
late 1970's to provide a better defined and more consistent software estimating process. These tools
have been developed from historical data collected from thousands of software projects, as well as
research performed to identify key productivity factors. Early tools were hampered by the scarcity
of reliable data; however, as more data became available, estimating tools were improved and
continued to evolve. Most parametric software estimation tools use algorithms, and some of the
more advanced tools are rule-based or knowledge-based as well as interactive.
Good software estimating tools do not always guarantee reliable software estimates. If
inaccurate software size estimates and attribute ratings are input, then inaccurate estimates will
result. Additionally, organizations need to customize the software estimation tools to their own
development environment (the calibration process was discussed previously in Chapter IV).
124
Parametric Cost Analysis Handbook
This customization requires collecting and maintaining historical data from current and past
projects to provide the necessary inputs required for the software estimating tools. The software
development organization should establish a staff that is thoroughly trained in the software
estimating process and available estimating tools, and they should perform all the software
estimating activities. Experience and existing tools dictate what project development information
needs to be maintained.
125
Parametric Cost Analysis Handbook
(3) Provide early estimates - The tool should be capable of generating estimates early and
quickly in the life cycle process when requirements and development environments
are not fully defined. The tool should also allow task detail to be added incrementally
as functions, activities, and other information becomes more completely defined.
Since there are many unknowns early in the estimating process, the tool should reflect
degrees of uncertainty based on the level of detail input (risk analysis). In general, the
tool should provide sufficient information to allow initial project resource planning as
well as reasonably early "go/no go" decisions.
(4) Be based on software life cycle phases and activities - The tool should be capable of
providing estimates for all phases and activities of the most commonly used software
life cycle models. It should allow the organization to decompose and map software
development tasks into those phases and activities, as well as support a program
WBS. In addition, it should allow for "what if" situations and include factors for
design trade-off studies.
(5) Allow for variations in application languages and application function - It is very
important that the tool provide estimates specific to the application of the software
project since the associated estimating equations, cost drivers, and life cycle phases
should be unique to each application are. General application categories include
Information Systems (IS), simulation and modeling systems, real-time systems,
accounting systems, and systems based on higher-order languages.
(6) Provide accurate size estimates - The size of a software development project is a
major cost driver in most estimating tools, yet size is one of the most difficult input
parameters to estimate accurately. The tool should include the capability to help
estimate the size of the software development project, or at least help define a method
for estimating the size.
(7) Provide accurate schedule estimates - As previously mentioned, schedule overruns are
common and can be extremely costly. The software estimating tool should be able to
provide schedule estimates accurately. The purpose of scheduling is not only to
predict task completion given task sequence and available resources, but also to
126
Parametric Cost Analysis Handbook
establish starting and ending dates for the associated work packages and life cycle
phases.
(8) Provide maintenance estimates separately - The software estimating tool should be
able to provide software maintenance estimates as a separate item. Software
maintenance includes such activities as correcting errors, modifying the software to
accommodate changes in requirements, or extending and enhancing software
performance.
127
Parametric Cost Analysis Handbook
environment. The list that follows is not all inclusive, and should not be considered complete. The
tools discussed are representative only, since there are many others that could be used.
REVIC
The Revised Enhanced Version of Intermediate COCOMO (REVIC) model was developed
by Mr. Raymond L. Kile formerly of Hughes Aerospace. The Air Force Contract Management
Division, Air Force System Command, Kirtland Air Force Base, New Mexico, sponsored the
development for use by its contract administrator.
The main difference between REVIC and COCOMO is the coefficients used in the effort
equations. REVIC changed the coefficients based on a database of recently completed DOD
projects. It also uses a different method of distributing effort and schedule to each phase of product
development, and applies an automatic calculation of standard deviation for risk assessment.
REVIC provides a single "weighted average" distribution for effort and schedule along with
the ability to let the user vary the percentages in the system engineering and development test and
evaluation phases. REVIC employs a different Ada model than Ada COCOMO. The REVIC
model has also been enhanced by using a Program Evaluation and Review Technique (PERT)
statistical method for determining the lines of code to be developed.
In addition to providing estimates for cost, manpower, and schedule, the program creates
estimates for typical DOD-STD-2167A/498 documentation sizing and long term software
maintenance. REVIC's schedule estimates are often considered lengthy because it assumes that a
project's documentation and reviews comply with the full requirements of DOD-STD-2167A/498.
REVIC operates on PC compatible systems.
PRICES
The PRICES tool is distributed by Lockheed - Martin PRICE Systems. This tool was first
developed in 1977, and is considered one of the first complex commercially available tools used for
software estimation. The equations used by this tool are proprietary. However, descriptions of the
methodology algorithms used can be found in papers published by PRICE Systems. A model
describing the PRICES parametric tool is shown in Figure V-4.
128
Parametric Cost Analysis Handbook
FIGURE V - 4
The PRICES tool is based on Cost Estimation Relationships (CERs) that make use of
product characteristics in order to generate estimates. CERs were determined by statistically
analyzing completed projects where product characteristics and project information were known, or
developed with expert judgment.
A major input to PRICES is Source Lines of Code (SLOC). Software size may be input
directly, or automatically calculated from quantitative descriptions (function point sizing). Other
inputs include software function, operating environment, software reuse, complexity factors,
productivity factors, and risk analysis factors. Successful use of the PRICES tool depends on the
ability of the user to define inputs correctly. It can be customized and calibrated to the needs of the
user. It is now available for Windows and UNIX/Motif.
129
Parametric Cost Analysis Handbook
SASET
The Software Architecture, Sizing and Estimating Tool (SASET) was developed for DOD
by the Martin Marietta Corporation. SASET is a forward-chaining, rule-based expert system
utilizing a hierarchiacally structured knowledge database. The database is composed of projects
with a wide range of applications.
SASET provides functional software sizing values, development schedules, and associated
man-loading outputs. It provides estimates for all types of programs and all phases of the
development cycle. It also provides estimates for maintenance support and performs a risk
assessment on sizing, scheduling, and budget data.
SASET uses a five-tiered approach for estimating including class of software, source lines
of code, software complexity, maintenance staff loading, and risk assessment. The user can either
input the program size directly or allow SASET to compute size, based on function-related inputs.
The tool also has an extensive customization file in which the user can adjust many parameters. It
operates on PC compatible systems.
SEER-SEM
System Evaluation and Estimation of Resources - Software Estimation Model (SEER-SEM)
is distributed by Galorath Associates and is currently under a five year Air Force wide license
agreement. It provides software estimates with knowledge bases developed from many years of
completed projects.
The knowledge base allows estimates with only minimal high level inputs. The user need
only select the platform (i.e. ground, unmanned space, etc.), application (i.e. command and control,
diagnostic), development methods (i.e. prototype, incremental), and development standards (i.e.
2167A/498). SEER-SEM is applicable to all types of software projects and considers all phases of
software development.
SEER-SEM is designed to run on PC compatible systems running Microsoft Windows
3.0/3.1 (Air Force license includes MS-DOS version). It is also available for the Apple MacIntosh
running system 6.0.3 and above and the UNIX/SUN work station.
130
Parametric Cost Analysis Handbook
A companion tool called the SEER-Software Sizing Model (SSM) is also distributed by
Galorath Associates and is used to estimate the size of the software product.
SLIM
The Software Life Cycle Model (SLIM) is marketed by Quantitative Software (QSM).
SLIM was developed in 1979 by Mr. Larry Putnam. Originally developed from analyses of ground-
based radar programs, the SLIM tool has been expanded to include other types of programs. It can
be customized for the user's development environment.
SLIM supports all phases of software development, except requirements analysis, as well as
all sizes of software projects, but was especially designed to support large projects.
Success in using SLIM depends on the user's ability to customize the tool to fit the software
development environment and to estimate both a Productivity Index (a measure of the software
developer's efficiency) and a Manpower Buildup Index (a measure of the software developer's
staffing capability). SLIM also provides a life cycle option which extrapolates development costs
into the maintenance phase.
A companion tool named SIZE Planner is also distributed by QSM and is used to estimate
the size of the software product.
QSM provides a training course and leases the tool via a time sharing service. There is also
a PC compatible version of SLIM that can be leased for a yearly fee.
SOFTCOST-R
SOFTCOST-R is a software estimating tool developed by Reifer Consultants, Inc. (RCI).
SOFTCOST-R is based upon the modeling work done by Dr. Robert Tausworthe of the Jet
Propulsion Laboratory. It contains a database of over 1500 data processing, scientific and real-time
programs. A key input is SLOC, which can be input directly or computed from Function Points.
SOFTCOST-R is applicable to all types of programs, however, it was specifically configured to
estimate real-time and scientific software systems, and considers all phases of the software
development cycle.
The tool's primary input is SLOC, however, it also uses the same inputs and provides the
same outputs as COCOMO which allows direct comparisons to be made.
131
Parametric Cost Analysis Handbook
SOFTCOST-R has some unique inputs such as user and peer reviews, customer experience,
and degree of standardization. It also supports a standard WBS for task planning and scheduling.
RCI provides SOFTCOST-Ada, which is a tool to estimate Ada and C++ development
costs. Softcost-Ada is a cost estimation tool specifically developed to estimate systems using
object-oriented techniques.
RCI also has a separate estimating tool called ASSET-R to estimate the size of the software
product. SOFTCOST-R, SOFTCOST-Ada, and ASSET-R are leased on an annual license basis,
and require a PC compatible running DOS 2.3 or higher.
SYSTEM-4
SYSTEM-4 is marketed by Computer Economics, Inc. (CEI). It contains a proprietary
model that is based on the work of Jensen, Boehm, Putnam, and other noted software experts.
SYSTEM-4 is applicable to all types of programs and all phases of the software life cycle.
Inputs consist of size (SLOC), twenty environmental factors, seven development factors, software
type, and constraints. This tool comes with 23 predefined default parameter files. The default sets
provide typical values for all parameters except size. There are also seven parameter subset files for
various implementations of DOD-STD-1703, DOD-STD-2167A/498, and varying degrees of Ada
experience.
The user must select one of the default sets and input the SLOC estimate to perform a quick
estimate. SYSTEM-4 can accommodate multiple CSCIs or tasks, and each task can be broken
down into elements and units. There is a limit of 64 tasks, 64 elements, and 64 units. SYSTEM-4
can be customized to reflect the user's software development environment.
CEI has a companion software size estimating tool called Computer Economics
Incorporated Sizing (CEIS) System. These tools operate on PC compatible systems.
132
Parametric Cost Analysis Handbook
project managers as a major technical performance or productivity indicator that allows them to
track the project during software development.
The most commonly used method to estimate the size of a software product is by using both
expert judgment and the analogy method. The experts determine the functions the software will
perform and estimate the size by comparing the new system to completed projects with similar
characteristics. Often, parametric methods are the used to precisely size the software.
Estimating tools using analogy methods compare the new program to similar programs of
known size. Because past projects are not always exactly like the new project, the estimate is
adjusted by a factor determined from experience. These tools accept characteristics of new
programs as inputs, then search a database for similar programs. The tools either list the similar
programs or provide an estimate of size based on an average of the size of the similar programs
selected from the database.
Expert judgment tools use the opinion of one or more experts to estimate the size of the
program, or use structured questions designed to extract judgment from the experts. These are the
rule-based or expert system tools.
Many tools use the algorithmic method by applying equations to determine size estimates.
A technique that is becoming very widely used is Function Point Analysis (FPA). One problem
with FPA is that it was developed for IS-oriented programs and does not take into consideration the
number or complexity of algorithms in scientific and real-time programs.
Many software estimating tools such as REVIC and SLIM use extensions of the Program
Evaluation and Review Technique (PERT). PERT is based on a beta distribution of estimates
provided by the user and calculates expected size according to the equation:
Expected Size = (S + 4(M) + L)/6
where S, M, and L are estimates of the smallest size, most likely size, and the largest size,
respectively.
ASSET-R
ASSET-R is a function point sizing tool developed to estimate the size of data processing,
real-time, and scientific software systems, and is marketed by Reifer Consultants, Inc. It utilizes a
knowledge-based system which extends the theory of function points into scientific and real-time
133
Parametric Cost Analysis Handbook
systems by considering issues like concurrence, synchronization, and reuse in its mathematical
formulation. The formulas use as many as nine parameters to develop function point counts. It also
couples function point and operand/operator counts with architectural, language expansion, and
technology factors to generate the size estimate. ASSET-R works with RCI's SOFTCOST-R and
SOFTCOST-Ada software estimation tools. It operates on PC compatible systems.
CA-FPXpert
CA-FPXpert is distributed by Computer Associates International, Inc. It uses FPA for size
estimation of IS type software projects, and conforms to accepted IFPUG standard counting
practices. It includes an on-line tutor to help the function point counting process. CA-FPXpert
works in conjunction with CA-ESTIMACS to provide software size estimation input, and operates
on PC compatible systems.
CEIS
CEIS is marketed by Computer Economics, Inc. Estimates are generated by comparing the
attributes of the new project to the attributes of three reference projects of known size. The user
determines any six attributes that contribute to the number of lines of code and ranks them in order
of importance, then selects three reference projects of known size. Separate algorithms are used to
produce four independent estimates and to determine a level of confidence. CEIS works in
conjunction with SYSTEM-4, and operates on PC compatible systems.
SIZEEXPERT
SIZEEXPERT was developed by the Institute for Systems Analysis and is marketed by
Technology Application/Engineering Corporation. This tool is an expert judgment tool that
produces estimates of liens of code based on questions asked by COSTEXPERT. Both tools are
packaged and distributed together, and operate on PC compatible systems.
SEER-M
SEER-M is marketed by Galorath Associates and is available to government personnel
under and Air Force-wide contract. It produces software size estimates in lines of code or function
134
Parametric Cost Analysis Handbook
points. It also provides its own historical database in producing the size estimate. SEER-M works
with SEER-SEM software estimating tool, and operates on PC compatible systems.
Appendix F includes other currently available cost models, and discusses in more detail the
most popular software estimating tools.
GLOSSARY OF TERMS
MODEL CALIBRATION
The act of calibration standardizes a model. Many models are developed for specific
situations and are, by definition, calibrated to that situation. Such models usually are not useful
outside of their particular environment. However, general cost estimating models including
commercially available ones such as the FAST, PRICE and SEER models (described earlier) are
designed to be useful as estimating tools for a wide range of situations. The act of calibration is
needed to increase the accuracy of one of these general models by making it temporarily a specific
model for whatever product it has been calibrated for. Calibration is in a sense customizing a
generic model.
Items which can be calibrated in a model are: product types, operating environments, labor
rates and factors, various relationships between functional cost items, and even the method of
accounting used by a contractor. All general models should be standardized (i.e. calibrated), unless
used by an experienced modeler with the appropriate education, skills and tools, and experience in
the technology being modeled.
Calibration is the process of determining the deviation from a standard in order to compute
the correction factors. For cost estimating models, the standard is considered historical actual costs.
The calibration procedure is theoretically very simple. It is simply running the model with normal
inputs (known parameters such as software lines of code) against items for which the actual cost are
known. These estimates are then compared with the actual costs and the average deviation
135
Parametric Cost Analysis Handbook
becomes a correction factor for the model. In essence, the calibration factor obtained is really good
only for the type of inputs that were used in the calibration runs. For a general total model
calibration, a wide range of components with actual costs need to be used. Better yet, numerous
calibrations should be performed with different types of components in order to obtain a set of
calibration factors for the various possible expected estimating situations. For instance, the PRICE
Software Model addresses this situation using internal productivity factors. These can be modified
by the calibration process.
Trends
Advances in languages, development methodologies, and other areas will have to be
addressed by future software cost estimating models and associated methodologies. As software
technology matures, changes in development and support concepts occur which will impact
software cost estimating. Concepts such as prototyping and spiral development present a challenge
to cost estimation since normal software development cycles are altered.
Artificial Intelligence (AI) represents a growing area of modern technology. Since AI is
software-intensive, proper management of AI software, including cost estimating, will be a
challenge for software managers. The development of software for expert system and other AI
applications will probably require a different development process.
The trend for the future will include better and more accurate ways of developing software
estimating methodologies for:
* Software size estimates.
* Resource Estimates for maintenance and support.
* Incorporating the effects of Ada and new paradigms such as rapid prototyping and fourth
generation languages.
* Modeling the dynamic interaction of variables that affect productivity, cost, and quality.
* Object-Oriented Design.
136
Parametric Cost Analysis Handbook
The trend in software estimating tools is to provide a whole family of models which not
only estimate cost and effort of software development, but hardware as well. The tools are being
upgraded to support higher-order languages such as Ada and C++. The most significant
improvement is the use of data collected from past software projects to customize the tool to an
organization's environment. This is especially true within agencies of DOD and NASA.
Conclusions
This chapter has discussed the software estimating process and the various methodologies
used in software estimation. The basic software estimating functional capabilities were also
discussed. A few different software estimating tools were examined.
A review of product literature and user manuals indicate that many tools will perform most
of the functional capabilities outlined in this chapter. The users generally agree that the tools they
are using satisfy their requirements.
The software estimating organization must be able to customize the software estimating tool
to their own software development environment. This requires collecting historical data from past
completed projects to accurately provide the inputs that the software tool requires. The software
development organization should establish a staff that is thoroughly trained in the use of the tools.
This staff should do all the software estimating activities and determine what data should be
collected to provide a historical database for future reference.
The use of two or more software estimating tools using different methodologies is
recommended. The software development organization should select a primary tool for software
estimating and an alternate tool for comparison and validation. These tools should be used
throughout the software development process. Parametric tools are considered to be the best for
software estimating for the following reasons:
* Equations are based on previous historical development projects.
* Outputs are repeatable and formulas can be analyzed.
* They can be customized to fit the user's environment.
* They require minimal time and effort to use.
* They are particularly useful in earlier phases of software development.
* They are most frequently used by DOD agencies.
137
Parametric Cost Estimating Handbook
CHAPTER VI
A MANAGEMENT VIEWPOINT
138
Parametric Cost Estimating Handbook
CHAPTER VI
The following paper written by Michael Thibault, Deputy Director, DCAA, provides
background from an auditor's point of view on auditing parametric cost estimating techniques.
Introduction
The U.S. Defense Contract Audit Agency (DCAA) was established in 1965. DCAA's
mission is to perform all necessary contract audits for the Department of Defense and, when
requested, to perform contract audit services on a reimbursable basis for other government
agencies. In essence, DCAA provides accounting and financial advisory services for procurement
and contract administration activities. Contract audit activities include providing professional
advice on accounting and financial matters to assist in the negotiation, award, administration,
repricing, and settlement of contracts. DCAA audits are conducted in accordance with generally
accepted government auditing standards established by the General Accounting Office.
This paper discusses the government's expectations when defense contractors use
parametric cost-estimating relationships for estimating government contract costs. The paper also
emphasizes that companies must meet all the adequacy criteria set out in the Federal Acquisition
Regulations (FAR) and applicable supplements to obtain approval for their estimating systems.
Companies must apply the same criteria to their parametric cost-estimating relationships to ensure
they are acceptable for use in estimating systems.
DCAA believes now, as it has always believed, that parametric estimating techniques using
cost-estimating relationships are acceptable in the appropriate circumstances for proposing costs on
government contracts. DCAA is ready and willing to work with industry in the evolution of
parametrics. Operation Desert Shield and Operation Desert Storm dramatically demonstrated that
our government must be capable of responding quickly to changing procurement requirements.
139
Parametric Cost Estimating Handbook
Parametric systems can help us do just that. Future estimating systems must be responsive,
accurate, and cost effective.
Background
DCAA was issuing official guidance on parametric systems as early as 1978. Parametrics
was broadly defined as a technique that employs one or most cost-estimating relationships to
estimate costs associated with developing, manufacturing, or modifying an end item. In the 1980s,
DCAA auditors reported an increase in the number of contractors using parametric cost estimating.
DCAA developed and issued audit guidance to assist the field auditors in this new area. Studies
conducted by DCAA; the Office of the Secretary of Defense Cost Analysis Improvement Group;
Headquarters, Air Force Contract Management Division; Headquarters, Aeronautical Systems
Division; and the Space Systems Cost Analysis Group provided the basis for DCAA audit guidance
issued in 1982.
Parametric Criteria
This guidance was also the subject of an article written for the Spring 1982 issue of Journal
of Parametrics published by the International Society of Parametric Analysts. Charles 0. Starret, Jr.,
then-Director of the Defense Contract Audit Agency, wrote the article entitled "Parametric Cost
Estimating -- An Audit Perspective." The guidance contained in that article is essentially the same
as the guidance given to DCAA auditors today. It reiterates DCAA's long-held view that
parametrics is an acceptable estimating technique. The 1982 article included the criteria a
contractor should apply before submitting a contract price proposal using parametrics. The criteria
are still on point today, and they are:
1) Logical relationships
2) Significant statistical relationships
3) Verifiable data
4) Reasonably accurate predictions
5) Proper system monitoring
Logical Relationships
140
Parametric Cost Estimating Handbook
Verifiable Data
There must be a system in place for verifying data used for parametric cost-estimating
relationships. In many instances, the auditor will not have previously evaluated the accuracy of
noncost data used in parametric estimates. For monitoring and documenting noncost variables,
contractors may have to modify existing information systems or develop new ones. Information
that is adequate for day-to-day management needs may not be reliable enough for contract pricing.
Data used in parametric estimates must be accurately and consistently available over a period of
time, and easily traced to or reconciled with source documentation.
141
Parametric Cost Estimating Handbook
system is as accurate as a discrete estimating system, then the government has increased assurance
of receiving a fair and reasonable price.
As with any estimating relationship derived from prior history, it is essential for the
contractor to document that the work being estimated using parametric cost-estimating relationships
is comparable to the prior work from which the parametric data base was developed.
142
Parametric Cost Estimating Handbook
3) Determine that the supporting cost and pricing data for the individual proposal was derived
in accordance with the contractor's estimating system and is in compliance with applicable
regulations.
The auditor plans the audit scope using what is known about the contractor. For example,
the audit scope will vary depending upon the estimating methods the contractor uses. In addition,
the auditor will consider the following types of questions:
* What is the dollar amount and type of contract contemplated?
* Has the contractor established strong internal controls and sound accounting and
estimating systems?
* What kinds of testing does the contractor do to ensure compliance with these systems?
* What does our prior audit experience tell us about the contractor's internal controls or
estimating practices?
Audit planning requires the auditor to answer all of these questions and to make a
determination regarding the government's risk. Judgment is then exercised in deciding the degree
of risk that the estimate could be materially misstated. This assessment of risk will be used to
decide what audit procedures to employ.
The auditor identifies the method of estimating the contractor uses to determine the kind of
support that should be available. A contractor could be using any or all of the following methods:
1) Detailed -- also known as the bottoms-up approach. This method divides proposals into
their smallest component tasks and are normally supported by detailed bills of material.
2) Comparative -- develops proposed costs using like items produced in the past as a
baseline. Allowances are made for product dissimilarities and changes in such things as
complexity, scale, design, and materials.
3) Judgmental -- subjective method of estimating costs using estimates of prior experience,
judgment, memory, informal notes, and other data. It is typically used during the
research and development phase when drawings have not yet been developed.
143
Parametric Cost Estimating Handbook
Parametric estimating techniques may be used in conjunction with any of these methods.
Whatever the method selected by the contractor, it must comply with applicable laws and
regulations. The laws and regulations most often encountered in dealing with parametrics are:
1) CAS 401, "Consistency in Estimating, Accumulating, and Reporting Costs"
2) Truth in Negotiations Act (TINA)
3) FAR 15.800, Price Negotiation
4) DFARS 215.811, "Estimating Systems"
Cost Accounting Standards (CAS) provides guidance in accounting for contract costs at
larger contractors. CAS 401 requires that a contractor's estimating practices be consistent with
those governing the accumulation and reporting of costs during contract performance. Some
contractors see parametrics as being inconsistent with CAS 401. Contractors must ensure both cost
and noncost information used in estimating is separately accumulated and reported as required by
CAS 401.
The purpose of the Truth in Negotiations Act, 10 U.S.C. 2306(f), is to provide the
government with all facts available to the contractor at the time it certified the cost or pricing data
was accurate, complete, and current. Parametric estimates must meet the same basic disclosure
requirements under the act as discrete estimates. Although the principles are no different, proposals
supported in whole or in part with parametric estimating will have different types of cost or pricing
data than traditional discrete cost estimates.
Fundamental to the definition of cost or pricing data are "all facts ... which prudent buyers
and sellers would reasonably expect to have a significant effect on price negotiations" (FAR
15.801). Reasonable parallels may be drawn between the data examples provided in FAR for
discrete estimating approaches and the type of data pertinent to parametric estimating approaches.
The contractor is also expected to provide all factual data for the parametric cost estimates. This
data must be accurate, complete, and current.
Many contractors use parametric cost estimating for supplementary support or validation of
estimates developed using other methods. This requires judgment in selecting which data will be
used in developing the total cost estimate relied upon for the price proposal. In distinguishing
between fact and judgment, FAR states the certificate of cost or pricing data "does not constitute a
144
Parametric Cost Estimating Handbook
representation as to the accuracy of the contractor's judgment on the estimate of future costs or
projections. It does apply to the data upon which the contractor's judgment or estimate was based"
(FAR 15.804-4b). Thus, if a contractor develops a proposal using both parametric data and discrete
estimates, it would be prudent to disclose all pertinent facts to avoid later questions about
completeness of the submission.
Auditors are also required to evaluate estimating systems of major Department of Defense
(DoD) contractors. This includes ensuring that, if parametric estimating procedures are part of the
estimating system, they are properly disclosed. Of key concern to the auditor in evaluating the
estimating systems' disclosure of parametric procedures are the following:
1) Do the procedures clearly establish guidelines for when parametric techniques would be
appropriate?
2) Are there guidelines to ensure the consistent application of estimating techniques?
3) Is there proper identification of sources of data and the estimating methods and the
rationale used in developing cost estimates?
4) Do the procedures ensure that relevant personnel have sufficient training, experience,
and guidance to perform estimating tasks in accordance with the contractor's established
procedures?
5) Is there internal review of and accountability for the adequacy of the parametric
estimating techniques, including the comparison of projected results to actual results
and an analysis of any differences?
DCAA believes parametric estimating approaches are acceptable when they are properly
implemented. Auditors encounter it most often as a technique used in conjunction with other
estimating methods. For example, parametrics are often used for estimating costs of scrap and other
such factors. The majority of the proposals audited are not developed solely based on parametric
estimating techniques. The use of parametric estimating is most appropriate in such circumstances
where historical data is not available as when the program is at the engineering concept stage, or
when no bill of materials exists and the program definition is unclear. Contractors with good
parametric cost estimating systems analyze their perspective proposal to determine the appropriate
estimating technique for each part of the work breakdown structure.
145
Parametric Cost Estimating Handbook
Contractors should be satisfied that implementation and monitoring costs do not outweigh the
benefit of reduced estimating costs. Moreover, it is critical that the environment is appropriate for
the use of parametrics. It would not be prudent to rely exclusively on parametric techniques to
estimate costs when directly applicable historical cost data are available. Such is the case of
follow-on production for the same or similar hardware. Contractors manufacturing mature weapon
systems already have a record of the actual costs. The use of parametric estimating may be
appropriate for certain aspects of follow-on production, however, the contractor should disclose any
data that may have a significant impact on cost. The exclusive use of parametrics is generally not
appropriate for economic forecasting of such elements as labor and indirect cost rates. For
parametric estimates, the contractor must ensure that any changes in accounting practices are
accounted for in the estimate and that labor and indirect cost rates are appropriately applied.
Another problem encountered is contractors failing to properly disclose their parametric
estimating practices. Auditors have experienced instances where the first time a parametric model is
disclosed is during the evaluation of the proposal. This is often too late. As mentioned earlier, larger
DoD contractors have an obligation to disclose in writing their estimating procedures. Making
proper and timely disclosure will minimize problems and expedite the negotiation process.
Contractors can take the lead in helping to streamline the oversight process. In a few words,
they should practice self-governance! DCAA has been a leading proponent of the self-governance
program. Self-governance is intended to encourage contractors to establish and maintain good
systems of internal control in key areas, including estimating systems. It requires contractors to
146
Parametric Cost Estimating Handbook
provide their own oversight -- to detect system weaknesses and take corrective action as necessary.
This initiative recognizes that prudent contractors already have the means in place to ensure their
operations are efficient and cost effective.
DCAA has another initiative that contractors should consider as a part of streamlining the
oversight process. It is called "coordinated audit planning." DCAA defines coordinated audit
planning as a voluntary process wherein the DCAA auditor and the contractor's internal and external
auditors consider each other's work in determining the nature, timing, and extent of his or her own
auditing procedures. Coordinated audit planning considers the extent to which reliance can be placed
upon work performed by the other auditor to minimize duplication of audit effort. In addition, this
process strengthens the evaluation of internal control systems.
Understanding estimating systems controls, assessing risk, and transaction testing are
common objectives of DCAA, and the internal and external auditors associated with contractors.
This often results in duplicative audit procedures. In the coordinated audit planning conducted to
date, DCAA found that the instances of duplication are significant. DCAA auditors are more than
willing to rely on the work done by contractor internal and external auditors, providing DCAA has
the opportunity to evaluate and test their work.
Contractors are expected to establish and maintain reliable estimating systems. Departmental
procurement officials, Members of Congress, and the average American citizen hold Defense
contractors to high and exacting standards. The funds involved can be enormous.
The expectations are equally high for the auditor whose job it is to protect the taxpayer's
interest. The DCAA auditor must comply with the American Institute of Certified Public
Accountants' Statements on Auditing Standards and the GAO's Government Auditing Standards.
These standards require that the auditor be independent in fact and in appearance. They also require
the auditor exercise a healthy degree of professional skepticism. These requirements, however,
should not render the DCAA auditor and the contractor enemies. Such polarized and adversarial
relationships are dysfunctional and not in either party's best interest.
Both parties are taking significant actions to improve relationships. These changes are
producing a culture change that is very positive: positive because contractors are beginning to more
fully accept their responsibilities; positive because auditors are more effectively communicating audit
plans and objectives.
147
Parametric Cost Estimating Handbook
Defense procurement is taking on a new, streamlined look in the 1990s. Government and
industry are both concerned with quicker, less costly means of procuring goods and services. This is
one reason why parametric methods continue to stir up so much interest. Parametric techniques,
properly applied, can assist contractors and the government alike in streamlining the acquisition
process. In addition, the ability of the government to place greater reliance on contractor oversight of
contractor systems will also result in meeting procurement needs more timely.
Summary
In today's and tomorrow's procurement environment, a great challenge facing us all is the
development of a cooperative work climate conducive to quickly acquiring quality products and
services at fair and reasonable prices. New estimating techniques such as parametrics can cut
estimating costs.
Adequate estimating systems, fully supported and self-governed by industry, can cut audit
costs. Quick estimating and audit turnaround times can cut procurement costs. All of this, however,
requires communication and teamwork. Whether our buying effort is for innovative space equipment
or for recurring maintenance, we must all work to meet the challenge.
Mike Thibault's message to the parametric community is clear: Parametric techniques are
acceptable cost estimating methodologies given that the following five criteria exist: logical
relationships, verifiable data, a significant statistical relationship (correlation) exists, techniques
produce accurate predictions, and they are easy to monitor and support.
He goes on to say that the auditor needs to consider the adequacy of the parametrics cost
estimating system and related internal controls, which includes: the audit trail, sufficiency of
documentation, currency and sources of data, procedures for calibration and validation, and the
appropriateness of parametrics use. Since historical data is normally used as the basis for all
estimating, the auditor will use basic auditing techniques to verify that costs are current, accurate, and
complete. Mr. Thibault completely dispels the myth that the audit community would not accept the
use of appropriate parametric cost estimating techniques for firm business proposals.
DCAA's complete audit guidance is included in its Contract Audit Manual, Chapter 9-1000.
This chapter is included in its entirety in Appendix C.
148
Parametric Cost Estimating Handbook
149
Parametric Cost Estimating Handbook
In any case, when a offeror uses estimating techniques other than a consolidated priced bill of
material to estimate material costs, the offeror must adequately describe the techniques being used,
provide sufficient support to allow for an independent evaluation, and explain why the techniques
used are the best in the circumstances in order to comply with the material cost criteria included in
Table 15-2.
DFARS 215.811-70 delineates attributes of an adequate estimating system. These are:
(1) Establishes clear responsibility for the preparation, review, and approval of cost
estimates.
(2) Provides a written description of the organization and duties of personnel responsible
for ... contributing to the estimating process ...
(3) Ensures that relevant personnel have sufficient training, experience and guidance ...
(4) Identifies sources of data and the estimating methods and rationale used in developing
cost estimates.
(5) Provides for appropriate supervision ...
(6) Provides for consistent application of estimating techniques.
(7) Provides for detection and timely correction of errors.
(8) Protects against cost duplication and omissions.
(9) Provides for the use of historical experience, including vendor pricing information
where appropriate.
(10) Requires use of appropriate analytical methods.
(11) Integrates information available from other management systems as appropriate.
(12) Requires management review [of the estimating system]
(13) Provides for internal review of and accountability for the adequacy of the estimating
system, including the comparison of projected results to actual results and an analysis of
any differences.
(14) Provides procedures to update cost estimates in a timely manner.
(15) Addresses responsibility for review and analysis ... of subcontract prices.
150
Parametric Cost Estimating Handbook
Act) issue. A well calibrated and validated parametric estimating system will be compliant in all
respects. The DFARS goes on to list indicators of or conditions that may cause significant estimating
deficiencies (from 215.811-77). Three in particular stand out:
(1) Failure to ensure that relevant historical experience is available to and utilized by cost
estimators, as appropriate.
(2) Consistent absence of analytical support for significant proposed cost amounts.
(3) Excessive reliance on individual personal judgment where historical experience or
commonly used standards are available.
151
Parametric Cost Estimating Handbook
and is truly a CER. Parametric costing is done all the time. An FPRA for a parametric model is a
natural next step.
Therefore, when repetitive use of single CERs or parametric cost models is envisioned by the
contractor, an FPRA should be established. Procedures for establishing such a FPRA should follow
the basic guidelines listed in FAR Part 15.809-3 and should include the following:
a. Timely submission by the contractor, at least yearly, of an adequately supported FPRA
proposal.
b. Requirement for clear definitions of all elements of cost (dependent variables) which
will be developed by the CERs included in the FPRA and the applicable bases
(independent variables). Additionally, the proposal must indicate the period of
applicability of the proposed CERs and the conditions under which the CERs will be
used in price proposals.
c. The proposed CER's or parametric cost models should be evaluated in terms of the ten
criteria listed in the Table VI-1. The answer to the majority of these questions should
be affirmative - to support the need for a CER or Cost Model.
d. Tracking of the negotiated CERs and parametric cost models included in the FPRA is
required to test their validity as estimating tools. Guidelines for monitoring FPRAs are
included in Part 15.809-5 and the DCAA Contract Audit Manual (Vol. 1, Chapter
9-200).
It is the contractor's responsibility to identify the rate(s) to be included under the umbrella of
the FPRA or FPA and to specify the latest data already submitted in accordance with the agreement.
All data submitted in connection with the agreement, updated as necessary, form a part of the total
data that the offeror certifies to be current, accurate, and complete at the time of agreement on price
for an initial contract or for a contract modification.
152
Parametric Cost Estimating Handbook
TABLE VI-I
Cost Benefit Is it more cost beneficial to have a CER for the cost
element than to generate a discreet estimate?
Customer Acceptance Is the CER accepted by the PCO, DPRO, the CER owner,
etc.?
Effective Implementation Are the persons doing the modeling/ analysis sufficiently
trained and experienced to use the tools correctly?
153
Parametric Cost Estimating Handbook
With the proliferation of parametric cost estimating models and tools, both commercial
models and “home grown” versions, it is impossible to describe what to look for in every model
and cost parameter. However, some generalizations can be made. The following is a summary,
since most of this has already been covered in the Handbook.
As a cost estimator or analyst, knowing and understanding the audit guidance described in
this chapter is a good start. If you follow the steps delineated earlier in this chapter, the cost
analysis or audit battle is more than half won.
If, as an analyst, you are confronted with a parametric cost estimate, you should take a
few steps to ensure a fair review. These are :
♦ Understand the cost model used. Do not hesitate to ask questions and to consult with the
experts. Most commercial model builders welcome calls from users, analysts and
auditors. You do not have to have a user’s understanding of the model, just a general
knowledge about how it works. Remember, there’s no such thing as a stupid question.
♦ Review the program inputs to the model. Is the schedule correct? Does the WBS
adequately describe the product being estimated? Is anything missing? Are there
duplications? Does the WBS follow the statement of work?
♦ Review the technical inputs to the model with your own engineers or technical
department, and your program office. Check them for reasonableness and benchmark
them against your own experience and the experience of the resident experts.
♦ Understand the model’s cost drivers. Generally, there are a few select parameters that
are the predominate cost estimating factors that drive cost. Many of the others are just
“tweaks” to the model. Concentrate on the cost drivers when performing your analysis.
Your may wish to combine the other minor “environmental” factors together as one
factor.
♦ Be aware of the assumptions the cost estimator made when he/she built the model. Are
they reasonable? Would you have made the same assumptions? If not, why not?
♦ Be knowledgeable of the historical cost basis for the model, if any. Be sure to review the
source documentation. Be wary of any model used that has no basis in historical cost.
154
Parametric Cost Estimating Handbook
How was the data “tuned” or normalized? Were data outliers disregarded? If so, why?
Was the model calibrated? How was the calibration performed? Were any universal or
“standard” cost factors used? Would they be applicable in this case?
♦ Question how the future environment might be different from the historical case. Have
these differences been accounted for? Would you account for them differently? If so,
how?
♦ Is the estimated program expected to push the state of the technology (state of the art)?
How is this planned to be accomplished? How is it accounted for in the estimate? Is it
seriously considered, or glossed over? Advances in the state of the art normally have
significant impacts on the cost drivers. Does this show up in the parametric estimate, or
are the historical factors used without adjustment? It is important to use technical experts
on this item. For instance, if a contractor proposes using a technology that has yet to be
proven in the laboratory, program trouble may lie ahead and result in inadequate
estimates. Consult with your technical people on this and related technical concerns.
♦ Review the track record of the estimator or the estimating organization. What is their
past performance? Have their estimates been reliable?
♦ Understand the economics involved with the model. What are the effective costing rates
used? Are they reasonable? Do they reflect the organization and skill levels being
estimated?
♦ Identify what cost factors have been “tweaked” and why. Focus on the “big ticket” items.
Using expert opinion and “rules of thumb,” are any significant cost factors outside the
range of reasonableness? For instance, it is very easy to calibrate a software cost model’s
cost (hours or dollars) per line of software code. Is the CER reasonable for the estimate?
However, different models may define a software line of code differently. It is therefore
vital to understand the definitions used in the model you are evaluating. If the model
violates your concept of reasonableness, then question it.
♦ Last, don’t be intimidated. Most (especially commercial) models are quite user friendly
and the model builders are helpful. Other people have learned how to use them, and you
are as smart as they are. The most difficult part is learning the language and gaining the
experience over time to develop your skills.
155
Parametric Cost Estimating Handbook
Rules Of Thumb
Rules of thumb are general rules, CER’s or other cost factors that are widely used, but do
not apply precisely to and specific case. Each estimating organization will have its own peculiar
set. Most long time estimators can develop a rapid estimate (a ROM) based on their own
personal rules of thumb. These rules come from their own experience. Examples of rules of
thumb are the “default” factors that populate many parametric models including the commercial
ones. If no factor input is available, the model uses a universally derived factor - one taken from
a universal data base or expert opinion.
Although there is nothing wrong with the use of these rules per se, the danger is evident.
The rules are only norms or benchmarks, and they will never apply to a specific estimate. Their
value lies in the comparison of a factor in a model to the universal case. Too much deviation
should be investigated. Why is the actual case so different from the rule? Or, stated differently,
why doesn’t the rule apply here?
For instance, if you believe that a line of software code (for a particular type of software)
should take about one-half hour to write (on average), and the parametric model indicates two
hours, then some investigation is in order. Although either CER could be the correct one, there is
far too much deviation from the norm, and suspicions are aroused. However, if the model
indicates six tenths of an hour that compares to your one-half hour, the rule had performed a
“sanity” check for you. But the use of the rule does not preclude a thorough cost analysis if you
have identified a cost driver.
Rules of thumb are important, but they must be used wisely. Develop your own list. This
can be accomplished through experience, or by consulting with experts in the field. Many rules
exist embedded within commercial parametric models, and are available to the users and
modelers. In any event, it is important to have a list of benchmark factors. You’ll probably need
a different list for each organization you deal with. Organizational and product differences will
demand it. But remember that there are no shortcuts or magic formulas, so use your rules as a
quality check, or for quick, rough order of magnitude estimates.
An Example
156
Parametric Cost Estimating Handbook
What follows is an example of some of the above discussion. The example here uses the
software cost estimating model, REVIC.
The fundamental REVIC equation is: Man Months = A*(KDSI)B*ΠFi, where,
A and B are coefficients of several Software Development Modes (for instance,
embedded , organic, Ada);
KDSI is Thousand Delivered Source Instructions; and;
ΠFi is the product of various environmental factors the model uses to adjust the
cost estimates. These environmental factors include, for example: analyst and
programmer capabilities, applications experience, programming language, storage
constraints, requirements volatility, reliability requirements, data base size,
product complexity, the use of modern programming practices and software tools,
platform (airborne, space, etc.), schedule compression, etc. Each factor is given a
factor value from very low through nominal to high.
What would you do with it? The basis of estimate (BOE) tells you this: “The Integrated
Management System Evaluation Package for Release 1 (IMS EP 1) requires 1337 new lines of
embedded source code of new algorithms and uses complex interfaces. Our superior
programming staff using state-of-the-art software tools, modern programming practices, and
possessing significant application experience allows the product of the environmental factors to
be less (13%) than nominal.” What do you notice about this estimate?
First, this particular estimate is relatively small (4.1MM). You may wish to move on to
the more important cost issues. However, you decide to “spot check” this one. You know very
little about REVIC, so you turn to the resident expert for advice. He/She confirms that the
technical description in the BOE is correct, so the model’s values for A and B are the correct
157
Parametric Cost Estimating Handbook
ones. You make a quick calculation to compare the estimate with a rule of thumb supplied by
your resident expert.
4.1MM * 155hrs./MM ÷ 1337DSI = .475hrs/DSI
Your resident expert informs you that he believes this factor to be quite low, possibly two
times. He also indicates that the DSI number needs to be verified for accuracy, since it is a major
cost driver for this estimate. He is also suspicious of the “....less (13%) than nominal”
environmental factors product. (Nominal here he believes should equal at least 1.5 due to a
partial space platform involved.) Therefore 13% appears to be a mistake of some kind, but it will
have to be investigated. He also argues that since this is a competitive proposal that’s almost
80% software, all software estimates may be “aggressive,” and should be carefully reviewed.
The model does not appear to have been calibrated except for the vague references “....superior
programming staff....” in the BOE. The product of the environmental factors seems to be based
upon judgment, although likely an educated one.
You decide to carefully review the other REVIC cost estimates in the proposal, so you
borrow a REVIC User’s Guide in order to become more proficient.
Although this is a relatively easy example, it demonstrates the general idea of what to
look for in a parametric estimate.
158
Parametric Cost Estimating Handbook
CHAPTER VII
BUSINESS APPLICATIONS OF
PARAMETRIC ESTIMATING
159
Parametric Cost Estimating Handbook
CHAPTER VII
INTRODUCTION
Why do programs die? Why do CEO's, Boards of Directors, DoD, or Congress kill or
cancel technically superior projects? There are many examples, and, generally, it is not due to
technical viability. Even though millions or even billions are lost, the program graveyard claims
many projects. Sommers and Fad suggest that these events happen partly because of an inattention
to future cost issues during program development. There is a need to recognize and react to help
solve the cost problems. Technical success can often mask future cost issues.
160
Parametric Cost Estimating Handbook
The authors propose that an organization responsible for costing use parametric techniques
early on, and that cost and technical objectives be treated together. Parametrics are perfect for cost
estimating during a program's conceptual phase, and the tools connect technical and cost parameters
together in a very effective way.
New business development actions can attack the problem. These actions include:
1) Parallel and coordinated development of technical and programmatic (cost, schedule and
quantity) concepts from requirements.
2) Iterative and continuous cost improvement
3) Develop cost targets (bogeys) as the result of 1 & 2, above.
4) A review/reconciliation process that focuses on cost/risk analysis.
The authors go on to suggest that organizations not put off the cost estimating activity, and
that cost analysts immediately begin to explore technical/cost alternatives. They also suggest that
the parametric function belong to an independent department and not reside within another function
outside the organization that owns new business development.
To effectively integrate parametrics into an organization, upper management needs to be
involved. Upper management involvement will insure the creation of a cooperative team spirit, and
more effective management and customer acceptance including estimate credibility.
A parametric approach can be used to calibrate production cost and then estimate future
production buy(s) and options using a commercial model. The benefit here is a simpler cost
proposal with no bill of material and no labor hours roll-up. The model can be calibrated and
estimates derived at any level of the WBS, hence useful for spare parts pricing. In either case, the
CERs or cost models can be used as the basis of estimate, and delivered to a proposal pricing
system via a post processor, or a spread sheet. In either event, the cost estimating process is greatly
simplified, and resource economies of scale can be realized.
161
Parametric Cost Estimating Handbook
Manufacturing Support Labor: 17,280 hours which was generated from 1 Mechanical Engineer, 1
Quality Engineer, and 1 Test Engineer at each of the three plant locations.
Engineering Support Labor: 7,680 hours generated by 2 Engineers at Plant A, and 1 Engineer
each at Plants B and C.
Tooling Costs: $15,000
Test Equipment Costs: $25,000
162
Parametric Cost Estimating Handbook
* The 15 hours for the 5 parts were adjusted to account for performance differences between Plants B and C.
Performance factor (inverse of efficiency) at Plant B was 1.5. The current performance factor at Plant C is
1.75. Adjustment: 15 hours divided by 1.5 times 1.75 equals 17.5.
163
Parametric Cost Estimating Handbook
The proposed 84-lot is 60% of a halving of the baseline 120-lot. Therefore, the support
labor adjustment is -15% (60% X 25% = 15%) .
9.00 People Baseline
7.65 People Baseline Adjusted (-15%)
14,688 Hours Proposed support labor @ 1920 hours per person/year
Purchased Parts:
Base $78,609 X 84 Systems = $6,603,156
164
Parametric Cost Estimating Handbook
Parametric techniques can sometimes be used for estimating spares and change order
proposals. Using a calibrated system model for spares, select the WBS element that represents the
spare part(s) that need to be estimated. Then exercise the model at that point, either a commercial
version or one developed for the specific program. The estimate can be exercised very quickly, and
the calibrated model will yield accurate results.
Engineering Change Proposals (VECPs) or other change proposals have to be accounted for
by adjusting the actuals or a delta to the current estimate. The comparative approach is normally
well supported by history, as accurate as any other technique. It is much easier to do "what if"
analysis and compute option prices, and the proposals are less complex and smaller in volume and
more "user friendly", with significantly reduced preparation, review, audit and negotiation time.
Table VII-1 indicates the potential parametric applicability for various firm business
proposals. Appendix G contains a parametric estimating system checklist.
165
Parametric Cost Estimating Handbook
KEY:
VA = Very Applicable; NVA = Not Very Applicable; MBA = May Be Applicable
TABLE VII - 1
166
Parametric Cost Estimating Handbook
APPENDIX A
167
Parametric Cost Estimating Handbook
APPENDIX A
Algorithmic Models - (also known as Parametric models) produce a cost estimate using one or
more mathematical algorithms using a number of variables considered to be the major cost
drivers. These models estimate effort or cost based primarily on the Hardware/Software size, and
other productivity factors known as cost driver attributes.
Analogy - A method of estimating developed on the basis of similarities between two or more
programs, systems, items, etc.
Analogy Models - use a method of estimating that involves comparing a proposed project with
one or more similar and completed projects where costs and schedules are known. Then,
extrapolating from the actual costs of completed projects, the model(s) estimates the cost of a
proposed project.
Analometric - A recent term meaning a method of combining the analogy and parametric
estimating methods to form a cost estimate when only 2 relevant data points are available. It is
usually combined with the "D" Factor to adjust (up or down) complexity from the 2 point CER
regression line.
Analysis - decision making context involving time horizons extending into the future. A
concrete set of specifications are not usually available. There could be many major uncertainties,
and a wide range of alternatives, each having several configurations. Analysis is usually
concerned with new equipment proposals and new methods of operation of systems never
produced before. Analysis Objectives are to find significant differences in resource requirements
among alternatives; and, how will resource requirements for any alternative change as key
168
Parametric Cost Estimating Handbook
configuration characteristics vary over their relevant ranges. It is often a "sensitivity" type of
investigation.
Analysis (NES Dictionary) - a systematic approach to problem solving. Complex problems are
simplified by separating them into more understandable elements.
Annual Change Traffic (ACT) - the fraction of a software product's source instructions which
undergoes change during a year, either through addition or modification. The ACT is the
quantity used to determine the product size for software maintenance effort estimation.
Anomalies - variances in cost related data caused by an unusual event(s) not expected to recur in
the future.
As Spent Dollars - the cost, in real year dollars, of a project recorded as the dollars were spent
without normalization for inflation.
Audit - the systematic examination of records and documents and the securing of evidence by
confirmation, physical inspection, or examination.
Audit Trail - information allowing the data being used in an estimate to be tracked back to the
original source for verification.
Benefit (NES Dictionary) - Result(s) attained in terms of the goal or objective rather than in
terms of output.
Benefit Cost Analysis (NES Dictionary) - an analytical approach to solving problems of choice.
It requires: (a) the definition of objectives; (b) identification of alternative ways of achieving
each objective; and (c) the identification for each objective or alternative which yields the
required level of benefits at the lowest cost. It is often referred to as cost-effectiveness analysis
when the benefits of the alternatives cannot be quantified in terms of dollars.
169
Parametric Cost Estimating Handbook
Bottom-Up Models - use a method of estimation that estimates each component of the project
separately, and the results are combined ("Rolled Up") to produce an estimate of the entire
project.
Calibration - in terms of a cost model, a technique used to allow application of a general model
to a specific set of data. This is accomplished by calculating adjustment factor(s) to compensate
for differences between the referenced historical costs and the costs predicted by the cost model
using default values.
170
Parametric Cost Estimating Handbook
Constant Dollars - Computed values which remove the effect of price changes over time
(inflation). An estimate is said to be in constant dollars if costs for all work are adjusted to
reflect the level of prices of a base year.
Contract Work Breakdown Structure - Contract work breakdown structure is defined as the
complete work breakdown structure for a contract, i.e., the DoD approved work breakdown
structure for reporting purposes and its discretionary extension to the lower levels by the
contractor, in accordance with this standard and the contract work statement.
Cost (Fisher) - "Economic costs" are benefits lost. It is for this reason that economic costs are
often referred to as "alternative costs" or "opportunity costs." It is in alternatives, it is in
foregone opportunities, that the real meaning of "cost" must always be found. The only reason
you hesitate to spend a dollar, incidentally, is because of the alternative things that it could buy.
Some use the word "cost" when referring to resources. Cost of something is measured by the
resources used to attain it. Cost of attaining an objective at some point in time is measured by
the resources not available for use in attaining alternative objectives. Costs are a measure of
other defense capabilities foregone. Money cost is not necessarily the same as economic cost.
"Economic cost" implies the use of resources - manpower, raw materials, etc. Dollars are used
merely as a convenient common denominator for aggregating numerous heterogeneous physical
quantities into meaningful "packages" for purposes of analysis and decision mating.
Cost (NES Dictionary) - the amount paid or payable for the acquisition of materials, property, or
services. In contract and proposal usage, denotes dollars and amounts exclusive of fee or profit.
Also used with descriptive adjectives such as “acquisition cost," or "product cost," etc. Although
dollars normally are used as the unit of measure, the broad definition of cost equates to economic
resources, i.e., manpower, equipment, real facilities, supplies, and all other resources necessary
for weapon, project, program, or agency support systems and activities.
Cost Analysis (NES Dictionary) - the accumulation and analysis of actual costs, statistical data,
and other information on current and completed contracts or groups of contracts (programs).
171
Parametric Cost Estimating Handbook
Cost analysis also includes the extrapolation of these cost data to completion, comparisons and
analyses of these data, and cost extrapolations on a current contract with the cost data in the
contract value for reports to customers, program and functional managers, and price estimators.
In the procurement organizations of the Department of Defense, cost analysis is the review and
evaluation of a contractor's cost or pricing data and of the judgmental factors applied projecting
from the data to the estimated costs, in order to form an opinion on the degree to which the
contractor's proposed costs represent what performance of the contract should cost, assuming
reasonable economy and efficiency.
Cost Analysis (Large) - the primary purpose of cost analysis is comparison - to provide estimates
of the comparative or relative costs of competing systems, not to forecast precisely accurate costs
suitable for budget administration. In this context consistency of method is just as important,
perhaps more so, as accuracy in some absolute sense. In comparing the costs of military systems,
we prefer to speak of "cost analysis" rather than "cost estimation," because the identification of
the appropriate elements of cost -- the analytical breakdown of many complex interrelated
activities and equipment -- is so important a part of the method. Weapon system cost analysis is
much more than an estimate of the cost of the weapon itself. Weapon procurement costs may be
relatively small compared to other necessary costs, such as base facilities, training of personnel,
and operating expenses; and these other costs may vary greatly from system to system.
Cost Benefit Analysis (NES Dictionary) - a technique for assessing the range of costs and
benefits associated with a given option, usually to determine feasibility. Costs are generally in
monetary terms.
Cost Driver Attributes - productivity factors in the software product development process that
include software product attributes, computer attributes, personnel attributes, and project
attributes.
172
Parametric Cost Estimating Handbook
Cost Drivers - The controllable system design or planning characteristics that have a
predominant effect on the system's costs. Those few items, using Pareto's law, that have the most
significant cost impact.
Cost Effectiveness (NES Dictionary) - the measure of the benefits to be derived from a system
with cost as a primary or one of the primary measures.
Cost Effectiveness Analysis (NES Dictionary) - a method for examining alternative means of
accomplishing a desired military objective/mission for the purpose of selecting weapons and
forces which will provide the greatest military effectiveness for the cost.
Cost Estimating (Fisher) - "making an estimate" of the cost of something implies taking a rather
detailed set of specifications and "pricing them out."
Cost Estimating (NES Dictionary) - cost and price estimating is defined as the art of
predetermining the lowest realistic cost and price of an item or activity which assure a normal
profit.
Cost Factor - a brief arithmetic expression wherein cost is determined by the application of a
factor as a proportion.
173
Parametric Cost Estimating Handbook
Cost Model - an estimating tool consisting of one or more cost estimating relationships,
estimating methodologies, or estimating techniques used to predict the cost of a system or one of
its lower level elements.
Cost/Schedule Control System Criteria: C/SCSC - a set of criteria specified by the Federal
Government for reporting project schedule and financial information.
Current Dollars - Level of costs in the year actual cost will be incurred. When prior costs are
stated in current dollars, the figures given are the actual amounts paid. When future costs are
stated in current dollars, the figures given are the actual amounts expected to be paid including
any amount due to future price changes.
Deflators - The de-escalation factors used to adjust current cost/price to an earlier base year for
comparison purposes. A deflator is the inverse of an escalator.
Delivered Source Instructions: DSIs - the number of source lines of code developed by the
project. The number of DSIs is the primary input to many software cost estimating tools. The
term DELIVERED is generally meant to exclude non-delivered support software such as test
drivers. The term SOURCE INSTRUCTIONS includes all program instructions created by
project personnel and proceed into machine code by some combination of preprocessors,
compilers, and assemblers. It excludes comments and unmodified utility software. It includes
job control language, format statements, and data declarations.
Delphi Technique - a group forecasting technique, generally used for future events such as
technological developments, that uses estimates from experts and feedback summaries of these
estimates for additional estimates by these experts until a reasonable consensus occurs. It has
been used in various software cost-estimating activities, including estimation of factors
influencing software costs.
174
Parametric Cost Estimating Handbook
Detail Estimating: Grass Roots, Bottom-Up - The logical buildup of estimated hours and
material by use of blue-prints, production planning tickets, or other data whereby each operation
is assigned a time value.
Design to Cost: DTC - An acquisition management technique to achieve defense system designs
that meet stated cost requirements. Cost is addressed on a continuing basis as part of a system's
development and production process. The technique embodies early establishment of realistic
but rigorous cost objectives, goals, and olds and thresholds and a determined effort to achieve
them. Cost, as a key design parameter, is addressed on a continuing basis and as an inherent part
of the development and production process. Cost goals are established early in the development
phase, which reflect the best balance among life cycle cost, acceptable performance and schedule.
175
Parametric Cost Estimating Handbook
Domain - a specific phase or area of the software life cycle in which a developer works.
Domains define developers and users areas of responsibility and the scope of possible
relationships between products. The work can be organized by domains such as Software
Engineering Environments, Documentation, Project Management, etc.
DTC Goals - A cost bogey, in constant dollars, based upon a specified production quantity and
rate, established early during system development as a management objective and design
parameter for subsequent phases of the acquisition cycle. (see DTC Target).
DTC Parameter - Approved measurable values for selected cost elements established during
system development as design considerations and management objectives for subsequent life
cycle phases. A DTC parameter may be an objective, goal, or threshold. Values will be
expressed it constant dollars, resources required, or other measurable factor(s) that influences
costs.
DTC Targets - Approved DTC parameters divided into smaller. identifiable tasks or areas of
responsibility that serve as requisites for contractors or government activities.
DTC Threshold - Costs or values that, if exceeded, will cause a program review.
Economics Analysis (Hitch and McRean) - economic analysis - ranging from just straight
thinking about alternative courses of action to systematic quantitative comparisons, to identify or
achieve near "optimal" solutions.
Economic Analysis (DoDI 7041.3) - a systematic approach to the problem of choosing how to
employ scarce resources and an investigation of the full implications of achieving a given
objective in the most efficient and effective manner.
176
Parametric Cost Estimating Handbook
Effectiveness (DoDI 7041.3) - the performance or output received from an approach or program.
Effort Adjustment Factor: EAF - a term used in COCOMO to calculate the cost driver
attribute's effect on the project. It is the product of the effort multipliers corresponding to each of
the cost drivers for the project.
Estimating (NES Dictionary) - to predict costs. Generation of detailed and realistic forecasts of
hours, material costs, or other requirements for a task, subtask, operation, or a part or groups
thereof - generally in response to a request for proposal or specific statement of work. The art of
approximating the probable worth (or cost), extent, quantity, quality, or character of something
based on information available at the time. It also covers the generation of forecasted costing
rates and factors with which estimate numbers may be converted to costs, and the application of
these costing rates and factors to "estimate numbers" to establish forecasted costs.
Expert Judgment Models - use a method of software estimation that is based on consultation
with one or more experts that have experience with similar projects. An expert-consensus
mechanism such as the DELPHI TECHNIQUE may be used to produce the estimate.
177
Parametric Cost Estimating Handbook
Fixed Year Dollars - dollars that have been adjusted for inflation to a specific year.
Historical Data - a term used to describe a set of data reflecting actual cost or past experience of
a product or process.
Inflation - an increase in the level of prices for the same item(s). Examples of indices that
reflect inflation include Consumer Price Index (CPI), Wholesale Price Index, and Producer Price
Index. When prices decline it is called deflation.
Life Cycle - the stages and process through which hardware or software passes during its
development and operational use. The useful life of a system. Its length depends on the nature
and volatility of the business, as well as the software development tools used to generate the
databases and applications.
178
Parametric Cost Estimating Handbook
Metric - Quantitative analysis values calculated according to a precise definition and used to
establish comparative aspects of development progress, quality assessment or choice of options.
Normalize - to adjust data (normally cost data) for effects like inflation, anomalies, seasonal
patterns, technology changes, accounting system changes, reorganizations, etc.
Operation (Philip Morse: Notes on Operations Research, MIT Press 1962) - an operation is a
pattern of activity of men or of men and machines, engaged in carrying out a cooperative and
usually repetitive task, with pre-set goals and according to specified rules of operation.
Operational Research (Operational Research Quarterly, Vol. 13, No. 3, 282 (Sept. 1962)) -
operational research is the attack of modern science on complex problems arising in the direction
and management of large systems of men, machines, materials, and money in industry, business,
government, and defense. The distinctive approach is to develop a scientific model of the
system, incorporating measurements of factors such as chance and risk, with which to predict and
compare the outcomes of alternative decisions, strategies, or controls. The purpose is to help
management determine its policies and actions scientifically.
Organic Mode - a term used by COCOMO to describe a project that is developed in a familiar,
stable environment. The product is similar to previously developed products. Most people
connected with the project have extensive experience in working with related systems and have a
thorough understanding of the project. The project contains a minimum of innovative data
processing architectures or algorithms. The product requires little innovation and is relatively
small, rarely greater than 50,000 DSIs.
179
Parametric Cost Estimating Handbook
Parametric Cost Estimate - Estimate derived from statistical correlation of historic system costs
with performance and/or physical attributes of the system.
Procedures - manual procedures are human tasks. Machine procedures are lists of routines or
programs to be executed, such as described by the Job Control Language (JCL) in a mini or
mainframe, or the batch processing language in a personal computer.
Process - the sequence of activities (in software development) described in terms of the user
roles, user tasks, rules, events, work products, resource use, and the relationships between them.
It may include the specific design methodology, language, documentation standards, etc.
Production Rate - the number of items produced in a given time period (e.g., items/month).
180
Parametric Cost Estimating Handbook
Program Evaluation (DoDI 7041.3) - is the economic analysis of on-going actions to determine
how best to improve an approved program/project based on actual performance. Program
evaluation studies entail a comparison of actual performance with the approved program/project.
Program Evaluation and Review Technique: PERT - a method used to size a software or
hardware product and calculate the Standard Deviation (SD) for risk assessment. For example,
the PERT equation (beta distribution) estimates the Equivalent Delivered Source Instructions
(EDBIs) and the SD based on the analyst's estimates of the lowest possible size, the most likely
size, and the highest possible size of each Computer Program Component (CPC).
Program Work Breakdown Structure - A program work breakdown structure covers the entire
acquisition cycle of a program, consists of at least the first three levels of a work breakdown
structure, as prescribed by MIL-STD-881B, and its extension by the DoD Component and/or
contractor(s). The program work breakdown structure has uniform element terminology,
definition, and placement in the family-tree structure.
Rapid Prototyping - the creation of a working model of a software module to demonstrate the
feasibility of the function. The prototype is later refined for inclusion in a final product.
Rayleigh Distribution - a curve that yields a good approximation to the actual labor curves on
software projects.
Real-Time - (1) Immediate response. The term may refer to fast transaction processing systems
in business; however, it is normally used to refer to process control applications. For example, in
avionics and space flight, real-time computers must respond instantly to signals sent to them.
(2) Any electronic operation that is performed in the same time frame as its real-world
counterpart. For example, it takes a fast computer to simulate complex, solid models moving on
screen at the same rate they move in the real world. Real-time video transmission produces a live
broadcast.
181
Parametric Cost Estimating Handbook
Resources (Fisher) - raw materials, skilled personnel, scarce materials or chemicals, etc.
Reusability - ability to use all or the greater part of the same programming code or system design
in another application.
Reuse - software development technique that allows the design and construction of reusable
modules, objects, or units, that are stored in a library or database for future use in new
applications. Reuse can be applied to any methodology in the construction phase, but is most
effective when object oriented design methodologies are used.
Schedule - a time display of the milestone events and activities of a program or project.
Scope - a definition of how, when, where, and what a project is expected to include and
accomplish.
Security - the protection from accidental or malicious access, use, modification, destruction, or
disclosure. There are two aspects to security, confidentiality and integrity.
182
Parametric Cost Estimating Handbook
and inexperienced personnel. The software to be developed has some characteristics of both
organic and embedded modes. Semi-detached software can be as large as 300K DSIs.
Should Cost Estimate - An estimate of contract price which reflects reasonably achievable
economy and efficiency. It is generally accomplished by a team of procurement, contract
administration, cost analysis, audit and engineering representatives performing an in-depth
analysis of cost and cost effects. Its purpose is to develop realistic cost objectives.
Software Development Life Cycle - the stages and process through which software passes
during its development. This includes requirements definition, analysis, design, coding, testing,
and maintenance.
Software Development Life Cycle Methodology - application of methods, rules, and postulates
to the software development process to establish completeness criteria, assure an efficient
process, and develop a high quality product.
Software Method - (or Software Methodology) - focuses on how to navigate through each phase
of the software process model (determining data, control, or uses hierarchies; partitioning
functions; and allocating requirements) and how to represent phase products (structure charts;
stimulus-response threads; and state transition diagrams).
Software Tool - program that aids in the development of other software programs. It may assist
the programmer in the design, code, compile, link, edit, or debug phases.
Systems Analysis (Fisher) - Systems analysis may be defined as inquiry to assist decisionmakers
in choosing preferred future courses of action by (1) systematically examining and reexamining
the relevant objectives and the alternative policies or strategies for achieving them; and (2)
comparing quantitatively where possible the economic costs, effectiveness {benefits), and risks
of alternatives.
183
Parametric Cost Estimating Handbook
Technical Non-Cost Data - engineering variables that describe the item, subsystem or system.
The source of this data includes engineering documents, engineering drawings and works of a
technical nature which are specified to be delivered pursuant to the contract.
Top-Down Models - use a method of estimation that estimates the overall cost and effort of the
proposed project derived from global properties of the project. The total cost and schedule is
partitioned into components for planning purposes.
Update - to update an estimate or CER means to utilize the most recent data to make it current,
accurate and complete.
Validation - in terms of a cost model, a process used to determine whether the model selected
for a particular estimate is a reliable predictor of costs for the type of system being estimated.
184
Parametric Cost Estimating Handbook
APPENDIX B
BY:
NEIL F. ALBERT
185
Parametric Cost Estimating Handbook
Neil F. Albert
INTRODUCTION
This paper presents a guide for preparing, understanding and presenting a Work Breakdown
Structure (WBS). It discusses what determines a good work Breakdown Structure, provides a
general understanding for developing a program work breakdown structure, and shows how to
develop and implement a contract work breakdown structure. The primary objective of the paper is
to achieve consistent application of work breakdown structures using MIL-STD-881 as a
background source.
This paper is directed primarily at preparation of a defense program work breakdown structure.
However, the guidance is also appropriate for use with any work breakdown structure developed at
any phase during the acquisition process including concept exploration and definition,
demonstration and validation, engineering and manufacturing developments and production phases.
The guidelines are directed at both contractors and Government Activities in the development of
work breakdown structures for acquisition of defense material items.
DEFINITIONS
Several definitions are critical to the discussion in the following sections. These definitions are:
Level 1: Level 1 is the entire defense material item; for example, electronic system refers to an
electronics capability such as a command and control system, radar system, communications
system, information system, sensor system, navigation/guidance system, electronic warfare
system, etc. Level 1 is usually directly identified as a major program or as a project or
subprogram within an aggregated program.
Level 2: Level 2 elements are major elements of the defense material item; for example, the
prime mission product which includes all hardware and software elements, aggregations of
system level services (e.g., system test and evaluation, and system engineering/program
management) and of data.
Level 3: Level 3 elements are elements subordinate to level 7 major elements; for example,
radar data processor, signal processor, antenna, type of service (e.g. development test and
186
Parametric Cost Estimating Handbook
evaluation, contractor technical support, training services), or types of data (e.g., technical
publications). Lower levels follow the same process.
BACKGROUND
When the decision is made to develop and acquire a new or updated system, several factors are
considered when planning or monitoring efforts. One of these factors is determining the work
breakdown structure to use for the systems. A work breakdown structure is a product-oriented
family tree, composed of hardware, software, services, data and facilities, which results from
system engineering efforts during the development and production of a defense material item, and
which completely defines the program. A work breakdown structure displays and defines the
product(s) to be developed or produced and relates the elements of work to be accomplished to each
other and to the end product. Therefore, the work breakdown structure plays a significant role in
planning and assigning management and technical responsibilities; and monitoring and controlling
the progress and status of (a) engineering efforts, (b) resource allocations, (c) cost estimates, (d)
expenditures, and (e) Cost and technical performance. Seven defense material items are identified.
187
Parametric Cost Estimating Handbook
These are: Aircraft Systems Electronic/Automated Software Systems Missile Systems, Ordnance
Systems Ship Systems, Space Systems, and Surface Vehicle Systems.
a. Technical Management
The work breakdown structure provides a framework for defining the technical objectives of the
program. Together with the contract statement of work and product specification, the work
breakdown structure aids in establishing a specification tree, defining configuration items, and
planning support tasks.
c. Specification Tree
A specification tree, developed by system engineering, structures the Performance parameters
for the system or systems being developed. It subdivides the system(s) and its elements. The
performance characteristics are explicitly identified and quantified. Completed, the
specification tree represents a hierarchy of performance requirements for each component
element of the system for which design responsibility is assigned. Because specifications may
188
Parametric Cost Estimating Handbook
not be written for each product on the work breakdown structures the specification tree may not
match the work breakdown structure completely.
d. Configuration Management
Configuration management is the process of managing the technical configuration of items
being developed. In establishing, the requirement for configuration management on a program,
the Government Activity needs to designate which contract deliverable designated for
configuration management is called a Configuration Item. For software, this item is called
Computer Software Configuration Item (CSCI). Configuration management involves defining
the baseline configuration for the configuration items controlling the changes to that baseline,
and accounting for all approved changes. The frameworks for designating the configuration
items on a program is the work breakdown structure which needs to be extended sufficiently to
clearly define all elements subject to configuration management.
e. Financial Management
The work breakdown structure assists management in measuring cost and schedule
performance. By breaking the total product into successively smaller entities, management can
ensure that all required products are identified in terms of cost and schedule performance goals.
The planning of work by work breakdown structure elements serves as the basis for estimating
and scheduling resource requirements. The assignment of performance budgets to scheduled
segments of contract work identified to responsible organization units produces a time phased
plan against which actual performance can be compared and appropriate corrective action taken
when deviations from the plan are identified. This integrated approach to work planning also
simplifies the identification of potential cost and schedule impacts of proposed technical
changes.
f. Contract Budgeting
Funds management involves periodic comparison of actual costs with time phased budgets,
analysis of performance variances, and follow-up corrective actions as required. When work
breakdown structure product elements and the supporting work; are scheduled, a solid base for
time phases budgets is made. Assignment of planned resource cost estimates to scheduled
activities (tasks) and summarization by work breakdown structure element by time period
results in a time phased program/contract budget, which becomes the performance
measurement baseline.
g. Cost Estimating
Use of the work breakdown structure for cost estimating facilitates program and contract
management. The work breakdown structure helps the Government program office to plan,
coordinate, control and estimate the various program activities that the Government and the
contractors are conducting. It provides a common framework for tracking the estimated and
actual costs during the performance of each contract. The data from the various program
contracts support the Government program manager in evaluating contractor performance,
preparing budgets, and preparing program life-cycle costs e.g., as programs move through the
various phases of the acquisition process (conceptual design, development, and production), the
189
Parametric Cost Estimating Handbook
actual experience to date and the estimates for the remaining phases provides the basis for
reassessment of the total program costs.
h. Data Bases
Cost information accounted for by work breakdown structure element can be used for pricing
and negotiating contracts and contract changes, and for follow-on procurement. Over time, the
Government is accumulating a growing cost data base of similar work breakdown structure
elements from different programs. Such historical cost can be used to develop learning curves,
regression analyses and other techniques to estimate the cost requirements for like elements of
new programs. Actual cost data collected by the Government on each program can be
compared to the original estimates to identify trends, and establish validity of estimating
techniques. Contractors will similarly benefit from such data bases. Since contractors tend to
provide similar products on similar programs, the cost history accumulated on their programs
can assist them in bidding future contracts and budgeting new work.
Work is effort performed by people to transform or create products to solve identified problems
in order to verifiably meet specified objectives. Just as the organization hierarchically structures the
people who perform work, so the work breakdown structure hierarchically structures the products
to be produced and on which the people work. Examples of these products include equipment
(hardware/software), data, services and facilities for such systems as missile systems, helicopter
systems, automated systems, etc.
In order to use the work breakdown structure as a framework for structuring the technical
objectives of a program, in addition to its use as a management tool for cost and schedule control, it
is important that the work breakdown structure be product-oriented. It’s elements should represent
identifiable work products whether they be equipment (hardware, software) data or relatable service
products. Because any work breakdown structure is a product structure, not an organization
structure, complete definition of the effort encompasses the work to be performed by all
participants. Figure l provides an overview of the work breakdown structure development process.
190
Parametric Cost Estimating Handbook
Figure 1
191
Parametric Cost Estimating Handbook
The program work breakdown structure should be developed in the conceptual stages of the
program. The program work breakdown structure evolves during conceptual design from an
iterative analysis of the program objective, functional design criteria, program scope, technical
performance requirements, proposed methods of performance, including procurement strategy, as
well as drawings, process flow charts, and other technical documentation. Before developing the
program work breakdown structure, the Government Activity must know the contractual structure
of the contract (e.g., the prime/subcontract relationship, make vs. buy plan, etc.). It is, therefore,
important that the development documentation detail the Government plan to build, integrate, and
field the system. Through this process the levels of reporting and elements for appropriate Request
for Proposal (RFP) selection are determined.
192
Parametric Cost Estimating Handbook
Figure 2
193
Parametric Cost Estimating Handbook
Figure 3
194
Parametric Cost Estimating Handbook
(3) Rework, retesting, and refurbishing should be treated as work on the appropriate work
breakdown structure element affected, not as separate elements of a work breakdown
structure.
(4) Non-recurring and recurring classifications are not work breakdown structure elements.
The government reporting requirements should require segregation of each work breakdown
structure element into its non-recurring and recurring parts.
(5) Cost saving efforts such as total quality management initiatives, should cost, warranty, etc.,
are not work breakdown structure elements. These efforts should be included in the cost of
the item they affect and not captured separately.
(6) The organization structure of the program office or the contractor should not be the basis for
development of a work breakdown structure. The work breakdown structure should always
retain its product orientation.
(7) Costs for meetings, travel, computer support, etc., arc to be included in the work breakdown
structure elements for which they are associated. They are not to be treated as separate
work breakdown structure elements.
(8) The use of generic term in a work breakdown structure is improper. The system(s) name
and/or nomenclature is appropriate. The work breakdown structure elements should be
clearly named to indicate the character of the product to avoid semantic confusion. For
example, if the level 1 system is Fire Control, then the Level 2 item (prime mission product)
is Fire Control Radar.
(9) Tooling (e.g., special test equipment, and factory support such as: assembly tools, dies, jigs,
fixtures, master forms, handling equipment, etc. should be included in the cost of the
equipment being produced. It is a functional cost, not a work breakdown structure element.
If the tooling cannot be assigned to an identified subsystem or component, it should be
included in the cost of integration, assembly, test, and checkout. Any additional quantities
produced for equipment support or maintenance in the field should be included and reported
under Peculiar Support Equipment. This same philosophy applies to software. For
example, when a software development facility/environment is created to support the
development of software, the cost associated with this element is considered part of the
CSCI it supports, or if more than one CSCI is involved, it should be included in integration,
assembly, test and checkout.
(10) The definition of integration, assembly, test, and checkout includes production acceptance
testing of R&D (including first article test) and production units, but excludes all system
engineering/program management, and system test and evaluation which are associated
with the overall system.
Figure IV provides and example of both a correct and an incorrect work breakdown structure.
195
Parametric Cost Estimating Handbook
Figure 4
196
Parametric Cost Estimating Handbook
f. WBS Dictionary - When developing a program work breakdown structure, the Government
Activity should also develop a WBS Dictionary. The program work breakdown structure
dictionary lists and defines the work breakdown structure elements. Although initially prepared
for the program work breakdown structure, it can be expanded in greater detail at lower levels
by contractors as they develop their contract work breakdown structure.
The dictionary lists elements to show their hierarchical relationship to each other, describes
each work breakdown structure element and the resources and processes required to produce it, and
provides a link to the detailed technical definition documents. The work breakdown structure
dictionary should be revised to reflect changes and should be maintained in a current status
throughout the life of the program.
197
Parametric Cost Estimating Handbook
Figure 5
198
Parametric Cost Estimating Handbook
Figure 6
199
Parametric Cost Estimating Handbook
other applicable Level 2 Common WBS elements will be selected. The result is the contract work
breakdown structure. Figure VII depicts the development and relationship of the Program Work
Breakdown Structure with the Contract Work Breakdown Structure. Each RFP should include the
contract work breakdown structure and the initial WBS Dictionary prepared by the Government
Activity. The RFP should instruct potential contractors to extend the selected contract work
breakdown structure elements to define the complete contract scope.
200
Parametric Cost Estimating Handbook
Figure 7
201
Parametric Cost Estimating Handbook
structure planning, work breakdown structure revisions may result from expansion or
contraction of program/ contract scope and the movement of a program through its various
stages. Normally, changes to the work breakdown structure should not be made after contracts
are awarded and work is underway unless major rescoping of the program occurs. NOTE: The
sequence shown in the preceding paragraphs may be iterative as the program evolves, contracts
are awarded, and the work effort progresses through major program phases. Whenever the
work breakdown structure is revised, the ability to crosswalk and track back to the previous
work breakdown structure must be maintained.
202
Parametric Cost Estimating Handbook
accomplish the work have been organized. To assign specific work tasks, the organizational
structure must be linked effectively with the work breakdown structure. This linkage can occur
at any level of the work breakdown structure. Figure VIII depicts the linkage between the work
break-down structure and the contractor's organizational structure.
b. Cost Account Level -- To provide the responsible (cost account) manager with the technical,
schedule, and cost information needed to manage the organization’s work on the work
breakdown structure element for which he is responsible, the management control system must
be keyed to the same work breakdown structure element and organization unit. The appropriate
work breakdown structure level at which a cost account is established is purely a function of the
magnitude of the program and the type of product. The responsible organization level is a
function of the management span of control and upper management’s desire to delegate
technical, schedule, and cost responsibility for product/contract work breakdown structure
elements to lower management levels. In identifying cost accounts, the contractor must be
allowed to establish organizational responsibilities at meaningful and appropriate levels,
otherwise, the contractor's existing management control systems and responsibility assignments
may be affected adversely. For example, when software is a major component of cost and the
Government wants it identified separately, care must be taken to not unnecessarily complicate
the contractor work breakdown structure and contractor management system. To meet these
needs, special reporting requirements are specified in the SOW. In this example, Figure IX
shows how the cost management system with job coding (SW) and the work breakdown
structure can provide needed detail and visibility without extending the work breakdown
structure to extremely low levels.
Virtually all aspects of the contractor's management control system, including technical
definition, budgets, estimates, schedules, work assignments, accounting, progress assessments,
problem identification, and corrective actions, come together at the cost account. Performance
visibility is directly relatable to the level and content of the cost account.
SUMMARY
The development of any work breakdown structure is intended to achieve a clear understanding
and statement of the technical objectives and the end item(s) (or end product(s)) of the work to be
performed. The process of identifying these objectives assists in structuring the product elements
during the work breakdown structure development. Objectives derived from the overall program
objective are identified in such a way that identifiable products support economically and
technically identifiable subsystems of the program objectives. This process may be repeated until
the component level is reached. In this matter, subsystems support a total system capability.
When properly implemented, a good work breakdown structure can effectively satisfy the needs
of both the Government and contractor program managers. By coordinating the development of the
work breakdown structure, the statement of work the contract line item structure and the product
specification, cost and technical requirements and performance can be better translated, managed
and maintained. Without this generic framework, the acquisition process becomes difficult and
complicated.
203
Parametric Cost Estimating Handbook
Figure 8
204
Parametric Cost Estimating Handbook
Figure 9
205
Parametric Cost Estimating Handbook
Neil F. Albert is Director of the Cost Analysis Division at Management Consulting & Research,
Inc. (MCR), Falls Church, Virginia. He has over sixteen years of experience in directing and
performing cost estimating/analysis tasks. His experience includes engineering and Parametric
estimating, cost trade-off analysis, design-to-cost, and lifecycle cost analysis. He has held various
corporate positions as a consultant, senior manager and cost analyst including Manager of Cost
Analysis/Estimating for Textron Defense Systems. He holds an M.B.A. in Financial Management
and a B.A. in Finance. Mr. Albert is the Administrator for MIL-STD-881B Working Group. In this
capacity he was responsible for supporting the revision of MIL-STD-881A and for organizing and
generating MIL-STD-881B. Mr. Albert was a principal author of the work breakdown structure
User Guide which can be found in Appendix I of MIL-STD-881B.
REFERENCES
2. MIL-STD-881A, Work Breakdown Structures for Defense Material Items, 25 April l975.
10. MIL-HDBK-171 (DRAFT), Work Breakdown Structures for Software Elements, 8 December
1992.
206
Parametric Cost Estimating Handbook
APPENDIX C
207
Parametric Cost Estimating Handbook
APPENDIX C
9-1001 INTRODUCTION
208
Parametric Cost Estimating Handbook
a. Although the basic criteria for cost-to-cost and cost-to-noncost CERs are generally
comparable, the supplementary criteria in this section pertain to cost-to-noncost CERs. Audits of
traditional cost-to-cost estimating rates and factors are covered in other sections of this chapter
and in referenced appendixes.
b. Cost-to-noncost CERs are CERs which use something other than cost or labor hours as
the independent variable. Examples of noncost independent variables include end-item weight,
performance requirements, density of electronic packaging, number or complexity of engineering
drawings, production rates or constraints, and number of tools produced or retooled. CERs
involving such variables, when significant, require that the accuracy and currency of the noncost
variable data be audited. Special audit considerations are described in 9-1003 and 9-1004.
When a contractor uses parametric cost estimating techniques in a price proposal, the
auditor will apply all pertinent criteria applicable to any proposal along with the supplemental
criteria provided in 9-1004.
209
Parametric Cost Estimating Handbook
b. Although the principles are no different, proposals supported in whole or in part with
parametric estimating will present new fact situations concerning cost or pricing data which is
required to be submitted. A fundamental part of the definition of cost or pricing data is "all facts
... which prudent buyers and sellers would reasonably expect to have a significant effect on price
negotiations" (FAR 15.801). Reasonable parallels may be drawn between the data examples
provided in FAR for discrete estimating approaches and the type of data pertinent to parametric
estimating approaches. For example, if a contractor uses a cost-to-noncost CER in developing an
estimate, the data for the CER should be current, accurate, and complete.
c. Many contractors use parametric cost estimating for supplementary support or for cross-
checking estimates developed using other methods. Judgment is necessary in selecting the data
to be used in developing the total cost estimate relied upon for the price proposal. In
distinguishing between fact and judgment, FAR states the certificate of cost or pricing data "does
not make representations as to the accuracy of the contractor's judgment on the estimated portion
of future costs or projections. It does, however, apply to the data upon which the contractor's
judgment is based" (FAR 15.804-4[b]). Therefore, if a contractor develops a proposal using both
parametric data and discrete estimates, it would be prudent to disclose all pertinent facts to avoid
later questions about full disclosure.
d. The auditor should address the following questions during the evaluation of Parametric
cost estimates:
* Do the procedures clearly establish guidelines for when parametric techniques would be
appropriate?
* Are there guidelines for the consistent application of estimating techniques?
* Is there proper identification of sources of data and the estimating methods and
rationale used in developing cost estimates?
* Do the procedures ensure that relevant personnel have sufficient training, experience,
and guidance to perform estimating tasks in accordance with the contractor's established
procedures?
* Is there an internal review of and accountability for the adequacy of the estimating
system, including the comparison of projected results to actual results and an analysis of
any differences?
The auditor should also consider the following supplemental criteria when evaluating
parametric cost estimates.
210
Parametric Cost Estimating Handbook
The contractor should demonstrate that data used for parametric cost estimating
relationships can be verified. In many instances the auditor will not have previously evaluated
the accuracy of noncost data used in parametric estimates. For monitoring and documenting
noncost variables, contractors may have to modify existing information systems or develop new
ones. Information that is adequate for day-to-day management needs may not be reliable enough
for contract pricing. Data used in parametric estimates must be accurately and consistently
available over a period of time and easily traced to or reconciled with source documentation.
211
Parametric Cost Estimating Handbook
a. Contractors may submit proposals for forward pricing rate agreements (FPRAs) or
formula pricing agreements (FPAs) for parametric cost estimating relationships to reduce
proposal documentation efforts and enhance government understanding and acceptance of the
contractor's system. Government and contractor time can be saved by including the contractor's
most commonly used CERs in FPRAs or FPAs. (See FAR 15.809 for basic criteria.) However,
such an agreement is not a substitute for contractor compliance at the time of submitting a
specific price proposal. FAR requires that the contractor describe any FPRAs in each specific
pricing proposal to which the rates apply and identify the latest cost or pricing data already
submitted in accordance with the agreement. All data submitted in connection with the
agreement is certified as being accurate, complete, and current at the time of agreement on price
on each pricing action the rates are used on, not at the time of negotiation of the FPA or FPRA
(FAR 15.809 [d]).
b. Key considerations in auditing FPRA/FPA proposals for parametric CERs follow:
(1) FPRAs/FPAs do not appear practicable for CERs that are intended for use on only
one or few proposals.
(2) Comparability of the work being estimated to the parametric data base is critical.
FPRA proposals for CERs must include documentation clearly describing
circumstances when the rates should be used and the data used to estimate the rates
must be clearly related to the circumstances.
(3) Validation of all the parametric criteria (9-1003) is especially important if a single
CER or family of CERs is to be used repetitively on a large number of proposals.
212
Parametric Cost Estimating Handbook
213
Parametric Cost Estimating Handbook
214
Parametric Cost Estimating Handbook
statistics but should be relevant and verifiable to the experience of the particular contractor using
them.
215
Parametric Cost Estimating Handbook
The total cost of my order is $________. Price includes regular shipping and handling and is subject to change.
International customers please add 25%.
Charge
your
_______________________________________________________ order.
Company or personal name (Please type or print)
It’s
________________________________________________________________________ easy!
Additional address/attention line
________________________________________________________________________
Street Address
________________________________________________________________________
City, State, Zip Code
________________________________________________________________________
Daytime phone including area code
________________________________________________________________________
Purchase order number (optional) To fax
your orders
(202)512-2250
Check method of payment:
Check payable to Superintendent of
Documents
GPO Deposit
Account To phone
VISA MasterCard your orders
(202) 512-1800
Authorizing signature
216
Parametric Cost Analysis Handbook
APPENDIX D
217
Parametric Cost Analysis Handbook
APPENDIX D
In this Appendix we will discuss, in additional detail, some statistical topics including
error, sample size, the Student's "t" distribution, the "f" distribution, and confidence intervals.
STATISTICAL INFERENCE
Z-scores
In general, if X is a measurement belonging to a set of data having the mean x (or µ for
a population) and the standard deviation s (or σ for a population), then its value in standard
units, denoted by Z, is
x−x Χ−µ
Z= or, Z=
S σ
depending on whether the data constitute a sample or a population. In these units, Z tells us how
many standard deviations a value lies about or below the mean of the set of data to which it
belongs.
Error
An estimate is generally called a point estimate, since it consists of a single number, or a
single point on the real number scale. Although this is the most common way in which estimates
are expressed, it leaves room for many questions. For instance, it does not tell us on how much
information the estimate is based, and it does not tell us anything about the possible size of the
error. And, of course, we must expect an error. An estimate's reliability depends upon two
things - the size of the sample and the size of the population standard deviation, σ . Any
statistics textbook will show that the error term is,
σ
E = Z (α 2 )∗ where,
n
218
Parametric Cost Analysis Handbook
Z( α /2) represents the number of standard deviations from the mean that we are willing to allow
our estimate to be "off" either way by probability of 1 − α . This result applies when n is large
and the population is infinite. The two values which are most commonly used for 1 − α are
0.95 and 0.99, with corresponding Z scores, Z ( 0.025) = 1.96 (standard deviations) for 1 − α =
0.95 and Z (0.005) = 2.575 for 1 − α = 0.99, respectively.
There is one complication with this result. To be able to judge the size of the error we
might make when we use x as an estimate of µ , we must know the value of the population
standard deviation, σ . Since this is not the case in most practical situations, we have no choice
but to replace σ with an estimate, usually the sample standard deviation, s. In general, this is
considered to be reasonable provided the sample is sufficiently large (n ≥ 30).
Sample Size
The formula for E can also be used to determine the sample size that is needed to attain a
desired degree of precision. Suppose that we want to use the mean of a large random sample to
estimate the mean of a population, and we want to be able to assert with probability 1 − α that
the error of this estimate will be less than some prescribed quantity E. Solving the the previous
equation for n, we get,
2
Z (α / 2 )∗σ
n=
Ε
Confidence Intervals
For large random samples from infinite populations, the sampling distribution of the
mean is approximately normal with the mean µ and the standard deviation σ / n , namely,
that,
x−µ
Z=
σ n
is a random variable having approximately the standard normal distribution. Since the
probability is 1 − α that a random variable having the standard normal distribution will take on
a value between -Z( α / 2 ) and Z( α / 2 ), namely, that −Z (α / 2 ) < Z < Z (α / 2 ) , we can
substitute into this inequality the foregoing expression for z and it yeilds,
x−µ
− Z (α / 2 ) < < Z (α / 2)
σ/ n
219
Parametric Cost Analysis Handbook
and we can assert with probability 1 − α that it will be satisfied for any given sample. In other
words, we can assert with ( 1 − α )% confidence that the interval, above, determined on the basis
of a large random sample, contains the population mean we are trying to estimate. When σ is
unknown and n is at least 30, we replace σ by the sample standard deviation, s.
An interval such as this is called a confidence interval, its endpoints are called confidence
limits, and the probability 1 − α is called the degree of confidence. Again, the values most
commonly used for 1 − α are 0.95 and 0.99, the corresponding values of Z (α / 2 ) are 1.96 and
2.575, and the resulting confidence intervals are referred to as 95% and 99% confidence intervals
for µ.
x − t (α / 2 )∗ s / n < µ < x + t (α / 2 )∗ s / n
The degree of confidence is 1 − α and the only difference between this formula and the
large sample formula is the t(α/2) takes the place of Z(α/2).
and is called a variance ratio. The F distribution is a theoretical distribution which depends on
two parameters called the numerator and denominator degrees of freedom. When the F statistic
is used to compare the means of k samples of size n, the numerator and denominator degrees of
freedom are, respectively, k-1 and k(n-1).
This is a simple form of an analysis of variance. The basic idea of an analysis of variance
is to express a measure of the total variation of a set of data as a sum of terms, which can be
attributed to specific sources, or causes of variation. Two such sources of variation could be 1)
actual differences, and, 2) chance differences. As a measure of the total variation of an
observation consisting of k samples of size n, we use the total sum of squares,
220
Parametric Cost Analysis Handbook
k n
SST = ∑ ∑(X
i =1 j =1
ij
− x...) 2 ,
where Xij is the jth observation of the ith sample, i = 1, 2, ... , k, and j = 1, 2, ... , n, and x , the
mean of all the k measurements or observations is called the grand mean. If we divide the total
sum of squares by kn-1, we get the variance of the combined data.
Letting X i devote the mean of the ith sample, i = 1, 2, ..., k, we can write the following
identity:
k k n
SST = n∗ ∑ ( X i − X ...) 2 + ∑ ∑ (X ij − Xi ) 2
i =1 i =1 j =1
Looking closely at the two terms into which the total sum of squares SST has been
partitioned, we find that the first term is a measure of the variation among the means. Similarly,
the second term is a measure of the variation within the individual samples, or chance variation.
Dividing the first term by k-1 and the second by k (n-1), we get the numerator and the
denominator of the F Statistic as defined, above. The first term is often referred to as the
treatment sum of squares, SST and the second term as the error sum of squares, SSE,
experimental error, or chance.
Refer to a statistics textbook for a further explanation of these and other statistical
subjects.
221
Parametric Cost Analysis Handbook
APPENDIX E
222
Parametric Cost Analysis Handbook
APPENDIX E
223
Parametric Cost Analysis Handbook
224
Parametric Cost Analysis Handbook
Data Source: A large historical 50-year database of land, water, air and space
systems. The data includes 54 spacecrafts, 22 space transportation
systems, 61 aircraft, 86 missiles, 29 ships, and 18 ground vehicles.
All of the data points are from programs that were completed
through IOC.
Principle Ground Rules/ The AMCM development model is primarily based on raw
Assumption/Limitations: data from contractors, government reports and published sources.
The original data was adjusted to constant year 1991 dollars using
various inflation factors. The basic model does not distinguish
between non-recurring and recurring cost because of the
difficulties associated with separating the costs in development
programs with very small production runs. The recurring cost of
the system can be computed from the CER’s by calculating the
marginal cost. The mean absolute percentage error (MAPE) for the
model is approximately 45%.
Equipment: The Microsoft Excel add-in application will be able to run on any
platform that supports Microsoft Excel.
Q is the quantity
W is the weight
s is the Specification
IOC is the IOC year
B is the Block Number
D is the Difficulty Factor
a-g are model parameters
225
Parametric Cost Analysis Handbook
Model Description: SSCM estimates the first-unit development and production cost by
computing a weighted average of parametric CER’s derived from
actual Small-satellite cost and technical databases. Included are
production, integration, assembly, and testing. Excluded are any
payload development or production costs.
Principle Ground Rules/ SSCM estimates but costs only; payload development/
Assumptions/Limitations production costs must be separately determined.
226
Parametric Cost Analysis Handbook
Data Source: Proprietary data from variety of DoD launch vehicle programs.
227
Parametric Cost Analysis Handbook
Principle Ground Rules/ Estimates are in 1986 dollars excluding fee. Costs cover all NR
Assumptions/Limitations and REC Spacecraft cost including launch operations (relating to
S/C only) and orbital support.
228
Parametric Cost Analysis Handbook
CER Format: Both single and multivariate using linear and log/log
relationships. An interesting not is the introduction of the
“PING” factor with the log/log relationship.
229
Parametric Cost Analysis Handbook
230
Parametric Cost Analysis Handbook
INPUTS
S/C
Space Veh Dry Wt/Mission •Integration & Assembly
Weight/Manner •Structure
•Attitude Control Subsystem
Sensor Types, Wt •Attitude Determination
Vol/Design Life/Wt •RCS
Wt/Manned •Thermal Control
•Electrical Power Subsystem
Area/BOL Pwr/# cells •Generation $
Wt*BOL Pwr •Storage
Wt*BOL Pwr •Distribution
Wt*BOL Pwr •Conditioning
•Telemetry, Tracking & Command Microwave
Weight •Transmitter/Amplifier Ferrite
Weight/Sync Orbit •Receiver/Exciter Devices
Weight/Output Power •Transponder
•Digital Electronics
Suite Weight •Analog Electronics
Weight •Antenna
Total Impulse •Propulsion (AKM)
Program Level
+ Quantity Costs
Weight •Communications Payload Total $ Microwave
Weight/Sync Orbit •Transmitter/Amplifier Ferrite $ • Prog Mgmt
Weight/Output Power •Receiver/Exciter Devices • Sys Engr
Component Power •Transponder • Sys Test
Suite Weight •Digital Electronics • Data
Weight •Analog Electronics
•Antennas
231
Parametric Cost Analysis Handbook
APPENDIX F
232
Parametric Cost Analysis Handbook
APPENDIX F
This section incorporates inputs from a variety of sources, primarily from model
developers. No attempt has been made to substantiate the information at this point. Comments
and updates are welcome, as are full descriptions of other models not discussed here.
Contents:
PRICE-S
REVIC
SASET
SEER-SEM
SLIM
Other Models
233
Parametric Cost Analysis Handbook
PRICE-S
This model was developed originally by Martin Marietta Price Systems (initially RCA
Price, then GE Price) as one of a family of models for hardware and software cost estimation.
Developed in 1977 by F. Freiman and Dr. R. Park, it was the first commercially available
detailed parametric software cost model to be extensively marketed and used. In 1987, the model
was modified and re-validated for modern software development practices. The PRICE-S model
is proprietary, it can be leased for yearly use on IBM or compatible PC, and operates within
Microsoft windows. It is also available for use on a UNIX workstation. The model is applicable
to all types of software projects, and considers all DoD-STD-2167A development phases.
Inputs
One of the primary inputs for the PRICE-S model is source lines of code (SLOC). This may be
input by the user or computed using either object-oriented or function point sizing models. Both
sizing models are included in the PRICE-S package. Other key inputs include:
1. Application: a measure of the type (or types) of software, described by one of seven
categories (mathematical, string manipulation, data storage and retrieval, on-line, real-
time, interactive, or operating system).
2. Productivity Factor: A calibratable parameter which relates the software program to
the productivity, efficiency/inefficiencies, software development practices and
management practices of the development organization.
3. Complexities: Three complexity parameters which relate the project to the expected
completion time, based on organizational experience, personnel, development tools,
hardware characteristics, and other complicating factors.
4. Platform: the operating environment, in terms of specification, structure and
reliability requirements.
5. Utilization: Percentage of hardware memory or processing speed utilized by the
software.
6. New Design/New Code: Percentage of new design and new code.
7. Integration (Internal): Effort to integrate various software components together to
form an integrated and tested CSCI.
8. Integration (Extenal): Effort to integrate various software CSCI’s together to form an
integrated and tested software system.
9. Schedule: Software project start and/or end dates.
10. Optional Input Parameters: Financial factors, escalation, risk simulation.
Processing
The PRICE-S algorithms are published in the paper entitled "Central Equations of
PRICE S" which is available from PRICE Systems. It states that PRICE-S computes a "weight"
of software based on the product of instructions and application inputs. The productivity factor
and complexity inputs are very sensitive parameters which affect effort and schedule
calculations. Platform is known to be an exponential input; hence, it can be very sensitive. A
new weighted design and code value are calculated by the model based on the type or category of
instructions. Both new design and code affect schedule and cost calculations. Internal
234
Parametric Cost Analysis Handbook
integration input parameters affect the CSCI cost and schedule for integrating and testing the
CSCI. The external integration input parameter is used to calculate software to software
integration cost and schedule.
Outputs
PRICE-S computes an estimate in person effort (person hours or months). Effort can be
converted to cost in dollars or other currency units using financial factors parameters. Software
development schedules are calculated for nine DoD-STD-2167A phases: System Concept
through Operational Test and Evaluation. Six elements of costs are calculated and reported for
each schedule phase: Design Engineering, Programming, Data, Systems Engineering Project
Management, Quality Assurance, and Configuration Management. The PRICE-S model also
contains several optional outputs including over thirty graphs, Gantt charts, sensitivity matrices,
resource expenditure profiles, schedule reports. In addition, Microsoft Project files, spreadsheet
files, and risk analysis reports can be generated. The risk analysis report is a Cumulative
Probability Distribution and is generated using either Monte Carlo or Latin Hypercube
simulation.
Calibration
The PRICE-S model can be run in ECIRP (PRICE backwards) mode to calibrate selected
parameters. The most common calibration is that of the productivity factor, which, according to
the PRICE-S manual, tends to remain constant for a given organization. It is also possible to
calibrate platform, application, and selected internal factors.
Risk Analysis
The PRICE-S model contains a robust Monte Carlo simulation utility, which facilitates
rigorous risk analysis. Uncertainty can be characterized using probability distributions to define
input parameters. Normal, Beta, Triangular and Uniform distributions are among those
available. Simulation results are consolidated and reported as a probabilistic estimate.
Contact
Lockheed-Martin PRICE Systems
700 East Gate Drive, Suite 200
Mt. Laurel, NJ 08054
(800) 437-7423 a.k.a (800) 43PRICE
REVIC
235
Parametric Cost Analysis Handbook
The Revised Intermediate COCOMO (REVIC) model was developed by Ray Kile and the
U.S. Air Force Cost Analysis Agency. It is a copyrighted program that runs under DOS on an
IBM PC or compatible computer. The model predicts the development costs for software
development from requirements analysis through completion of the software acceptance testing
and maintenance costs for fifteen years. REVIC uses the intermediate COCOMO set of
equations for calculating the effort (man-power in staff-months and staff-hours) and schedule
(elapsed time in calendar months) to complete software development projects based on an
estimate of the lines of code to be developed and a description of the development environment.
The forms of the basic equations are:
(I) MM = AB(KDSI)P(Fi)
(2) TDEV = CD(MM)
Equation (1) predicts the manpower in man-months (MM) based on the estimated lines of
code to be developed (KDSI = Delivered Source Instructions in thousands) and the product of
a group of environmental factors (Fi). The coefficients (A,C), exponents (B,D) and the factor
(Fi) are determined by statistical analysis from a database of completed projects. These
variables attempt to account for the variations in the total development environment (such as
programmer's capabilities or experience with the hardware or software) that tend to increase
or decrease the total effort and schedule. The results from equation (1) are input to equation
(2) to determine the schedule (TDEV = Development Time) in months needed to complete
the development.
Addition of the first and last development phases. COCOMO provides a set of tables
distributing the effort and schedule to the phases of development (system engineering,
preliminary design, critical design, etc.) and activities (system analysis, coding, test plarming,
etc.) as a percentage of the total. COCOMO covers four development phases (preliminary design,
critical design, code and unit test, and integration and test) in the estimate. REVIC adds two
more development phases: software requirements engineering, and integration & test after FQT.
REVIC predicts the effort and schedule in the software requirements engineering phase
by taking a percentage of the development phases. It provides a default value (12% for effort,
30% for schedule) for this percentage based on average programs, but allows the user to change
the percentage.
236
Parametric Cost Analysis Handbook
COCOMO development phase ends at completion of the integration & test phase (after
successful FQT). This phase covers the integration of software CSC's into CSCI's and testing of
the CSCI's against the test criteria developed during the program. It does not include the system-
level integration (commonly called software builds) of CSCI's, and the system-level testing to
ensure that system-level specification requirements are met. The software to software and
software to hardware integration and testing is accounted for in the Development Test and
Evaluation (DT&E) phase. REVIC predicts the effort and schedule for this phase by taking a
percentage of the development effort. REVIC provides a default percentage of 22% for effort
and 26% for schedule based on average programs. It allows the user to modify these percentages
if desired.
Complete interface between the model and the user. Users are not required to have extensive
knowledge about the model or detailed knowledge of algorithms. REVIC contains extensive
prompting and help screens. REVIC also removes the typical intimidation factor that prevents
analysts from successfully using models.
Provide the capability to interactively constrain the schedules and staffing levels. Schedules
can be constrained either in the aggregate or by phase of the development effort. Staffing can be
constrained by phase. Using these features, the analyst can estimate cost overruns, underruns,
and schedule slips at any major milestone by entering the actuals-to-date at any milestone, and
letting the program calculate the remaining effort and schedule.
Processing
While REVIC processing is mostly the same as Intermediate COCOMO, it provides a
single weighted "average" distribution for effort and schedule, along with the ability to allow the
user to vary the percentages in the system engineering and DT&E phases. On the other hand,
COCOMO provides a table for distributing the effort and schedule over the development phases,
depending on the size of the code being developed. REVIC has been enhanced by using
statistical methods for determining the lines of code to be developed. Low, high, and most
probable estimates for each CSC are used to calculate the effective lines of code and standard
deviation. The effective lines of code and standard deviation are then used in the equations,
rather than the linear sum of the estimates. This quantifies, and to some extent, reduces the
existing uncertainties regarding software size. [see Boehm's Software Engineering Economics for
a discussion of effective lines of code]. REVIC automatically performs sensitivity analysis
showing the plus and minus three sigma values for effort, and the approximate resultant
schedule.
Outputs
The user is presented with a menu allowing full exercise of the analytical features and displays of
the program. All inputs are easily recalled and changed to facilitate analyses and the user can
constrain the solution in a variety of ways.
237
Parametric Cost Analysis Handbook
REVIC's coefficients have been calibrated using recently completed DoD projects
(development phase only) by using the techniques in Dr. Boehm's book. On the average, the
values predicted by the effort and schedule equations in REVIC are higher than in COCOMO. A
study validating REVIC equations using a database different from that used for initial calibration
was published by the Air Force's HQ AFCMD/EPR. In terms of accuracy, the model compares
favorably with expensive commercial models.
The level of accuracy provided by the model is directly proportional to the user's
confidence in the lines-of-code (LOC) estimates and a description of the development
environment. When little is known about the project or environment, the model can be run
leaving all environment parameters at their default (nominal) settings. The only required input is
LOC. As the details of the project become known, the data file can be reloaded into the program,
and the nominal values can be changed to reflect the new knowledge permitting continual
improvement of the accuracy of the model.
Maintenance Estimate
REVIC provides an estimate for software maintenance over a fifteen year period by using
the Boehm's COCOMO equation:
(3) MMam = MMnom * ACT P (MFi), where MMnom is the result of equation (1) without
multiplying by the environmental factor (Fi); ACT is annual change traffic as a percentage,
and Mfi is the environmental factors for maintenance.
REVIC provides a default percentage of ACT and allows it to be changed. REVIC also
assumes a transition period after delivery of the software, during which residual errors are found
before reaching a steady state condition. This provides a declining, positive delta to the ACT
during the first three years. Beginning the fourth year, REVIC assumes the maintenance activity
consists of both error correction and new software enhancements.
Contact
The Air Force Cost Analysis Agency has assumed responsibility for REVIC upkeep and
distribution. For suggestions for upgrades or problem reporting, contact:
Air Force Analysis Agency
AFCSTC/IS (REVIC)
1111 Jefferson Davis Highway, Suite 403
Arlington, VA 22202
[information current as of 5/92]
(703) 604-0412
238
Parametric Cost Analysis Handbook
REVIC is available on the Air Force cost bulletin board at (800) 344-3602 or from Washington,
D.C. at . (703) 604-0412. Connection should be 1200 or 2400 BAUD with 8 bits, no parity, and
one stop bit. [information current as of 5/92]
REVIC and its user manual are also available on the Air Force Software Technology
Support Center (STSC) bulletin board at autovon 458-7653 or (801) 777-7653. Connection
at 2400 BAUD with 8 bits, no parity, and one stop bit. For access problems call Vern
Phipps or Rick Dahl at (801) 777-7703. [information current as of 5/92]
239
Parametric Cost Analysis Handbook
SASET
Inputs
SASET starts from an initial state of no information about the software project, and then
proceeds to prompt the user in an orderly fashion for the information necessary to gain insight
and emulation of project environment, complexities, and sizing for up to 50 CSCI's, to produce
schedule and effort outputs.
Processing
SASET is a database-derived expert system. The calibrated equations for effort budget
and schedule are computed for the system environment, and software complexity, brought
together for a common set and then multiplied against the equivalent new HOL SLOC.
Outputs
The computerized model has the capability of sizing, costing, scheduling and integrating
up to 50 multiple CSCI's simultaneously. The model determines a labor estimate expressed in
staff months and spread over the phases and subphases of software development life cycle for
each individual CSCI and the total project and helps to establish a logical development sequence.
The model also identifies an optimal schedule and effort.
Calibration
All input variables and calibration constants of SASET are open and capable of being
changed. The calibration process requires a set of software development records as input. The
required data from each record includes staff hours, source lines of code, schedule, and project
complexity. Records are sorted by class (manned flight, avionics, ground, etc.) and type of
software (system, application, support), then run through multiple regression analysis for
productivity constants for each type of software.
240
Parametric Cost Analysis Handbook
Contact
Air Force Analysis Agency
AFCSTC/IS (REVIC)
1111 Jefferson Davis Highway, Suite 403
Arlington, VA 22202
[information current as of 5/92]
(703) 604-0412
241
Parametric Cost Analysis Handbook
SEER-SEM
SEER-SEM is part of a family of software and hardware cost, schedule and risk
estimation tools. SEER models run on IBM, Macintosh, and Sun/UNIX platforms with no
special hardware requirements. SEER-SEM is used throughout the aerospace and defense
industry on two continents. All issues found in today's software environments are addressed.
Inputs
SEER-SEM accepts source lines of code (SLOC) or function points or both. When
selecting function points, the user may use IFPUG standard function points or SEER function-
based inputs which include internal functions. Users follow a Work Breakdown Structure
(WBS) describing each CSCI, CSC, and CSU (module or element) to be estimated. Knowledge
bases are used to provide fast and consistent inputs describing complexity, personnel capabilities
and experience, development support environment, product development requirements, product
reusability requirements, development environment complexity, target environment, schedule,
staffing and probability. Users can modify all inputs to their specifications at any time.
There are four sets of knowledge bases that automatically input environment factors.
These knowledge bases cover a wide variety of scenarios and help users produce fast and reliable
estimates. Knowledge bases are easily calibrated to user environments to give quick and accurate
estimates for the entire life cycle. Users can also change and modify each input at any time.
Knowledge bases include the following:
Platform describes the primary operating platform. Platform knowledge bases include avionics,
business, ground-based, manned space, unmanned space, shipboard, and more.
Application describes the overall function of the software under estimation. Application
knowledge bases include computer-aided design, command & control, database, MIS, office
automation, radar, simulation, and more.
Development Method describes the development methods to be used during the development.
These knowledge bases include Ada, spiral, prototyping, object oriented design, evolving,
traditional waterfall, and more.
Development Standard describes the development documentation, quality, test standards and
practices which will be followed during the development. These knowledge bases include
commercial, ISO-9000, 2167A, 1703, 1679, 7935A and more.
Processing
SEER-SEM uses proprietary algorithms which are found in the back of the User's
Manual. Parameter (input) sensitivities and other insights into the model are also found in the
user's documentation. Knowledge bases can be printed out by users. SEER-SEM utilizes a
unique process that simulates a 10,000 iteration Monte Carlo for risk analysis.
Outputs
242
Parametric Cost Analysis Handbook
SEER-SEM has almost 30 informative reports and charts covering all aspects of software
costs, schedules and risk. The Quick Estimate Report is easily tailored to instantly give the user
specific details for trade-off analyses and decision support information. A Detailed Staffing
Profile follows SEI suggested staffing categories. Risk reports and graphs based on person
months, costs, and schedule are standard features. SEER-SEM gives a minimum schedule
output. However, schedules, personnel, and other factors can be changed to give effort and cost
tradeoffs.
Calibration
Calibration of SEER-SEM involves the effort to customize input values to more closely
reflect particular program development characteristics. SEER-SEM's Design To Technology and
Design To Size functions provide the tools for calibration activities. The SEER knowledge bases
are flexible and easy to create and modify, providing the user with a mechanism for building
custom calibrated knowledge bases.
Maintenance
SEER-SEM baseline maintenance includes all adaptive, perfective and corrective
maintenance. Additionally, you may add annual change rate and growth percents to anticipate
any functional growth or enhancements over the software maintenance period. Enhancements
and block upgrades can also be estimated.
Contact
Galorath Associates, Inc.,
SEER Technologies Division
P.O. Box 90579
Los Angeles, CA 90009 (3 1 0)670-3404.
243
Parametric Cost Analysis Handbook
SLIM
SLIM for Windows 3.1 was developed by Quantitative Software Management, Inc.
(QSM). All of the theory behind the model has been published by Prentice Hall in 1992 in the
book, Measures for Excellence: Reliable Software, on Time, Within Budget by Lawrence H.
Putnam and Ware Meyers. SLIM is based on QSM's Software Equation. This was derived from
the Rayleigh-Norden model and has been validated over a 15 year time period with thousands of
real, completed projects. The equation takes the conceptual form:
This means that the product of the time and effort coupled with the process productivity
of the development organization determines how much function can be delivered. Extensive
empirical study of software data has shown that very strong linearities exist in software behavior.
This is taken into account by the calculational form of the software equation and discloses how
these non-linearities can be exploited by management. The calculation form of the equation is
expressed:
where,
SLIM is applicable to all types and sizes of projects. It computes schedule, effort, cost,
staffing for all software development phases and reliability for the main construction phase.
Because the software equation effectively models design intensive processes and is not
methodology dependent, SLIM works well with waterfall, spiral, incremental, and prototyping
development methodologies. It works with all languages, and function points as well as other
sizing metrics. It is specifically designed to address the concerns of senior management, such as:
1. What options are available if the schedule is accelerated by four months to meet a
tight market window?
2. How many people must be added to get two months schedule compression and how
much will it cost?
244
Parametric Cost Analysis Handbook
3. When will the defect rate be low enough so I can ship a reliable product and have
satisfied customers?
4. If the requirements grow or substantially change, what will be the impact on schedule,
cost, and reliability?
5. How can I quantify the value of my process improvement program?
Computing Platforms
SLIM is available for use on IBM PC or compatible machines running Windows 3.1.
SLIM is licensed on an annual basis with full support and free upgrades.
Inputs
The primary input for SLIM is SLOC, function points, objects, CSCI, or any valid
measure of function to be created. The model uses size ranges for input: minimum, most likely,
and maximum. Other important inputs include:
Processing
There are three primary modes of operation: building and using an historical database,
performing estimating and analysis, and creating presentations and reports.
For estimation, SLIM uses the software equation in conjunction with management
constraints for schedule, cost, staffing and required reliability to determine an optimal solution
with the highest probability of successful completion. Through Monte Carlo simulation
techniques, the size range estimates are mapped through the software equation to provide
estimates of the uncertainty in schedule, cost staffing and reliability. The solution obtained can
be compared with the user's historical data and QSM's historical data to test its reasonableness.
This discloses impossible or highly improbable solutions so that expensive mistakes are avoided.
245
Parametric Cost Analysis Handbook
Outputs
The primary output of SLIM is the optimal solution, which provides development time,
cost, effort and reliability expected at delivery. It also provides comprehensive sensitivity and
risk profiles for all key input and output variables, and a consistency check with similar projects.
SLIM's graphical interactive user interface makes it easy to explore quickly extensive tradeoff
and "what if" scenarios including design to cost, schedule, effort and risk. It has 181 different
output tables and graphs from which the user can choose. These outputs constitute a
comprehensive set of development plans to measure and control the project while it is underway.
Calibration
The process productivity parameter for SLIM can (and should) be obtained by calibration
using historical data. All that is required are project size, development time and effort. These
numbers are input into the software equation to solve for the process productivity. The historical
data can also be used to compare with any current solution to compare for reasonableness.
Contact
Quantitative Software Management, Inc.
1057 Waverly Way
McLean, VA 22102
(703) 790-0055
246
Parametric Cost Analysis Handbook
The SoftCost-R model was developed by Dr. Don Reifer based on the work of Dr. Robert
Tausworthe of the NASA Jet Propulsion Laboratory. SoftCost is now marketed by Resource
Calculations, Inc. of Englewood, Colorado. It contains a database of over 1500 data processing,
scientific and real-time programs. SoftCost-R is applicable to all types of prograrns and
considers all phases of the software development cycle. The model is available for lease on IBM
PC's. A separate model SoftCost-Ada is available to model Ada language and other object-
oriented environments.
SoftCost-Ada has been developed to match the new object-oriented and reuse paradigm
which are emerging not only in Ada, but also C++ and other object-oriented techniques. It
contains a database of over 150 completed projects, primarily Ada.
SoftCost-R Inputs
A key input of SoftCost-R is size, which can either be directly input in SLOC or
computed from function points. SoftCost-R uses a more sophisticated sizing model than
COCOMO; besides reused code, sizes of modules added or deleted may be included. The other
inputs are in four categories like COCOMO. Some SoftCost-R inputs are similar to COCOMO,
but many of the more than thirty inputs are unique. Examples of unique inputs are use of peer
reviews, customer experience, and degree of standardization. Each input except size requires a
rating ranging from "very low" to "extra high", with "nominal" ratings having no effect on effort
calculations. SoftCost-R also uses COCOMO inputs to compare the results of SoftCost-R with
those of an updated version of COCOMO.
SoftCost-Ada Inputs
In the main, the inputs are the same as SoftCost-R, with some changes to reflect the new
paradigm. There is no COCOMO comparison.
Processing
SoftCost-R is not a simple regression model. It uses powerful multivariable differential
calculus to develop solutions relying on the T-W probability distribution. This provides the
ability for the user to perform "what-if" analysis and look at what would happen to schedule if
effort were constrained. Such a capability is not present in COCOMO. SoftCost-R is one of the
few models for which the mathematical algorithms are completely described in the user's manual.
The SoftCost-R equation is:
(1) PM = P0 * Al * A2 * (SLOC)B
where,
1. PM = number of person-months,
2. P0 is a constant factor that may be calibrated,
3. A 1 is the "Reifer cost factor" which is an exponential product of nine inputs,
4. A2 is a productivity factor computed from 22 inputs,
5. B is an exponent which may be calibrated.
247
Parametric Cost Analysis Handbook
The user's manual illustrates values assigned to ratings for all model inputs to help the
user understand the effect of each input on effort and schedule.
Outputs
SoftCost-R computes an estimate in person-months of effort and schedule for each
project, plus a productivity value. Other outputs include a side-by-side comparison with a recent
version of COCOMO, several "what if" analysis options, a resource allocation summary for any
of three development methods (traditional waterfall, incremental development, or Ada object-
oriented), and schedule outputs for Gantt and PERT charts. SoftCost-Ada output formats are
similar, and can interface with project planning tools in the same way.
Calibration
The model contains a calibration file which contains values for multiple calibration
constants and cost drivers. The user may change these values to better describe the user's unique
environment, and store alternative calibration and WBS files for different jobs. SoftCost-Ada
and SoftCost-R are similar.
Contact
Mr. A.J. (Tony) Collins
Resource Calculations, Inc.
7853 East Arapahoe Court, Suite 2500
Englewood, CO 80112-1361
Telephone: (303) 267-0379
Facsimile: (303) 220-5620
248
Parametric Cost Analysis Handbook
OTHER MODELS
CHECKPOINT
Software Productivity Research, Inc.
I New England Park
Burlington, MA 01803
(617) 273-0140
COSTAR
SoftStar Systems
28 Ponemah Road
Arnherst,NH 03031
(603) 672-0987
SOFTST R
Softstar Systems
28 Ponemah Road
Arnherst,NH 03031
(603) 672-0987
Sells COSTAR, a COCOMO variant; supports Ada COCOMO and function points
IBM PC, VAX/VMS
Dan Ligett (personal letter - Jan 16, 1987) [info checked brochure 10/92]
SWAN
IIT Research Institute
201 Mill Street
Rome, NY 13440
(315)336-2359
Fax: (315) 339-7002
Personto Contact: Anthony H. Williarns (315)339-7105 [ok 12/13/93]
A COCOMO variant developed for US Arrny, PM Training Devices, 12350 Research
Parkway, Orlando, FL. 32826
Written in Ada; runs on IBM AT, MS-DOS 3.0 or higher
Includes Ada COCOMO & Function Point Analysis
Target Software
Dr. George Bozoki
552 Marine Parkway, #1202
Redwood City, CA 94065
(415)592-2560
249
Parametric Cost Analysis Handbook
Computer Economics
4560 Admiralty Way, Suite 109
Marina Del Rey, CA 90292
(213) 827-7300
Jensen Model ("System 4"), CEIS (CEI Sizer), training
[ref. 1987 IITRI Software Sizing model paper]
COSTMODL
Developed at the Software Technology Branch at NASA/JSC to be used for Space Station
Freedom Software Support Environment and will be available from COSMIC.
250
Parametric Cost Estimating Handbook
APPENDIX G
251
Parametric Cost Estimating Handbook
APPENDIX G
The list that follows identifies elements of good practice that support reliable estimating
processes.
Questions:
Are there other elements of good estimating practice that should be listed?
Are there other items of evidence that indicate good practice that should be listed?
Note: The objective here is to identify elements of good practice that are distinguished, if
possible, from elements of good estimating processes. This means, among other things, that the
elements in the list should be independent of the estimating methods or models used.
The element in this list should not tell people how to estimate. They should simply provide
guidelines and testable criteria for doing it well.
252
Parametric Cost Estimating Handbook
253
Parametric Cost Estimating Handbook
2. The product to be produced is clearly Estimates for product size and content are
described supported by systematic engineering
analyses
3. The tasks to be estimated are clearly Checklists are used to identify the elements
identified and activities included (and excluded from)
the estimate
4. People from related but different Estimating and analysis team members
projects or disciplines are involved found in historical database
in the estimating process
6. More than one cost model or Differences in results are analyzed and
estimating approach is used accounted for
7. Potential cost and schedules impacts A structured process is used to identify and
are estimated for all identified tasks scope technical risks
254
Parametric Cost Estimating Handbook
10. The estimating origination has a Elements in the database are recorded and
method (such as a database) for described in ways that avoid
organizing and retaining information misinterpretation
on completed projects
Schedule milestones (start and finish dates)
are described in terms of criteria for
initiation or completion
255
Parametric Cost Estimating Handbook
11. Postmortems are held at the Technical seminars and training programs
completion of each project incorporate this information in a timely
manner
12. The emphasis throughout is on The consistency achieved when fitting cost
developing consistency in describing models to completed projects is measured
completed projects and in relating and tracked
new work to demonstrated
performance on those projects The term “model accuracy” (as in “the
model made the estimate”) is never used
256
Parametric Cost Estimating Handbook
What organizational support is needed to successfully establish and sustain a reliable estimating
capability?
The following checklist identifies seven indicators of good support. For each indicator, it lists
items of evidence one can look for to help evaluate the quality of this support.
Questions:
Note: The objective of this checklist is to provide guidance to organizations that are seeking to
establish and sustain a successful estimating capability.
257
Parametric Cost Estimating Handbook
Independence/objectivity
258
Parametric Cost Estimating Handbook
5. The estimating capability Management tracks and reviews the effectiveness of its
of the organization is estimating capability
quantified, tracked, and
evaluated - what is working well
- what’s not working well
- opportunities for improvement
7. Estimators help quantify Cost models are used to account for factors that make
and track progress in projects different, so that process improvements can be
software process meaningfully measured and compared
improvement
Trend analyzed derived form cost model calibrations
are used to track changes in the organization’s
processing and performance parameters
259
Parametric Cost Estimating Handbook
When you (or your boss) receive a cost or schedule estimate, what should you look for to
determine your willingness to rely on that estimate?
The checklist on the following pages identifies key issues that have been identified as important
to estimates we can trust. Each issue is associated with elements of evidence that, if presents
support the credibility of the estimate.
Question:
Are there other issues that you (or your boss) look at?
What evidence relative to any of these issues would support you willingness to rely on an
estimate?
Note: The objective of this checklist is to provide operational guidelines that help people
evaluate the credibility of cost and schedule estimates for software projects.
260