Sunteți pe pagina 1din 19

Chapter 12

Software Measurement
Draft <February 4, 2008>
Note: texts in blue come from other SWEBOK Knowledge Areas – 2004 version

ACRONYMS modeling and analysis methods, there are several


aspects of software engineering measurement which are
BSC Balanced Scorecard fundamental and which underlie many of the more
CMMI Capability Maturity Models Integration advanced measurement and analysis processes [Zus97].
EFQM European Foundation for Quality Mgmt Furthermore, achievement of process and product
FPA Function Point Analysis improvement efforts can only be assessed if a set of
FSM Functional Size Measurement baseline measures has been established.
FSMM FSM Method
GQM Goal-Question-Metric
IPO Input-Processing-Output Measurement is fundamental to the engineering
ISBSG International Software Benchmarking disciplines, and, at the inception of the SWEBOK, it
Standard Group had been given to all the Knowledge Areas (KA)
KA Knowledge Area associate editors as a criterion for identifying relevant
ODC Orthogonal Defect Classification measurement-related knowledge in their respective KA.
QIP Quality Improvement Paradigm It is therefore one of the three common themes (with
RCA Root-Cause-Analysis Quality and Tools) cutting across most KA. However,
SCM Software Configuration Management no attempt had been done to ensure that the
SLC Software Life Cycle measurement related topic in software engineering had
SPC Statistical Process Control had a full coverage. The purpose of this chapter is to
SQM Software Quality Management provide an integrated view of the measurement related
SRS Software Requirement Specification knowledge spread throughout all other chapters and to
VIM Vocabulary of International Metrology improve the coverage of the information about
measurement as it is actually understood in software
engineering. This chapter presents in blue characters
the measurement related information present in the
INTRODUCTION other chapters and, in black characters, the information
added to complete the taxonomy of measurement
The importance of measurement and its role in better related concepts in software engineering.
management practices is widely acknowledged, and so
its importance can only increase in the coming years. BREAKDOWN OF TOPICS FOR SOFTWARE
Effective measurement has become one of the The breakdown of the Software Measurement KA is
cornerstones of organizational maturity. presented in Figure 1, together with brief descriptions
of the major topics associated with it. Appropriate
Measurement can be performed to support the initiation references are also given for each of the topics. Figure
of process implementation and change or to evaluate 1 gives a graphical representation of the top-level
the consequences of process implementation and decomposition of the breakdown for this KA.
change, or it can be performed on the product itself.

While the application of measurement to software


engineering can be complex, particularly in terms of

12-1
Figure 1. Breakdown of topics for the Software Measurement KA

1. Basic Concepts then the 'measuring instrument', and the measurement


This section presents the foundations of Software operations are themselves influenced as well by the
Measurement, as well as the main definitions and 'characteristics of the measuring instruments'.
concepts as well as the ISO measurement information
model.

1.1 Foundations
In sciences and engineering disciplines, it is the domain
of knowledge referred to as ‘metrology’ that focuses on
the development and use of measurement instruments
and measurement processes. The international
consensus on metrology is documented in the ISO
Vocabulary of basic and general terms in metrology –
VIM [ISO93]. Figure 2. ISO Process model of the categories of
It is to be noted that in the Vocabulary, the term metrology terms in the VIM93 [Abr02]
'measurements' when used as a single term corresponds
to the 'set of operations' for measuring. As in any new domain of application, empirical trials
represent the starting point to develop a measure, not
In the VIM, 120 measurement related terms are defined, necessarily following a rigorous process. But after a
and organized into the following six categories: community of interest accepts a series of measures to
1. Quantities and units quantify a concept (or a set of concepts through a
modeling of the relationships across the concepts) it is
2. Measurements usual to ascertain that the proposed measures are
3. Measurement results verified, and validated. Kitchenham [Kit95] has
4. Measuring instruments proposed a software measurement validation
framework and this framework has been discussed and
5. Characteristics of the measuring systems analyzed in Sellami et al. [Sel03].
6. Measurement standards – Etalons
1.2 Definitions and Concepts
To represent the relationships across the elements of
these six categories, the classical representation of a 1.2.1 Definitions
production process is selected: e.g. input, output and Key terms on measurement processes have been
control variables, as well as the process itself inside the defined in ISO/IEC 15939 [ISO15939-07] on the basis
box [Abra02]. In Figure 2, the output is represented by of the ISO international vocabulary of metrology
the 'measurement results' and the process itself by the [ISO93], and the key terms on measures, measuring
'measurement' in the sense of measurement operations, instruments and measuring systems in the VIM itself.
while the control variables are the 'Etalons' and the Considering the six categories above mentioned in
'quantities and units'. This set of concepts represent Figure 1, noted using the IDEF0 process notation:

12-2
• Quantities and units: this section includes assigning the measures. If the numbers assigned are
definitions about the possible units of measure and merely to provide labels to classify the objects, they are
coherent systems as well as terms related to the called nominal. If they are assigned in a way that ranks
‘dimension’ concept (i.e. base or derived quantity); the objects (for example, good, better, best), they are
• Measurements: this section includes all terms called ordinal. If they deal with magnitudes of the
related to the measurement process, including for property relative to a defined measurement unit, they
instance the measurand, measurement signal, are called interval (and the intervals are uniform
transformed value, measurement method, between the numbers unless otherwise specified, and
measurement as well as the influence quantities; are therefore additive). Measurements are at the ratio
• Measurement results: this section presents the level if they have an absolute zero point, so ratios of
terms related to the types of measurement results, distances to the zero point are meaningful.
the modes of verification of the measurement
results and information about the uncertainty of 1.2.3 Measurement ontology
measurement. A initial software measurement ontology is presented
• Measuring instruments: this section presents a in [Garc06]: it is based on ISO 15939 and other
series of terms related to the instruments/devices international standards and proposals, and aims at
for measurement. contributing to the harmonization and integration of the
• Characteristics of the measuring instruments: this different approaches to software measurement by
section proposes a series of definitions about providing a set of common concepts.
intervals, conditions, thresholds and accuracy in
measurement; 1.2.4 Design of a software measurement method
• Measurement standards – Etalons: VIM mentions The products associated with the design of a software
that “In science and technology, the English word measurement method are [Hab08]:
“standard” is used with two different meanings: 1. The measurement principle: this is a precise
as a widely adopted written standard, specification, definition of the entities concerned and the
technical recommendation or similar document (in attribute to be measured. It involves a description
French “norme”) and as a measurement standard of the empirical world and the numerical world to
(in French “étalon”). This Vocabulary is which the entities are to be mapped:
concerned solely with the second meaning”.
1.a. The empirical world can be described
through conceptual modeling techniques or
1.2.2 Concepts
through mathematical axioms, or both.
ISO/IEC 15359 also provides a standard process for
measuring both process and product characteristics. 1.b. The numerical world can be, in the general
Nevertheless, readers will encounter terminological case, any mathematical set, together with the
differences in the literature; for example, the term operations performed on it. It can also be
“metric” is sometimes used in place of “measure” defined through the selection of one scale
[Abr02]. type (ordinal, interval or ratio). This also
includes the definition of units and other
The quality of the measurement results (accuracy, allowed composition operations on the
reproducibility, repeatability, convertibility, random mathematical structure.
measurement errors) is essential for the measurement 2. The measurement method: this is a description of
programs to provide effective and bounded results. Key the mapping making it possible to obtain a value
characteristics of measurement results and related for a given entity. It involves some general
quality of measuring instruments have been defined in properties of the mapping (mathematical view),
the ISO International vocabulary on metrology. together with a collection of assignment rules
[ISO93] The theory of measurement establishes the (operational description):
foundation on which meaningful measurements can be
made. The theory of measurement and scale types is 2.a. Mapping properties: In addition to the
discussed in [Kan02]. homomorphism of the mapping, this can also
include a description of other mapping
Measurement is defined in the theory as “the properties; for instance, a unit axiom (the
assignment of numbers to objects in a systematic way mandatory association of the number 1 with
to represent properties of the object.” An appreciation an entity of the empirical set) or, more
of software measurement scales and the implications of generally, an adequate selection of a small
each scale type in relation to the subsequent selection finite representative set of elements ranked
of data analysis methods is especially important. by domain practitioners.
[Abr96; Fen98: c2; Pfl01: c11] 2.b. Numerical assignment rules: These
correspond to an operational description of
Meaningful scales are related to a classification of the mapping, i.e. how to map empirical
scales. For those, measurement theory provides a entities to numerical values: identification
succession of more and more constrained ways of rules, aggregation rules, procedural modeling

12-3
of a measurement instruments family, usage
rules, etc.
3. The measurement procedure: This corresponds to
a complete technical description of the modus
operandi of the measurement method in a
particular context (goal, precision, constraints,
etc.) and with a particular measuring instrument.

Verification activities of the design of a software


measure refer to the verification of each of the above
design products of a software measure..

1.3 Software Measurement Information Models


The concept of Measurement Information Model is
presented and discussed in ISO 15939 (see Figure 3) to
help in determining what must be specified during
measurement planning, performance and evaluation.
Figure 3 shows that a specific measurement method is
used to collect a base measure for a specific attribute.
The values of two or more base measures can then be
used in a computational formula (by means of a
measurement function) to produce and construct a
specific derived measure. These derived measures are
in turn used in an analysis model to arrive at an
Figure 3. ISO 15939 Measurement Information Model
indicator, which is a value, and to interpret the
indicator’s value in order to explain the relationship
between it and the information needed, doing so in the
language of the measurement user, to produce an
Information Product for the user’s Information Needs,
as shown in Figure 3.

The bottom section of Figure 4 represents the


‘Data Collection’ (e.g. the measurement methods and
the base measures) in this Measurement Information
Model, the middle section the ‘Data Preparation’, using
agreed-upon mathematical formulas and related labels
(e.g. measurement functions and derived measures),
and the top section the ‘Data Analysis’ (e.g. analysis
model, indicator and interpretation) see Figure 4.

An example of a measurement information model


applied also to the Software Engineering domain is the
Balanced Scorecard (BSC) framework [Kapl96;
Iban98; Ferg99]. A BSC represents both a process for
building a scorecard as well as an audit on the
corporate measurement system at different levels.
Among the measures to be included in a BSC should
include both a number of lead and lag measures for
each selected perspective as well as links to the
organizational strategic map.

2. Measurement Process
This topic follows the international standard ISO/IEC Figure 4: Measurement steps within the Measurement
15939, which describes a process which defines the Information Model.
activities and tasks necessary to implement a software
measurement process and includes, as well, a 2.1 Establish and Sustain Measurement
measurement information model. Commitment
• Accept requirements for measurement. Each
measurement endeavor should be guided by
organizational objectives and driven by a set of
measurement requirements established by the

12-4
organization and the project. For example, an • Define data collection, analysis, and reporting
organizational objective might be “first-to-market procedures. This encompasses collection
with new products.” [Fen98: c3,c13; Pre04: c22] procedures and schedules, storage, verification,
This in turn might engender a requirement that analysis, reporting, and configuration management
factors contributing to this objective be measured of data [ISO15939-07: 5.2.4].
so that projects might be managed to meet this • Define criteria for evaluating the information
objective. products. Criteria for evaluation are influenced by
- Define scope of measurement. The organizational the technical and business objectives of the
unit to which each measurement requirement is organizational unit. Information products include
to be applied must be established. This may those associated with the product being produced,
consist of a functional area, a single project, a as well as those associated with the processes
single site, or even the whole enterprise. All being used to manage and measure the project
subsequent measurement tasks related to this [ISO15939-07: 5.2.5 and Appendices D, E].
requirement should be within the defined scope. • Review, approve, and provide resources for
In addition, the stakeholders should be identified. measurement tasks.
- Commitment of management and staff to - The measurement plan must be reviewed and
measurement. The commitment must be formally approved by the appropriate stakeholders. This
established, communicated, and supported by includes all data collection procedures, storage,
resources (see next item). analysis, and reporting procedures; evaluation
• Commit resources for measurement. The criteria; schedules; and responsibilities. Criteria
organization’s commitment to measurement is an for reviewing these artifacts should have been
essential factor for success, as evidenced by established at the organizational unit level or
assignment of resources for implementing the higher and should be used as the basis for these
measurement process. Assigning resources reviews. Such criteria should take into
includes allocation of responsibility for the various consideration previous experience, availability of
tasks of the measurement process (such as user, resources, and potential disruptions to projects
analyst, and librarian) and providing adequate when changes from current practices are
funding, training, tools, and support to conduct the proposed. Approval demonstrates commitment to
process in an enduring fashion. the measurement process [ISO15939-07: 5.2.6.1
and Appendix F].
2.2 Plan the Measurement Process - Resources should be made available for
• Characterize the organizational unit. The implementing the planned and approved
organizational unit provides the context for measurement tasks. Resource availability may be
measurement, so it is important to make this staged in cases where changes are to be piloted
context explicit and to articulate the assumptions before widespread deployment. Consideration
that it embodies and the constraints that it imposes. should be paid to the resources necessary for
Characterization can be in terms of organizational successful deployment of new procedures or
processes, application domains, technology, and measures [ISO15939-07: 5.2.6.2].
organizational interfaces. An organizational • Acquire and deploy supporting technologies.
process model is also typically an element of the This includes evaluation of available supporting
organizational unit characterization [ISO15939-07: technologies, selection of the most appropriate
5.2.1]. technologies, acquisition of those technologies,
• Identify information needs. Information needs and deployment of those technologies [ISO 15939-
are based on the goals, constraints, risks, and 07: 5.2.7].
problems of the organizational unit. They may be
derived from business, organizational, regulatory, 2.3 Perform the Measurement Process
and/or product objectives. They must be identified • Integrate measurement procedures with
and prioritized. Then, a subset to be addressed relevant processes. The measurement procedures,
must be selected and the results documented, such as data collection, must be integrated into the
communicated, and reviewed by stakeholders [ISO processes they are measuring. This may involve
15939-07: 5.2.2]. changing current processes to accommodate data
• Select measures. Candidate measures must be collection or generation activities. It may also
selected, with clear links to the information needs. involve analysis of current processes to minimize
Measures must then be selected based on the additional effort and evaluation of the effect on
priorities of the information needs and other employees to ensure that the measurement
criteria such as cost of collection, degree of procedures will be accepted. Morale issues and
process disruption during collection, ease of other human factors need to be considered. In
analysis, ease of obtaining accurate, consistent addition, the measurement procedures must be
data, and so on [ISO15939-07: 5.2.3 and Appendix communicated to those providing the data, training
C]. may need to be provided, and support must
typically be provided. Data analysis and reporting

12-5
procedures must typically be integrated into organizational processes. This structure has been
organizational and/or project processes in a similar chosen for listing measures for each one of the 10 KA.
manner [ISO 15939-07: 5.3.1].
• Collect data. The data must be collected, verified, 3.1 Primary Processes
and stored [ISO 15939-07:5.3.2]. 3.1.1. Software Requirements
• Analyze data and develop information products. As a practical matter, it is typically useful to have some
Data may be aggregated, transformed, or recoded concept of the “volume” of the requirements for a
as part of the analysis process, using a degree of particular software product. This number is useful in
rigor appropriate to the nature of the data and the evaluating the “size” of a change in requirements, in
information needs. The results of this analysis are estimating the cost of a development or maintenance
typically indicators such as graphs, numbers, or task, or simply for use as the denominator in other
other indications that must be interpreted, resulting measurements. Functional Size Measurement (FSM) is
in initial conclusions to be presented to a technique for evaluating the size of a body of
stakeholders. The results and conclusions must be functional requirements. IEEE Std 14143.1 defines the
reviewed, using a process defined by the concept of FSM. [IEEE14143.1-00] Standards from
organization (which may be formal or informal). ISO/IEC and other sources describe particular FSM
Data providers and measurement users should methods.
participate in reviewing the data to ensure that they
are meaningful and accurate and that they can 3.1.2. Software Design
result in reasonable actions [ISO 15939-07: 5.3.3 Measures can be used to assess or to quantitatively
and Appendix G]. estimate various aspects of a software design’s size,
• Communicate results. Information products must structure, or quality. Most measures that have been
be documented and communicated to users and proposed generally depend on the approach used for
stakeholders [ISO 15939-07: 5.3.4]. producing the design. These measures are classified in
two broad categories:
2.4 Evaluate Measurement • Function-oriented (structured) design measures:
• Evaluate information products. Evaluate the design’s structure, obtained mostly through
information products against specified evaluation functional decomposition; generally represented as
criteria and determine strengths and weaknesses of a structure chart (sometimes called a hierarchical
the information products. This may be performed diagram) on which various measures can be
by an internal process or an external audit and computed [Jal97:c5,c7, Pre04:c15]
should include feedback from measurement users. • Object-oriented design measures: the design’s
Record lessons learned in an appropriate database overall structure is often represented as a class
[ISO 15939-07: 5.4.1 and Appendix D]. diagram, on which various measures can be
• Evaluate the measurement process. Evaluate the computed. Measures on the properties of each
measurement process against specified evaluation class’s internal content can also be computed
criteria and determine the strengths and [Jal97:c6,c7; Pre04:c15]
weaknesses of the process. This may be performed
by an internal process or an external audit and 3.1.3. Software Construction
should include feedback from measurement users. Numerous construction activities and artefacts can be
Record lessons learned in an appropriate database measured, including code developed, code modified,
[ISO 15939-07: 5.4.1 and Appendix D]. code reused, code destroyed, code complexity, code
• Identify potential improvements. Such inspection statistics, fault-fix and fault-find rates, effort,
improvements may be changes in the format of and scheduling. These measurements can be useful for
indicators, changes in units measured, or purposes of managing construction, ensuring quality
reclassification of categories. Determine the costs during construction, improving the construction
and benefits of potential improvements and select process, as well as for other reasons. [McC04].
appropriate improvement actions. Communicate Even if measures applied to software source code
proposed improvements to the measurement evolves with new programming languages and styles,
process owner and stakeholders for review and some ‘older’ measures continues to be discussed and
approval. Also communicate lack of potential revisited with new technologies (for instance: the
improvements if the analysis fails to identify cyclomatic complexity number [McCA 76]).
improvements [ISO 15939-07: 5.4.2].
3.1.4. Software Testing
3. Measures by Life Cycle Phase • Program measurements to aid in planning and
The current version Guide to the SWEBOK documents designing testing: Measures based on program
the consensus on the structure of the software size (for example, source lines of code or function
engineering knowledge, which consists of ten (10) points) or on program structure (like complexity)
Knowledge Areas (KA): the first five representing are used to guide testing. Structural measures can
what the ISO calls the primary processes in the 12207 also include measurements among program
standard [ISO95], and the other five the support and modules in terms of the frequency with which

12-6
modules call each other [Bei90:c7s4.2; Jor02:c9] • Mutation score: In mutation testing (see Chapter
(Ber96; IEEE982.1-88). 5, sub-topic 3.4.2), the ratio of killed mutants to
• Fault density: A program under test can be the total number of generated mutants can be a
assessed by counting and classifying the measure of the effectiveness of the executed test
discovered faults by their types. For each fault set. [Zhu97:s3.2-s3.3]
class, fault density is measured as the ratio • Cost/effort estimation and other process
between the number of faults found and the size of measures: Several measures related to the
the program. [Per95:c20] (IEEE982.1-88; resources spent on testing, as well as to the relative
Lyu96:c9) fault-finding effectiveness of the various test
• Life test, reliability evaluation: A statistical phases, are used by managers to control and
estimate of software reliability, which can be improve the test process. These test measures may
obtained by reliability achievement and evaluation cover such aspects as number of test cases
(see Chapter 5, sub-topic 2.2.5), can be used to specified, number of test cases executed, number
evaluate a product and decide whether or not of test cases passed, and number of test cases
testing can be stopped. [Pfl01:c9] (Pos96:p146- failed, among others. Evaluation of test phase
154) reports can be combined with root-cause analysis
• Reliability growth models: Reliability growth to evaluate test process effectiveness in finding
models provide a prediction of reliability based on faults as early as possible. Such an evaluation
the failures observed under reliability achievement could be associated with the analysis of risks.
and evaluation (see Chapter 5, sub-topic 2.2.5). Moreover, the resources that are worth spending
They assume, in general, that the faults that caused on testing should be commensurate with the
the observed failures have been fixed (although use/criticality of the application: different
some models also accept imperfect fixes), and thus, techniques have different costs and yield different
on average, the product’s reliability exhibits an levels of confidence in product reliability.
increasing trend. There now exist dozens of [Per95:c4, c21] (Per95: Appendix B; Pos96:p139-
published models. Many are laid down on some 145; IEEE982.1-88)
common assumptions, while others differ. Notably, • Termination: A decision must be made as to how
these models are divided into failure-count and much testing is enough and when a test stage can
time-between failure models. [Lyu96:c7; Pfl01:c9] be terminated. Thoroughness measures, such as
(Lyu96:c3, c4) achieved code coverage or functional
• Coverage/thoroughness measures: Several test completeness, as well as estimates of fault density
adequacy criteria require that the test cases or of operational reliability, provide useful support,
systematically exercise a set of elements identified but are not sufficient in themselves. The decision
in the program or in the specifications (see Chapter also involves considerations about the costs and
5, sub-area 3). To evaluate the thoroughness of the risks incurred by the potential for remaining
executed tests, testers can monitor the elements failures, as opposed to the costs implied by
covered, so that they can dynamically measure the continuing to test. See also Chapter 5, sub-topic
ratio between covered elements and their total 1.2.1 Test selection criteria/Test adequacy criteria.
number. For example, it is possible to measure the [Bei90:c2s2.4; Per95:c2]
percentage of covered branches in the program
flow-graph, or that of the functional requirements 3.1.5. Software Maintenance
exercised among those listed in the specifications Abran [Abr93] presents internal benchmarking
document. Code-based adequacy criteria require techniques to compare different internal maintenance
appropriate instrumentation of the program under organizations. The maintainer must determine which
test. [Jor02:c9; Pfl01:c8] (IEEE982.1-88) measures are appropriate for the organization in
• Fault seeding: Some faults are artificially question. [IEEE1219- 98; ISO9126-01; Sta94] suggests
introduced into the program before test. When the measures which are more specific to software
tests are executed, some of these seeded faults will maintenance measurement programs. That list includes
be revealed, and possibly some faults which were a number of measures for each of the four sub-
already there will be as well. In theory, depending characteristics of maintainability:
on which of the artificial faults are discovered, and • Analyzability: Measures of the maintainer’s effort
how many, testing effectiveness can be evaluated, or resources expended in trying to diagnose
and the remaining number of genuine faults can be deficiencies or causes of failure, or in identifying
estimated. In practice, statisticians question the parts to be modified
distribution and representativeness of seeded faults • Changeability: Measures of the maintainer’s effort
relative to genuine faults and the small sample size associated with implementing a specified
on which any extrapolations are based. Some also modification
argue that this technique should be used with great • Stability: Measures of the unexpected behavior of
care, since inserting faults into software involves software, including that encountered during testing
the obvious risk of leaving them there. [Pfl01:c8] • Testability: Measures of the maintainer’s and
(Zhu97:s3.1) users’ effort in trying to test the modified software

12-7
software quality measurement is provided in the
Certain measures of the maintainability of software can Software Quality KA, topic 3.4. ISO models of
be obtained using available commercial tools. (Lag96; software product quality and of related measurements
Apr00). See also [Car90:s2-s3; IEEE1219-98:Table3; are described in ISO 9126, parts 1 to 4 [ISO9126-01].
Sta94:p239-249]. [Fen98: c9,c10; Pre04: c15; Som05:c24]

3.2 Supporting Processes 3.2.4. Software Tools & Methods


According to ISO 12207 [ISO95], a supporting process Software Tools are becoming more and more crucial
“supports another process as an integral part with a for achieving proper results, because projects present a
distinct purpose and contributes to the success and growing complexity due to new technologies,
quality of the software project. A supporting process is programming languages, the need to integrate several
employed and executed, as needed, by another process”. huge fluxes of data. Because of their continuous
evolution, tool evaluation is an essential topic [Pfl01]
3.2.1. Software Configuration (IEEE1209-92, IEEE1348-95, Mos92, Val97). The ISO
SCM measures can be designed to provide specific JTC1/SC7/WG4 is developing a series of standards
information on the evolving product or to provide related to the evaluation of software tools (including
insight into the functioning of the SCM process. A criteria and measures) [ISO14471].
related goal of monitoring the SCM process is to
discover opportunities for process improvement. On Software Methods, three groups are discussed in
Measurements of SCM processes provide a good Chapter 10 (heuristic, formal, prototyping), each one
means for monitoring the effectiveness of SCM dealing with a series of approaches. In particular, an
activities on an ongoing basis. These measurements are added value to measurement that these methods can
useful in characterizing the current state of the process, bring is respectively about proposing methods
as well as in providing a basis for making comparisons emphasizing the non-functional view on the project,
over time. Analysis of the measurements may produce refinement of specifications, and prototyping
insights leading to process changes and corresponding evaluation techniques.
updates to the SCMP. [Buc96:c3; Roy98]
Software libraries and the various SCM tool 3.2.5. Software Quality
capabilities provide sources for extracting information The models of software product quality often include
about the characteristics of the SCM process (as well as measures to determine the degree of each quality
providing project and management information). For characteristic attained by the product. [Gra92]. If they
example, information about the time required to are selected properly, measures can support software
accomplish various types of changes would be useful quality (among other aspects of the software life cycle
in an evaluation of the criteria for determining what processes) in multiple ways. They can help in the
levels of authority are optimal for authorizing certain management decision-making process. They can find
types of changes. Care must be taken to keep the focus problematic areas and bottlenecks in the software
of the surveillance on the insights that can be gained process; and they can help the software engineers
from the measurements, not on the measurements assess the quality of their work for SQA purposes and
themselves. Discussion of process and product for longer term process quality improvement. With the
measurement is presented in the Software Engineering increasing sophistication of software, questions of
Process KA. The software measurement program is quality go beyond whether or not the software works to
described in the Software Engineering Management how well it achieves measurable quality goals. There
KA. are a few more topics where measurement supports
SQM directly. These include assistance in deciding
3.2.2. Software Engineering Management when to stop testing. For this, reliability models and
Evaluate the measurement process against specified benchmarks, both using fault and failure data, are
evaluation criteria and determine the strengths and useful. The cost of SQM processes is an issue which is
weaknesses of the process. This may be performed by almost always raised in deciding how a project should
an internal process or an external audit and should be organized. Often, generic models of cost are used,
include feedback from measurement users. Record which are based on when a defect is found and how
lessons learned in an appropriate database [ISO 15939- much effort it takes to fix the defect relative to finding
07: 5.4.1 and Appendix D]. the defect earlier in the development process. Project
When incomplete or insufficient data are available, data may give a better picture of cost. Discussion on
appropriate techniques should be used for analysis of this topic can be found in [Rak97: pp. 39-50]. Related
historical databases [Stri00]. information can be found in the Software Engineering
Process and Software Engineering Management KAs.
3.2.3. Software Engineering Process Finally, the SQM reports themselves provide valuable
As a multi-dimensional attribute, quality measurement information not only on these processes, but also on
is less straightforward to define than those above. how all the software life cycle processes can be
Furthermore, some of the dimensions of quality are improved. Discussions on these topics are found in
likely to require measurement in qualitative rather than [McC04] and (IEEE1012-98).
quantitative form. A more detailed discussion of

12-8
While the measures for quality characteristics and Section 4.1, can be found in the Total Quality
product features may be useful in themselves (for Management – TQM - approach.
example, the number of defective requirements or the
proportion of defective requirements), mathematical 4.1 Measurement Techniques
and graphical techniques can be applied to aid in the Measurement techniques may be used to analyze
interpretation of the measures. These fit into the software engineering processes and to identify
following categories and are discussed in [Fen98, strengths and weaknesses. This can be performed to
Jon96, Kan02, Lyu96, Mus99]. initiate process implementation and change, or to
• Statistically based (for example, Pareto analysis, evaluate the consequences of process implementation
run charts, scatter plots, normal distribution) and change. The quality of measurement results, such
• Statistical tests (for example, the binomial test, as accuracy, repeatability, and reproducibility, are
chi-squared test) issues in the measurement of software engineering
• Trend analysis processes, since there are both instrument-based and
• Prediction (for example, reliability models) judgmental measurements, as, for example, when
assessors assign scores to a particular process. A
The statistically based techniques and tests often discussion and method for achieving quality of
provide a snapshot of the more troublesome areas of measurement are presented in [Gol99]. Process
the software product under examination. The resulting measurement techniques have been classified into two
charts and graphs are visualization aids which the general types: analytic and benchmarking. The two
decision-makers can use to focus resources where they types of techniques can be used together since they are
appear most needed. Results from trend analysis may based on different types of information. (Car91)
indicate that a schedule has not been respected, such as • Analytical techniques: The analytical techniques
in testing, or that certain classes of faults will become are characterized as relying on “quantitative
more intense unless some corrective action is taken in evidence to determine where improvements are
development. The predictive techniques assist in needed and whether an improvement initiative has
planning test time and in predicting failure. More been successful.” The analytical type is
discussion on measurement in general appears in the exemplified by the Quality Improvement Paradigm
Software Engineering Process and Software (QIP) consisting of a cycle of understanding,
Engineering Management KAs. More specific assessing, and packaging [SEL96]. The techniques
information on testing measurement is presented in the presented next are intended as other examples of
Software Testing KA. analytical techniques, and reflect what is done in
References [Fen98, Jon96, Kan02, Pfl01] provide practice. [Fen98; Mus99], (Lyu96; Wei93; Zel98)
discussion on defect analysis, which consists of Whether or not a specific organization uses all
measuring defect occurrences and then applying these techniques will depend, at least partially, on
statistical methods to understanding the types of its maturity.
defects that occur most frequently, that is, answering o Experimental Studies: Experimentation involves
questions in order to assess their density. They also aid setting up controlled or quasi experiments in the
in understanding the trends and how well detection organization to evaluate processes. (McG94)
techniques are working, and how well the development Usually, a new process is compared with the
and maintenance processes are progressing. current process to determine whether or not the
Measurement of test coverage helps to estimate how former has better process outcomes.
much test effort remains to be done, and to predict Another type of experimental study is process
possible remaining defects. From these measurement simulation. This type of study can be used to
methods, defect profiles can be developed for a specific analyze process behaviour, explore process
application domain. Then, for the next software system improvement potentials, predict process
within that organization, the profiles can be used to outcomes if the current process is changed in a
guide the SQM processes, that is, to expend the effort certain way, and control process execution.
where the problems are most likely to occur. Initial data about the performance of the current
Similarly, benchmarks, or defect counts typical of that process need to be collected, however, as a basis
domain, may serve as one aid in determining when the for the simulation.
product is ready for delivery. Discussion on using data o Process Definition Review is a means by which
from SQM to improve development and maintenance a process definition (either a descriptive or a
processes appears in the Software Engineering prescriptive one, or both) is reviewed, and
Management and the Software Engineering Process deficiencies and potential process improvements
KAs. identified. Typical examples of this are
presented in (Ban95; Kel98). An easy
4. Techniques & Tools operational way to analyze a process is to
Measurement techniques and tools are useful to compare it to an existing standard (national,
support decision-makers in the use of quantitative international, or professional body), such as
information. Additional techniques to the ones in IEEE/EIA 12207.0 [IEEE12207.0-96]. With this
approach, quantitative data are not collected on
the process, or, if they are, they play a
12-9
supportive role. The individuals performing the • Measurement tools. The measurement tools assist
analysis of the process definition use their in performing the activities related to the software
knowledge and capabilities to decide what measurement program.
process changes would potentially lead to
desirable process outcomes. Observational 5. Quantitative Data
studies can also provide useful feedback for Quantitative date is identified in Vincenti as one of the
identifying process improvements. (Agr99) engineering knowledge types [Vin90]. According to
o Root Cause Analysis is another common Vincenti, quantitative data should be represented in
analytical technique which is used in practice. tables and graphs, proposing measurement references
This involves tracing back from detected and codified experimental data, in order to show to
problems (for example, faults) to identify the those interested practical results from the consistent
process causes, with the aim of changing the application of a shared practice in the domain. Such
process to avoid these problems in the future. quantitative data is currently not referenced in any of
Examples for different types of processes are the SWEBOK chapters.
described in (Col93; Ele97; Nak91).
o Orthogonal Defect Classification (ODC) is a 5.1 Types of Entities
technique which can be used to link faults found Quantitative data are typically collected based on the
with potential causes. It relies on a mapping following entities being measures.
between fault types and fault triggers. (Chi92;
Chi96) The IEEE Standard on the classification 5.1.1. Organization
of faults (or anomalies) may be useful in this Data on organizations include, for example, the semi-
context (IEEE Standard for the Classification of annual results on CMMI staged appraisals
Software Anomalies - IEEE1044-93). ODC is [SEMA06a][SEMA06b], as well as the results from
thus a technique used to make a quantitative prizes associated to Performance Management Models
selection for where to apply Root Cause as Malcolm Baldrige in the UU.SS., EFQM in Europe.
Analysis.
o Statistical Process Control is an effective way 5.1.2. Project
to identify stability, or the lack of it, in the Data on projects are available from project
process through the use of control charts and benchmarking repositories such as the ISBSG
their interpretations. A good introduction to SPC (International Software Benchmarking Standards
in the context of software engineering is Group) [ISBSG07a], with over more than 4000
presented in (Flo99). projects gathered worldwide as of 2006; a
o The Personal Software Process defines a series characteristics of this repository is that the size of all of
of improvements to an individual’s development these projects is measured with one of the ISO
practices in a specified order [Hum95]. It is recognized FSM methods - see www.isbsg.org.
‘bottom-up’ in the sense that it stipulates
personal data collection and improvements 5.1.3. Resource
based on the data interpretations. Referred to the ‘resource’ entity, the most relevant one
• Benchmarking techniques: The second type of is people. P-CMM [PCMM-01] appraisal data could be
technique, benchmarking, “depends on identifying a good source of information about people
an ‘excellent’ organization in a field and management [PCMM-06]. Other information on
documenting its practices and tools.” resources can be retrieved from the ISBSG repository
Benchmarking assumes that if a less-proficient [ISBSG07a].
organization adopts the practices of the excellent
organization, it will also become excellent. 5.1.4. Process
Benchmarking involves assessing the maturity of Process data are typically available from project
an organization or the capability of its processes. It repositories (see 5.2) or from process assessment
is exemplified by the software process assessment appraisals [SEMA06a][SEMA06b]. Repositories of
work. A general introductory overview of process process assessment appraisal data are typically
assessments and their application is provided in proprietary.
(Zah98).
5.1.5. Product
4.2 Measurement Tools The data collected on software products are based on
Software engineering management tools are subdivided either individually defined measures and models, or on
into three categories [Dor02]: project planning and measures and models proposed by standards
tracking, risk management, and measurement. organizations: ISO 9126 (to be re-labeled as the ISO
• Project planning and tracking tools. These tools 25000 series) contains both examples of measures and
are used in software project effort measurement model to quantify software product quality. There are
and cost estimation, as well as project scheduling. some publications on data based on such measures and
• Risk management tools. These tools are used in models, but with limited scope. An example of the use
identifying, estimating, and monitoring risks. of the ISO 9126 approach is documented in [Fra03] for

12-10
Web Environment and software package selection. A • ISO 15504, documents …., including ISO 15504-5
mapping of ISO 9126 to ISBSG is provided in [Che07]. for other exemplar process models.

5.2 Repositories 6.2. De facto standards


In 2008, two publicly available data repositories are
available to the software engineering community: the 6.2.1 SEI
ISBSG [ISBSG07 and the PROMISE repository for ?????
quality studies (Boett07). The ISBSG repository has, as
of the beginning of 2008, more than 4100 projects of 6.2.2 ISBSG
several types. These repositories are sources of data for ISBSG (International Software Benchmarking
external benchmark analysis as well as examples of Standards Group) has put in place a data collection
data collection procedures, including standardized standard on software projects [ISBSG07a]. The ISBSG
definitions for the data collected. data collection questionnaire [ISBSG07b] proposes 7
sections subdivided into several subsections:
6. Measurement Standards • Submitter Information: in this section,
A de jure standard is typically defined and adopted by information is collected about the organization and
an (independent) standardization body such as ISO and the people filling out the questionnaire.
IEC, ANSI (USA), AFNOR (France), etc. A de facto • Project Process: information is collected about the
standard is typically defined and adopted by a project process. The information collected can be
community of interests, such as IEEE, ISBSG, etc. detailed along the various phases of a software life
cycle (SLC).
6.1. De jure standards
• Technology: information is collected about the
6.1.1 Software Product Quality:
tools used for developing and carrying out the
• ISO defines a standard model for the quality of a project and for each stage of the software life cycle.
software product [ISO9126-01], with three views
• People and Work Effort: three groups of people
of software quality: internal, external and quality
are considered: development team, customers and
in use. While ISO 9126 Parts 2 to 4 proposes
end-users, and IT. operations. Collected
candidate measures for each of these views, none
information includes information about the various
of these measures are yet considered by ISO as
people working on the project, their roles and their
measurement standards. This ISO 9126 is
expertise, and the effort expended for each phase
currently under revision at ISO and will be re-
of the software life cycle.
published as the ISO 25000 series.
• Product: information is collected about the
• In addition, ISO proposes a software usability
software product itself (e.g. software application
model [ISO9241].
type and deployment platform, such as
client/server, etc.). This includes as well functional
6.1.2 Software functional size
size
• Functional Size Measurement (FSM): key
• Project Completion: this last section of the
concepts have been specified in ISO 14143-1 for a
questionnaire provides an overall picture of the
method to be recognized as a Functional Size
project, including project duration, defects,
Measurement Method. The ISO 14143 series
number of lines of code, user satisfaction and
includes five additional technical reports on
project costs, including cost validation.
various aspects related to functional size
measurement methods, such as verification criteria,
6.2.3 Others
functional domains.
• Five functional size measurement methods have • Defects Measurement: some organizations have
been adopted as ISO standards in conformity with proposes standards for measuring defects, such as
ISO 14143-1: [UKSMA00]
• COSMIC-FFP v2.1 [ISO19761-03],
• IFPUG FPA v4.1 Unadjusted [ISO20926-03],
• Mk-II FPA v1.3.1 [ISO20968-02],
• NESMA FPA v2.1 [ISO24570]
• FISMA FPA v1.1 [ISO29881].

6.1.3 Software measurement process


• ISO 15939 [ISO 15939-07] documents the
software measurement process – See section 2.

6.1.4 Software process


• ISO 19011 documents a generic process appraisal
(9001-14001)

12-11
12-12
MATRIX OF TOPICS VS. REFERENCE MATERIAL Comment [LBU1]: To be
updated at the end of the work
with the Classification related to
SWEBOK 1
International Books Papers & Book chapters: the empirical method used
Source / Item
Measurement Topics Standards Empirical Method Used
Breakdown
1.0. Basic Concepts New
1.1. Foundations SEP, §8.4 [ISO93] [Zus97] [Abr03]: Legacy (B2)
[Shep95]
1.2. Definitions and
concepts
1.2.1. Definitions SEP, §8.4.2 [ISO15939-07] ---
[ISO93]
1.2.2. Concepts SEM, §7.6 [ISO93] [Kan02] [Abr96]: Legacy (B2)
[Fen98: c2]: Literature Search (B1)
[Pfl01: c11]: Literature search (B1)
[Abr02]
[Garc06]
1.3. Software SEM, §7.2.6 [ISO15939-07] [Abr02]: Literature Search (B1)
Measurement Models [Jac97]: Static Analysis (B4)
1.4. Types of Entities New [Kapl96]
1.4.1. Fenton & [Fent98] ---
Pfleeger
1.4.2. ISBSG [ISBSG07a]
1.4.3. Further [Bug02]: Literature search (B1) Comment [LBU2]: To be
groupings cancelled
2.0. Measurement
Process
2.1. Establish and SEM, §2.6.1 [ISO15939-07] [PSM03 ] [Fen98: c3,c13]: Literature Search (B1)
Sustain Measurement [Pre04: c22]: Literature Search (B1)
Commitment
2.2. Plan the SEM, §2.6.2 [ISO15939-07] [PSM03 ]
Measurement Process
2.3. Perform the SEM, §2.6.3 [ISO15939-07] [PSM03 ]
Measurement Process
2.4. Evaluate SEM, §2.6.4 [ISO15939-07] [PSM03 ]
Measurement Process
3.0. Measures by
SLC phase
3.1. Primary Processes
3.1.1. Software SR, §1.7.5 [IEEE14143.1-00]
Requirements [ISO19761-03]
[ISO20926-03]
[ISO20968-02]
3.1.2. Software Design SD, §2.4.3 [Jal97: c5,c6,c7] : Literature search (B1)
[Pre04: c15]: Literature Search (B1)
3.1.3. Software SC, §3.2.3 [McC04] [McCA76]: Static Analysis (B4)
Construction
3.1.4. Software Testing ST, §4.4.1.1 [Bei90:c7s4.2] : Literature search (B1)
[Jor02:c9] : Literature search (B1)
ST, §4.4.1.3 [Per95:c20] : Literature search (B1)
ST, §4.4.1.4 [Pfl01:c9] : Literature search (B1)
ST, §4.4.1.5 [Lyu96:c7] : Literature search (B1)
[Pfl01:c9] : Literature search (B1)
ST, §4.4.2.1 [Jor02:c9] : Literature search (B1)
[Pfl01:c8] : Literature search (B1)
ST, §4.4.2.2 [Pfl01:c8] : Literature search (B1)
ST, §4.4.2.3 [Zhu97:s3.2-s3.3]: Literat. Search (B1)
ST, §4.5.1.6 [Per95:c4,c21] : Literature search (B1)
ST, §4.5.1.7 [Bei90:c2s2.4] : Literature search (B1)
[Per95:c2] : Literature search (B1)
3.1.5. Software SM, §5.2.4.1 [IEEE1219-98:Tab3] [Abr93]: Case Study (A3)
Maintenance [IEEE1219-98] [Car90:s2-s3] : Literature search (B1)
[ISO9126-01] [Sta94: 239-249]: Field Study (A4)
[ISO19761-03]
3.2. Supporting Processes
3.2.1. Software SCM, §6.1.5.1 [Buc96: c3] : Literature search (B1)
Configuration [Roy98: 188-202, 283-298]
Management
3.2.2. Software SEM, §7.6.4 [ISO15939-07: s5.4.1, [Stri00]: Legacy (B2)

1
The KAs are introduced by their initial letters: e.g. Software Engineering Management = SEM; Software Quality = SQ; and so on.

12-13
Engineering s5.4.2 +App.D]
Management
3.2.3. Software SEP, §8.4.1 [ISO15939-07] [Fen98: c3,c11]: Literature Search (B1)
Engineering Process [Som05: c25] : Literature search (B1)
3.2.4. Software New
Engineering Tools
3.2.5. Software Quality SQ, §10.3.4 [Gra92] [Rak97: pp39-50]: Literature Search
[Fen97] [Jon96] (B1)
[Kan02] [Lyu96]
[Mus99] [Pfl01]
3.2.6. Software SEM, §7.6.4 [ISO15939-07: s5.4.1
Measurement +App.D]
4.0. Techniques &
Tools
4.1. Measuremetn SEP, §8.4.5 [Gol99] [Fen98]
Techniques [SEL96] [Mus99]
SEP, §8.4.5.1
SEP, §8.4.5.2 [IEEE12207.0-96] [Hum95]

4.2. Measurement Tools SETM, §9.1.7 [Dor02]


5.0. Quantitative New
Data
5.1. Types of Entities
5.1. Organization Appraisal CMMI, [SEMA06a]: Field Study (A4)
Sw-CMM, [SEMA06b]: Field Study (A4)
SPICE, …
Performance
Mgmt Models
(MBQA, EFQM,
BSC, …)
5.2. Project Benchmark [ISBSG04]: Field Study (A4)
ISBSG r9
5.3. Resource P-CMM, … [PCMM-01]: Literature Search (B1)
5.4. Process Appraisal CMMI, [SEMA06a]: Field Study (A4)
Sw-CMM, [SEMA06b]: Field Study (A4)
SPICE, …
5.5. Product ISO/IEC 9126 [Fra03]: Literature Search (B1)
profiles, …
5.2 Repositories New [ISBSG07]
(Mend03)
6.0. Measurement New
Standards
6.1. By Entity
6.1.1. Resource [IEEE830-98]
6.1.2. Process [ISO15939-07]
SEP, App.B [IEEE1219-98]
[IEEE12207.0-96]
[ISO15288-02]
[ISO95]
[IEEE1045-92]
6.1.3. Product SEP, §8.4.2 [ISO9126-01]
[IEEE14143.1-00]
[ISO19761-03]
SEP, App.B [ISO20926-03] [Jon96]
[ISO20968-02]
[ISO14598]
[ISO9241]
[ISO24570]
[IEEE1061-98]
6.2. By Type
6.2.1. De Jure All the IEEE/ISO
std on SwMeas
previously listed
in Section 3.1

6.2.2. De Facto GQM [PSM03 ] [Bas94]: Assertion (A3)


[Sol99]

12-14
RECOMMENDED REFERENCES FOR SOFTWARE Transactions on Software Engineering, vol. 21, no. 12, 1995, pp. 929
-944
MEASUREMENT
[Jal97] P. JALOTE, An Integrated Approach to Software Engineering,
[Abr93] A. ABRAN & H. NGUYENKIM, "Measurement of the Second ed. New York: Springer-Verlag, 1997
Maintenance Process from a Demand-Based Perspective," Journal of
Software Maintenance: Research and Practice, vol. 5, iss. 2, 63-90, [Jon96] C. JONES, Applied Software Measurement: Assuring
1993, URL: http://www.gelog.etsmtl.ca/publications/pdf/7.pdf Productivity and Quality, Second ed: McGraw-Hill, 1996

[Abr96] A. ABRAN AND P. N. ROBILLARD, "Function Points [Jor02] P. C. JORGENSEN, "Software Testing: A Craftsman's
Analysis: An Empirical Study of its Measurement Processes," IEEE Approach," Second Edition, CRC Press, 2004
Transactions on Software Engineering, vol. 22, 895-909, 1996, URL:
http://www.gelog.etsmtl.ca/publications/pdf/64.pdf [Kan02] S. H. KAN, Metrics and Models in Software Quality
Engineering, Second ed: Addison-Wesley, 2002
[Abr02] ABRAN, A. & SELLAMI, A.: Initial Modeling of the
Measurement Concepts in the ISO Vocabulary of Terms in Metrology. [Kapl96] KAPLAN R.S. & NORTON D.P., The Balanced Scorecard:
Proc. of the 12th International Workshop on Software Measurement Translating Strategy into Action, Harvard Business School Press,
(IWSM 2002), October 7-9, 2002, Magdeburg, Shaker Publ., Aachen, 1996
pp. 9-20, URL: http://www.gelog.etsmtl.ca/publications/pdf/756.pdf
[Lyu96] M. R. LYU, "Handbook of Software Reliability
[Abr03] ABRAN A., SELLAMI A. & SURYN W.: Metrology, Engineering," Mc-Graw-Hill/IEEE, 1996
Measurement and Metrics in Software Engineering. Proc. of the 9th
IEEE International Symposium on Software Metrics (METRICS [McC04] S. MCCONNELL, "Code Complete: A Practical Handbook
2003), 3-5 September 2003, Sydney (Australia), pp. 2-11, URL: of Software Construction," Microsoft Press, 1993
http://www.gelog.etsmtl.ca/publications/pdf/798.pdf
[Mus99] J. MUSA, Software Reliability Engineering: More Reliable
[Bei90] B. BEIZER, "Software Testing Techniques," International Software, Faster Development and Testing: McGraw Hill, 1999
Thomson Press, 1990
[Per95] W. PERRY, "Effective Methods for Software Testing," Wiley,
[Buc96] F. J. BUCKLEY, Implementing Configuration Management: 1995
Hardware, Software, and Firmware, Second ed. Los Alamitos, CA:
IEEE Computer Society Press, 1996 [Pfl01] S. L. PFLEEGER, Software Engineering: Theory and Practice,
Second ed: Prentice-Hall, 2001
[Car90] D. N. CARD & R. L. GLASS, Measuring Software Design
Quality: Prentice Hall, 1990 [Pre04] R. S. PRESSMAN, "Software Engineering: A Practitioner's
Approach" Sixth ed: McGraw-Hill, 2004
[Che07] CHEIKHI, L., ABRAN, A., BUGLIONE, L., “The ISBSG
Software Projects Repository from the ISO 9126 Quality [Rak97] S. R. RAKITIN, Software Verification and Validation: A
Perspective”, Journal of Software Quality, American Society for Practitioner's Guide: Artech House, Inc., 1997
Quality, Vol. 9, No. 2, March 2007, pp. 4-16.
[Roy98] W. ROYCE, Software Project Management, A United
Framework: Addison-Wesley, 1998
[Dor02] M. DORFMAN & R. H. THAYER, Eds., "Software
Engineering." (Vol. 1 & vol. 2), IEEE Computer Society Press, 2002 [SEL96] SOFTWARE ENGINEERING LABORATORY, "Software Process
Improvement Guidebook," NASA/GSFC, Technical
[Fen98] N. E. FENTON AND S. L. PFLEEGER, Software Metrics:A Report SEL-95-102, April, 1996, URL:
Rigorous & Practical Approach, Second ed: International Thomson http://sel.gsfc.nasa.gov/website/documents/online-doc/95-102.pdf
Computer Press, 1998
[Sel03] SELLAMI A., ABRAN A, "The contribution of metrology
[Gol99] D. GOLDENSON, K. EL-EMAM, J. HERBSLEB & C. concepts to understanding and clarifying a proposed framework for
DEEPHOUSE, "Empirical Studies of Software Process Assessment software measurement validation", 13th International Workshop on
Methods," presented at Elements of Software Process Assessment Software Measurement – IWSM 2003, Montréal (Canada), Springer-
and Improvement, 1999, URL: Verlag, Sept. 23-25, 2003, pp. 18-40
http://www2.umassd.edu/SWPI/McGill/spai.pdf
[Som05] I. SOMMERVILLE, Software Engineering, Seventh ed:
[Gra92] R. B. GRADY, Practical Software Metrics for project Addison-Wesley, 2005
Management and Process Management: Prentice Hall, Englewood
Cliffs, NJ 07632, 1992 [Sta94] G. E. STARK, L. C. KERN & C. V. VOWELL, "A Software
Metric Set for Program Maintenance Management" Journal of
[Hav08] N.HABRA, A. ABRAN, M. LOPEZ, A. SELLAMI, Systems and Software, vol. 24, iss. 3, March, 1994, URL:
‘A Framework for the Design and Verification of http://members.aol.com/geshome/ibelieve/sus.pdf (draft version)
Software Measurement Methods’, Journal of Systems [Zhu97]H. ZHU, P. A. V. Hall and J. H. R. May, "Software Unit Test
and Software, vol... no. .., 2008, pp. ….. Coverage and Adequacy," ACM Computing Surveys, vol. 29, iss. 4,
366-427, Dec., 1997, URL: http://laser.cs.umass.edu/courses/cs521-
[Hum95] W. HUMPHREY, A Discipline for Software Engineering: 621/papers/ZhuHallMay.pdf
Addison Wesley, 1995

[Kit95] B. Kitchenham, S.L. Pfleeger, and N. Fenton, "Towards a


Framework for Software Measurement Validation", IEEE

12-15
ADDITIONAL REFERENCES FOR SOFTWARE
MEASUREMENT
(Mend03) E. MENDES, N. MOSLEY, AND S. COUNSELL, Investigating
(Agr99) W. AGRESTI, “The Role of Design and Analysis in Process Early Web Size Measures for Web Cost Estimation, Proc. Int’l Conf.
Improvement,” presented at Elements of Software Process Empirical Assessment in Software Eng., pp. 1-22, 2003.
Assessment and Improvement, 1999.
(Mos92) V. Mosley, “How to Assess Tools Efficiently and
(Apr00) A. APRIL AND D. AL-SHUROUGI, “Software Product Quantitatively,” IEEE Software, vol. 9, iss. 3, May 1992, pp. 29-32.
Measurement for Supplier Evaluation,” presented at FESMA2000,
2000. (Nak91) T. NAKAJO AND H. KUME, “A Case History Analysis of
Software Error Cause-Effect Relationship,” IEEE Transactions on
(Ban95) S. BANDINELLI ET AL., “Modeling and Improving an Software Engineering, vol. 17, iss. 8, 1991.
Industrial Software Process,” IEEE Transactions on Software
Engineering, vol. 21, iss. 5, 1995, pp. 440-454. (Pos96) R.M. POSTON, Automating Specification-based Software
Testing, IEEE Press, 1996.
(Ber96) A. Bertolino and M. Marrè, “How Many Paths Are Needed
for Branch Testing?” Journal of Systems and Software, vol. 35, iss. 2, (Wei93) G.M. WEINBERG, “Quality Software Management,” First-
1996, pp. 95-106. Order Measurement (Ch. 8, Measuring Cost and Value), vol. 2, 1993.

(Boett07) G. Boetticher, T. Menzies and T. Ostrand, PROMISE (Zah98) S. ZAHRAN S., Software Process Improvement: Practical
Repository of empirical software engineering data Guidelines for Business Success, Addison-Wesley, 1998.
http://promisedata.org/ repository, West Virginia University,
Department of Computer Science, 2007 [Bas94] BASILI, G.CALDIERA & D.ROMBACH, The Goal-Question-
Metric paradigm, in “Encyclopedia of Software Engineering” - 2
(Car91) R.H. CARVER AND & K.C. TAI, “Replay and Testing for Volume Set, pp 528-532, Copyright by John Wiley & Sons, Inc.,
Concurrent Programs,” IEEE Software, March 1991, pp. 66-74. 1994, URL:
http://www.cs.umd.edu/projects/SoftEng/ESEG/papers/gqm.pdf
(Chi92) R. CHILLAREGE ET AL., “Orthogonal Defect Classification -
A Concept for In-Process Measurement,” IEEE Transactions on [Bug02] BUGLIONE & A.ABRAN, ICEBERG: a different look at
Software Engineering, vol. 18, iss.11, 1992, pp. 943-956. Software Project Management, IWSM2002 in "Software
Measurement and Estimation", Proceedings of the 12th International
(Chi96) R. CHILLAREGE, “Orthogonal Defect Classification,” Workshop on Software Measurement (IWSM2002), October 7-9,
Handbook of Software Reliability Engineering, M. Lyu, ed., IEEE 2002, Magdeburg (Germany), Shaker Verlag, ISBN 3-8322-0765-1,
Computer Society Press, 1996. pp. 153-167, URL:
http://www.gelog.etsmtl.ca/publications/pdf/757.pdf
(Col93) J. COLLOFELLO AND B. GOSALIA, “An Application of Causal
Analysis to the Software Production Process,” Software Practice and [Fra03] FRANCH X. & CAVALLO J.P, Using Quality Models in
Experience, vol. 23, iss. 10, 1993, pp. 1095-1105. Software Package Selection, IEEE Software, vol. 20, no. 1, pp. 34-
41, January/February 2003
(Ele97) K. EL-EMAM, D. HOLTJE, AND N. MADHAVJI, “Causal
Analysis of the Requirements Change Process for a Large System,” [ISBSG07a] ISBSG, Data Repository Release 10, January 2007,
presented at Proceedings of the International Conference on Software www.isbsg.org
Maintenance, 1997.
[ISBSG07b] ISBSG, Data Collection Questionnaire – New
[Ferg99] Ferguson P., Leman G., Perini P., Renner S. & Seshagiri G., Development, Redevelopment, or Enhancement sized using IFPUG
Software Process Improvement Works!, SEI Technical Report, or NESMA Function Points, version 5.10, September 2007,
CMU/SEI-TR-99-27, November 1999 www.isbsg.org

(Flo99) W. Florac and A. Carleton, Measuring the Software


Process: Statistical Process Control for Software Process [Jacq97] JACQUET J.P. AND ABRAN A., “From Software Metrics to
Improvement, Addison-Wesley, 1999. Software Measurement: A Process Model” The 3th IEEE
International Software Engineering Standard Symposium, ISESS’97,
[Garc06] GARCIA, F., BERTOA, M., CALERO, C., VALLECILLO, A., Walnut Creek, CA, June 2-6, 1997, IEEE Computer Society Press;
RUIZ, F., PIATTINI, M. & GENERO, M, Towards A Consistent URL: http://www.gelog.etsmtl.ca/publications/pdf/208.pdf
Terminology for Software Measurement, Information And Software
Technology, 48 (2006) (8), Pp. 631–644. [McCA76] MCCABE T.J., A complexity measure, IEEE Transactions
on Software Engineering, SE-2(4):308-320, December 1976; URL:
(Kel98) M. KELLNER ET AL., “Process Guides: Effective Guidance http://www.dsi.unive.it/~marzolla/didattica/IngSoftware2005/mccabe
for Process Participants,” presented at the 5th International .pdf
Conference on the Software Process, 1998.
[PCMM-01]CURTIS B., HEFLEY B. & MILLER S., People Capability
[Iban98] Ibáñez M., Balanced IT Scorecard Generic Model Version Maturity Model (P-CMM) version 2.0, Software Engineering
1.0, European Software Institute, Technical Report, ESI-1998-TR- Institute, CMU/SEI-2001-MM-001, URL:
009, May 1998 http://www.sei.cmu.edu/pub/documents/01.reports/pdf/01mm001.pdf

(Lag96) B. LAGUË AND A. APRIL, “Mapping for the ISO9126 [PCMM-06] MILLER S., People Capability Maturity Model Maturity
Maintainability Internal Metrics to an Industrial Research Tool,” Profile, January 1 2006
presented at SESS, 1996.
[PSM03] DOD, Practical Software Measurement: A Foundation for
(McG94) F. MCGARRY ET AL., “Software Process Improvement in Objective Project Management, v. 4.0b1, 2003 URL:
the NASA Software Engineering Laboratory,” Software Engineering http://www.psmsc.com/PSMGuide.asp
Institute CMU/SEI-94-TR-22, 1994, available at
http://www.sei.cmu.edu/pub/documents/94.reports/pdf/ tr22.94.pdf [SEMA06a] SOFTWARE ENGINEERING INSTITUTE, Process Maturity
Profile Sw-CMM 2006 Mid-Year Update, Software Engineering
Measurement and Analysis Team, Carnegie Mellon University,

12-16
August 2006; URL: http://www.sei.cmu.edu/sema/pdf/SW-
CMM/2006aug.pdf

[SEMA06b] SOFTWARE ENGINEERING INSTITUTE, Process Maturity


Profile CMMI v1.1 SCAMPI v1.1 Class A Appraisal Results 2006-
Mid-Year Update, Software Engineering Measurement and Analysis
Team, Carnegie Mellon University, August 2006; URL:
http://www.sei.cmu.edu/sema/pdf/CMMI/2006aug.pdf

[Shep95] SHEPPERD M., Foundations of Software Measurement,


Prentice-Hall, 1995, ISBN 0133361993

[Sol99] VAN SOLINGEN R. & BERGHOUT E., The


Goal/Question/Metric Method. A practical guide for Quality
Improvement of Software Development, Mc-Graw Hill, 1999, ISBN
0-07-709553-7

[Stri00] STRIKE K., EL-EMAM K & MADHAVJI N., Software Cost


Estimation with Incomplete Data, NCR 43618, Technical Report,
National Research Council Canada, January 2000, URL:
http://www.software-metrics.org/documents/1071.pdf

(Val97) L.A. Valaer & R.G.Babb. II, “Choosing a User Interface


Development Tool,” IEEE Software, vol. 14, No. 4, 1997, pp. 29-39.

[Vin90] W.G. Vincenti, What Engineers Know and How They Know
It — Analytical Studies from Aeronautical History, John Hopkins
University Press, 1990.

[Zel98] M.V. ZELKOWITZ AND & D.R. WALLACE, “Experimental

[Zus97] ZUSE, H. A Framework of Software Measurement, De


Gruyter, Berlin, 1997, ISBN 3-11-015587-7
Models for Validating Technology,” Computer, vol. 31, No.5, 1998,
pp. 23-31.

12-17
APPENDIX A. LIST OF STANDARDS (IEC 60050) IEC, International Electrotechnical Vocabulary -
Electrical and electronic measurements and measuring instruments -
(IEEE1044-93) IEEE Std 1044-1993 (R2002), IEEE Standard for the Part 311: General terms relating to measurements - Part 312:
Classification of Software Anomalies, IEEE, 1993. General terms relating to electrical measurements - Part 313: Types
of electrical measuring instruments - Part 314: Specific terms
[IEEE1045-92] IEEE, Standard for Software Productivity Metrics, according to the type of instrument, International Electrotechnical
Std 1045-1992 Commission, Genève, 2001

[IEEE1061-98] IEEE, Standard for a Software Quality Metrics (IEEE 982.1-88) IEEE Std 982.1-1988, IEEE Standard Dictionary
Methodology, Std 1061-1998 of Measures to Produce Reliable Software, 1988

[IEEE1074-97] IEEE, “Std 1074-1997: Standard for Developing (IEEE 1012-98) IEEE Std 1012-1998, Software Verification and
Software Life Cycle Processes,” 1997 Validation, IEEE, 1998.

[IEEE830-98] IEEE, Recommended Practice for Software [UKSMA00] UKSMA, Quality Standards: Defect Measurement
Requirements Specifications, Std 830-1998 Manual, Release 1.a, United Kingdom Software Metrics Association,
October 2000
[ISO14598] ISO, ISO/IEC, "14598 - Information technology --
Software product evaluation, Parts 1-6” International Organization [ISO14471] ISO/IEC TR 14471 “Information technology -- Software
for Standardization, 1999-2001 engineering -- Guidelines for the adoption of CASE tools”,
International Organization for Standardization, 1999, Geneva
[ISO15288-02] ISO, ISO/IEC, “IS 15288:2002: Systems Engineering
– System Life Cycle Processes,” International Organization for [ISO29881] ISO/IEC DIS 29881,Information technology -- Software
Standardization, 2002, Geneva and systems engineering -- FiSMA 1.1 functional size measurement
method, International Organization for Standardization, 2007,
[ISO24570] ISO/IEC, IS “24570:2005 - Software engineering -- Geneva
NESMA functional size measurement method version 2.1 --
Definitions and counting guidelines for the application of Function
Point Analysis”, International Organization for Standardization, 2005

[ISO9241] ISO/IEC, Ergonomic requirements for office work with


visual display terminals (VDTs) -- Part 11: Guidance on usability,
International Organization for Standardization, 1998

[ISO95] ISO, ISO/IEC JTC1/SC7/WG10, TR 15504-5, Software


Process Assessment - Part 5: An Assessment Model and indicator
guidance, v.3.03, International Organization for Standardization,
1998

[IEEE1219-98] IEEE Std 1219-1998, IEEE Standard for Software


Maintenance: IEEE, 1998

[IEEE12207.0-96] IEEE/EIA 12207.0-1996//ISO/IEC12207:1995,


Industry Implementation of Int.Std. ISO/IEC 12207:95, Standard for
Information Technology-Software Life Cycle Processes, vol. IEEE,
1996

[IEEE14143.1-00] IEEE Std 14143.1-2000//ISO/IEC14143-1:1998,


Information Technology-Software Measurement-Functional Size
Measurement-Part 1: Definitions of Concepts: IEEE, 2000

[ISO15939-07] ISO/IEC 15939:2007, Software Engineering-


Software Measurement Process: ISO and IEC, 2007

[ISO19761-03] ISO/IEC 19761:2003, Software Engineering-Cosmic


FPP-A functional Size Measurement Method: ISO and IEC, 2003
[ISO20926-03] ISO/IEC 20926:2003, Software Engineering-IFPUG
4.1 Unadjusted functional Size measurement method-Counting
practices manual: ISO and IEC, 2003

[ISO20968-02] ISO/IEC 20968:2002, Software Engineering-MK II


Function Point Analysis- Counting Practices Manual: ISO and IEC,
2002

[ISO9126-01] ISO/IEC 9126-1:2001, Software Engineering-Product


Quality-Part 1: Quality Model: ISO and IEC, 2001

[ISO93] ISO VIM, International Vocabulary of Basic and General


Terms in Metrology. Geneva, Switzerland: ISO, 1993

(IEEE 1209-92) IEEE Std 1209-1992, Recommended Practice for the


Evaluation and Selection of CASE Tools, (ISO/IEC 14102, 1995),
IEEE Press, 1992.

(IEEE 1348-95) IEEE Std 1348-1995, Recommended Practice for


the Adoption of CASE Tools, (ISO/IEC 14471), IEEE Press, 1995.

12-18
APPENDIX

Text from SWEBOK version 4.0 that has not been retained, or
that need to be modified.

12-19

S-ar putea să vă placă și