Sunteți pe pagina 1din 76

A METHOD FOR EVALUATING BUSINESS

VALUE OF IT SYSTEMS

A CASE STUDY ON WORK AND MAINTENANCE


APPLICATIONS WITHIN ASSET MANAGEMENT
AT VATTENFALL AB

ANDREAS ANDERSSON

Master Thesis
Stockholm, Sweden 2006

XR-EE-ICS 2006:015
Master of Science Thesis 2006-09-14

Executive Summary
This thesis report presents a method for evaluating the business value of IT systems. The report also
includes an extensive case study where the developed method is employed at two business units at
Vattenfall AB, more specifically Vattenfall Hydro Power Sweden (hydropower generation) and Vattenfall
Distribution Sweden (electrical distribution), within the work and maintenance domain. Included in the
report is also accompanying instructions on how to apply the developed method, as well as results for
the aforementioned business units.

The thesis is part of a larger research project entitled the IT Investment Evaluation Method (ITIEM),
a joint project between Vattenfall and KTH. The ultimate goal is to provide decision makers with a
tool to evaluate potential IT investment scenarios with respect to the value the investment would
generate to the business [10] [9]. For an overview of the overall ITIEM project, see section 1.2.

The evaluation method developed in this thesis consists of three central components. In the first part,
25 different benefits that IT can generate (such as improved decision making, efficiency and information) are
prioritized according to what the management considers to be of most value to their organization.
The second part attempts to identify how much business value that the evaluated IT system
functionality contributes to the organization, in terms of the 25 benefits of IT. This part also tries to
establish how much non-functional system aspects (such as usability and availability) affect the
contribution to business value that the functionality has. Finally, in the third part the results are
analyzed according to the coverage of system functionality, the consistency of answers and the
credibility of respondents. See section 5.1 for an overview of the method, and section 5.2 for
instructions on how to apply it.

The case study in total involved 30 respondents from Vattenfall. 18 persons participated in the
business value prioritization (10 from Distribution Sweden and 8 from Hydro Power Sweden). Furthermore,
15 respondents were interviewed (7 from Distribution Sweden, 7 from Hydro Power Sweden and 1 from
Vattenfall Service) for a total of 19 interviews, requiring approximately 30 hours. See section 6.1 for
more details about the data collection.

Overall, the prioritizations are in line with what could be expected from the respective business units.
The most apparent differences between the assessed business units are that customer relations are highly
prioritized for Distribution Sweden and technology/tools are at the lower end and that the opposite is true
for Hydro Power Sweden. The differences between the business units could be a result of different
strategic goals and different business conditions for the respective business units. Regarding
similarities, there are four benefits that clearly have the lowest priority for both business units, i.e.
competitor relations, lock-in effect/switching costs, differentiations in products/services and new products/services. For
more details on the prioritization results, see section 6.2.1

The overall business value contribution from most functional areas is relatively similar, although some
functional areas contribute with less business value than others. The real differences between
functional areas are that their contribution is focused on different business value dimensions. For
example, one functional area contributes a great deal to decision making and strategy formulation and
planning, whereas another functional area contributes the most to information and efficiency. Most
functional areas contribute highly to information, decision making, efficiency and control and follow up. In
contrast, most functional areas contribute little to lock-in effect/switching costs, new products/services,
technology/tools, competitor relations and differentiations in products/services.

The impact of the different quality dimensions is very similar regardless of which functional area that
is evaluated. For example, usability is consistently ranked as one of the most important non-functional
factors in terms of the effect it has on the business value contribution, and modifiability is frequently
ranked as the least important. For more details on the results for business value contribution and
quality impact, see section 6.2.2.

The development of the evaluation method has resulted in future recommendations on how to further
improve the method. In short, the recommendations include the following:
• Clearly delimiting the scope of the evaluation facilitates a high quality result
• Interest and commitment from the evaluated organization is important for success
• Phone interviews can be used in order to save time and money
• Understanding and communicating ITIEM frameworks is essential for achieving a good result
For more details on the future recommendations, see chapter 8.

Andreas Andersson Industrial Information and Control Systems, KTH

2
Master of Science Thesis 2006-09-14

Glossary
The following chapter provides explanations of abbreviations, central terms and expressions that are
used throughout the report.

Abstract Component – A description of a system function in the functional reference model. An


abstract component contains several measurement points.

AIP – Asset Investment Planning, business sub-function in the functional reference model. Consists
of functions responsible for a long-term approach to investments with the objective of maintaining an
optimal cost-benefit ratio.

Asset Management – The process of maximizing the return on investment of equipment by


maximizing performance and minimizing cost over the entire life cycle of that component. Involved
activities are for instance maintenance, inspection and construction of assets.

Business Function – The area of Distribution Management is segmented into business functions,
each of which generally is performed at one department at the utility. Every business function consists
of several business sub-functions.

Business Sub-function – Further breakdown of the business function. Every business sub-function
consists of several abstract components.

Business Unit – Organizational unit within Vattenfall. A business unit is in charge of a delimited
business area, as for instance power distribution.

Business Value Dimension – Describes one type of business value that can be attained from IT.
Examples of business value dimensions are improved decision making and higher efficiency. See
section 3.2.

CALC – Network Calculations. Contains functions related to load forecast, power flows, short circuit
analysis and energy loss calculations. Belongs to the sub-function AIP in the adopted functional
reference model, but has for the purpose of this thesis been considered as a separate functional area.

CON – Construction and Design, business sub-function in the functional reference model. Consists
of functions related to construction of assets.

Credibility Dimension – Describes a facet of how a respondent’s credibility can be assessed. See
section 3.4.

DMS – Distribution Management Systems. Systems needed to operate an electrical distribution


network.

DOC – Document Management, business sub-function in the functional reference model. Contains
functions related to managing documents.

DSP – Work Dispatch, business sub-function in the functional reference model. Contains functions
needed for the dispatch of work crews.

EINV – Equipment and Network Inventory, business sub-function in the functional reference
model. Consists of functions related to the management of asset repository data.

FRD – Field Recording and Design, business sub-function in the functional reference model.
Consists of functions needed for efficient data collection from field crews.

Functional Reference Model – A list of functions that cover a certain functional area. The model is
in this thesis divided into business functions, business sub-functions, abstract components and
measurement points. See section 3.1.

GINV – Geographical Inventory, business sub-function in the functional reference model. Contains
functions for using geographical data combined with asset repository.

Andreas Andersson Industrial Information and Control Systems, KTH

3
Master of Science Thesis 2006-09-14

IEC – International Electrotechnical Commission, organization responsible for the development of


among other things standards within the electrical distribution and transmission area.

IRM – Interface Reference Model, used by the IEC to define the functional area of DMS.

IT System Scenario – A set of IT systems, together covering the functions needed for an
organization.

ITIEM – IT Investment Evaluation Method. Joint research project between The Royal Institute of
Technology (KTH) and Vattenfall AB. See section 1.2.

Maintenance – Activities to uphold pre-defined asset properties. Maintenance is a subset of asset


management.

MAI – Maintenance and Inspection, business sub-function in the functional reference model.
Contains functions needed for the maintenance and inspection at a power company.

Measurement Point – A description of a concrete IT system function (the most granular component
of the functional reference model).

PRM – Premises, business sub-function in the functional reference model. Contains functions needed
for managing all issues related to real estate, i.e. self-owned and customer-owned premises.

Quality Dimension – A non-functional aspect of a system, for example usability and information
security. See section 3.3.

SCM – Supply Chain and Logistics, business sub-function in the functional reference model. Contains
functions needed for supply and storage of needed spare parts and materials.

SCHD – Work Scheduling, business sub-function in the functional reference model. Contains
functions for detailed work scheduling.

SWA – Switch Action Scheduling/Operation work Scheduling, business sub-function in the


functional reference model. Contains functions needed for the co-ordination of switching of power
(or water) with planned maintenance/construction work.

TCM – Trouble Call Management, business sub-function in the functional reference model. Contains
functions for the management of trouble calls, i.e. calls from customers that are affected by outages or
other power quality issues.

Vattenfall Hydro Power Sweden – Swedish Business Unit within Business Group Nordic. The
business area is generation of electricity in hydropower plants.

Vattenfall Distribution Sweden – Swedish Business Unit within Business Group Nordic. The
business area is distribution of electrical power.

Vattenfall Service – Service Provider within Business Group Nordic. Provides contract services both
to internal, Vattenfall business units, and to external customers.

Please note, that the terms “non-functional” and “quality” are used synonymously in the report.
Similarly, the terms “functional area” and “group of abstract components” are used synonymously.

Andreas Andersson Industrial Information and Control Systems, KTH

4
Master of Science Thesis 2006-09-14

Outline
Chapter 1 presents the background to the thesis and how it fits into a larger research project. Chapter
2 provides the purpose of the thesis, the more specific objectives of it and the scope of the report.
Chapter 3 covers the theory that forms the foundation for the thesis and is necessary in order to
understand the rest of the report. Chapter 4 gives an overview of the work carried out during the
thesis degree project. The structure of chapter 5 and 6 follow the two objectives of the thesis, stated in
section 2.2. More specifically, chapter 5 presents the evaluation method developed during the thesis
work, and goes into details about how to apply it in an organization. Chapter 5 also provides
justifications for the design choices made when developing the evaluation method. Chapter 6
describes the extensive case study performed during the thesis work, and presents the results of the
data collection. Chapter 7 analyzes the validity and reliability of the thesis, and finally chapter 8
suggests a number of future recommendations for how to improve the evaluation method further.

Please note that this thesis has been accomplished in parallel with master thesis phase III of the IT
Investment Evaluation Method project (see section 1.2 for more details), and that there has been a
high level of co-operation between the two thesis workers. This is because the two thesis phases share
a number of frameworks and prerequisites, and the evaluation methods developed in the two phases
need to be compatible with each other, to ensure that the results can be aggregated and fit into the
overall ITIEM project. As a consequence, a few sections in this report are similar to the
corresponding sections in the thesis report written for phase III of the ITIEM project by Martin
Johnson [16]. More specifically, this concerns sections 1, 3.1, 3.3, 3.4, 3.5, 4, 5.3.3, 7 and 8.2.

Andreas Andersson Industrial Information and Control Systems, KTH

5
Master of Science Thesis 2006-09-14

Acknowledgements
There are several people I would like to take the opportunity to thank for helping me complete this
master thesis report.

At KTH, I would first and foremost like to extend my gratitude to Magnus Gammelgård and Mathias
Ekstedt for entrusting me with the thesis project in competition with 20 other applicants. Their
support and encouragement throughout the project have been very helpful, and without our many
fruitful discussions the result would surely not have been as good. The same holds true for Per
Närman, who carried out a lot of the groundwork for my thesis, establishing the functional reference
model that was a cornerstone in my thesis project. Always ready to lend a hand, he has been of
invaluable help, and is worth praise especially for reviewing my thesis report on more than one
occasion.

At Vattenfall, I would like to express my gratitude to Georg Karlén for sponsoring my thesis and
compensating me for my efforts. I am also grateful for Erik Biberg’s assistance with administrative
matters at Vattenfall. Moreover, I appreciate the help with finding and contacting respondents that
Hans Andersson, Claes Wange, Fredrik Backlund and Jan Gugala all have contributed. Thanks also to
Peter Söderström for directing me in order for the results to be relevant and usable at Vattenfall.

Furthermore, I am thankful to everyone that has participated in any way in my thesis project, above all
in the interviews. You have been an integral part of my report.

Last, but certainly not least, I would like to thank Martin Johnson, my fellow thesis worker and close
friend. Without his co-operation, company and invaluable assistance the work would not have been as
enjoyable and successful.

Andreas Andersson Industrial Information and Control Systems, KTH

6
Master of Science Thesis 2006-09-14

Table of Contents
1 Background ...................................................................................................................... 8
1.1 Choosing IT Systems for Investment............................................................................................8
1.2 IT Investment Evaluation Method ................................................................................................8
2 Purpose, Objectives and Scope ....................................................................................... 10
2.1 Purpose..............................................................................................................................................10
2.2 Objectives .........................................................................................................................................10
2.3 Scope..................................................................................................................................................10
3 Theory ............................................................................................................................. 11
3.1 Functional Reference Model .........................................................................................................11
3.2 Business Value Dimensions...........................................................................................................13
3.3 Quality Dimensions ........................................................................................................................15
3.4 Credibility Dimensions...................................................................................................................16
3.5 Categorization of Respondents.....................................................................................................19
4 Master Thesis Project Phases..........................................................................................20
4.1 Initiation............................................................................................................................................20
4.2 Testing ...............................................................................................................................................21
4.3 Data Collection ................................................................................................................................21
4.4 Analysis..............................................................................................................................................21
4.5 Finalization .......................................................................................................................................21
5 Evaluation Method..........................................................................................................22
5.1 Overview...........................................................................................................................................22
5.2 Instructions.......................................................................................................................................24
5.3 Justifications .....................................................................................................................................31
6 Case Study .......................................................................................................................34
6.1 Data collection .................................................................................................................................34
6.2 Results ...............................................................................................................................................38
7 Validity and Reliability ....................................................................................................49
7.1 Validity...............................................................................................................................................49
7.2 Reliability...........................................................................................................................................50
8 Future Recommendations ............................................................................................... 51
8.1 Recommendations for Phase IV of the IT Investment Evaluation Method........................51
8.2 General Recommendations for the IT Investment Evaluation Method...............................52
9 References .......................................................................................................................54
10 Appendices ......................................................................................................................56
Appendix 1 – Functional Reference Model Vattenfall Distribution Sweden........................................56
Appendix 2 – Functional Reference Model Vattenfall Hydro Power Sweden .....................................58
Appendix 3 – Estimated Resource Requirements for Different Functional Levels ............................60
Appendix 4 – Descriptions of Functional Areas........................................................................................63
Appendix 5 – Form for Business Value Contribution from Functionality ...........................................67
Appendix 6 – Form for Quality Impact on Business Value Contribution ............................................68
Appendix 7 – Form for Credibility Evaluation ..........................................................................................70
Appendix 8 – Contact List for Respondents ..............................................................................................71
Appendix 9 – List of Interview Data and Respondents ...........................................................................72

Andreas Andersson Industrial Information and Control Systems, KTH

7
Master of Science Thesis 2006-09-14

1 Background
This thesis presents a Master of Science degree project conducted at the department of Industrial
Information and Control Systems (ICS), at the Royal Institute of Technology (KTH). It is part of a
larger research project called the IT Investment Evaluation Method (ITIEM), with the objective to
develop a method which can be used to evaluate and rank IT systems scenarios based on the business
value they provide to the company. For more details concerning this research project, see section 1.2
below.

1.1 Choosing IT Systems for Investment


With the newly deregulated electricity market in northern Europe, utilities are forced to make more
efficient use of their assets. As a result, many business units of Vattenfall (where the case study was
performed in this thesis) are in the process of deciding whether or not to invest in new asset
management systems, and if so, which IT system scenario that is the best one to choose. [13]

It is rarely evident what the criteria for being the “best” system are. And even when this is agreed on,
identifying and collecting information needed for performing this evaluation is often very resource
demanding. For this type of challenge, an Enterprise Architecture (EA) model can be used as decision
support, since it can offer a structured approach to a complex situation such as the one above. [5]

EA has to a large degree grown from an ever-increasing need of abstracting software systems.
Basically, EA models describe high level abstractions of entities of the enterprise system and how they
relate to each other, for example technical entities such as functionality and applications, as well as
organizational entities such as business processes, organizational units and work flows. In short,
different EA models are depicting the same piece of reality but from different perspectives and for
different purposes. Hence, there are various EA frameworks, each with its different focus, that
complement each other. [5] Examples are the Zachman framework [28] [24], TOGAF [26], and
DoDAF [4].

These frameworks are typically fairly wide-ranging and general in their approach [13]. And the
information needed for enterprise architecture analyses can be quite costly to collect, as it
encompasses many different aspects and requires much time for the evaluation. Thus, it is important
to be clear about what type of analyses the information will be used for, in order not to collect data
that is of no use [15]. Hence, the information required has to be carefully selected for the purposes in
question [17]. Some support is found for this in the concept of viewpoints, i.e. to use different
perspectives to analyze the enterprise, but they are far from always clear with what types of analysis
that can be performed and how it should be done. In addition, if the purpose is to make an overall
analysis of the enterprise architecture, it is not clear how the analyses from the different views should
be combined. [13]

The ITIEM presents a way of combining different viewpoints related to functional and non-
functional aspects, in order to estimate how much business value different IT system scenarios
provide an organization. The method can be of valuable help as a decision support tool in deciding
which IT system scenario to choose (see section 1.2 below).

1.2 IT Investment Evaluation Method


As can be seen above, this Master of Science thesis is part of a larger research project entitled the IT
Investment Evaluation Method (ITIEM), a joint project between Vattenfall and KTH. The ultimate
goal is to provide decision makers with a tool to evaluate potential IT investment scenarios with
respect to the value the investment would generate to the business [10] [9]. The project was initiated
by the CIO Group at Vattenfall AB and involves two Swedish business units, more precisely Vattenfall
Hydro Power Sweden (hydropower generation) and Vattenfall Distribution Sweden (electrical distribution)
within the work and maintenance domain. For an overview of the ITIEM project, see Figure 1 below.

Andreas Andersson Industrial Information and Control Systems, KTH

8
Master of Science Thesis 2006-09-14

Figure 1 – Overview of the ITIEM project [10]. Please note that the data is
fictitious.

Phase I of the project concerns the development of a functional reference model that provides a
categorized list of functions required for work and maintenance applications within asset management
[19]. The model is based on the IEC-619 68 standard developed specifically for interapplication
integration at electrical utilities [14]. This functional reference model is adopted in both phase III and
IV of the ITIEM. For more details on the model, see section 3.1.

Phase II of the project concerns the development of frameworks for the measurement of certain non-
functional qualities in IT systems, such as modifiability, data quality and interoperability [11]. These
frameworks are adopted in both phase III and IV of the ITIEM. For more details on the non-
functional qualities, see section 3.3.

In phase III of the project, a method is developed for evaluating IT systems scenarios according to
their functional and non-functional qualities [16]. These measurements are matched with the
mappings and prioritizations from phase IV, when determining the business value of the IT systems
scenarios.

In phase IV of the project (i.e. this thesis), mappings are established from information system
properties (functionality and quality) to business value dimensions, in order to identify contribution to
business value from different functional areas. This phase also includes a method for prioritizing
business value dimensions, pointing out what the management considers to be of most value to the
business unit.

After all the phases have been completed, i.e. all the functional and non-functional qualities have been
measured and the business value connection has been established, the different IT systems scenarios
can be ranked according to the total business value they provide. The results also include the value of
corresponding confidence levels and confidence intervals, indicating how certain the results are (see
Figure 2 below) [10] [9].

Figure 2 – Example of final results of the IT Investment


Evaluation Method. Please note that the data is fictitious.

Andreas Andersson Industrial Information and Control Systems, KTH

9
Master of Science Thesis 2006-09-14

2 Purpose, Objectives and Scope


The following chapter outlines the overall purpose and the specific objectives of this thesis, as well as
the scope of the degree project.

2.1 Purpose
The overall purpose of this thesis is to develop a method for evaluating the business value of IT
systems, i.e. how much the systems provides to the organization in terms of business value.

2.2 Objectives
The goal of this thesis can be expressed as two distinct, but coherent objectives:

1) Evaluation Method
Developing a method to evaluate how much business value that IT systems contributes to an
organization. This includes mapping information system functionality to business value
dimensions, and evaluating the impact that non-functional aspects (quality dimensions) have
on these mappings. The method also includes prioritizing business value dimensions,
indicating what the management considers to be of most value to their organization. In
addition, the method contains credibility assessments, indicating the level of certainty in the
results. For details, see chapter 5.

2) Case study
Applying the developed method at Vattenfall Hydro Power Sweden and Distribution Sweden,
validating that the method works and providing results for these two business units, intended
for further processing in the ITIEM project. The case study is also part of validating that the
overall ITIEM is applicable. For details, see chapter 6.

For further explanations of system functionality, business value dimensions, quality dimensions and
credibility, see chapter 3.

2.3 Scope
As mentioned above, the method has been developed at two business units (hydropower generation
and electrical distribution) at Vattenfall AB in the utilities industry, within the work and maintenance
domain. This does not mean, however, that the method is limited to this particular company, nor to
this industry or domain. The method is of general nature, and provided that the required functionality
is specified, it is thought to be applicable to a number of different types of organizations.

Andreas Andersson Industrial Information and Control Systems, KTH

10
Master of Science Thesis 2006-09-14

3 Theory
The following chapter presents the theoretical foundation that the method is based on, including a
functional reference model, business value dimensions, quality dimensions, credibility dimensions and
a framework for categorizing respondents.

3.1 Functional Reference Model


A functional reference model is a list of functions that cover a certain functional area. The functional
reference model used in this thesis was developed during phase I of the ITIEM project [19]. It is
based on the so-called Interface Reference Model (IRM), which is presented in the IEC 61988-1
standard [14]. This is a standard specifying the architecture for Distributed Management Systems (DMS) of
utilities – systems that are used when operating an electric grid for electrical distribution.

More specifically, the IRM specifies a list of business functions applicable to electrical distribution
utilities (see Figure 3 below). These business functions are broken down into subcomponents called
business sub-functions, which in turn are broken down into more delimited functional units, called
abstract components. An abstract component is a system function, which may or may not be realized in a
specific application. Measurement points are sub ordered abstract components and describe concrete
system functions. Furthermore, measurement points are thought to be used as indicators when
evaluating if the functionality of an abstract component is implemented in an IT system or not. [19]

Figure 3 – The business functions used from the Interface Reference Model [14]. The grey boxes are
not part of the functional reference model in this thesis.

Moreover, the functional breakdown of the IRM is based on business requirements rather than
existing systems. This ensures a reasonable degree of vendor independence in the functional reference
model. For a structural overview, see Figure 4 below.

Andreas Andersson Industrial Information and Control Systems, KTH

11
Master of Science Thesis 2006-09-14

Maintenance
and
Business Function level Construction

Maintenance Work Construction Field Work


Business Sub- & Design Recording Dispatch
& Inspection Scheduling
Function Level & Design

Field Design Field


Crew Time Actual
Abstract Component Level Inspection
Entry Materials
Results

Function to Function to
change display asset
Measurement Point Level blueprints in repository in
the field the field

Figure 4 – The structure of the functional reference model

For more information about the development of the functional reference model and further details
about its structure, refer to Per Närman’s master thesis report A Functional Reference Model For Work-
And Maintenance Applications Within Asset Management At Vattenfall [19]. For an overview of the
functional reference models for Vattenfall Distribution Sweden and Hydro Power Sweden, see Appendix
1and Appendix 2.

Andreas Andersson Industrial Information and Control Systems, KTH

12
Master of Science Thesis 2006-09-14

3.2 Business Value Dimensions


In this thesis, the different types of business value that can be attained from IT are classified into 25
different categories. These categories have been identified from a large literature survey, covering
more than 80 references [8] [12]. They are referred to as business value dimensions in this report, and
include benefits related to the various parts of the business [10] [12]:
• The input to the organization, i.e. supplies and materials
• The outputs from the organization, i.e. improvements in product or service quality
• The structure of the organization, e.g. improved flexibility and communications
• The resources used, e.g. better information available
• The actors of the organization, e.g. better decision-making and benefits related to improved
organizational culture

The business value dimensions represent generalized types of business value, since the purpose is to
express the benefits of IT in terms understood by business managers. For a full list of the business
value dimensions and their explanations, see Figure 5 below. The business value dimensions are
derived from the working paper Dimensions of Benefits from IT/IS [8] and the research paper A
Categorization of Benefits from IS/IT Investments [12].

Business Value Dimension Explanation


Decision making Improved decision making, e.g. better decision
support, more well-informed decisions, decisions
taken closer to operations, shorter time to decisions,
increased reliability in decisions, less uncertainty and
complexity in decision making etc.
Differentiations in products/services Improved ability to change/differentiate
products/services that are offered, e.g. adaptations
of existing products to other parts of the market,
ability to tailor products to customer needs etc.
Efficiency "Doing as much as before with a less amount of
resources", e.g. shortening of manufacturing
times/lead times/cycle times/work times,
simplified/reduced paperwork and administrative
tasks, automation of work tasks, staff reductions etc.
Flexibility Improved organizational ability to adapt to changes
in market conditions/requirements, e.g. changed
demand, political and economical factors etc.
Flow of products/services Improved flow of products/services within/between
processes or departments in the organization, e.g.
improved flow within the distribution department or
between the production department and the
distribution department etc.
Change management Improved ability to deliberately make changes in the
organization, e.g. to replace people or roles, to
restructure, to add/remove departments or
processes etc.
Information Improved information and information support, e.g.
more information is collected, the information is
more readily available, more detailed, more accurate,
more objective, easier to interpret etc.
Lock-in effect/switching costs Improved ability to prevent customers from
choosing competing products or to make it more
costly to switch to competing products, e.g. license
agreements, loyalty programs etc.
Integration and coordination Improved ability to coordinate and integrate
different parts of the organization, e.g. coordination
of the production and the distribution department,
of the sales and production planning departments
etc.

Andreas Andersson Industrial Information and Control Systems, KTH

13
Master of Science Thesis 2006-09-14

Communication Improved/increased communication


within/between processes or departments in the
organization, e.g. more types of communication
channels, more open dialogue etc.
Control and follow up Improved ability to control and follow up e.g.
improved reporting possibilities, increased evaluation
etc.
Organizational learning and knowledge Improved learning and/or increased knowledge of
persons in the organization, e.g. promotion of
competence development, retention of knowledge
etc.
Quality of products/services Improved quality of the products and/or services
offered, e.g. less errors in products/services,
increased reliability of products/services etc.
Deliveries Improved delivery of products and/or services to
customers, e.g. new ways to distribute products,
faster response to customer orders, reduced delivery
times and less uncertainty in delivery times to
customers etc.
Technology/tools Improved non-IT tools and machinery used to
produce products and/or services, e.g.
improvements in production technology,
new/improved maintenance methods etc.
Inbound logistics Improvements related to the products/services that
the company purchases from suppliers, e.g.
increased ability to control the quality of products
and services from suppliers, reduced delivery times
and less uncertainty in delivery times from suppliers.
Cost reductions Reduced costs related to both the organization as a
whole, but also departments, product lines,
processes, administrative expenses, material usage,
stock keeping, travelling costs etc.
New products/services New products that can be offered to customers
which the company does not offer today, e.g.
product innovations, added services etc.
Organizational culture Improved organizational culture, e.g. increased
involvement/interest from management, higher job
satisfaction, stress reduction etc.
Productivity "Doing more than before with the same amount of
resources", e.g. to produce more units with the same
machinery or personnel etc.
Competitor relations Improved relations with competitors, e.g. increased
co-operation, increased bargaining power towards
competitors etc.
Customer relations Improved relations with customers, e.g. being able to
offer better support/service on sold
products/services, better and faster information to
customers, increased customer satisfaction, increased
bargaining power towards customers etc.
Supplier relations Improved relations with suppliers, e.g. new forms of
co-operation, simplified ways of doing business,
increased bargaining power towards suppliers etc.
Third party relations Improved relations with external parties that are not
customers, competitors or suppliers, e.g. various
organizations, authorities, society etc.
Strategy formulation and planning Improved ability to develop long-term business
strategies and also to plan activities.
Figure 5 – Business value dimensions and explanations

Andreas Andersson Industrial Information and Control Systems, KTH

14
Master of Science Thesis 2006-09-14

3.3 Quality Dimensions


A common approach to assessing the technical quality of IT systems is to separate the functional from
the non-functional (quality) attributes [2] [20]. The quality dimensions used in this thesis are based on
several research papers [11], and have for the purpose of the ITIEM project been narrowed down to
seven dimensions. The explanations in Figure 6 below are summaries of the most relevant
characteristics for each of the quality dimensions derived from the working paper The Total Quality
ATD [11].

Quality Dimension Explanation


Availability and reliability Availability is a measure of the degree that a system is accessible
when a user needs to use it. Reliability is a measure a system’s
ability to perform its required function under stated conditions
for a specified period. In other words, availability and reliability
concern that the system works as it is supposed to when a user
needs it. Availability and reliability are also related to how long
it takes between failures and how long it takes until the system
is up-and-running again following a failure.
Data quality Data quality is a measure of how accurate (correct data),
complete (no data missing), current (up-to-date data), and
consistent (the same data for the same entity throughout the
system) data is.
Information security Information security is a measure of how well protected a
system (and the information in it) is against unauthorized
access, use and modification. Information security concerns,
amongst other things, the level of control for access to a system
(for example login with a password) and levels of authority (for
example an administrator level for the most critical functions).
It also includes encryption of the information in the system, as
well as logging (supervision) of what is done in the system and
the communication to and from the system.
Interoperability Interoperability is a measure of a system’s ability to interact with
other systems outside the system itself, for example that
information can be exchanged between systems. This also
depends on the availability of persons (both internally and
externally) who can integrate the system with other systems, and
on the skills and experience of these persons. Furthermore, the
amount and quality of documentation supporting integration of
systems is an issue.
Modifiability Modifiability is a measure of how easy it is to change a system
or how much that can be changed, for example how easy it is to
modify or add functions in a system or to update a system. This
also depends on the availability of persons (both internally and
externally) who can modify the system, and on the skills and
experience of these persons. Furthermore, the amount and
quality of documentation supporting modification of the system
is an issue.
Performance Performance is a measure of how fast a system runs, for
example how long the system takes to respond to an input and
how long it takes to perform operations. Performance is also a
measure of the level of resource requirements (for example in
terms of hardware) for a system, and scalability (i.e. the amount
of users the system can support and still function properly).
Usability Usability is a measure of how easy it is to use a system for a
user, for example how easy it is to learn to use the system, how
easy it is to remember how to use the system, that the use of the
system is effective and efficient, and that the system prevents
user errors.
Figure 6 – Quality dimensions

Andreas Andersson Industrial Information and Control Systems, KTH

15
Master of Science Thesis 2006-09-14

3.4 Credibility Dimensions


The credibility indicates the level of quality in the answers as well as the level of certainty in the
results. The working paper Credibility Assessment Methodology [18] outlines eleven credibility assessment
dimensions. These are not all practically applicable, however, when assessing the credibility of answers
in the ITIEM [7]. For some of the dimensions it is simply not possible to obtain assessments through
a reasonable effort, and some are not relevant for the type of questions asked in the ITIEM. Hence,
the original eleven credibility dimensions have been narrowed down to six. The explanations in Figure
7 below are summaries of what the credibility dimensions represent, derived from the Credibility
Assessments in the ITIEM Project working paper [7], in which more detailed explanations of the
credibility dimensions can be found.

Credibility Dimension Explanation


Reflected Credibility Reflected Credibility is an assessment of how certain the respondent
himself/herself is of his/her answer. The higher the certainty, the
higher the credibility.
The question asked to the respondent is how certain are you that your
assessment is correct? The alternatives to choose from are the following:
• Very certain
• Fairly certain
• Fairly uncertain
• Uncertain
Source Proximity Source Proximity is an assessment of how close the respondent is to
the source of his/her answer, for example if the respondent has
experienced the answer himself/herself, or if the respondent has
heard it from someone. The closer to the source, the higher the
credibility.
In this thesis, Source Proximity concerns how close the respondent
is to the work tasks which require support from the evaluated
functionality. The question asked to the respondent is how close to you
are the work tasks which use the different parts of the functional reference model
(the functional area) as tools and support? The alternatives to choose from
are the following:
• Respondent works with the tasks him/herself
• Respondent asked person working with the tasks
• Respondent referred to person working with the tasks
• Respondent referred to others
Age of Answer Age of Answer is an assessment of the age of the information the
respondent uses to answer the question. The more recent the
information, the higher the credibility.
In this thesis Age of Answer concerns how long it has been since the
respondent performed work tasks which require support from the
evaluated functionality. The question asked to the respondent is how
long has it been since you performed work tasks that required the IT systems
support described in the functional reference model (the functional area)? The
alternatives to choose from are the following:
• Days
• Weeks
• Months
• Years
Question Domain & Answer Question Domain & Answer Domain Match is an assessment of
Domain Match how well matched the evaluated domain (answer domain) is with the
desired domain of evaluation (question domain). The better the
match, the higher the credibility.
In this thesis, Question Domain & Answer Domain Match concerns
how well the of the respondent’s actual work tasks (which require
support from the evaluated functionality) match the work tasks that
are meant to be evaluated. Examples of work tasks are maintenance
work, asset investment planning and procurement. Question

Andreas Andersson Industrial Information and Control Systems, KTH

16
Master of Science Thesis 2006-09-14

Domain & Answer Domain Match also concerns how well the
evaluated type of business matches the sought-after business.
Examples of types of businesses are power generation and electric
distribution. The question that is assessed by the interviewer is to
what extent does the answer domain match the question domain? The
alternatives to choose from are the following:
• The same type of business
• The same type of work tasks
• Comparable type of business
• Comparable types of work tasks
Please note that two alternatives are to be chosen, for example “the
same type of business” as well as “comparable types of work tasks”.
Match with Area of Expertise Match with Area of Expertise is an assessment of how well the
respondent’s expertise matches the wanted respondent profile (the
role that is most suited to answer the question). The respondent’s
expertise is categorized according to the framework developed in
phase I of the ITIEM project [19], see section 3.5. Please note, that
this dimension refers to if the respondent is correctly chosen and
differs from the previous as it refers to the credibility of the
respondent per se, rather than the individual answers given by the
respondent. The better the match, the higher the credibility.
In this thesis, Match with Area of Expertise concerns how well the
area of knowledge and the organizational level of the respondent
match the wanted profile. The question asked to the respondent is
what profile do you consider yourself to have? The alternatives to choose
from are the following:
• Area of Knowledge (Business, Business/IT or IT)
• Organizational level (Process performer, process manager,
business unit or enterprise)
• Relation to Organization (Employee, consultant or supplier)
Please note that one of the alternatives is chosen from each of the
three categories.
Years of Experience Years of Experience is an assessment of how many years of
experience that the respondent has in the field that is being
evaluated. The more experience, the higher the credibility.
In this thesis, Years of Experience concerns the respondent’s
experience in the functional area that is being evaluated. The
question that is asked to the respondent is how much experience does the
respondent have in the question domain? The alternatives to choose from
are the following:
• Less than 1 year
• 1 to 3 years
• More than 3 years
Figure 7 – Credibility dimensions used to evaluate respondents

In addition to the dimensions above, another dimension called Interviewer’s Assessment is used to assess
the credibility of the respondent (see Figure 8 below). The purpose of this dimension is to enable the
interviewer to balance the credibility of a respondent that fits the desired profile, but that does not
give a credible impression.

Credibility Dimension Explanation


Interviewer’s Assessment1 Interviewer’s Assessment is an assessment of how certain the
interviewer is of the respondent’s answer. The higher the
certainty, the higher the credibility.
For more easily assessing a respondent, certain criteria that can

1 Note that the credibility dimension Interviewer’s Assessment is not part of the form used during the
interviews. This is to ensure that the respondent cannot see the interviewer’s assessment if the
interview is carried out in person.
Andreas Andersson Industrial Information and Control Systems, KTH

17
Master of Science Thesis 2006-09-14

be used. For example, if the respondent answers “don’t know”


frequently, the credibility is lowered. The same is true if the
respondent do not give solid reasons for his/her evaluations.
The question that is assessed by the interviewer is how certain are
you, the interviewer, that the respondent’s assessment is correct? The
alternatives to choose from are the following:
• Very certain
• Fairly certain
• Fairly uncertain
• Uncertain
Figure 8 – Additional credibility dimension used only by interviewer to evaluate respondents

For details on which questions that are asked to respondents and the scales used, see Appendix 7 for
the credibility form.

Andreas Andersson Industrial Information and Control Systems, KTH

18
Master of Science Thesis 2006-09-14

3.5 Categorization of Respondents


The ITIEM project phase I includes a framework for categorization of respondents [19] (see Figure
9). The framework is comprised of three dimensions, more precisely scope of knowledge, type of knowledge,
and relation of knowledge. In this thesis, however, the dimensions are referred to as organizational level (see
Figure 10), area of knowledge (see Figure 11), and relation to organization (see Figure 12). The tables below
give detailed explanations of the respective categories of the three dimensions.

Figure 9 – The respondent categorization framework

Organizational Level Explanation


Process Performer A person responsible for carrying out certain tasks within a
business process. (narrow scope, deep level of knowledge)
Process Manager A person responsible for a business process within a business
unit
Business Unit A person with business unit wide responsibility
Enterprise A person at a relatively senior position with enterprise-wide
responsibility (widest scope of knowledge)
Figure 10 – The dimension organizational level

Area of Knowledge Explanation


Business An expert of some business process
IT An expert in IT systems
Business & IT An expert with knowledge in both some business process and IT
systems
Figure 11 –The dimension area of knowledge

Relation to Organization Explanation


Employee2 A person employed at the business unit in question
Consultant3 A person who is a consultant, in-house or external, in relation to
the business unit in question
Supplier A person in an organization supplying an IT system for the
business unit in question.
Figure 12 – The dimension relation to organization

2 Note that in this thesis, the category employee is used instead of the original category user [19].
3 Note that in this thesis, the category consultant is used instead of the original category implementer [19].

Andreas Andersson Industrial Information and Control Systems, KTH

19
Master of Science Thesis 2006-09-14

4 Master Thesis Project Phases


The following chapter outlines the different phases of the master thesis project. The purpose of the
chapter is to give the reader an overview of the method development process. The intent is not to go
into details about the various design choices of the method, but to complement the more detailed
explanations in section 5.3. A summary of the phases and activities included in the master thesis
project can be seen in Figure 13 below.

Master Thesis Project Phase Activity


Initiation Literature Study
Evaluation Method Development
Questionnaire Development
Testing Pilot Interviews
Adjustments of the Method
Data Collection Respondents Selection
Data Collection Interviews
Analysis Data Coverage
Data Consistency
Respondent Credibility
Reasonableness
Finalization Conclusions
Future Recommendations
Figure 13 – Main phases in master thesis method development process.

4.1 Initiation
The initiation phase was the opening part in the method development process. It included a literature
study, development of the evaluation method and development of the questionnaires.

4.1.1 Literature Study


To become acquainted with the ITIEM framework a literature study was conducted early in the
project. The functional reference model, developed in ITIEM project phase I, was studied to grasp
how the functionality was delimited and described in the context of the ITIEM [19]. Moreover, the
working paper The Total Quality ATD was studied to understand the quality dimensions being used for
evaluating non-functional requirements [11]. Also, the working paper Dimensions of Benefits from IT/IS
was studied to realize how business value was used in the ITIEM and what specific business value
dimensions existed in the framework [8]. Of course, other information sources were used as well in
the literature study. The ones mentioned, however, were central for the method development process
since they were specifically written for the ITIEM project dealing with functionality, quality, and
business value – three cornerstones of the ITIEM.

4.1.2 Evaluation Method Development


The development of the measurement method began in connection with the literature study.

Quantitative Approach and Structured Interviews


As can be seen further in section 5.3.3, a quantitative approach, focusing the data collection on
measurements and the analysis on statistical processing, was chosen [3].

Granularity
With relation to the quantitative approach, a suitable level of detail was chosen for the evaluation. The
line of argument for the specific granularity level is described further in section 5.3.2.

Scope
The scope of the thesis was more or less given beforehand (since the thesis is part of the ITIEM
project). As described in section 2.3 the method as such, has a wider scope than the case study at
Vattenfall Distribution Sweden and Hydro Power Sweden. The scope of the applied measurements in the
case study was narrowed down to work and maintenance applications within asset management at the
two mentioned business units at Vattenfall AB.

Andreas Andersson Industrial Information and Control Systems, KTH

20
Master of Science Thesis 2006-09-14

Respondent Profile
A preferred respondent profile, based on the respondent characterization framework [19] was
selected. For more details, see sections 3.5 and 5.2.

Number of Respondents
A preliminary estimation of the number of respondents needed was made in the initiation phase. This
was done both to facilitate the project planning and to prepare the involved business units, Distribution
Sweden and Hydro Power Sweden, regarding the resources needed.

4.1.3 Questionnaire Development


Questionnaires were developed to be used as supportive tools during the structured interviews as well
as a means for easy data registration. For more details on the reasons, see section 5.3.3.

4.2 Testing
Before the actual data collection interviews were conducted, pilot interviews were held. The pilot
interviews resulted in valuable feedback which led to a number of adjustments of the measurement
method before the data collection phase took place. For more details on adjustments, see section
5.3.3.

4.3 Data Collection


In the data collection phase, a number of respondents were selected and interviewed. The procedures
(respondent selection, interviews etc.) in the data collection phase were established in the initiation
phase and adjusted in the test phase. Measurement data as well as credibility data about the
respondents were collected. The data collection phase was one of the biggest parts of the thesis work
and is described in more detail in section 6.1.

4.4 Analysis
The analysis phase involved an assessment of the collected data. Firstly, the data coverage was
evaluated to assess if enough data was collected. Secondly, the data consistency was examined to
assess whether the respondents (who answered the same questions) agreed with each other. Thirdly,
the respondent credibility was inspected to assess if the profiles of the participating respondents were
suitable enough. Also, an aggregated analysis of the collected data was conducted to assess if the
overall results were reasonable. For more details on analysis and results, see sections 5.1.3 and 6.2.

4.5 Finalization
The final phase concerned concluding the experiences of the project. It also included formulating
recommendations for how the measurement method could be developed further in the future. For
future recommendations, see chapter 8.

Andreas Andersson Industrial Information and Control Systems, KTH

21
Master of Science Thesis 2006-09-14

5 Evaluation Method
The following chapter first presents an overview of what constitutes the evaluation method developed
in this thesis with brief descriptions of its main components, as well as the prerequisites for the
method. More detailed descriptions are then provided on how to apply the method, and lastly
justifications for the design of the method are given.

5.1 Overview
The method consists of the following three main steps:
1. Business Value Prioritization
2. Business Value Contribution
3. Measurement Data Analysis

The first two steps can be performed in parallel, and the resulting data is then analyzed in the third
step (see Figure 14 below).

Figure 14 – Evaluation method overview

5.1.1 Business Value Prioritization

Respondent Selection
Finding suitable respondents is key to receiving credible answers, and ultimately to the credibility of
the calculated business value for the IT system scenarios. Hence, respondent profiles are presented
which act as guidelines when choosing appropriate respondents. For more details, see section 5.2.1.

Prioritization of Business Value Dimensions


In order to establish what is most important for the specific business unit, business value dimensions
are prioritized according to what the management considers to be of most value to their organization
[10]. For more details, see section 5.2.1.

5.1.2 Business Value Contribution

Respondent Selection
Finding suitable respondents is as important for evaluating contribution from functionality to business
value as it is when prioritizing business value dimensions. The profile of the typical respondent,
however, differs somewhat between the two activities. For more details, see section 5.2.2.

Establishing Business Value Contribution from Functionality


In order to identify how much business value that the IT system functionality contributes to the
evaluated business unit, the functionality is mapped to a number of benefits of IT, i.e. the business
value dimensions. For more details, see section 5.2.2.

Establishing Quality Impact on Business Value Contribution


Non-functional aspects also affect the contribution to business value. In order to identify how much
impact they have, a number of quality dimensions are applied to the established mappings (from the
step above). For more details, see section 5.2.2.

Andreas Andersson Industrial Information and Control Systems, KTH

22
Master of Science Thesis 2006-09-14

5.1.3 Measurement Data Analysis

Coverage
There are several criteria that can be used for assessing coverage. For business value prioritizations,
the number of respondents can be used as a measurement of coverage. It is also important that the
respondents have different areas of responsibility, to ensure representative results for the business
unit. For the business value contribution, the coverage can be assessed by the proportion of functional
areas that have been covered, and the number of respondents that have been interviewed for the
functional areas. If the collected measurement data include too few respondents or functional areas,
additional measurements should be considered. There is, however, no definite limit for sufficient
coverage. For the business value prioritizations, it depends on the size of the assessed business unit.
For the business value contribution, a guiding principle is to cover all the functional areas with at least
two respondents, and the most contributing areas with a few complementary measurements.

Consistency
Consistency refers to how much coherence the answers of a respondent contain, i.e. how similar the
answers are to comparable questions. For further details regarding prioritization consistency, see
section 5.3.1. For business value contribution, the consistency also refers to how equally respondents
have evaluated the same functional areas or impact from quality dimensions. The higher the
discrepancy between the respondents’ answers, the lower the data consistency. As an example, if two
different respondents assert the contribution for a functional area to be very different, such as 0 and 4,
respectively, additional measurements should be considered (with other respondents).

Credibility
An examination of the credibility of respondents should be made to ensure that the collected data do
not contain answers from respondents with too low credibility. If the data do contain answers with
low credibility, additional measurement with more credible respondents should be considered. Please
note that credibility refers primarily to how well a respondent matches a predetermined profile (see
sections 3.5 and 5.2).

The analysis should yield a recommendation whether the data consistency, data coverage and
respondent credibility is sufficient for the so far collected and analyzed measurement data. If the
analysis shows that the collected data is adequate, the measurement data can be used for further
processing in the ITIEM project. Otherwise, an additional data collection step should follow to
supplement the so far collected data.

5.1.4 Business Value Aggregation


It is important to keep in mind that both the business value prioritization and the business value
contribution affect the final business value. In the overall ITIEM project the two elements are
aggregated to form a combined business value, taking into consideration both functional and quality
aspects together with a management view and credibility assessments.

This is most easily shown through an example. Please note, that this is not the exact way of calculating
the business value, but merely a way to show how the different elements affect the overall business
value. Take a certain functional area (for example Asset Investment Planning) that contributes to a certain
degree to for example decision making (business value contribution) and where the business value
dimension decision making is ranked by management (business value prioritization). In addition, the
average effect that all the quality dimensions have on this functional area is taken into consideration
(quality impact on business value contribution). Altogether, the total contribution that the functional
area Asset Investment Planning has on the business value dimension decision making can be calculated. This
is done for all business value dimensions for a certain functional area, in order to establish how much
the functional area contributes in general to the business value. This is in turn done for all the
functional areas in the functional reference model, in order to identify how much each functional area
contributes to the business value. Finally, the results from the calculations above are combined with
the rest of the results from the ITIEM method, in order to determine the total business value for the
evaluated IT system scenarios.

Andreas Andersson Industrial Information and Control Systems, KTH

23
Master of Science Thesis 2006-09-14

5.1.5 Prerequisites
The measurement method requires a functional reference model to delimit the evaluation scope. Also,
there is a need for descriptions of the functional areas that are to be evaluated (based on the
functional reference model). The descriptions are used for communicating the model to the
respondents. Moreover, a list of relevant business value dimensions along with their explanations is
required, as well as a list of quality dimensions and their explanations.

5.2 Instructions
The following chapter provides directions for how the method is applied in practice. This includes
how to find suitable respondents, how to prioritize business value dimensions and how to establish
the business value contribution.

5.2.1 Business Value Prioritization


The purpose of this component of the method is to establish what is most important for the specific
business unit, according to the managers. In this thesis the web based software FocalPoint was used
for prioritizing the business value dimensions, an application that adopts the Analytic Hierarchy
Process (AHP) [21]. Note, however, that any other method that generates a list of the prioritized
alternatives can be used. The foundation of the AHP method is pairwise comparisons between
different alternatives, whereby the relative importance of one alternative over another can be
expressed. In this way, by comparing a number of different pairs, a list is generated with all the
alternatives prioritized according to their relative importance. The alternatives in this case correspond
to the business value dimensions, which are explained in section 3.2. For details on how the pairwise
comparisons are performed, see the Prioritization section below. Moreover, the desired profile for
respondents can be found below.

Respondent Selection
The typical respondent profile for prioritizing business value dimensions looks like the following:
• The respondent should have a senior management position at the top of the business unit, in
order to have a more strategic view of the business. Hence, the organizational level should be
“Business Unit”, according to the framework used to categorize respondents (see section
3.5).
• The respondent should have a business perspective (rather than IT), in order to be familiar
with the business value dimensions. Hence, the area of knowledge should be “Business”.
• The respondents should have different areas of responsibility, in order to reflect a
representative picture for the overall business unit (see section 5.3.1).

Prioritization
In order to prioritize the business value dimensions, a number of pairwise comparisons are required.
More specifically, the question asked to the respondent is which business value dimension is in the long-term
(more than 5 years) most important for the success of your business unit? The respondent is presented with two
different business value dimensions along with their descriptions, and is then asked to compare them
on a relative scale (see Figure 15 below).

Andreas Andersson Industrial Information and Control Systems, KTH

24
Master of Science Thesis 2006-09-14

Figure 15 – A screenshot of a pairwise comparison from FocalPoint [25]

As can be seen in Figure 16 below, the prioritization is made on scale ranging from equal importance
to increasing importance in four levels in both directions.

Figure 16 – Explanation of the pairwise comparison scale

For example, the alternatives could be efficiency and decision making, where efficiency is described as
shortening of manufacturing times or reduced paperwork and administrative tasks and decision making is
described as better decision support, shorter time to decisions or increased reliability in decisions. This is done for
different pairs of the business value dimensions (according to a specific algorithm in the FocalPoint
software) until the prioritized list has been generated for the respondent. In order to achieve a result
with 95 per cent confidentiality the respondents have to make 53 comparisons. The results of all
participating respondents are then aggregated to form a combined list for the overall business unit.

5.2.2 Business Value Contribution


The purpose of this component of the method is to establish how much the IT system functionality
provides to the organization in terms of business value. In this thesis, the results were collected
through interviewing respondents, either in person or on the phone. For detailed directions on how
the interviews are performed, se the Interview Process below. Moreover, the desired profile for
respondents can be found below.

Respondent Selection
The typical respondent profile for evaluating the contribution to business value looks like the
following:
• The respondent should use or be in need of IT system support for the functionality in
question, in order to be aware of the issues related to it.
• The respondent should personally perform the work tasks covered by the system
functionality, or at least know how other people carry out the work tasks, in order to be
acquainted with the work process that the functionality supports. Hence, the organizational

Andreas Andersson Industrial Information and Control Systems, KTH

25
Master of Science Thesis 2006-09-14

level should be “Process Manager” or “Process Performer” according to the framework used
to categorize respondents (see section 3.5).
• The respondent should have a business perspective (rather than IT), in order to be able to
evaluate the business value of functionality. Hence, the area of knowledge should be
“Business” or “Business & IT”.
• The respondent should be able to fully evaluate at least one of the functional areas in the
functional reference model, in order to be a suitable respondent for the method.

The same holds true for the quality impact as this assessment is made in the same interview with the
same respondent.

Interview Process
The following steps outline the process of interviewing a respondent. The interview process is divided
into two stages (see Figure 17 below), where the first stage is only carried out once for each
respondent in the beginning of the interview, and the second stage is repeated for each functional area
that is evaluated.

Figure 17 – Interview process overview

Concepts specific to the method are naturally explained when they come up during the interview, such
as the different business value dimensions and quality dimensions. Already in the beginning of the
interview, however, it should be pointed out to the respondent that the method is system
independent, in the sense that no system in particular is assessed when evaluating business value
contribution. The functionality is in this method considered more as a supporting tool, existing in
some system, regardless of what the system is. Hence, the method is applicable even if system support
does not yet exist, as long as the work tasks (which would be supported by the functionality) are
performed.

Interview Stage 1 (requires approximately 15 minutes)

1. Explaining Background and Purpose ( 10 minutes)


Please note that this step is only performed when interviewing a respondent for the first time.
The purpose is to give the respondent an understanding of how the interview fits into a larger
context, the reason for the interview and why the respondent has been selected for the interview.
It includes shortly explaining how the ITIEM works and which stage in the method that the
interview concerns. Moreover, this step encompasses familiarizing the respondent with the
functional breakdown of the functional reference model.

Then the respondent is given an opportunity to briefly present himself/herself, describing


his/her background and current work tasks Please note that this step can take longer if the
respondent goes into details about the background or work tasks. Hence, it is important to guide
the respondent to follow the interview structure.

2. Collecting Respondent Credibility Data ( 5 minutes)


The purpose of this step is to establish the level of quality in the answers given by the
respondent. There is in fact a big difference if the answers are based on guesses made by a
respondent that just happened to be available, or if thorough arguments are provided by an
experienced expert in the evaluated area.

This step is a natural continuation of the respondent’s background description, since this
information coincides with the credibility dimensions that are being assessed in this step. More
specifically, it concerns Question Domain & Answer Domain Match, Match with Area of Expertise and
Years of Experience, as these dimensions are related to the field and the position that the
respondent is working in, as well as the amount of experience the respondent has (see section 3.4
for details and Figure 18 below for an example).
Andreas Andersson Industrial Information and Control Systems, KTH

26
Master of Science Thesis 2006-09-14

Figure 18 – Example of how the form for quality impact is filled in.
Please note that this is only a snapshot from the whole form, which can
be found in Appendix 7.

For some of the credibility dimensions, questions are asked directly to the respondent, such as
how much experience do you have in the question domain? For other dimensions the assessment is just
noted by the interviewer, for example to what extent does the answer domain match the question domain?
For the form used by the interviewer during the interview, see Appendix 7.

Interview Stage 2 (approximately 30 minutes)

3. Explaining Functional Area to be Evaluated ( 5 minutes)


The purpose of this step is to determine which areas that the respondent is able to evaluate. One
of these functional areas is selected for evaluation. This area is then explained in detail and
discussed with the respondent to ensure that the respondent agrees with the description and that
the interviewer and the respondent mean the same thing. For an overview of the functional areas,
see Appendix 4 and Figure 19 below for an example.

Figure 19 – Example of a functional area description. Please note that this is only a
snapshot from the whole form, which can be found in Appendix 4.

The same descriptions of functional areas should be used for all respondents to maintain
consistency throughout the method. Please note that the interviewer needs to have a more
thorough understanding of the functional areas in order to be able to explain the brief points
shown to the respondent. Also note that it is very important to emphasize throughout the
interview that the evaluation is limited to this particular functional area and not IT systems
in general.

Andreas Andersson Industrial Information and Control Systems, KTH

27
Master of Science Thesis 2006-09-14

4. Establishing the Business Value Contribution from Functionality ( 15 minutes)


The purpose of this step is to identify which connections that exist between the evaluated
functional area and the business value dimensions, and also how strong these connections are.
The answer is given on a predetermined scale from 1-4 (see Figure 20 below), and if there is no
connection to a certain business value dimension, this value is 0.

Figure 20 – Scale for business value contribution from functionality

Hence, for each business value dimension the respondent is asked how much does the functionality
contribute directly to the business value dimension? As an example, for the functional area Asset Investment
Planning the questions for the three first business value dimensions would be the following:
• How much does system functionality for Asset Investment Planning contribute directly to improved
decision making?
• How much does system functionality for Asset Investment Planning contribute directly to improved
ability to differentiate products/services?
• How much does system functionality for Asset Investment Planning contribute directly to improved
efficiency?

To help the respondent to start thinking in business value terms, the interviewer can:
1) Ask the respondent to contrast having the functionality against not having it at all
2) Give examples more specifically adapted to the respondent’s area of responsibility. This
concerns mainly the business value dimensions, where the original examples are of a fairly general
character.

The answers, i.e. contribution grades and relevant comments, are filled in on a form together with
identifying respondent information (see Figure 21 below). For the whole form used by the
interviewer during the interview, see Appendix 5. A relevant comment could for example be
“there is a high contribution as the functionality summarizes data and makes it easier to spot
patterns and trends, which improves decision making”.

Figure 21 – Example of how the form for business value


contribution is filled in. Please note that this is only a
snapshot from the whole form, which can be found in
Appendix 5.

This procedure is repeated for all the business value dimensions. It should be emphasized to the
respondent that the contribution grades are in relation to the other business value dimensions,
and that the highest alternative should only be given to the dimensions that stand out from the
rest, i.e. that the functional area contributes the most to. Moreover, it should be stressed to the
respondent that it is not the importance of the business value dimensions that is assessed
in this step, but the contribution that the functionality has to them. The actual importance of the
dimensions is established in the business value prioritization step. If the respondent has difficulty
with or is uncertain of some of the dimensions, the interviewer should ask him/her to continue

Andreas Andersson Industrial Information and Control Systems, KTH

28
Master of Science Thesis 2006-09-14

and then come back to these dimensions after the contribution has been established to all the
other dimensions.

After establishing all the connections from a functional area, the interviewer should go through
the answers with the respondent in order to ensure that the respondent obtains an overview and
sees the answers in relation to each other. Then the respondent should be asked to point out if
there are one or two business value dimensions that the functionality especially contributes to, in
order to determine the main business value focus of the functional area.

Please note that the purpose of the method is to identify so-called direct benefits that the
functionality contributes to, i.e. first order benefits. This means that the respondent should
express the direct benefits from having certain functionality, not the ultimate benefit, for example
higher revenues for the organization. This means that the respondent should not be discussing
causal links between the business value dimensions when identifying the business value
contribution. It might be difficult to express this explicitly as an interviewer, but it is nevertheless
important to evaluate in some form, since the existence of causal relations lowers the credibility
of the results. [7] See Figure 22 below for an example of direct and indirect business value
contribution.

Figure 22 – An example of direct and indirect business value contribution. Functional area 1
contributes directly to the dimensions information, efficiency and control and follow up. The
improved information in turn leads to better quality of products and services, but this is an indirect
result and should not be counted when assessing the business value contribution in this case.
The same is true for the second-order benefits that information, efficiency and control and follow up
have in terms of reduced costs. Functional area 2, on the other hand, contributes directly to
cost reductions (for example in terms of travelling costs) and better quality of products and services
(for example improved power quality for customers), which makes it a valid connection.

5. Establishing the Quality Impact on the Business Value Contribution ( 10 minutes)


The purpose of this step is to identify how much impact certain quality aspects have on the
business value contribution. That is, how much each quality dimension of a system influences the
contribution that the functional area in question has on business value. The answer is given on a
predetermined scale from 1-4 (see Figure 23 below), and if the quality dimension has no impact,
this value is 0.

Figure 23 – Scale for quality impact on business value contribution

Hence, for each quality dimension the respondent is asked how much impact does the quality dimension
have on the business value contribution? As an example, for the functional area Asset Investment Planning
the questions for the first three quality dimensions would be the following:
• How much impact does the fact that the system is available and reliable have on the business value
contribution from Asset Investment Planning?
• How much impact does the fact that the data quality in the system is high have on the business value
contribution from Asset Investment Planning?

Andreas Andersson Industrial Information and Control Systems, KTH

29
Master of Science Thesis 2006-09-14

• How much impact does the fact that the information security of the system is high have on the business
value contribution from Asset Investment Planning?

The answers, i.e. grades for quality impact, priority ranks (see below) and relevant comments, are
filled in on a form together with identifying respondent information (see Figure 24 below). For
the whole form used by the interviewer during the interview, see Appendix 6. A relevant
comment could for example be “usability has high impact on the business value contribution as
the people using the system are not IT experts, and if the usability of the system is not high, the
system will not be used very frequently or effectively”.

Figure 24 – Example of how the form for quality impact is filled in. Please note that this
is only a snapshot from the whole form, which can be found in Appendix 6.

This procedure is repeated for all the quality dimensions. After asking these questions, the quality
dimensions are ranked according to their relative importance. More specifically, the quality
dimensions are prioritized from 1-7 (since there are 7 quality dimensions), where 1 is the most
important and 7 is the least important.

After establishing the quality impact and the relative importance between the quality dimensions,
the interviewer should place the answers in relation to other functional areas that the respondent
has evaluated, in order to accentuate possible differences in quality impact between functional
areas.

6. Assessing Credibility for Functional Area (<5 minutes)


The purpose of this step is to assess the credibility dimensions that differ between functional
areas, i.e. where the credibility is related to the specific functional area that is being evaluated. It
concerns the four remaining credibility dimensions, i.e. Source Proximity, Age of Answer, Reflected
Credibility and Interviewer’s Assessment (see section 3.4 for more details and Figure 25 below for an
example). These answers are collected separately for each evaluated functional area. Please note
that the Interviewer’s Assessment dimension is silently recorded by the interviewer and is not shown
to the respondent.

Andreas Andersson Industrial Information and Control Systems, KTH

30
Master of Science Thesis 2006-09-14

Figure 25 – Example of how the form for credibility is filled in. Please note that this is
only a snapshot from the whole form, which can be found in Appendix 7.

If another functional area is to be evaluated, then the process starts over at step 3.

5.3 Justifications
The following section presents the reasons for choices made during the development of the method.
It includes arguments for each of the parts included in the method, i.e. business value prioritization
and business value contribution. Also, the design choices for the questionnaires are justified.

5.3.1 Business Value Prioritization

Web Based Application


One of the reasons for choosing a web based application is that the respondents can perform the
prioritization anywhere they like at any suitable time (within given time period). Moreover, this way of
prioritizing is not very time consuming, but takes approximately 30 minutes per respondent. The only
thing the respondents are required to do is to pairwise compare the business value dimensions, and
the prioritization is then generated automatically. Doing it this way also saves time in managing the
results, as the results from all respondents are summarized electronically and are easily accessible.

A drawback with a web based method is that there is little interaction with the interviewer, and there
is risk for possible misunderstandings and misuse of scales. This risk can be reduced, however, by
providing clear, thorough instructions to the respondents. Also, there is little information about how
well thought-out the answers are, even though some indication comes from the level of consistency in
the answers.

Business Perspective
The reason for choosing a business perspective (rather than IT) is to identify what is most important
in a strategic business sense. This establishes what the functionality of an IT system should support,
so that it is possible to match this ambition with what the functionality actually does support. The
business perspective is incorporated through the respondent profile and through directions given to
the respondents.

Long-term Perspective
The reason for choosing a long-term perspective is to reduce the influence from day-to-day or current
problems on the prioritization. Also, this results in more of a forward-looking perspective, which is
useful as the information systems will take time to implement in the organization and will be used for
a longer period of time. The long-term perspective is incorporated through directions given to the
respondents and in the question asked to the respondents, i.e. which business value dimension is in the
long-term (more than 5 years) most important for the success of your business unit?

Participants with Different Areas of Responsibility


The reason for choosing respondents with different focus is to ensure that the results present a
representative picture for the overall business unit. The different areas of responsibility are
incorporated through the choice of respondents from different departments of the business unit.

Andreas Andersson Industrial Information and Control Systems, KTH

31
Master of Science Thesis 2006-09-14

Separate Prioritization for Each Business Unit


The reason for conducting separate prioritizations for every business unit is to recognize differences
in focus between business units. The strategy of business units differs and this can translate into
differences in prioritizations. For example, the relations with customers are not very important for a
business unit that has little contact with the end customer, whereas it can be of great importance to a
business unit that has that contact.

5.3.2 Business Value Contribution

Granularity Level for Evaluating Business Value Contribution


There are two alternatives to choose from when trying to identifying the contribution to business
value from functionality. Respondents can either be interviewed for each abstract component, or the
interviews can be carried out for larger functional groups (consisting of several abstract components).
For the purpose of this method, interviews on the group level are more suitable, for several reasons
stated below.
• Respondent Requirements
Interviewing respondents for each abstract component requires 3 times as many different
respondents compared to interviews on functional groups (see Appendix 3). This is due to
the fact that the abstract components are more closely specified functionally, which requires
respondents with deeper knowledge in a more limited functional area. Because of this
limitation, there is a risk of the respondent losing the overall business value perspective. In
other words, it can be difficult for the respondent to identify the contribution to business
value from a too delimited functional area, such as a single abstract component.
• Time Requirements
Interviewing respondents for each abstract component requires 4 times as much interview
time compared to interviews on functional groups (see Appendix 3). This does not include
the administrative time it takes to find respondents and book them, nor does it include
potential travelling time. Furthermore, with more respondents involved, the overhead time
increases, i.e. the time needed to explain the background and the reason for the interview,
and also instructing the respondent on how the interview is carried out.
• Other Issues
Functional groups encompass a larger functional area, which corresponds more closely to
the work processes that organizations often follow (as opposed to strict functional
departments). This makes it easier for the respondent to relate his/her assessment to the
structure of the organization. Also, there are certain business value dimensions that
consistently receive high marks (meaning high contribution from functionality) regardless of
the functional area (see section 6.2.2). If the evaluation is carried out on single abstract
components, this leads to unnecessary time consumption and redundant results.

Level of Detail in Descriptions of Quality Dimensions


The descriptions of the quality dimensions given to the respondent have a suitable level of detail to be
understood (as tested on respondents). Too detailed descriptions, on the one hand, would increase the
time required to explain them and too many details would also confuse respondents, as the
respondents are mainly business focused (see desired profiles in chapter 5.2). Too general
descriptions, on the other hand, would make it difficult for respondents to understand exactly what
they are supposed to assess. Also, using too little detail in the descriptions would make it harder to
receive consistent evaluations from different respondents, as the respondents prior to the interview
have different opinions of what the quality dimensions represent, and must be clearly informed of
what is relevant in this method.

Ranking of Quality Dimensions


The quality dimensions are also ranked, in order to identify differences even if the grades are very
similar between quality dimensions, as it forces the respondent to decide upon a priority between
quality dimensions. This ranking of dimensions also generates a more thorough discussion with more
arguments given for the respondent’s choices.

5.3.3 Structure of Interviews and Questionnaires


Four pilot interviews were conducted to ensure the general appropriateness of the interview process
and the questionnaires. Questions, explanations, and examples were also evaluated in order to see if
Andreas Andersson Industrial Information and Control Systems, KTH

32
Master of Science Thesis 2006-09-14

they were understood as intended and that questions were answered as expected by the respondents.
Finally, the scales for evaluating contribution from functionality, and the impact of quality were tested.
The pilot interview respondents were from both Vattenfall and the department of Industrial
Information and Control Systems at KTH.

Quantitative Approach and Structured Interviews


A quantitative approach, focusing the data collection on measurements and the analysis on statistical
processing, was chosen [3]. It was considered that the quantitative perspective would facilitate making
the evaluation objects measurable and to present the evaluation results numerically [1]. Furthermore, a
considerable amount of information is collected during the interviews, which originates from many
different respondents and concerns a number of different data dimensions. To facilitate the
management of measurement data and to easier control the data collection procedure, structured
interviews were chosen [1]. Another reason for choosing a structured method is to facilitate telephone
interviews, which reduces both time, travelling and costs, as can be seen in section 8.2. Finally,
structured interviews are easier to learn for an interviewer that has not used the method before. All
this is in accordance with the principally quantitative method ITIEM.

The primary purpose of the disposition of the questionnaires was to facilitate the respondents’
understanding of the method. The disposition also aimed at making the data registration as effortless
and fast as possible for the interviewer in order to minimize the time consumption. Moreover, a
suitable grading scale was chosen (see Scales below). Much effort was also put into choosing words
that would suit the respondents, since it was discovered that usage of inappropriate words and
concepts could make the respondents confused during the interviews, jeopardizing the success of the
evaluation.

Scales
The scales used in the method were chosen according to comments from the pilot interviews, but also
in order to maintain a harmonious scale structure throughout the ITIEM project. There were
essentially two aspects of the scales that raised questions:
• The number of alternatives on the scale
The reason to choose 4 alternatives is because of the tendency that respondents have to pick
the middle alternative [21]. By using an even number of alternatives this can be avoided. An
option would be to use more alternatives than 4, but this would make it more difficult for
the respondent to see the differences between alternatives. Using less alternatives than 4
would be practically impossible if the odd number of alternatives is to be avoided.
• The representation of the alternatives on the scale
The options to represent the alternatives on the scale are by numbers, words or both. Using
merely words confuses the respondent as it is difficult to find words on a 4 graded scale that
convey the objective difference between adjoining alternatives in the way that numbers do,
for example between 3 and 4. To guide the respondent to what the numbers on the scale
represent, a combination of words and numbers is used, but only at the end of the scale, i.e.
on alternative 1 and 4 (see Figure 26 below).

Please note, that a there is an option for the respondent to state no contribution to a certain business
value dimension from a functional area. The 0 option is, however, clearly marked out as not being part
of the regular 1-4 scale, and has been perceived as different during the interviews. Thus the 2 cannot
be considered to be a middle alternative in this case.

Figure 26 – Example of a scale used in this thesis

Andreas Andersson Industrial Information and Control Systems, KTH

33
Master of Science Thesis 2006-09-14

6 Case Study
The following chapter gives an account for the extensive case study performed at two business units
at Vattenfall, altogether involving 30 people from the company. The first section deals with the data
collection for both the business value prioritization and the business value contribution, and in the
second part the results from the case study are presented.

6.1 Data collection


The following section describes in detail how the method was applied at Vattenfall. Please note, that
the chapter for pedagogical reasons follows the same structure as section 5.1, in order to facilitate
understanding.

6.1.1 Business Value Prioritization

Respondent Selection
People at Vattenfall helped pointing out suitable respondents for the prioritization. At Distribution
Sweden especially the IT director assisted in finding respondents within the desired profile (see section
5.2.1), and at Hydro Power Sweden particularly the person in charge of maintenance development
together with a person from Vattenfall Research & Development helped in identifying potential
participants. An example of participating respondents can be seen in Figure 27 below. The entire list
of respondents can be found in Appendix 9.

Respondent Area of Knowledge Organizational Level Relation to Organization

E1 Business & IT Business Unit Employee


E7 Business Process Manager Employee
E8 Business Business Unit Employee
Figure 27 – Example of business value prioritization respondents and their profiles

Prioritization of Business Value Dimensions


Two separate surveys were created in the software FocalPoint, one for Distribution Sweden and one for
Hydro Power Sweden. Potential respondents were added to the application and user information (user
name and password) was associated with each respondent. Emails were sent out to the potential
respondents with an inquiry about participation and instructions about the web based survey. Please
note that these emails were sent out to Distribution Sweden around one week before Hydro Power Sweden,
as the respondent list had been compiled earlier at this business unit.

The e-mail that was sent out contained the following information:
• Description of background and purpose, i.e. that the survey was part of a master thesis in
cooperation with Vattenfall, related to the ITIEM project.
• Motivation for the respondents to participate, i.e. arguments that the results of the thesis and
the ITIEM would turn out to be more relevant and usable for Vattenfall in terms of more
accurate business value estimates for the IT system scenarios.
• An overview of the prioritization process, i.e. that the participation would be followed by the
results being sent out for possible comments from the respondents, before finalizing the
results for further use in the ITIEM.
• Detailed instructions for the prioritization, i.e. how to pairwise compare business value
dimensions, which question to be answered, description of the scale used, how much time
the prioritization was estimated to take and also the reason for comparing a recommended
number of business value dimensions to ensure high quality of data. Moreover, the address
to the website was included and the login information for the respondent, as well as a note
ensuring confidentiality of individual results. Please note that the specific instructions on
how to pairwise compare dimensions were repeated as a welcoming screen when
respondents logged in to the FocalPoint software.

Altogether, inquiries were sent out to 10 persons at Distribution Sweden and to 12 persons at Hydro Power
Sweden. As can be seen in section 6.2.1, ultimately all 10 persons participated at Distribution Sweden, and
8 out of 12 persons at Hydro Power Sweden. To receive answers from all the participants took around
four weeks for both business units. The difference in participation between the business units may
Andreas Andersson Industrial Information and Control Systems, KTH

34
Master of Science Thesis 2006-09-14

have been a result of help from the inside at Distribution Sweden, where the IT director assisted in
motivating the respondents to participate.

Finally, thank you emails with the results (separate for the two business units) were sent out to all the
participants, giving them a chance to comment on the results before continuing with analyzing the
results.

6.1.2 Business Value Contribution

Respondent Selection
Initial respondents were found in Per Närman’s master thesis report [19], as they had been involved in
an earlier phase of the ITIEM project and were already categorized according to the respondent
profile framework (see section 3.5). The existing categorizations of respondents notably facilitated the
respondent selection. Some respondents were also pointed out by people at Vattenfall, and during the
course of data collection interviews, referrals from respondents led to more suitable respondents. An
example of interview information can be seen in Figure 28 below and more details about the
participating respondent can be seen in Figure 29. The entire list of interviews and respondents can be
found in Appendix 9.

Interview no. 11
Date 2006-04-10
Respondent V4
Interview Length 1,5 hours
Interview Type In person (Råcksta)
Age of Respondent
Functional Area Source Proximity
Answer Certainty
SCM Works with tasks himself Weeks 3
CON Referred to person working with tasks Months 3
AIP Referred to person working with tasks Months 3
Figure 28 – Example of business value contribution interview and credibility
information

Covered
Area of Organizational Relation to Q&A Years of
Respondent Functional
Knowledge Level Organization Domain Experience
Areas
V4 Business Process Performer Employee STB & > 3 years SCM, CON,
SWT AIP
Figure 29 – Example of business value contribution respondent information

Establishing Business Value Contribution and Quality Impact


The following section outlines how a typical interview was performed during the data collection. On
average, an interview took around 1 hour and 40 minutes in total and included evaluation of three
functional areas (see Appendix 3 for estimates on interview time). The interview example follows the
same structure as in section 5.2.2, where details can be found about the different forms used during
the interview. Please note that in the beginning the interviews required more time, as the interview
process was not optimized and the interviewer did not have the experience yet to guide the
respondents in the right direction all the time.

Interview Stage 1 (approximately 15 minutes)

1. Explaining Background and Purpose ( 10 minutes)


The interview usually started with the interviewer shortly explaining the background for the
interview, including what the ITIEM project is and where the interview fits into the overall
project. Also, the respondent was informed about why he/she had been selected for the
interview.

2. Collecting Respondent Credibility Data ( 5 minutes)


After that the respondent briefly presented himself/herself describing his/her background and
current work tasks. This was connected to registering credibility data that was related to the
respondent’s work tasks and experience, more specifically Question Domain & Answer Domain
Match, Match with Area of Expertise and Years of Experience. At the beginning of the method

Andreas Andersson Industrial Information and Control Systems, KTH

35
Master of Science Thesis 2006-09-14

development these credibility dimensions were assessed at the very end of the interview, but as it
turned out it is more natural do make this assessment in connection with the respondent
describing his/her background.

Interview Stage 2 ( 30 minutes x 3 = 90 minutes)

3. Explaining Functional Area to be Evaluated ( 5 minutes)


The interviewer started explaining the functional area according to the main points prepared and
sent out to the respondent in advance. After this, a short discussion followed in order to verify
that the respondent agreed with the description of the functional area given by the interviewer
and that both persons meant the same thing. It was discovered that the respondent often was
reasoning in general system terms and it is therefore important to emphasize throughout the
interview that the evaluation is limited to this particular functional area and not IT systems in
general.

4. Establishing the Business Value Contribution from Functionality ( 15 minutes)


The interviewer started by explaining the question to be answered along with an example of a
business value dimension, for example how much does system functionality for Asset Investment Planning
contribute directly to improved decision making? (Assuming that Asset Investment Planning was the
functional area that was evaluated) Also, the interviewer exemplified an answer, for instance
“there is a high contribution as the functionality summarizes data and makes it easier to spot
patterns and trends, which improves decision making”. In connection with this, the scale was
showed and explained to the respondent. More specifically, the respondent was told that the
contribution goes from 1 to 4, but that there is also an option to answer 0 in case the
contribution is irrelevant or non-existent. Also, the respondent was informed that the business
value dimensions would be explained in detail as they come up. After this the contribution was
identified to all the business value dimensions. If the respondent had difficulty with or was
uncertain at some of them, the interviewer asked him/her to continue and noted this. These
business value dimensions were then brought up again after the contribution had been
established to all the other dimensions.

5. Establishing the Quality Impact on the Business Value Contribution ( 10 minutes)


The interviewer started by explaining the question to be answered along with an example of a
quality dimension, for example how much impact does the fact that the system is available and reliable have
on the business value contribution from Asset Investment Planning? (Assuming that Asset Investment Planning
was the functional area that was evaluated) Also, the interviewer exemplified an answer, for
instance “usability has high impact on the business value contribution as the people using the
system are not IT experts and if the usability of the system is not high, the system will not be
used very frequently or effectively”. After this the impact was established for all the quality
dimensions. The dimensions where then prioritized from 1-7 (since there are 7 quality
dimensions), where 1 is the most important and 7 is the least important. This prioritization step
was not part of the method in the beginning of the method development, but was introduced
because all the quality dimensions received high grades consistently. Prioritizing them makes it
possible to distinguish differences even if dimensions have the same grade.

6. Assessing Credibility for Functional Area (<5 minutes)


Finally, the interviewer asked the respondent questions related to the specific functional area that
had just been covered, concerning Source Proximity, Age of Answer, Reflected Credibility and Interviewer’s
Assessment. At the beginning of the method development these credibility dimensions were
assessed at the very end of the interview, but as it turned out it is easier to assess them in
connection with the functional area that is being evaluated, i.e. at the end of Interview Stage 2. This
is because the description of the functional area is fresh in mind and the respondent can more
effortlessly relate the area to their work tasks and when they last used the type of functionality.

After this step, the interviewer started over at step 3 explaining another functional area to be
evaluated. This was repeated twice, altogether evaluating three functional areas.

Questions Asked to Verify the Method


At the end of the interview the respondent was asked if he/she thought this was a reasonable way to
assess business value and if they felt like the method was related to how they actually perform their

Andreas Andersson Industrial Information and Control Systems, KTH

36
Master of Science Thesis 2006-09-14

work. Also, the respondent was asked if there was something in the method that was particularly
difficult or if they had any suggestions on how to improve the method. For phone interviews, the
respondent was asked if it was possible to understand the method despite not seeing the interviewer
demonstrate it. Some of the respondents were asked whether the functional level was appropriate, i.e.
if it was possible to consider the whole functional area as homogeneous when relating it to business
value and if it would have been possible to assess the business value on a lower level. This was done in
order to verify that the method is an appropriate way of identifying business value, and that it is
practically applicable.

Positive Comments from Respondents


Most respondents thought the method seemed logical and that it was possible to answer the questions
appropriately. The respondents interviewed over phone had no particular problems with this set up,
not even respondents that the interviewer had never met in person. The level of functionality worked
well for assessing business value contribution, and most respondents seemed to be able to discuss the
functional areas as limited but homogeneous parts.

Suggested Improvements from Respondents


There were a few respondents, however, that found the method a bit theoretical and that it was
difficult to relate the functionality to business value dimensions. Moreover, for the most extensive
functional areas (for example Asset Investment Planning) a few respondents thought the functional scope
could have been broken down into smaller functional areas. Sometimes the functionality listed under a
functional area included functions that were not used by the business unit being assessed, which has
to do to a certain extent with the general character of the functional reference model. A common
perception was that there were too many business value dimensions, and that some of them were hard
to distinguish from each other. It was mainly difficult for respondents to tell the difference between
flow of products/services and integration and coordination, as well making a distinction between efficiency and
productivity.

6.1.3 Measurement Data Analysis

Coverage
For the prioritization the coverage has to be considered sufficient, especially for Distribution Sweden
where all 10 invited respondents participated, but also for Hydro Power Sweden where 8 respondents
participated, as it is a smaller business unit. For business value contribution, the objective was to first
cover all the functional areas at least once, and then cover the most important areas more times (most
important in this case refers to the functional areas covering the largest number of abstract
components). For the resulting coverage of the case study, see section 6.2.2.

Consistency
For the prioritization the consistency was generally high, with a few exceptions (see section 6.2.1). For
business value contribution, there are a numerous connections from the functional areas to business
value dimensions. This makes it inevitable for inconsistencies for some of these connections when
comparing answers from different respondents. The inconsistencies were compensated for by
interviewing more respondents for the larger functional areas, but some inconsistencies remain for
smaller areas due to lack of resources and difficulty in finding suitable respondents. Please note,
however, that the results can always be complemented afterwards as much as necessary.

Credibility
The credibility of respondents was overall relatively high and most respondents had suitable profiles
(see Appendix 9).

Andreas Andersson Industrial Information and Control Systems, KTH

37
Master of Science Thesis 2006-09-14

6.2 Results
The following section presents the results from the case study at Vattenfall. The results from the
business value prioritization and the business value contribution are depicted in separate sections, and
the results are compared for the two assessed business units.

6.2.1 Business Value Prioritization


The resulting prioritization consists of a list of business value dimensions ranked in relation to each
other. This means that the highest prioritized dimension is given 100 per cent and all the other ones
are placed in relation to this one. For example, if a business value dimension is prioritized at 50 per
cent, this means that the dimension is half as important as the highest one.

See Figure 30 and Figure 31 below for the prioritizations results from Distribution Sweden and Hydro
Power Sweden separately, and Figure 32 for a comparison of the two business units.

Custo mer relatio ns 100%


Co st reductio ns 97%
Quality o f pro ducts/services 92%
Strategy fo rmulatio n and planning 86%
Efficiency 78%
Decisio n making 72%
Flo w o f pro ducts/services 70%
Organizatio nal culture 65%
Pro ductivity 63%
Change management 61%
Deliveries 59%
Info rmatio n 54%
Co mmunicatio n 53%
Third party relatio ns 46%
Organizatio nal learning and kno wledge 45%
Techno lo gy/to o ls 45%
Inbo und lo gistics 44%
Co ntro l and fo llo w up 44%
Flexibility 43%
Supplier relatio ns 38%
Integratio n and co o rdinatio n 37%
New pro ducts/services 26%
Differentitatio ns in pro ducts/services 25%
Lo ck-in effect/switching co sts 22%
Co mpetito r relatio ns 22%

0% 20% 40% 60% 80% 100%

Figure 30 – Business value prioritization for Distribution Sweden

Andreas Andersson Industrial Information and Control Systems, KTH

38
Master of Science Thesis 2006-09-14

Techno lo gy/to o ls 100%


Pro ductivity 98%
Decisio n making 98%
Organizatio nal culture 94%
Organizatio nal learning and kno wledge 90%
Co mmunicatio n 88%
Third party relatio ns 87%
Strategy fo rmulatio n and planning 81%
Flo w o f pro ducts/services 74%
Efficiency 73%
Quality o f pro ducts/services 71%
Flexibility 71%
Supplier relatio ns 63%
Co ntro l and fo llo w up 62%
Co st reductio ns 60%
Inbo und lo gistics 60%
Change management 56%
Info rmatio n 55%
Deliveries 54%
Integratio n and co o rdinatio n 53%
Custo mer relatio ns 47%
Lo ck-in effect/switching co sts 20%
New pro ducts/services 17%
Co mpetito r relatio ns 17%
Differentitatio ns in pro ducts/services 16%

0% 20% 40% 60% 80% 100%

Figure 31 – Business value prioritization for Hydro Power Sweden

Decisio n making
Differentitatio ns in pro ducts/services
Efficiency
Flexibility
Flo w o f pro ducts/services
Change management
Info rmatio n
Lo ck-in effect/switching co sts
Integratio n and co o rdinatio n
Co mmunicatio n
Co ntro l and fo llo w up
Organizatio nal learning and kno wledge
Quality o f pro ducts/services
Deliveries
Techno lo gy/to o ls
Inbo und lo gistics
Co st reductio ns
New pro ducts/services
Organizatio nal culture
P ro ductivity
Co mpetito r relatio ns
Custo mer relatio ns
Supplier relatio ns
Third party relatio ns
Strategy fo rmulatio n and planning

0% 20% 40% 60% 80% 100%

Eldistributio n Vattenkraft

Figure 32 – Business value prioritization comparison for Distribution Sweden and Hydro Power Sweden

Overall, the prioritizations are in line with what could be expected from the respective business units.
As can be seen below, there are a few notable differences between the assessed business units, but also
some clear similarities.

Respondents
10 respondents participated at Vattenfall Distribution Sweden, and 8 at Vattenfall Hydro Power Sweden.
Please note, however, that 3 respondents have been removed when calculating the final results due to
uncertainties in the results (see below).
Andreas Andersson Industrial Information and Control Systems, KTH

39
Master of Science Thesis 2006-09-14

Differences and Similarities between Assessed Business Units


The most apparent differences between the assessed business units are that customer relations are highly
prioritized for Distribution Sweden and technology/tools are at the lower end, with 100 per cent (i.e. the
highest priority) and 45 per cent, respectively (see Figure 32 above). The opposite is true for Hydro
Power Sweden, i.e. technology/tools are highly prioritized (100 per cent) and customer relations are at the other
end of the scale (47 per cent).

The differences between business units could be a result of different strategic goals and different
business conditions for the respective business units. For example, the highly prioritized customer
relations for Distribution Sweden could be related to the customer contact that the business unit has. And
vice versa, the low prioritization for Hydro Power Sweden regarding customer relations could reflect the fact
that the business unit has little end customer contact. Similarly, that technology/tools are highly
prioritized for Hydro Power Sweden could be linked to the fact that much of their operations are
concerned with maintenance of power generation stations and dams. Distribution Sweden, on the other
hand, encompasses a wider field of operations and technology/tools are likely not predominant in the
same way as for Hydro Power Sweden.

Regarding similarities, on the other hand, there are four business value dimensions that clearly have
the lowest priority for both business units. More specifically, these are competitor relations, lock-in
effect/switching costs, differentiations in products/services and new products/services that are ranked at 22 to 26 per
cent in relation to the highest prioritized dimension for Distribution Sweden and at 16 to 20 per cent for
Hydro Power Sweden (see Figure 32 above and Figure 33 below). The next priority after this lowly
prioritized group is at 37 per cent for Distribution Sweden and at 47 per cent for Hydro Power Sweden.
Hence, the difference between this particular group and the rest of the business value dimensions is
especially evident at Hydro Power Sweden.

The similarities between business units could be a consequence of the similar industry for the business
units, more precisely the utilities industry.

Business Value Dimension Distribution Sweden Hydro Power Sweden


Competitor relations 22% 17%
Lock-in effect/switching costs 22% 20%
Differentiations in products/services 25% 16%
New products/services 26% 17%
Next prioritized dimension 37% 47%
Figure 33 – Lowest prioritized business value dimensions for Distribution Sweden and Hydro Power
Sweden

Differences between Respondent Results


The prioritizations differ between respondents with different business focus. An example can be seen
in Figure 34 below, where the prioritizations are shown for a manager responsible of asset
development and for a manager in charge of customer relations. The example illustrates that the
prioritizations reflect the main areas of responsibilities for the managers. For instance, the customer
relation manager prioritized the categories customer relations, quality of product/services and deliveries as
important, and conversely supplier relations, inbound logistics and change management as less important. The
manager responsible for asset development and strategy, on the other hand, ranked the categories
strategy formulation and planning, efficiency and decision making as important business value dimensions.

Andreas Andersson Industrial Information and Control Systems, KTH

40
Master of Science Thesis 2006-09-14

Decisio n making
Differentiatio ns in pro ducts/services
Efficiency
Flexibility
Flo w o f pro ducts/services
Change management
Info rmatio n
Lo ck-in effect/switching co sts
Integratio n and co o rdinatio n
Co mmunicatio n
Co ntro l and fo llo w up
Organizatio nal learning and kno wledge
Quality o f pro ducts/services
Deliveries
Techno lo gy/to o ls
Inbo und lo gistics
Co st reductio ns
New pro ducts/services
Organizatio nal culture
Pro ductivity
Co mpetito r relatio ns
Custo mer relatio ns
Supplier relatio ns
Third party relatio ns
Strategy fo rmulatio n and planning

0% 20% 40% 60% 80% 100%

A sset Develo pment Custo mer relatio ns

Figure 34 – Differences in business value prioritization for respondents with


different areas of responsibility

Uncertainty in Results
Overall, the consistency of the answers is high, more precisely 89 per cent for Distribution Sweden and
88 per cent for Hydro Power Sweden. The consistency “is an indication of the quality of the
comparisons” [25]. As an example, if business value dimension A is prioritized higher than B and B
higher than C, then A is expected to be prioritized higher than C. But if C instead is prioritized higher
than A by the respondent, there are inconsistencies in the results. Hence a high consistency, i.e. close
to 100 per cent, indicates high quality in the results [25]. It should be noted, however, that the
consistency of two of the respondents at Distribution Sweden is so low (less than 60 per cent), that their
answers are not considered when summarizing the final results. At Hydro Power Sweden, the results of
one of the respondents include too much statistical deviation and have to be discarded. Thus, the
number of respondents included in the results is de facto 8 at Distribution Sweden and 7 at Hydro Power
Sweden.

6.2.2 Business Value Contribution


The contribution from functionality to business value is represented as a percentage of the maximum
contribution that can be reached from a certain functional area to a specific business value dimension.
For example, 100 per cent means that all the respondents have answered that the functional area has a
high contribution (i.e. 4 on the scale) to a certain business value dimension. And 25 per cent, in turn,
means that the average contribution according to respondents is 1 (as 1 out of 4 equals 25 per cent).
As an example, if out of four respondents one of them answers 0, two of them answer 1, and one of
them answers 2, the average contribution is 1, i.e. the equivalent of 25 per cent. The quality impact on
business value contribution is expressed in the same way, i.e. as a percentage of the maximum impact
that a quality dimension can have.

See Figure 35 below for an overview of the strongest business value contribution for the evaluated
functional areas for Distribution Sweden and Hydro Power Sweden combined. Please note that connections
with strength less than 75 per cent of the maximum contribution have been left out.

Andreas Andersson Industrial Information and Control Systems, KTH

41
Master of Science Thesis 2006-09-14

)
A
W
)

(S
V
IN

g
in
(E

ul
I)

ed
)
or
)

)
A

D
IP

ch

)
(M

nt

)
)

M
C

C)
V
(A

S
(F
ve

(S

C
IN
O

k
n

(T
or
n
g

io

(C

In

(D
cs
(G
ig
in

/W
ct

t
)

ti
n

en
rk

es
nn

D
e

t
ry
ig

en

g
sp

H
o

gi

em
la

in
to
es

em
Lo
P
In

l
P

(S

n
et

u
D

ag
S
an

ed
ve
t

(D
N
d

ag
en

an
h
an

)
n
an

lin
d

Sc
an
m

a
lI
ch
n
an

lM
R
di

u
ce

in
st

M
ca

n
t

(P
ed
io

pa
or

io

al
ha
n
ve

an

hi

t
io
ct

ct
h

C
en
ec

es
is
In

C
n

ap
Sc

A
at
ru

e
D
te

is

um
ly

bl
ch
st

gr
et

st

m
n

pp

ou
ub

or

or

it
ss

ai

el

eo
on

oc
re

w
Su
Fi
M

Tr
W

W
A

D
G

P
C

S
Figure 35 – Strongest business value contribution from functionality for the evaluated business units
combined

See Figure 36 and Figure 37 below for detailed business value contribution results for Distribution
Sweden and Figure 38 and Figure 39 for the results for Hydro Power Sweden, separately. Also, see Figure
40 and Figure 41 for the two business units combined. Please note that a very high contribution (i.e.
90-100 per cent) is marked with blue, and that high contribution (i.e. 75-89 per cent) is marked with
red. Also note that there is too little data for the ranking of quality dimensions to be presented (as it
was introduced into the method at a later stage).

Functional Areas (all numbers are in per cent)

Business Value Dimensions AIP MAI CON EINV FRD DSP SCHD GINV SCM PRM DOC SWA TCM Avg.
Decision making 100 100 100 100 100 100 100 100 100 92 83 83 75 81%
Differentiations in products/services 67 33 8 8 8 17 25 25 17 17 17 25 17 19%
Efficiency 58 25 17 17 25 50 58 83 92 100 100 92 75 68%
Flexibility 25 17 8 0 8 25 50 67 58 58 33 33 25 35%
Flow of products/services 25 25 17 17 42 67 58 58 42 42 25 17 17 31%
Change management 58 42 42 50 50 42 25 42 50 67 50 42 50 43%
Information 75 83 83 83 75 83 83 75 75 83 92 92 92 82%
Lock-in effect/switching costs 0 0 0 0 0 0 0 0 0 0 0 0 0 1%
Integration and coordination 42 25 33 42 67 67 67 58 58 50 67 58 83 52%
Communication 25 50 75 92 75 50 25 33 42 67 58 50 42 43%
Control and follow up 58 75 83 100 92 92 83 92 92 83 75 58 67 65%
Organizational learning and knowledge 75 67 67 58 42 25 17 33 50 83 67 58 42 43%
Quality of products/services 50 83 75 58 33 8 8 8 17 33 42 58 50 44%
Deliveries 50 67 100 75 58 42 33 17 0 17 17 33 25 31%
Technology/tools 8 42 33 33 0 0 0 0 0 0 0 8 17 7%
Inbound logistics 25 58 33 33 25 33 42 42 42 58 33 42 17 27%
Cost reductions 42 75 75 67 42 42 58 83 75 92 75 58 25 57%
New products/services 0 0 8 8 8 0 0 0 0 0 0 0 0 6%
Organizational culture 33 42 25 25 17 25 25 33 33 67 42 33 0 28%
Productivity 58 58 75 75 50 33 42 42 50 50 67 67 58 50%
Competitor relations 8 25 17 17 0 0 0 0 17 25 25 8 8 10%
Customer relations 58 67 75 67 58 33 25 8 8 25 25 25 17 51%
Supplier relations 8 33 50 50 42 33 50 58 58 42 17 8 17 33%
Third party relations 25 42 25 33 33 42 50 42 50 25 33 33 42 35%
Strategy formulation and planning 92 100 100 100 75 75 50 50 42 42 33 17 17 53%
Figure 36 – Business value contribution from functionality for Distribution Sweden

Andreas Andersson Industrial Information and Control Systems, KTH

42
Master of Science Thesis 2006-09-14

Functional Areas (all numbers are in per cent)


Quality Dimensions AIP MAI CON EINV FRD DSP SCHD GINV SCM PRM DOC SWA TCM Avg.
Usability 83 83 92 100 100 100 100 100 92 92 92 92 83 92%
Data quality 83 92 92 92 83 67 67 75 92 92 83 83 92 90%
Information security 75 92 92 92 92 83 83 75 75 75 75 83 92 85%
Interoperability 67 83 92 92 83 83 83 83 75 75 75 83 92 80%
Modifiability 58 75 75 75 58 67 67 75 67 67 67 75 75 68%
Performance 83 92 75 83 83 92 92 83 83 75 83 83 92 86%
Availability and reliability 83 75 75 83 92 92 92 83 75 75 75 75 75 86%
Figure 37 – Quality impact on business value contribution for Distribution Sweden

Functional Areas (all numbers are in per cent)

Business Value Dimensions AIP MAI CON SCM DOC EINV FRD SWA Avg.
Decision making 100 100 100 100 92 92 83 92 90%
Differentiations in products/services 0 0 0 0 0 0 8 8 6%
Efficiency 75 67 58 67 75 75 75 67 72%
Flexibility 33 33 42 33 33 17 25 17 29%
Flow of products/services 33 42 67 75 75 67 58 42 43%
Change management 42 42 58 58 58 42 58 58 56%
Information 83 75 75 75 83 83 75 75 85%
Lock-in effect/switching costs 0 0 0 0 0 0 0 0 0%
Integration and coordination 58 58 58 58 50 58 58 75 56%
Communication 42 42 42 50 58 50 42 25 44%
Control and follow up 75 83 92 92 100 100 100 100 81%
Organizational learning and knowledge 33 33 50 42 58 42 50 25 46%
Quality of products/services 42 42 50 58 75 92 100 67 51%
Deliveries 8 33 50 50 50 58 83 58 29%
Technology/tools 0 0 8 8 33 25 42 17 19%
Inbound logistics 25 33 33 33 17 33 58 83 32%
Cost reductions 33 42 50 58 67 50 58 67 54%
New products/services 0 0 0 0 0 0 8 8 1%
Organizational culture 33 33 42 42 58 50 58 42 35%
Productivity 25 42 50 58 58 58 58 42 47%
Competitor relations 0 8 17 17 17 8 8 0 4%
Customer relations 17 25 25 17 17 17 42 25 21%
Supplier relations 8 25 58 83 75 50 42 42 43%
Third party relations 17 17 33 25 42 25 42 17 29%
Strategy formulation and planning 92 83 75 67 58 33 25 33 54%
Figure 38 – Business value contribution from functionality for Hydro
Power Sweden

Functional Areas (all numbers are in per cent)


Quality Dimensions AIP MAI CON SCM DOC EINV FRD SWA Avg.
Usability 83 92 92 92 100 100 100 100 96%
Data quality 67 58 75 75 92 75 83 67 84%
Information security 83 83 75 75 75 75 83 75 79%
Interoperability 67 58 67 67 92 83 83 75 75%
Modifiability 58 58 67 67 83 75 75 58 66%
Performance 67 75 75 83 100 92 83 75 79%
Availability and reliability 67 75 67 75 83 100 100 100 79%
Figure 39 – Quality impact on business value contribution for
Hydro Power Sweden

Andreas Andersson Industrial Information and Control Systems, KTH

43
Master of Science Thesis 2006-09-14

Functional Areas (all numbers are in per cent)


Business Value Dimensions AIP MAI CON EINV FRD DSP SCHD GINV SCM PRM DOC SWA TCM Avg.
Decision making 100 97 95 71 75 100 100 58 25 25 63 92 100 83%
Differentiations in products/services 33 9 10 17 8 0 0 8 25 13 0 25 50 16%
Efficiency 67 78 45 75 67 100 100 75 75 25 88 92 75 69%
Flexibility 29 44 15 29 17 75 0 50 25 13 50 42 63 33%
Flow of products/services 29 59 50 29 50 50 50 17 50 13 50 25 25 38%
Change management 50 56 55 58 42 75 50 33 25 25 38 58 38 48%
Information 79 78 75 96 83 75 50 83 75 88 88 83 100 83%
Lock-in effect/switching costs 0 0 0 0 0 0 0 0 0 13 0 0 0 1%
Integration and coordination 50 59 65 58 50 75 50 25 25 50 50 92 50 55%
Communication 33 44 60 46 33 25 50 25 75 25 38 67 25 44%
Control and follow up 67 94 95 63 92 0 50 25 75 38 25 75 88 70%
Organizational learning and knowledge 54 47 40 50 33 25 25 17 50 13 25 50 50 42%
Quality of products/services 46 41 60 46 67 25 50 17 25 25 13 75 100 48%
Deliveries 29 34 70 25 33 0 50 8 25 25 0 17 63 34%
Technology/tools 4 13 10 21 8 0 0 0 50 0 0 17 0 12%
Inbound logistics 25 34 45 29 25 0 50 0 100 0 0 17 50 30%
Cost reductions 38 72 45 46 58 50 75 50 75 25 63 92 63 57%
New products/services 0 0 10 0 17 25 0 8 25 0 13 0 13 5%
Organizational culture 33 41 30 21 25 50 75 33 25 13 13 25 13 31%
Productivity 42 53 50 63 58 75 75 50 25 13 50 50 38 51%
Competitor relations 4 16 0 8 0 0 0 8 50 0 13 8 13 9%
Customer relations 38 19 50 17 25 75 100 58 25 50 25 100 100 43%
Supplier relations 8 66 40 17 42 0 50 17 100 0 38 42 63 36%
Third party relations 21 41 30 33 33 50 50 42 25 50 38 25 38 35%
Strategy formulation and planning 92 59 50 17 42 50 75 50 75 13 13 58 88 53%
Figure 40 – Business value contribution from functionality for Distribution Sweden and Hydro Power
Sweden combined

Functional Areas (all numbers are in per cent)


Quality Dimensions AIP MAI CON EINV FRD DSP SCHD GINV SCM PRM DOC SWA TCM Avg.
Usability 83 96 100 92 100 100 100 100 75 75 100 92 100 93%
Data quality 75 79 80 92 92 75 100 100 100 100 88 100 88 88%
Information security 79 79 85 79 67 75 100 75 100 100 100 92 88 83%
Interoperability 67 79 80 92 75 50 50 88 75 88 88 67 88 78%
Modifiability 58 75 60 75 67 50 50 88 50 63 50 83 88 69%
Performance 75 86 80 92 83 75 75 100 100 88 88 83 100 85%
Availability and reliability 75 82 95 75 58 100 100 100 100 88 100 92 100 84%
Figure 41 – Quality impact on business value contribution for Distribution Sweden and Hydro Power
Sweden combined

Respondents
7 respondents from Distribution Sweden and 7 from Hydro Power Sweden participated in data collection
interviews. Also, one respondent from Vattenfall Service was involved for complementing the results.
Altogether, 19 interviews have been carried out, of which 10 were in person and 9 over the phone (see
Appendix 9 for details on interviews and respondents). These interviews have resulted in a coverage
for the functional areas that can be seen in Figure 42 below.

Andreas Andersson Industrial Information and Control Systems, KTH

44
Master of Science Thesis 2006-09-14

Distribution Hydro Power


Functional Area Sweden Sweden
Asset Investment Planning (AIP) 3 3
Maintenance and Inspection (MAI) 4 4
Construction and Design (CON) 3 2
Trouble Call Management (TCM)4 2 N/a
Supply Chain and Logistics (SCM) 1 1
Premises (PRM)5 2 -
Geographical Inventory (GINV) 3 -
Document Management (DOC) 2 3
Substation and Network Inventory (EINV) 4 2
Switch Action Scheduling/Work Scheduling (SWA) 2 1
Field Recording and Design (FRD) 2 1
Work Dispatch (DSP)6 1 N/a
Work Scheduling (SCHD) 1 N/a
Figure 42 – Number of respondents interviewed per functional area for the assessed business
units

Please note, that for the functional areas where the coverage is low (especially only one respondent),
there are bigger variations in the results and the results are discrete. For example, if only one
respondent has been interviewed, the results jump in 25 per cent increments (i.e. from 0 per cent to 25
per cent, from 25 per cent to 50 per cent, and so forth).

Differences between Functional Areas

Business Value Contribution from Functionality


The overall business value contribution from most functional areas is relatively similar, i.e. when the
contribution to all of the business value dimensions for each functional area is added up (see Figure
36 and Figure 38 above). There are, however, some functional areas that overall contribute with less
business value than others, for example Geographical Inventory, Premises and Document Management. The
real differences between functional areas are that their contribution is focused on different business
value dimensions. For example, the functional area Asset Investment Planning contributes a great deal to
decision making (100 per cent for both assessed business units) as well as strategy formulation and planning
(92 per cent for both), whereas the functional area Substation and Network Inventory contributes the most
to information (94 per cent and 100 per cent respectively) and efficiency (81 per cent and 63 per cent
respectively).

Quality Impact on Business Value Contribution


The impact of the different quality dimensions is very similar regardless of which functional area that
is evaluated (see Figure 37 and Figure 39 above). For example, usability is consistently ranked as one
of the most important non-functional factors in terms of the effect it has on the business value
contribution (92 per cent and 96 per cent on average for the assessed business units), and modifiability
is frequently ranked as the least important (68 per cent and 66 per cent respectively).

4The functional area Trouble Call Management is not part of the functional reference model for Hydro
Power Sweden, as this business unit does not have any contact with the end customer. Nor Work
Dispatch is part of the model for Hydro Power Sweden.
5 The functional area Premises is not used very much at Hydro Power Sweden, according to the
respondents that have been interviewed. Thus, it has been difficult to find any suitable respondents
for it. The same is true for the functional area Geographical Inventory.
6The functional area Work Dispatch is performed mainly by Vattenfall Service, but has for the purpose
of this thesis been placed under Distribution Sweden, in order to cover the entire functional reference
model. The same is true for Work Scheduling.

Andreas Andersson Industrial Information and Control Systems, KTH

45
Master of Science Thesis 2006-09-14

Differences between Assessed Business Units7

Business Value Contribution from Functionality


There are two main ways that the results of the two business units can be compared:

1) One way is by contrasting the average contribution to the business value dimensions
between the business units (see Figure 43 below). This is done by comparing one separate
business value dimension at a time. For a business value dimension, the average contribution
from every functional area is contrasted between the business units. For example, if the
average value for decision making is 81 per cent for one business unit and it is 90 per cent for
the other business unit, the difference is 9 per cent. This difference is calculated for each
business value dimension and an average difference can be computed for the business units.
Performing this comparison, there is an average difference of 7 per cent between the
business units. Please note that difference includes values from all respondents for a certain
business value dimension (see the rightmost column in Figure 36 and Figure 38), and not just
from the functional areas listed in the figure.

Functional Areas (all numbers are in per cent)


Business Value Dimensions AIP MAI CON SCM DOC EINV FRD SWA Avg. difference
Decision making 0 6 13 75 29 13 0 13 9%
Differentiations in products/services 67 25 4 25 0 25 13 75 13%
Efficiency 17 15 50 25 13 19 13 13 4%
Flexibility 8 35 17 25 8 25 13 13 5%
Flow of products/services 8 10 21 50 42 31 0 38 11%
Change management 17 15 13 25 4 25 88 25 13%
Information 8 0 0 0 4 6 25 25 3%
Lock-in effect/switching costs 0 0 0 0 0 0 0 0 1%
Integration and coordination 17 2 4 75 25 31 38 13 3%
Communication 17 17 38 75 13 13 25 13 1%
Control and follow up 17 2 8 25 33 0 13 0 16%
Organizational learning and knowledge 42 17 4 50 58 19 13 0 3%
Quality of products/services 8 60 67 25 21 13 63 0 8%
Deliveries 42 40 29 25 0 19 50 50 2%
Technology/tools 8 25 25 50 8 25 25 50 12%
Inbound logistics 0 10 50 0 0 6 38 50 5%
Cost reductions 8 21 8 25 29 6 25 13 3%
New products/services 0 0 4 25 13 0 25 0 5%
Organizational culture 0 17 33 0 21 13 0 0 7%
Productivity 33 21 0 0 25 0 13 75 3%
Competitor relations 8 19 0 50 13 13 0 13 6%
Customer relations 42 17 21 25 17 25 38 0 30%
Supplier relations 0 17 4 50 4 13 25 88 10%
Third party relations 8 4 8 25 13 6 25 38 6%
Strategy formulation and planning 0 13 63 0 38 6 25 63 1%
Average difference 15% 16% 19% 30% 17% 14% 24% 27% 20% 7%
Figure 43 – Average difference in contribution to business value dimensions between business units.

2) The second way of comparing the business units is by contrasting the average
contribution from the functional areas between business units (see Figure 44 below). This
is done by comparing one separate functional area at a time. For a functional area, the
average contribution to every business value dimension is contrasted between the business
units. For example, if the average contribution for the functional area Maintenance and
Inspection for the business value dimension decision making is 100 per cent for one business
unit, and 94 per cent for the other business unit, the difference is 6 per cent. This difference
is calculated for each business value dimension and an average difference can be computed
for the business units for that particular functional area. This is repeated for all functional
areas and the average of these differences is the average difference between the business
units. Performing this comparison, there is an average difference of 20 per cent between
the business units. Please note, however, that the difference is lower if only functional areas
where at least two respondents have been interviewed are included in the comparison. The
average difference in this case is 16 per cent. To make this point clearer, the average

7Please note that the comparison is limited to the functional areas where at least one respondent has
been interviewed at both business units.
Andreas Andersson Industrial Information and Control Systems, KTH

46
Master of Science Thesis 2006-09-14

difference for functional areas where only one respondent have been interviewed is 27 per
cent.

Functional Areas (all numbers are in per cent)


Business Value Dimensions AIP MAI CON SCM DOC EINV FRD SWA Avg. difference
Decision making 0 6 13 75 29 13 0 13 9%
Differentiations in products/services 67 25 4 25 0 25 13 75 13%
Efficiency 17 15 50 25 13 19 13 13 4%
Flexibility 8 35 17 25 8 25 13 13 5%
Flow of products/services 8 10 21 50 42 31 0 38 11%
Change management 17 15 13 25 4 25 88 25 13%
Information 8 0 0 0 4 6 25 25 3%
Lock-in effect/switching costs 0 0 0 0 0 0 0 0 1%
Integration and coordination 17 2 4 75 25 31 38 13 3%
Communication 17 17 38 75 13 13 25 13 1%
Control and follow up 17 2 8 25 33 0 13 0 16%
Organizational learning and knowledge 42 17 4 50 58 19 13 0 3%
Quality of products/services 8 60 67 25 21 13 63 0 8%
Deliveries 42 40 29 25 0 19 50 50 2%
Technology/tools 8 25 25 50 8 25 25 50 12%
Inbound logistics 0 10 50 0 0 6 38 50 5%
Cost reductions 8 21 8 25 29 6 25 13 3%
New products/services 0 0 4 25 13 0 25 0 5%
Organizational culture 0 17 33 0 21 13 0 0 7%
Productivity 33 21 0 0 25 0 13 75 3%
Competitor relations 8 19 0 50 13 13 0 13 6%
Customer relations 42 17 21 25 17 25 38 0 30%
Supplier relations 0 17 4 50 4 13 25 88 10%
Third party relations 8 4 8 25 13 6 25 38 6%
Strategy formulation and planning 0 13 63 0 38 6 25 63 1%
Average difference 15% 16% 19% 30% 17% 14% 24% 27% 20% 7%
Figure 44 – Average difference in contribution from functional areas between business units.

Quality Impact on Business Value Contribution


The same reasoning as above applies to the comparison of quality impact. The business value
dimensions are simply exchanged with the quality dimensions and the contribution is exchanged with
the impact on the contribution. Hence, the following:

1) One way is by contrasting the average impact of the quality dimensions between the
business units (see Figure 45 below). This is done by comparing one separate quality
dimension at a time. For a quality dimension, the average impact on every functional area is
contrasted between the business units. For example, if the average value for usability is 92 per
cent for one business unit and it is 96 per cent for the other business unit, the difference is 4
per cent. This difference is calculated for each quality dimension and an average difference
can be computed for the business units. Performing this comparison, there is an average
difference of 5 per cent between the business units. Please note that difference includes
values from all respondents for a certain quality dimension (see the rightmost column in
Figure 37 and Figure 39), and not just from the functional areas listed in the figure.

Functional Areas (all numbers are in per cent)


Quality Dimensions AIP MAI CON SCM DOC EINV FRD SWA Avg. difference
Usability 0 8 0 25 0 13 0 13 4%
Data quality 17 0 8 50 13 13 13 0 6%
Information security 8 0 17 25 8 25 25 13 6%
Interoperability 0 17 8 0 13 13 38 63 5%
Modifiability 0 8 4 0 0 0 13 25 2%
Performance 17 0 8 25 21 13 25 13 7%
Availability and reliability 17 8 8 0 25 0 13 13 7%
Average difference 8% 6% 8% 18% 11% 11% 18% 20% 12% 5%
Figure 45 – Average difference in quality impact from quality dimensions between business units.

2) The second way of comparing the business units is by contrasting the average impact on
the functional areas between business units (see Figure 46 below). This is done by
comparing one separate functional area at a time. For a functional area, the average impact
from every quality dimension is contrasted between the business units. For example, if the
average impact for the functional area Maintenance and Inspection from the quality dimension
usability is 100 per cent for one business unit, and 92 per cent for the other business unit, the
Andreas Andersson Industrial Information and Control Systems, KTH

47
Master of Science Thesis 2006-09-14

difference is 8 per cent. This difference is calculated for each quality dimension and an
average difference can be computed for the business units for that particular functional area.
This is repeated for all functional areas and the average of these differences is the average
difference between the business units. Performing this comparison, there is an average
difference of 12 per cent between the business units. Please note, however, that the
difference is lower if only functional areas where at least two respondents have been
interviewed are included in the comparison. The average difference in this case is 9 per cent.
To make this point clearer, the average difference for functional areas where only one
respondent have been interviewed is 18 per cent.

Functional Areas (all numbers are in per cent)


Quality Dimensions AIP MAI CON SCM DOC EINV FRD SWA Avg. difference
Usability 0 8 0 25 0 13 0 13 4%
Data quality 17 0 8 50 13 13 13 0 6%
Information security 8 0 17 25 8 25 25 13 6%
Interoperability 0 17 8 0 13 13 38 63 5%
Modifiability 0 8 4 0 0 0 13 25 2%
Performance 17 0 8 25 21 13 25 13 7%
Availability and reliability 17 8 8 0 25 0 13 13 7%
Average difference 8% 6% 8% 18% 11% 11% 18% 20% 12% 5%
Figure 46 – Average difference in quality impact on functional areas between business units.

Differences between Business Value Dimensions


The business value contribution differs between business value dimensions in the sense that certain
dimensions receive high or low values consistently. For example, most functional areas contribute
highly to information, decision making, efficiency and control and follow up (see Figure 47 below for values). In
contrast, most functional areas contribute little to, for example, lock-in effect/switching costs, new
products/services, technology/tools, competitor relations and differentiations in products/services (see Figure 48 below
for values).

Business Value Dimension Distribution Sweden Hydro Power Sweden


Information 82% 85%
Decision making 81% 90%
Efficiency 68% 72%
Control and follow up 65% 81%
Figure 47 – Business Value Dimensions with High Contribution

Business Value Dimension Distribution Sweden Hydro Power Sweden


Lock-in effect/switching costs 1% 0%
New products/services 6% 1%
Technology/tools 7% 19%
Competitor relations 10% 4%
Differentiations in products/services 19% 6%
Figure 48 – Business Value Dimensions with Low Contribution

Differences between Quality Dimensions


All quality dimensions have a relatively high impact on the business value contribution, the highest
being usability (92 per cent and 96 per cent on average for the assessed business units), and the lowest
being modifiability (68 per cent and 66 per cent respectively).

Uncertainty in Results
There is lower uncertainty and smaller differences between the assessed business units for functional
areas where more respondents have been interviewed. As an example, for functional areas where only
one respondent have been interviewed for either business unit, the average difference is 27 per cent
for business value contribution, and 18 per cent for quality impact. For functional areas where at least
two respondents have been interviewed for both business units, on the other hand, the average
difference is only 16 per cent for business value contribution, and 9 per cent for quality impact (see
Differences between Assessed Business Units above).

Andreas Andersson Industrial Information and Control Systems, KTH

48
Master of Science Thesis 2006-09-14

7 Validity and Reliability


7.1 Validity
Validity refers to that what is intended to be evaluated is evaluated [3]. In the context of this thesis
validity refer to that business value of IT systems is evaluated. Validity can furthermore be analyzed in
terms of Construct Validity and External Validity to discuss the correctness of operational measures and
generalizability of results and conclusion (see below for more details) [27].

7.1.1 Construct Validity


Construct validity refers to that “subjective” judgments should not be used in the data collection.
Subjectivity is thought be reduced by means of a sufficiently operational set of measures when
collecting data [27].

Business Value Prioritization


In this thesis, the business value prioritization is evaluated by assessing what managers in the assessed
business unit considers to be of most value to their organization (see section 5.2.1). The operational
measures correspond to the business value dimensions in the prioritization (see section 3.2). They
have been extracted through an extensive literature study, which resulted in a number of benefits from
IS/IT, that has been organized, grouped and finally categorized into the business value dimensions
used in this thesis [12]. As there are 25 different resulting business value dimensions, they all concern
a very specific area of the business, and have to be considered detailed in that regard.

Of course the evaluation of what the respondent “considers” to be of most value to their organization
is in itself subjective (value is per definition a subjective term). However, all the respondents have
access to the same business value dimensions, and hence the subjective judgment is limited to how
they prioritize the dimensions. The judgment is not subjective in terms of what the business value
dimensions mean, as it is clearly stated in the explanations and examples.

Business Value Contribution


The business value contribution is evaluated in two steps. First, the business value contribution from
functionality is identified, and then the quality impact on this contribution is established (see section
5.2.2). The operational measures for contribution from functionality are the functionality specified in
the functional reference on the one hand (see section 3.1), and the business value dimensions on the
other hand (see section 3.2). The functional reference model breaks down the business into clearly
defined functional areas, which in turn are broken down into two layers of smaller components, i.e.
abstract components and measurement points. For reasons related to resource requirements and
practicality, the evaluation for business value contribution is carried out on a higher functional level,
close to the sub-function level (see section 5.3.2 for justifications). Thanks to the more detailed
breakdown, however, the functional areas that are evaluated have been thoroughly researched and the
descriptions of them are therefore to the point and very relevant. The business value dimensions are
also detailed enough (see Business Value Prioritization above). For the quality impact, the operational
measures are the quality dimensions described in section 3.3. The level of detail for them is a trade-off
between being too detailed and too general (see section 5.3.2 for justifications), but the explanations
are designed to be understood in the same way by all respondents.

Similarly to the business value prioritization, the evaluation of how much business value the functional
areas contribute with is in itself somewhat subjective. But, as mentioned for the prioritization, all the
respondents have access to the same functional area descriptions and the same business value
dimensions, and thus the subjective judgment is limited to how they value the contribution.
Consequently, the evaluation is not subjective in terms of what the functional areas and business value
dimensions mean, as it is clearly stated in the explanations and examples. The same is true for the
quality impact, i.e. the evaluation is not subjective in terms of what the quality dimensions mean, as it
is clearly stated in the explanations.

7.1.2 External Validity


External validity concerns whether a study’s results and conclusions are generalizable beyond the
immediate case study [27]. Replicating a study’s findings in other domains provides support to
generalize a specific set of results to a more general theory [27]. One of the overall objectives of this
Andreas Andersson Industrial Information and Control Systems, KTH

49
Master of Science Thesis 2006-09-14

thesis is to develop a method for evaluating business value of IT systems (see section 2.2). The
objective is expressed in general terms since the method is thought to be applicable to any domain for
which a functional reference model can be compiled. In this thesis two separate case studies have
been conducted in order to validate that the method is generalizable to different domains, i.e. to the
electric distribution domain and to the hydro power domain (see section 6.2).

7.2 Reliability
In order for a method to be reliable, it needs to be repeatable. More specifically, two investigators
following the same procedures and conducting the same case study should arrive at the same results
and conclusions [27]. In order to make procedures and case studies repeatable, they need to be
carefully documented [27]. The method described in this thesis is structured in a three-step model (see
section 5.1). Each of these steps is described in detail to ensure that other people are able to repeat the
same procedures (see section 5.2). Furthermore, the prioritization is performed through a web-based
software, which can be used in exactly the same way repeatedly. In addition, the interviews in the case
study are conducted according to structured interview manuals (see section 5.2.2), and structured
evaluation forms are used for registration of the case study results (see Appendix 5, Appendix 6 and
Appendix 7 for forms used). All interviews have also been documented with regards to what was
evaluated, how the interview was conducted (in person or phone interview), the respondent profile,
and which credibility assessments were registered for the respondent at the time of the interview (see
sections 6.1.1 and 6.1.2).

Andreas Andersson Industrial Information and Control Systems, KTH

50
Master of Science Thesis 2006-09-14

8 Future Recommendations
The following chapter presents recommendations on how to further improve the method developed
in this thesis (phase IV of the ITIEM), but also the overall ITIEM.

8.1 Recommendations for Phase IV of the IT Investment


Evaluation Method
The following sections provide recommendations specific to the phase of ITIEM that this thesis
covers, i.e. phase IV.

8.1.1 Reduced Number of Business Value Dimensions


During the data collection a common perception amongst respondents was that there are too many
business value dimensions, and that some of them are hard to distinguish from each other (see section
6.1.2). It might be possible to reduce them, using the following criteria:
• Low Business Value Prioritization at Comparable Business Unit
If certain business value dimensions are prioritized low, it might indicate that they are not
applicable for the assessed business unit. For example, in the case study of this thesis, the
dimensions competitor relations, lock-in effect/switching costs, differentiations in products/services and new
products/services were ranked low at both the assessed business units (see section 6.2.1). The
similar results for both business units indicate that a low prioritization might be generalized
to comparable types of business units.
• Low Business Value Contribution at Comparable Business Unit
If certain business value dimensions generally receive low business value contribution from
functional areas, it might indicate that they are not applicable for the assessed business unit.
For example, in the case study of this thesis, the dimensions lock-in effect/switching costs, new
products/services, technology/tools, competitor relations and differentiations in products/services received
low contribution (see section 6.2.2). The similar results for both business units indicate that a
low business value contribution might be generalized to comparable types of business units.
• Lack of Distinction between Business Value Dimensions
If respondents have trouble telling the difference between certain business value dimensions,
it might be better to combine these dimensions. In the case study of this thesis, respondents
had difficulty distinguishing between the dimensions flow of products/services and integration and
coordination, as well making a distinction between efficiency and productivity. There might be a
possibility to combine these pairs into one dimension each.

There is also a possibility of combining several of the criteria above, for example the low priority with
the low business value contribution. In the case study of this thesis the two happens to coincide,
which indicates even stronger that they can be reduced for comparable types of business units.

8.1.2 Examples for Business Value Dimensions Adapted to Evaluated Domain


The current explanations of the business value dimensions are of a fairly general character. In order to
make it easier for respondents to understand what the dimensions stand for, the explanations could be
more adapted to the domain that is being evaluated (such as the utilities domain). Also, more
examples could be added that use terms and expressions understood by respondents.

If the instructions and examples were more detailed and more closely adapted to the domain of the
respondents, it might be possible to send out questionnaire forms to people instead of conducting
interactive interviews. This might be an appropriate approach, if the aim is a fast large coverage of the
functional areas. The credibility and consistency of the answers might be slightly lower, as the
interviewer cannot instruct the respondents in the same way as at a regular interview, but this is
balanced with the larger number of respondents participating, i.e. the higher coverage. See section
5.1.3 for further details on coverage, consistency and credibility.

8.1.3 Re-use of Business Value Contribution at Comparable Business Unit


The results of the business value contribution might to some degree be re-used at a comparable
business unit, provided that certain conditions are fulfilled. In the case study of this thesis for
example, the results from the two assessed business units are relatively similar, which indicates that the

Andreas Andersson Industrial Information and Control Systems, KTH

51
Master of Science Thesis 2006-09-14

results from one business unit might be used as a basis for the evaluation of the other business unit.
However, the re-use requires two conditions to be fulfilled:
• Comparable Functional Reference Models
The functional reference models have to be comparable for the business units. This is
because the business value contribution directly depends on which type of functionality that
is being evaluated, and the functionality is explicitly specified in the functional reference
model. In the case study of this thesis, the functional reference model for Hydro Power Sweden
was a sub-set of the model for Distribution Sweden (see Appendix 1 and Appendix 2).
• Completed Business Value Prioritization
The business value prioritization has to be completed at the business unit that plans to re-use
contribution results from another business unit. This is because the prioritization might be
used to highlight potential differences between the business units – differences that could be
a result of different strategic goals and different business conditions for the respective
business units. In the case study of this thesis for example, the business value dimension
customer relations was prioritized much higher at one of the assessed business units (see section
6.2.1). As it turned out, the biggest difference between the business units regarding business
value contribution was also related to the dimension customer relations (see section 6.2.2). This
suggests that the prioritization might be a possible indicator for potential differences in
business value contribution for the business units. Please note that it is important to
especially analyze the dimensions that are prioritized the highest, as they have the biggest
impact on the end business value result when business value prioritization and contribution
are aggregated.

8.2 General Recommendations for the IT Investment Evaluation


Method
The following sections provide general recommendations for the overall ITIEM project, and are
applicable to all the phases in the ITIEM.

8.2.1 Clearly Delimit Scope


The scope of the evaluation has to be clearly delimited, to ensure a focused effort on clearly
distinguishable IT system scenarios. A prerequisite to applying the method is to know which
department within the business unit that is to be evaluated and which areas of responsibility the
respondents need to have.

8.2.2 Interest and Commitment from Business Unit


Interest and commitment from the assessed business unit is key in order to facilitate the selection and
booking of respondents and to receive high quality answers. This to a large extent depends on the
situation of the business unit, for example if they are in the process of replacing existing systems or
have other incentives to participate. Regardless, the participation is greatly facilitated by a close contact
with at least one interested and committed person in a management position. It is namely easier to use
a top-down strategy for accessing the rest of the business unit and for having other useful people in
the organization pointed out.

8.2.3 Telephone Interviews


Thanks to the structured and primarily quantitative interviews it is possible to perform telephone
interviews. This type of interview saves both time and money, mainly in the form of reduced travelling
time and travelling costs. It also facilitates the booking of respondents, as it is possible to plan the
interviews closer together and to carry out several interviews in a row during the same day from the
same office.

There is potential to make the method even faster and more convenient. If questionnaires were made
more self-explaining with additional examples, it might be possible to send out questionnaires to the
respondents, and to have them fill in the questionnaires on their own. A shorter follow up could then
be held with respondents afterwards in order to discuss and resolve any problems, as well as validate
the answers. See section 8.1 above for more detailed suggestions of improvement for the
questionnaires. Another possibility might be to design web based questionnaires with more interactive

Andreas Andersson Industrial Information and Control Systems, KTH

52
Master of Science Thesis 2006-09-14

examples, where the answers could go directly into the database used for calculating the business value
of IT system scenarios in ITIEM.

8.2.4 Understanding and Communicating the ITIEM Frameworks


One of the most important elements of the ITIEM is to thoroughly understand the frameworks used
in the ITIEM and to communicate them to respondents. The most central framework for the method
is the functional reference model, which forms the backbone of the evaluation and can be very
extensive, covering a number of functional areas. In order to fully understand it, previous knowledge
within the domain in question (such as the utilities industry and maintenance domain) is preferable.
For example, knowing the terms and expressions used by persons in the domain helps both
understanding and when communicating the concepts to respondents. Also, knowing enough about
the operations of the assessed organization to be able to give relevant examples helps in
communicating the method to respondents. Regardless, the interviews should be limited to the
relevant parts of the functional reference model, as respondents can easily be overwhelmed with all
the different functional areas otherwise. Generally, theoretical terms like abstract component and sub-
function should be avoided when talking to respondents.

Andreas Andersson Industrial Information and Control Systems, KTH

53
Master of Science Thesis 2006-09-14

9 References
[1] Anderssen H. (1994) Vetenskapsteori och metodlära – En introduktion. Studentlitteratur. Lund.

[2] Bass L., Clements P., Kazman R. (1998) Software Architecture in Practice, Addison Wesley Longman.

[3] Davidsson B. & Patel R. (1991) Forskningsmetodikens grunder – Att planera, genomföra och rapportera en
undersökning. Studentlitteratur. Lund.

[4] DoD Architecture Framework Working Group, DoD Architecture Framework Version 1.0,
Department of Defense, 2003.

[5] Ekstedt M. (2004) Enterprise Architecture as a Means for IT Management. EARP Working Paper
2004-02, Department of Industrial Information and Control Systems. KTH.
Available: http://www.ics.kth.se/Publikationer/Working%20Papers/EARP-WP-2004-02.pdf
[accessed June 28, 2006]

[6] Gammelgård M. et al. A Reference Model for IT Management Responsibilities. Department of Industrial
Information and Control Systems. KTH.

[7] Gammelgård M. (2006) Credibility Assessment in the ITIEM Project. Working Paper, Department of
Industrial Information and Control Systems. KTH.

[8] Gammelgård M. and Ekstedt M. (2006) Dimensions of Benefits from IT/IS. EARP Working Paper
MG101, Department of Industrial Information and Control Systems. KTH.
Available:
http://www.ics.kth.se/Publikationer/Working%20Papers/EARP%20Working%20Paper%20Seri
es%20MG101.pdf
[accessed June 21, 2006]

[9] Gammelgård M. et al. (2005) IT Investment Evaluation Method Project Plan. Department of Industrial
Information- and Control Systems. KTH.

[10] Gammelgård M. et al. (2006) Business Value Evaluation of IT Systems: Developing a Functional Reference
Model. In Proceedings at the Conference on Systems Engineering Research, April 2006.
Department of Industrial Information and Control Systems. KTH.

[11] Gammelgård M. and Ekstedt, M. (2006) The Total Quality ATD. Working Paper, Department of
Industrial Information and Control Systems. KTH.

[12] Gammelgård M. et al. (2006) A Categorization of Benefits from IS/IT Investments. Department of
Industrial Information and Control Systems. KTH.

[13] Gammelgård M., Ekstedt M. and Närman P. (2006) Enterprise Architecture Scenario Analysis –
Estimating the Credibility of the Results. Submitted to the 40th Annual Hawaii International
Conference on System Sciences (HICSS’07).

[14] International Standard IEC 61968-1 – Application integration at electrical utilities – System interfaces for
distribution management – Part 1: Interface Architecture and General Requirements, Technical committee
57, work group 14, 2003.

[15] Johansson E., Ekstedt M. and Johnson P. (2006) Assessment of Enterprise Information Security – The
Importance of Information Search Cost, Proceedings of the 39th Annual Hawaii International
Conference on System Sciences (HICSS’06) Track 9, 2006.

[16] Johnson M. (2006) A Method for Measuring Technical Quality of IT Systems - A Case Study on Work and
Maintenance Applications within Asset Management at Vattenfall AB. Master thesis, Department of
Industrial Information- and Control Systems. KTH.

Andreas Andersson Industrial Information and Control Systems, KTH

54
Master of Science Thesis 2006-09-14

[17] Johnson, P. et al. (2004) Using Enterprise Architecture for CIO Decision-Making: on the Importance of
Theory, Proceedings of the Conference on Systems Engineering Research, Los Angeles, USA,
2004.

[18] Johnson P. (2006) Credibility Assessment Methodology. Working Paper, Department of Electrical
Engineering, division of Industrial Information- and Control Systems. KTH.

[19] Närman P. (2006) A Functional Reference Model For Work- And Maintenance Applications Within Asset
Management At Vattenfall. Master thesis, Department of Industrial Information- and Control
Systems. KTH.

[20] Maier M. W. and Rechtin E. (2000) The Art of Systems Architecting, 2nd ed., CRC Press.

[21] Preece J. et al. (2002) Interaction Design: beyond human-computer interaction. Wiley. U.S.

[22] Saaty T.L. (1980) The Analytic Hierarchy Process, McGraw-Hill, Inc.

[23] Schekkerman J. (2004) How to survive in the jungle of Enterprise Architecture Frameworks, Trafford.

[24] Sowa J. and Zachman J. (1992) Extending and Formalizing the Framework for Information Systems
Architecture, IBM Systems Journal, Vol. 31, No 3, 1992.

[25] Telelogic AB (2006) Focal Point Product Information, [Online]


Available: http://www.focalpoint.se [accessed June 21, 2006].

[26] TOG, The Open Group (2002), The Open Group Architectural Framework Version 8, The Open
Group.

[27] Yin R. (2003) Case Study Research: Design and Methods - Third Edition. Sage Publications. U.S.

[28] Zachman J. (1987) A Framework for Information Systems Architecture, IBM Systems Journal, Vol. 26,
No 3, 1987.

Andreas Andersson Industrial Information and Control Systems, KTH

55
Master of Science Thesis 2006-09-14

10 Appendices
Appendix 1 – Functional Reference Model Vattenfall Distribution
Sweden
The following appendix presents the functional reference model for Distribution Sweden down to the
abstract component level [19].

Business Business Sub-


Abstract Component
Function Function

Procurement
and Logistics
Supply Chain

Contract Management
(SCM)

Warehouse Logistics

Materials Management
External to DMS

Address
Premises (PRM)

Source substation

Meter information

Right of Ways, Easements, Grants

Real Estate Management


Management
Document

(DOC)

Document Management
Substation and

Equipment Characteristics
Inventory
Network

(EINV)

Connectivity Model

Substation Display

Telecontrol Database
Geographical

Network Displays
Inventory
(GINV)

Cartographic Maps

Maintenance Strategy

Life Cycle Planning


Records and asset management

Reliability Centered Analysis

Engineering and design standards

Performance Measurements

Performance Indices

Risk management
Asset Investment Planning (AIP)

Environmental management

Decision Support

Budget Allocation

Maintain Work Triggers

Asset Maintenance Groups (lists)

Asset Failure History

Asset Financial Performance

Thermal Ratings of Network Equipment and Lines

Load Forecast

Power Flows Computation

Contingency Analysis

Short Circuit Analysis

Optimal Power Flow

Energy Loss Calculations

Andreas Andersson Industrial Information and Control Systems, KTH

56
Master of Science Thesis 2006-09-14

Feeder Voltage Profiles

Maintenance Program Management

Maintain Work Triggers

Maintenance and Inspection (MAI)


Asset Maintenance Groups (lists)

Manage Inspection Readings

Asset Maintenance History

Asset Failure History

Work Order Status Tracking

Work Order Closing

Financial Control

Refurbishment Processing

Inspection

Work Initiation

Work Initiation
Construction and Design / Revision (CON)

Work Design
Maintenance and Construction

Work Cost Estimation

Work Flow Management

Work Order Status Tracking

Work Order Closing

Financial Control

Project Planning and Scoping

Phase-In Equipment

Asset Scrapping

Phase-Out Equipment

Work Task Planning


Work Scheduling

Crew Management
(SCHD)

Equipment Management

Material Coordination

Permit Management
Field Recording and

Field Design
Design (FRD)

Field Inspection Results

Crew Time Entry

Actual Materials

Actual Equipment

Field status tracking


Dispatch
(DSP)
Work

Real Time Communication

Weather Monitoring

Outage Calls
Customer Support

Management (TCM)

Power Quality
Trouble Call

Planned Outage Notifications

Media Communications

Restoration Projection/Confirmation

Outage History
Scheduling (SWA)
Scheduling/Work
Planning and
Optimisation

Switch Action
Operational

Release/Clearance Remote Switch Command Scheduling

Customer Outage Analysis and Information

Andreas Andersson Industrial Information and Control Systems, KTH

57
Master of Science Thesis 2006-09-14

Appendix 2 – Functional Reference Model Vattenfall Hydro Power


Sweden
The following appendix presents the functional reference model for Hydro Power Sweden down to the
abstract component level [19].

Business Business Sub-


Abstract Component
Function Function

Supply Chain
and Logistics
Procurement

(SCM)
Contract Management
Warehouse Logistics
External to DMS

Materials Management
Premises
(PRM)

Real Estate Management


Management
Document

(DOC)

Document Management
and Network
Substation

Inventory
(EINV)

Equipment Characteristics
Geographical
Inventory
(GINV)
Records and asset management

Cartographic Maps

Maintenance Strategy

Life Cycle Planning


Asset Investment Planning (AIP)

Reliability Centered Analysis

Engineering and design standards

Performance Measurements

Risk management

Environmental management

Decision Support

Budget Allocation

Maintain Work Triggers

Asset Maintenance Groups (lists)

Asset Failure History

Asset Financial Performance

Maintenance Program Management

Maintain Work Triggers


Maintenance and Inspection (MAI)

Asset Maintenance Groups (lists)

Manage Inspection Readings


Maintenance and Construction

Asset Maintenance History

Asset Failure History

Work Order Status Tracking

Work Order Closing

Financial Control

Refurbishment Processing

Inspection

Work Initiation
Design / Revision
Construction and

Work Initiation
Work Design
(CON)

Work Cost Estimation


Work Flow Management

Work Order Status Tracking

Andreas Andersson Industrial Information and Control Systems, KTH

58
Master of Science Thesis 2006-09-14

Work Order Closing


Financial Control
Project Planning and Scoping
Phase-In Equipment
Asset Scrapping
Phase-Out Equipment

Recording Work Scheduling


Work Task Planning
Crew Management

(SCHD)
Equipment Management
Material Coordination
Permit Management
Field Inspection Results
and Design
(FRD)

Crew Time Entry


Field

Actual Materials
Actual Equipment
Scheduling/Wor
Planning and
Optimisation

Switch Action
Operational

k Scheduling
(SWA)

Release/Clearance Remote Switch Command Scheduling

Andreas Andersson Industrial Information and Control Systems, KTH

59
Master of Science Thesis 2006-09-14

Appendix 3 – Estimated Resource Requirements for Different


Functional Levels
The following calculations relate to the interviews identifying contribution to business value from
functionality. The purpose is to show how the resource requirements change, depending on which
functional level the respondents are interviewed on. Resources in this case refer to respondents and
time.

Key Figures

Number of abstract components


Distribution Sweden 82
Hydro Power Sweden 54

For a structural overview of the Functional Reference Model and a list of abstract components, see
Appendix 1 and Appendix 2.

Number of abstract component groups


Distribution Sweden8 14
Hydro Power Sweden9 12

For an overview of the abstract component groups, see Figure 49 below.

Business function Groups of abstract components


External to DMS Supply Chain and Logistics (SCM)
Premises (PRM)
Document Management (DOC)
Records and Asset Management Substation and Network Inventory (EINV)
Geographical Inventory (GINV)
Asset Investment Planning (AIP)
Network Calculations (CALC)
Maintenance and Construction Maintenance and Inspection (MAI)
Construction and Design (CON)
Work Scheduling (SCHD)
Field Recording and Design (FRD)
Work Dispatch (DSP)
Customer Support Trouble Call Management (TCM)
Operational Planning and Optimisation Switch Action Scheduling/Work Scheduling (SWA)

Figure 49 – Groups of abstract components evaluated in the thesis

Estimates
The following estimates are based on experience gained during actual interviews.

8 The groups coincide with the structure of the subfunctions in the Functional Reference Model with
one exception. The subfunction Asset Investment Planning (AIP) has been divided into two smaller
groups of abstract components, since a clear distinction can be made between the regular AIP
functions and functions concerning Network Calculations. The same distinction has been made for
Hydro Power Sweden.
9Trouble Call Management is not part of the model for Hydro Power Sweden, as this business unit
does not have any contact with the end customer. Nor Work Dispatch is part of the model for Hydro
Power Sweden.

Andreas Andersson Industrial Information and Control Systems, KTH

60
Master of Science Thesis 2006-09-14

Time
Below are average times that it takes to interview a respondent for different parts.
• Contribution to business value for an abstract component10: 15 minutes
• Contribution to business value for a group of abstract components: 20 minutes
• Quality evaluation for an abstract component, or a group of abstract components: 10 minutes
• Overhead for the first interview (explanation of background and purpose for example): 15
minutes

Number of respondents
There is a need to interview at least 2 respondents for each abstract component, or for each group of
abstract components (depending on which level the evaluation is carried out). This is done in order to
ensure a lower uncertainty in the answers, and to avoid that a single respondent influences the results
too much. Moreover, the credibility is increased if more independent sources are used [Pontus’
credibility article].

Interview coverage
If the evaluation is carried out on the abstract component level, a respondent can on average evaluate
the abstract components in 1 group. This limitation to 1 group is due to the fact that the respondent
needs to have a deeper knowledge in a more limited functional area (see section 5.3.2).
If the evaluation is carried out on the abstract component group level, a respondent can on average
evaluate 3 groups of abstract components. This is due to the fact that it is possible to use respondents
with less detailed knowledge with a broader view of the business (see section 5.3.2).

Resource Requirements

Respondents

Abstract component level Respondents11


Distribution Sweden 28
Hydro Power Sweden 24
Total 52

Abstract component group level


Distribution Sweden 10
Hydro Power Sweden 8
Total 18

Summary: Evaluation on the abstract component level requires around 3 times as many different
respondents compared to the abstract component group level.

10 The only estimates that have not been validated thoroughly concern the time that is needed for an

abstract component (when evaluation is carried out on the abstract component level). These times are
most probably not shorter than 15 minutes (for contribution to business value) and 10 minutes (for
quality evaluation), but could most likely be longer. For example, the step in the interview that
requires the most time is to go through the business value dimensions (when identifying business
value contribution), something that takes a similar amount of time regardless of the size of the
functional area that is evaluated.
11 For the abstract component level, the number is reached by multiplying the number of abstract

component groups with the number of needed respondents for each group (as a respondent on
average can evaluate the abstract components in 1 group). For example, 14 groups x 2 respondents
needed = 28 respondents needed in total for Distribution Sweden.
For the abstract component group level, the number is reached by multiplying the number of abstract
component groups divided by three (as a respondent on average can evaluate 3 groups) with the
number of needed respondents for each group. For example, (14 groups / 3) x 2 = 10 respondents
needed in total for Distribution Sweden (the result is rounded up).
Andreas Andersson Industrial Information and Control Systems, KTH

61
Master of Science Thesis 2006-09-14

Interview time (hours)

Abstract component level Overhead Interview Total


Distribution Sweden 7,0 68,3 75,3
Hydro Power Sweden 6,0 45,0 51,0
Total 13,0 113,3 126,3

Abstract component group level


Distribution Sweden 2,5 14,0 16,5
Hydro Power Sweden 2,0 12,0 14,0
Total 4,5 26,0 30,5

This means an average interview time per respondent of around 2 hours for evaluation on the
abstract component level, and of around 1 hour and 40 minutes for the group level.

Summary: Evaluation on the abstract component level requires around 4 times as much interview
time compared to the abstract component group level.

Please note, that the actual number of respondents that were interviewed for this thesis is 8 for
Distribution Sweden (including one respondent from Vattenfall Service that covered certain functional
areas), and 7 for Hydro Power Sweden. The interviews required 30,5 hours in total, which is in line with
the estimates and verifies that this way of approximating resource requirements is feasible.

Andreas Andersson Industrial Information and Control Systems, KTH

62
Master of Science Thesis 2006-09-14

Appendix 4 – Descriptions of Functional Areas


The descriptions of the functional areas are based on the functional reference model from A Functional
Reference Model For Work- And Maintenance Applications Within Asset Management At Vattenfall [19].

Vattenfall Distribution Sweden

Asset Investment Planning


Functionality for planning and optimizing investments.

Includes functionality for:


• storing and implementing the maintenance strategy, for example applying guidelines outlined in
the strategy regarding what triggers investments
• life cycle analysis (for different investment alternatives)
• RCM analysis (analysis for Reliability Centered Maintenance)
• performance measurements on components and the network (for example different
performance indices and availability measures)
• risk analysis and management (for example financial, environmental, human and property risks)
• decision support (collecting and analyzing all available relevant data regarding, for example,
risks, the environment and the financial situation)
• budget allocation (both for short-term investment projects and long-term future projects)
• statistically analyzing trends and patterns (for example Asset Maintenance Groups)
• asset failure history (storing and analyzing previous failures of components and component
types)

Maintenance and Inspection


Functionality for planning, managing and controlling the inspection and maintenance process.

Includes functionality for:


• managing the maintenance program and maintenance plans (for example, breaking down and
specifying inspection intervals and preventive maintenance activities)
• co-ordinating maintenance activities with network extension planning and other investment
projects
• managing work trigger levels (making sure they are in sync with the overall maintenance plans
and that they are set optimally)
• planning inspections and handling data collected from inspections
• asset maintenance history (storing earlier maintenance activities)
• statistically analyzing trends and patterns (for example asset maintenance groups)
• asset failure history (storing and analyzing previous failures of components and component
types)
• managing work orders (for example creating work orders, tracking work orders status and
closing work orders)
• financial control and follow up (for example estimating the cost of work, financially following
up work orders and paying invoices)

Construction and Design


Functionality for planning and performing construction of primarily new parts of the network (more
extensive work than normal maintenance).

Includes functionality for:


• project planning and scoping (including who is to perform the work, i.e. in-house or
contractor)
• managing work orders (for example creating work orders, tracking work orders status and
closing work orders)
• financial control and follow up (for example estimating the cost of work, financially following
up work orders and paying invoices)

Andreas Andersson Industrial Information and Control Systems, KTH

63
Master of Science Thesis 2006-09-14

• phasing in and out equipment as well as scrapping phased out equipment (connected to asset
repository)

Premises
Functionality for managing the premises that the organization owns and those of customers.

Includes functionality for:


• recording and displaying addresses of customers and addresses of premises owned by the
organization
• storing information about meter locations
• legal aspects of running and extending an electrical network (for example agreements with land
owners on whose property lines are built)
• owning and operating real estate

Supply Chain and Logistics


Functionality for procurement, contract management and inventory handling.

Includes functionality for:


• procurement (support for procurement of services from contractors)
• contract management (keeping track of legal and other issues regarding written contracts
regarding material, goods or services)
• warehouse logistics (for example monitoring of inventory levels)
• materials management (managing of materials in store and materials needed for construction
and maintenance)

Document Management
Functionality for managing documents.

Includes functionality for:


• storing, accessing, reading, distributing and changing documents

Substation and Network Inventory


Functionality for presenting the asset repository and electrical connections within the network.

Includes functionality for:


• presenting equipment characteristics of objects in the asset repository (for example the
condition of the equipment)
• descriptions of all kinds of electrical connections within the network (also including
functionality to show a more detailed view of substations within the network) as well as
information about the communications equipment installed in the network

Geographical Inventory
Functionality for displaying the network on a geographical map.

Includes functionality for:


• presenting the network on a geographical map along with topographic and topological
information

Trouble Call Management


Functionality for handling calls from customers about outages and notifying those affected by planned
outages.

Andreas Andersson Industrial Information and Control Systems, KTH

64
Master of Science Thesis 2006-09-14

Includes functionality for:


• processing outage calls from customers related to power quality issues (such as voltage dips)
and other issues related to flaws in the network
• supporting the notification of those affected by planned outages, and the necessary media
communications during an outage, as well as estimated time needed for the restoration of
power to affected customers
• outage history (i.e. records of all outages, which is often required by authorities)

Switch Action Scheduling/Work Scheduling


Functionality for coordinating disconnection of equipment with maintenance and inspection activities.

Includes functionality for:


• scheduling and coordinating of when power is to be off and when performing work is needed
(also includes giving someone permission to work on deenergized cables and notifying
operations control when work is performed on energized cables)
• analyzing what the consequences of deenergizing a line would be in terms of number of
affected customers (enabling the work planner to schedule work on lines with as little impact as
possible on the customers, and to notify those affected about the outage)

Work Dispatch12
Functionality for maintaining communication between workers in the field and people at the office.

Includes functionality for:


• monitoring status of personnel (for example tracking location of workers and crew
movements)
• real time communication with system for field personnel and support for them to work offline
when there is no connection (for example updating the asset repository in the field)
• incorporation of weather forecasts and actual weather into the planning and dispatching of
work crews

Work Scheduling
Functionality for the day-to-day scheduling of various aspects of the work carried out.

Includes functionality for:


• creating detailed plans of the tasks that are part of a work order
• keeping track of work crews and their activities (for example their competency as well as
current and future work tasks, but also sending out work orders digitally to work crews)
• handling equipment necessary to perform requested work (such as tools and vehicles)
• handling material for construction work (for example coordinating material delivery with
supplier)
• managing permits for field work (work on some voltage levels, for example, requires that the
crew has adequate training and some substations are not to be entered because of security
implications)

Vattenfall Hydro Power Sweden


The functionality is the same for Hydro Power Sweden as for Distribution Sweden except for the areas
below. Moreover, the function Trouble Call Management is not part of Hydro Power Sweden’s functional
reference model, nor is the functional area Work Dispatch.

Premises
Functionality for managing the premises that the organization owns.

12The functional area Work Dispatch is performed mainly by Vattenfall Service, but has for the purpose
of this thesis been placed under Distribution Sweden, in order to cover the entire functional reference
model. The same is true for Work Scheduling.
Andreas Andersson Industrial Information and Control Systems, KTH

65
Master of Science Thesis 2006-09-14

Includes functionality for:


• managing all aspects of owning and operating real estate

Comment: As Hydro Power Sweden has no end customers, there is no need for address information.

Substation and Network Inventory


Functionality for presenting the asset repository.

Includes functionality for:


• presenting equipment characteristics of objects in the asset repository (for example the
condition of the equipment)

Comment: As Hydro Power Sweden has no electricity distribution network, there is no need to present
the electrical connections within the network.

Switch Action Scheduling/Work Scheduling


Functionality for coordinating disconnection of equipment with maintenance and inspection activities.

Includes functionality for:


• scheduling and coordinating of when power is to be off and when performing work is needed
(also includes giving someone permission to work on deenergized cables and notifying
operations control when work is performed on energized cables)

Comment: As Hydro Power Sweden has no end customers, no customers are affected by the outages or
need to be notified about it.

Andreas Andersson Industrial Information and Control Systems, KTH

66
Master of Science Thesis 2006-09-14

Appendix 5 – Form for Business Value Contribution from


Functionality
The following appendix contains the form used evaluating the business value contribution from
functionality.

Functional area: ______________________


Respondent: _________________________
Date: _______________________________
Organization: ________________________

Decision making

Differentiations in products/services

Efficiency

Flexibility

Flow of products/services

Change management

Information

Lock-in effect/switching costs

Integration and coordination

Communication

Control and follow up

Organizational learning and knowledge

Quality of products/services

Deliveries

Technology/tools

Inbound logistics

Cost reductions

New products/services

Organizational culture

Productivity

Competitor relations

Customer relations

Supplier relations

Third party relations

Strategy formulation and planning

Andreas Andersson Industrial Information and Control Systems, KTH

67
Master of Science Thesis 2006-09-14

Appendix 6 – Form for Quality Impact on Business Value


Contribution
The following appendix contains the form used evaluating the quality impact on business value
contribution.

Functional area: ______________________


Respondent: _________________________
Date: _______________________________
Organization: ________________________

Impact Rank Usability


Usability is a measure of how easy it is to use a system for a user, for
example how easy it is to learn to use the system, how easy it is to
remember how to use the system, that the use of the system is effective
and efficient, and that the system prevents user errors.

Comments:

Data Quality
Data quality is a measure of how accurate (correct data), complete (no
data missing), current (up-to-date data), and consistent (the same data for
the same entity throughout the system) data is.

Comments:

Information Security
Information security is a measure of how well protected a system (and the
information in it) is against unauthorized access, use and modification.
Information security concerns, amongst other things, the level of control
for access to a system (for example login with a password) and levels of
authority (for example an administrator level for the most critical
functions). It also includes encryption of the information in the system, as
well as logging (supervision) of what is done in the system and the
communication to and from the system.

Comments:

Interoperability
Interoperability is a measure of a system’s ability to interact with other
systems outside the system itself, for example that information can be
exchanged between systems. This also depends on the availability of
persons (both internally and externally) who can integrate the system with
other systems, and on the skills and experience of these persons.
Furthermore, the amount and quality of documentation supporting
integration of systems is an issue.

Comments:

Andreas Andersson Industrial Information and Control Systems, KTH

68
Master of Science Thesis 2006-09-14

Impact Rank Modifiability


Modifiability is a measure of how easy it is to change a system or how
much that can be changed, for example how easy it is to modify or add
functions in a system or to update a system. This also depends on the
availability of persons (both internally and externally) who can modify the
system, and on the skills and experience of these persons. Furthermore,
the amount and quality of documentation supporting modification of
the system is an issue.

Comments:

Performance
Performance is a measure of how fast a system runs, for example how
long the system takes to respond to an input and how long it takes to
perform operations. Performance is also a measure of the level of
resource requirements (for example in terms of hardware) for a system,
and scalability (i.e. the amount of users the system can support and still
function properly).

Comments:

Availability and Reliability


Availability is a measure of the degree that a system is accessible when a
user needs to use it. Reliability is a measure a system’s ability to perform its
required function under stated conditions for a specified period. In other
words, availability and reliability concern that the system works as it is
supposed to when a user needs it. Availability and reliability are also
related to how long it takes between failures and how long it takes until
the system is up-and-running again following a failure.

Comments:

Andreas Andersson Industrial Information and Control Systems, KTH

69
Appendix 7 – Form for Credibility Evaluation
The following appendix contains the form used for credibility evaluation of respondents.

Functional area: ______________________


Respondent: _________________________
Date: _______________________________
Organization: ________________________

Source Proximity (Question and Assessment13)


How close to the respondent are the work tasks which uses the different parts of the functional
reference model as tools and support?
Respondent works with the tasks him/herself
Respondent asked person working with the tasks
Respondent referred to person working with the tasks
Respondent referred to others

Age of Answer (Question)


How long has it been since the respondent performed work tasks that required the IT systems
support described in the functional reference model?
Days
Weeks
Months
Years

Question Domain & Answer Domain (Assessment)


To what extent does the answer domain match the question domain?
The same type of business
The same type of work tasks
Comparable type of business
Comparable types of work tasks

Years of Experience (Question)


How much experience does the respondent have in the question domain?
Less than 1 year
1 to 3 years
More than 3 years

Match of Area of Expertise (Question and Assessment)


To what extent does the actual respondent match the intended respondent profile?
Area of Knowledge (Business, Business/IT or IT)
Organizational level (Process performer, process manager, business unit or enterprise)
Relation to organization (User, implementer or supplier)

Respondent Certainty14 (Question)


How certain is the respondent that his/her assessment is correct?
Very certain
Fairly certain
Fairly uncertain
Uncertain

13 Question means that a question is asked directly to the respondent, and Assessment means that an
assessment is made silently by the interviewer (for example based on the respondent’s work position).
Both Question and Assessment imply that a question is asked and then matched against for example the
desired profile.
14 Please note that there is a corresponding assessment made by the interviewer on how certain he/she

thinks that the respondent is.


Master of Science Thesis 2006-09-14

Appendix 8 – Contact List for Respondents


The following appendix lists the contact information for all respondents participating in the thesis.

Email
Name Business Unit Phone Mobile Involved in
(@vattenfall.com)
Business
Erik Biberg Service 08 - 739 69 05 070 - 539 69 05 erik.biberg Pilot Interview
Bernt Distribution
Hansson Sweden 0520 - 885 23 070 - 558 85 23 bernt.e.hansson Prio
Christer Distribution
Eriksson Sweden 013 - 37 64 43 070 - 650 77 41 christer.eriksson Interview
Distribution
Claes Wange Sweden 013 - 37 64 33 070 - 602 71 78 claes.wange Prio + Interview
Einar Distribution
Hellman Sweden 08 - 623 28 64 070 - 309 98 64 einar.hellman Interview
Elenor Distribution
Båtsman Sweden 08 - 739 60 93 070 - 539 74 35 elenor.batsman Prio
Hans Distribution
Andersson Sweden 013 - 37 64 33 070 - 593 93 82 hans.g.andersson Prio + Interview
Hans Distribution
Broström Sweden 08 - 739 67 75 070 - 629 49 35 hans.brostrom Interview
Distribution
Jan Karlsson Sweden 013 - 37 65 23 070 - 558 85 46 jan.karlsson Prio
Distribution
Kjell Anderén Sweden 08 - 623 27 30 070 - 634 79 30 kjell.anderen Prio
Mats Distribution
Brännström Sweden 0920 - 770 74 070 - 695 63 29 mats.brannstrom Prio
Mats Distribution
Hallström Sweden 013 - 37 64 12 070 - 650 77 51 mats.hallstrom Interview
Olov Distribution
Holmberg Sweden 08 - 623 29 20 070 - 241 05 07 olov.holmberg Prio
Distribution
Pelle Larsson Sweden 0520 - 887 40 070 - 525 54 65 pelle.larsson Prio
Thomas Distribution
Gustafsson Sweden 08 - 739 66 43 070 - 580 92 32 thomas.gustafsson2 Prio
Distribution
Torleif Hiort Sweden 08 - 623 27 84 070 - 651 45 42 torleif.hiort Interview
Sivert
Rutgersson Service (south) 0520 - 887 50 070 - 626 61 40 sivert.rutgersson Interview
Research &
Jan Gugala Development 08 - 739 54 62 073 - 988 03 79 jan.gugala Pilot Interview
Kenth Hydro Power
Eriksson Sweden 08 - 739 53 34 070 - 310 00 21 kenth.eriksson Interview
Hydro Power
Michael Tiger Sweden 0920 - 772 46 070 - 514 78 79 michael.tiger Interview
Anders Hydro Power
Björnström Sweden 0920 - 771 53 070 - 546 01 55 anders.bjornstrom Prio
Christer Hydro Power
Ljunggren Sweden 0920 - 773 35 070 - 666 93 80 christer.ljunggren Prio
Claes-Olof Hydro Power
Brandesten Sweden 08 - 739 70 95 070 - 553 13 75 claesolof.brandesten Interview
Fredrik Hydro Power
Backlund Sweden 0920 - 772 13 070 - 697 72 13 fredrik.backlund Prio + Interview
Hans Hydro Power
Lindström Sweden 026 - 837 81 070 - 523 70 23 hans.lindstrom Prio
Hydro Power
Jan Lidström Sweden 090 - 15 14 11 070 - 576 32 83 jan.lidstrom Prio
Maria Hydro Power
Lillieroth Sweden 0696 - 178 27 070 - 691 78 27 maria.lillieroth Prio + Interview
Nils Hydro Power
Holmström Sweden 0920 - 771 96 070 - 353 70 55 nils.holmstrom Prio
Per-Olof Hydro Power
Ferm Sweden 026 – 837 20 070 – 312 96 77 perolof.ferm Interview
Tommy Hydro Power
Enmark Sweden 0918 - 311 13 070 - 688 39 67 tommy.enmark Prio + Interview

Prio = Business Value Prioritization


Interview = Business Value Contribution

Andreas Andersson Industrial Information and Control Systems, KTH

71
Master of Science Thesis 2006-09-14

Appendix 9 – List of Interview Data and Respondents


The following appendix lists all the interviews including details about the respondents and their
profiles.

Interviews for Business Value Contribution with Credibility


All the respondents were chosen because of knowledge in the evaluated functional area(s), and
because the respondents had suitable profiles according to the respondent categorization framework.
The interviews are listed in chronological order. Please note that the value for Respondent Certainty
corresponds to the scale used in the credibility form, where 4 denotes “Very certain”, 3 means “Fairly
certain” and so forth. Also note that the values for Interviewer’s Assessment have been left out, for
confidentiality reasons.

Interview no. 1
Date 2006-03-21
Respondent E2
Interview Length 1,5 hours
Interview Type In person (Råcksta)
Respondent
Functional Area Source Proximity Age of Answer
Certainty
CON Referred to person working with tasks Years 4

Interview no. 2
Date 2006-03-23
Respondent E4
Interview Length 1,5 hours
Interview Type In person (Linköping)
Respondent
Functional Area Source Proximity Age of Answer
Certainty
AIP Referred to person working with tasks Days 3

Interview no. 3
Date 2006-03-24
Respondent E3
Interview Length 2 hours
Interview Type In person (Linköping)
Respondent
Functional Area Source Proximity Age of Answer
Certainty
TCM Works with tasks himself Days 4

Interview no. 4 Comments


Date 2006-03-24
Respondent E7
Interview Length 1 hour
Interview Type In person (Linköping)
Respondent
Functional Area Source Proximity Age of Answer
Certainty
MAI Works with tasks himself Days 4

Interview no. 5
Date 2006-03-28
Respondent E5
Interview Length 1 hour
Interview Type In person (Sollentuna)
Respondent
Functional Area Source Proximity Age of Answer
Certainty
EINV Works with tasks himself Days 3

Andreas Andersson Industrial Information and Control Systems, KTH

72
Master of Science Thesis 2006-09-14

Interview no. 6
Date 2006-04-03
Respondent E6
Interview Length 2 hours
Interview Type In person (Sollentuna)
Respondent
Functional Area Source Proximity Age of Answer
Certainty
CON Works with tasks himself Days 3
MAI Works with tasks himself Days 3
FRD Works with tasks himself Days 3
EINV Works with tasks himself Days 3

Interview no. 7
Date 2006-04-04
Respondent E5
Interview Length 1,5 hours
Interview Type Phone
Respondent
Functional Area Source Proximity Age of Answer
Certainty
MAI Works with tasks himself Days 3
GINV Referred to person working with tasks Weeks 3

Interview no. 8
Date 2006-04-06
Respondent E1
Interview Length 2 hours
Interview Type In person (Råcksta)
Respondent
Functional Area Source Proximity Age of Answer
Certainty
AIP Works with tasks himself Days 4
CON Works with tasks himself Days 4
PRM Works with tasks himself Days 4
DOC Works with tasks himself Days 4
SCM Referred to person working with tasks Days 2

Interview no. 9
Date 2006-04-07
Respondent E3
Interview Length 2 hours
Interview Type Phone
Respondent
Functional Area Source Proximity Age of Answer
Certainty
SWA Works with tasks himself Days 4
CALC Works with tasks himself Days 4

Interview no. 10
Date 2006-04-07
Respondent V2
Interview Length 1,5 hours
Interview Type In person (Råcksta)
Respondent
Functional Area Source Proximity Age of Answer
Certainty
AIP Works with tasks himself Days 3
MAI Referred to person working with tasks Months 3

Andreas Andersson Industrial Information and Control Systems, KTH

73
Master of Science Thesis 2006-09-14

Interview no. 11
Date 2006-04-10
Respondent V4
Interview Length 1,5 hours
Interview Type In person (Råcksta)
Respondent
Functional Area Source Proximity Age of Answer
Certainty
SCM Works with tasks himself Weeks 3
CON Referred to person working with tasks Months 3
AIP Referred to person working with tasks Months 3

Interview no. 12
Date 2006-04-12
Respondent E7
Interview Length 2,5 hours
Interview Type Phone
Respondent
Functional Area Source Proximity Age of Answer
Certainty
AIP Works with tasks himself Months 3
DOC Works with tasks himself Days 4
EINV Works with tasks himself Days 4
GINV Works with tasks himself Weeks 4
PRM Referred to person working with tasks Months 2

Interview no. 13
Date 2006-04-19
Respondent V3
Interview Length 1,5 hours
Interview Type Phone
Respondent
Functional Area Source Proximity Age of Answer
Certainty
MAI Works with tasks himself Weeks 3
DOC Works with tasks himself Weeks 3
EINV Works with tasks himself Weeks 3
FRD Works with tasks himself Months 3

Interview no. 14
Date 2006-04-21
Respondent V1
Interview Length 2 hours
Interview Type Phone
Respondent
Functional Area Source Proximity Age of Answer
Certainty
AIP Works with tasks himself Weeks 3
EINV Works with tasks himself Weeks 3
MAI Referred to person working with tasks Weeks 3

Interview no. 15
Date 2006-04-24
Respondent E1
Interview Length 1,5 hours
Interview Type Phone
Respondent
Functional Area Source Proximity Age of Answer
Certainty
GINV Referred to person working with tasks Months 3
EINV Referred to person working with tasks Months 3
MAI Referred to person working with tasks Years 3
TCM Referred to person working with tasks Years 3
SWA Referred to person working with tasks Years 3

Andreas Andersson Industrial Information and Control Systems, KTH

74
Master of Science Thesis 2006-09-14

Interview no. 16
Date 2006-04-26
Respondent VS1
Interview Length 2 hours
Interview Type In person (Råcksta)
Respondent
Functional Area Source Proximity Age of Answer
Certainty
FRD Referred to person working with tasks Weeks 3
DSP Referred to person working with tasks Weeks 3
SCHD Referred to person working with tasks Weeks 3

Interview no. 17
Date 2006-05-02
Respondent V6
Interview Length 1,5 hours
Interview Type Phone
Respondent
Functional Area Source Proximity Age of Answer
Certainty
SWA Referred to person working with tasks Days 4
DOC Referred to person working with tasks Days 4

Interview no. 18
Date 2006-05-02
Respondent V7
Interview Length 0,5 hours
Interview Type Phone
Respondent
Functional Area Source Proximity Age of Answer
Certainty
DOC Referred to person working with tasks Days 4

Interview no. 19
Date 2006-05-05
Respondent V5
Interview Length 1,5 hours
Interview Type Phone
Respondent
Functional Area Source Proximity Age of Answer
Certainty
MAI Works with tasks himself Days 3
CON Works with tasks himself Days 3

Andreas Andersson Industrial Information and Control Systems, KTH

75
Master of Science Thesis 2006-09-14

Respondent Profiles for Business Value Contribution


The following table presents the static respondent profiles, i.e. the credibility dimensions that do not
change with the evaluated functional area. It also lists the functional areas that respondents evaluated.

Covered
Respon- Area of Organizational Relation to Years of
Q&A Domain Functional
dent Knowledge Level Organization Experience
Areas
E1 Business & IT Business Unit Employee STB & SWT > 3 years AIP, CON,
PRM, DOC,
SCM, GINV,
EINV, MAI,
TCM, SWA
E2 Business Business Unit Employee STB & SWT > 3 years CON
E3 Business & IT Process Performer Employee STB & SWT > 3 years TCM, SWA,
CALC
E4 Business Process Manager Employee STB & SWT > 3 years AIP
E5 Business & IT Process Performer Employee STB & SWT > 3 years EINV, MAI,
GINV
E6 Business Process Performer Employee STB & SWT > 3 years
CON, MAI,
FRD, EINV
E7 Business & IT Process Manager Employee STB & SWT > 3 years MAI, AIP,
DOC, EINV,
GINV, PRM
VS1 Business & IT Business Unit Employee STB & SWT > 3 years FRD, DSP,
SCHD
V1 Business Business Unit Employee STB & SWT > 3 years AIP, EINV,
MAI
V2 Business & IT Process Manager Employee STB & SWT > 3 years AIP, MAI
V3 Business & IT Process Manager Employee STB & SWT > 3 years
MAI, DOC,
EINV, FRD
V4 Business Process Performer Employee STB & SWT > 3 years SCM, CON,
AIP
V5 Business Process Manager Employee STB & SWT > 3 years MAI, CON
V6 Business Business Unit Employee STB & SWT > 3 years SWA, DOC
V7 IT Process Manager Employee STB & SWT > 3 years DOC

E = Distribution Sweden
V = Hydro Power Sweden
VS = Vattenfall Service
STB & SWT = Same type of business & same type of work tasks

Respondent Profiles for Business Value Prioritization

Area of Organizational Relation to


Respondent
Knowledge Level Organization

E1 Business & IT Business Unit Employee


E7 Business Process Manager Employee
E8 Business Business Unit Employee
E9 Business Business Unit Employee
E10 Business Business Unit Employee
E11 Business Business Unit Employee
E12 Business Business Unit Employee
E13 Business Business Unit Employee
E14 Business Business Unit Employee
E15 Business Business Unit Employee
V1 Business Business Unit Employee
V3 Business & IT Process Manager Employee
V8 Business Business Unit Employee
V9 Business Business Unit Employee
V10 Business Business Unit Employee
V11 Business Business Unit Employee
V12 Business Business Unit Employee
V13 Business Business Unit Employee

E = Distribution Sweden
V = Hydro Power Sweden
Andreas Andersson Industrial Information and Control Systems, KTH

76

S-ar putea să vă placă și