Documente Academic
Documente Profesional
Documente Cultură
net/publication/325673969
CITATIONS READS
2 657
2 authors:
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Tom Kontogiannis on 30 June 2018.
Abstract
6/30/2018 Page 1
1. INTRODUCTION
The early nineties have witnessed the introduction of advanced technologies in the control
rooms of complex systems. In the nuclear industry, for instance, third-generation
technologies made it possible for operators to supervise the process through compact
workstations with enhanced graphic and sound capabilities (e.g., “soft” controls, multi-
function buttons, and auditory feedback) and through computerised procedures and
intelligent aids. In a similar way, the emergence of “glass-cockpits” and flight management
systems have marked modern aircraft flightdecks. However, a number of studies on human
interaction with advanced technologies (e.g., Billings, 1991; Mossier et al., 1992; Woods
et al., 1994; Hopkin, 1995) have indicated that the postulated benefits of increased
reliability, maintainability, and quality of service have not yet completely materialised.
One of the main reasons for the deficiencies in the coupling of humans and new
technologies is that the operating environment associated with advanced systems is very
different from that of conventional control rooms. System designers have not anticipated
the changes in cognitive demands introduced by advanced technologies. As O’Hara & Hall
(1992) remarked “with the increased use of computer-based interfaces, cognitive issues are
emerging as more significant than the physical and ergonomic considerations that
dominated the design of conventional control rooms”. Developments in modern
technology are likely to affect the role of operators and their teams, the requirements about
knowledge and skills, and the way operators interact with the system. In addition, changes
in broader job factors, such as, operator licensing, training, and team work are likely to
take place.
These issues are now receiving wider recognition by many regulatory authorities
and research is in progress to review existing guidelines for control room design and the
introduction of advanced technology in general (e.g., Carter, 1992; O’Hara et al., 1995). In
view of these changes in cognitive and collaborative demands, cognitive ergonomics has
to play a crucial role in matching technological options and user demands. It is necessary
therefore to develop design guidelines that would address the interaction with new
technology in the context of new cognitive and collaborative demands.
This paper describes a framework that can be used to assess the cognitive demands
resulting from the application of information technology and, hence, serves as a basis for
design decisions concerning specific interface issues, task allocation, and level of
automation. A qualitative modelling layer is proposed based on a number of theoretical
postulates that concern the interactions and dependencies between various design features
and job factors as well as the coupling between cognition and context.
The designer's problem is illustrated by the situation where two (or more) options
are available to solve a specific problem, e.g., displaying information about the state of a
sub-system. Design criteria for option selection can be based on user requirements which,
however, can be defined at different levels of the man-machine interaction. At one level,
design criteria can include ease of interface use, detection of abnormal events, consistency
of displays and controls, and so on. At the level of operator tasks, requirements may
concern the content and format of information so that only relevant information is
displayed, reminders are provided for incomplete tasks, and relationships between process
6/30/2018 Page 3
variables are made explicit to operators. Yet at another level, requirements can be made in
the context of training and team work, such as, compatibility with operator strategies and
information needs, minimisation of training needs, and transparency of operator actions to
other team members. The definition of user requirements and the provision of matching
rules to technological options is an intricate design problem.
The development of design guidelines should serve to improve the quality and consistency
of designs by providing the designer with a clearly delineated set of alternatives together
with well-defined design criteria. Another reason that makes this ideal hard to attain
relates to problems of usability. Accumulated knowledge of human factors has produced a
large number of detailed design guidelines, but to be useful it must be easy to identify the
guidelines that are relevant for a given problem without having to search through them all.
This, paradoxically, mirrors the issues of functionality and usability that the guidelines
should help to resolve. Functionality refers to the number of functions a specific system is
able to perform. Usability refers to the functions that the user either is aware of or is able
to use efficiently. At present the functionality of many systems, in particular software
systems, is much higher than their usability. Good design guidelines and usability
engineering should help to reduce this discrepancy. Unfortunately, the design guidelines
themselves suffer from the same problem: high functionality and low usability. The
principles and criteria that are the basis for design should also serve as the basis for
6/30/2018 Page 4
evaluation1 (Figure 1 ). Unfortunately, evaluations suffer from the same weaknesses as
design, that is, it is hard to find well-defined principles that can be used for evaluation.
The definitions of user requirements and matching rules have given rise to different
approaches to control room design. In the traditional, technology-driven approach, the
design was approached from a strictly engineering point of view while human factors
considerations were left for the final stage to overcome certain design problems. In other
words, the interaction between people and technology was seen as an additional aspect of
system design. This approach has been criticised for producing designs that were
expensive to modify, reducing user acceptability, and generating excessive training
problems (Algera et al., 1989; Bainbridge, 1991a). The following section discusses two
other approaches based on classical and cognitive ergonomics, respectively, which appear
to be more promising.
Incorporating human factors at the starting point of control room design and matching
design to user capabilities and limitations have been well-established principles of the
classical ergonomics approach. To understand the concepts and methods used in this
approach, typical of the period until the end of the 1970s, we need to examine the
fundamental principles of classical ergonomics that can be reviewed as follows:
Human capacity for processing information is decisive for the capacity of the
joint system. The emergence of ergonomics coincided with the development of the
modern digital computer, and the analogy between the computer and the human
mind appeared to offer some fundamental explanations of human functional
1 It goes without saying that the evaluation must be a concurrent process during the design, but for the
sake of the discussion it is assumed that it takes place as a separate activity.
6/30/2018 Page 5
characteristics by referring to hypothetical information processing structures. This
therefore became the leading metaphor for several decades.
Generally speaking the solution advocated by classical ergonomics was to match carefully
the interface to known human capacities and limitations. This was done by observing the
following principles:
The classical ergonomic approach to system design is epitomised by the so-called Fitts’
List (Fitts, 1951) which was developed to allocate functions in an air-traffic control
system. The principle is that a list or table is constructed of the strong and weak features of
humans and machines, which then is used as a basis for assigning functions and
responsibilities to the various system components. The problem with this approach is that
training and design decisions are made to increase the competence of human and machine
elements in isolation The two elements are seen as working independently or in parallel
rather than jointly. In this sense, this approach fails to address how humans can build upon
6/30/2018 Page 6
information processing work and decisions made by machines. As Langlotz & Shortliffe
(1983) remarked “excellent decision making performance (on the part of machines) does
not guarantee user acceptance”. Operators need to cross-check the reliability of machine
strategies and be able to build upon it rather than repeat the same process independently.
Classical ergonomics has not addressed these issues, partly because technology had not
achieved an adequate level of sophistication until early nineties.
In the 1980's ergonomics came under the influences of cognitive psychology and Cognitive
Systems Engineering to produce what is commonly referred to as cognitive ergonomics.
Acknowledging the significance of human-computer interaction in almost all situations of
interest, a definition might be that cognitive ergonomics is system design with the
characteristics of the joint cognitive system (the operator and the computer) as a
frame of reference (cf. Hollnagel & Woods, 1983). The joint cognitive systems paradigm
is illustrated in Figure 2.
With sophisticated automation and computerised support, operators may not always
know about the current goals, or not even be in control of the process. The goal may
nominally be given to the operator, but in many cases the operator can do nothing
but accept it. Determining the extent to which goals should be shared between the
6/30/2018 Page 7
operator and the automatic control system is a crucial problem for cognitive
ergonomics.
If operators are in control, they clearly need sufficient information to provide the
required degree of control. In cases where a smaller or larger part of the process is
assigned to the automatic control system, the operator will need enough information
to follow what is happening. Determining just how much information operators
need - both when things are normal and when the automatic control systems fail - is
another critical issue for cognitive ergonomics.
If the automatic system is in control, the operator may at times be isolated from the
operation of the system. The reason is usually that operators cannot react quickly
enough or precisely enough or cannot process the necessary information fast enough
to make the correct choices. Determining when control should be with the operator
and when it should be with the automatic control system is a challenge for cognitive
ergonomics.
Classical ergonomics differed from Taylorism by emphasising that humans worked with
machines rather than as parts of them. Cognitive ergonomics went a step further by
emphasising that humans sometimes work through machines rather than with machines.
This means that the operator interacts with the process through the automatic control
system and through the interface. The operator’s understanding of the process therefore
becomes contingent upon how it is represented in the interface.
Cognitive ergonomics can also be seen as describing a change from closed loop to
open loop control. In classical ergonomics the operator was seen as part of a closed control
loop. The concern was therefore to provide information and controls that enabled him to
function effectively as part of the loop. In cognitive ergonomics many of the closed loop
functions are retained, but in addition the operator also has to function in an open loop. For
instance, when a diagnosis is made or a strategy is developed, the operators basically
engage in open loop control.2 It is therefore necessary to support the open loop control by
appropriate design of information presentation, task structuring, and function allocation.
2 This is, for instance, demonstrated by the fact that people may sometimes follow an incorrect
diagnosis in spite of conflicting information; feedback is thus neglected
6/30/2018 Page 8
To achieve this objective we need to examine not only the tasks that operators
perform in running the plant but also the cues and relationships they are perceiving, the
knowledge they are using, and the strategies they are choosing. Behavioural task analysis
(e.g., operational sequence diagrams, timelines, link analysis) focuses on what the
operator does, and not on how or why. Key judgements about the situation and decision
making tend to be lost or de-emphasised in the details of task decomposition. Cognitive
ergonomics can make a significant contribution to the analysis of cognitive processes in
controlling complex systems. Various methods of cognitive task analysis (e.g., Hollnagel,
1991; Hoc & Samurcay, 1992; Redding & Seamster, 1994; Klein et al., 1997) have already
been developed to examine different types of operator knowledge and strategies used to
interpret situations or make decisions how to chose a course of action.
Many cognitive task analysis methods use interviews to probe the cues and patterns
that operators are using; on other occasions, simulated scenarios are used in which
operators either think aloud while doing their jobs or are interviewed immediately
afterwards to investigate their strategies. Data from these sources can be combined to
produce a record of participant data acquisition, situation assessment, knowledge
activation, intentions, and tasks carried out in a particular scenario. The main objective of
cognitive task analysis methods should be to understand human behaviour from the point
of view of the operator rather than the point of view of the omniscient observer, in order to
reduce difficulties caused by hindsight bias. Woods (1993) quotes two challenges to
validity in the development of cognitive task analysis methods, that is, (a) to what extent
does the method changes the characteristics and context of the investigated tasks, and (b)
to what extent do the obtained data reflect the underlying cognitive processes, minimising
omissions of important aspects, intrusions of irrelevant features, or distortions of the actual
processes. Further developments in cognitive task analysis would have to cope with these
challenges in order to advance our understanding of cognitive processes in controlling
complex systems.
to predict the situations where problems may arise, particularly problems that
involve the operator’s understanding of the situation.
6/30/2018 Page 9
to describe the conditions that may either be the cause of the problems or have a
significant effect on how the situations develop.
to prescribe the means by which such situations can either be avoided or their
impact reduced.
In order to do this, it is necessary to apply concepts about human cognition and theories
about how the human mind works. Yet the development of such concepts and theories is a
task for cognitive psychology and Cognitive Systems Engineering rather than cognitive
ergonomics. In order to comply with this distinction, cognitive ergonomics would need a
well-developed theoretical foundation. Since this is not yet the case (Bainbridge, 1991b),
work in cognitive ergonomics cannot avoid to address issues of theoretical development as
well as of application.
The introduction of advanced technology does not automatically and magically enhance
human performance and reliability. To ensure this, it is necessary to evaluate the proposed
design in terms of the potential negative as well as positive effects. Classical ergonomics
has over the years produced a substantial set of principles and methods to evaluate specific
aspects of control room design, although mainly confined to static evaluations. Cognitive
ergonomics enhances these principles and methods by providing a different perspective. In
this section, we present a set of assumptions based on cognitive ergonomics and Cognitive
Systems Engineering that constitute an amended basis for the design and evaluation of
new technologies.
The use of tools. In advanced control rooms, new tools are required as the whole
concept of the control room has changed. Technological developments (e.g.,
computerisation, process status overviews, and workstations) have consequences for
both the cognitive nature of work and the physical nature of operator activities. In
particular, the predominance of advanced information technology means that operators
become more dependent on specialised tools for information retrieval, control, and
communication. The ways in which operators shape their tools and the characteristics of
tools that affect individual strategies are crucial to our understanding of human
computer interaction (Woods et al., 1990).
6/30/2018 Page 11
understand (Billings, 1991). Tool development, of course, aims at reducing the
complexity and making the work simpler and more comprehensible, but if each tool is
considered by itself, an overall increase in complexity may easily go unnoticed. We
need therefore to understand the factors that increase domain complexity and the way
that complexity affects joint performance. One way for reducing complexity, for
instance, might be the representation of technical systems in various levels of
decomposition or abstraction (Rasmussen, 1986). Another way could be the provision
of simplified constructs and models that can be made available to operators through
“help” menus or training (Billings, 1991).
These amended assumptions have implications for control room design and evaluation of
new technologies. For one thing, we are more concerned with the design of the interaction
in context rather than what is usually referred to as interface. The concept of “interacting”
or “joint” cognitive systems, for instance, implies that new technology must be
6/30/2018 Page 12
accountable for its actions and predictable in its intentions so that problems are defined
and solved in conjunction with the human operator. If operators are not able to understand
the hypotheses tested by a machine advisor and re-direct its line of reasoning in cases of
misperception, they are likely to repeat the diagnostic and fault-compensation strategies by
themselves (Woods, 1986). This means that any opportunities to build upon work carried
out by a machine advisor are missed.
This section examines those aspects of control room design that relate to the user-process
interface, level of automation, and computer assistance. It also considers the implications
of various design options on training, staffing, and team communications. It should be
noted that the aim is not to generate exhaustive lists of design guidelines but rather to
provide a framework that may help designers apply the principles of cognitive ergonomics
as described in the previous section.
New technology tends to force serial access to data that previously could be viewed in
parallel and to fragment data across different windows and displays. Thus, knowing where
to look next in the data space available behind the current display window and how to
extract information across multiple windows is a fundamental cognitive activity. Design
guidelines should focus on how to reduce the burden of interacting with the display system
especially at times when operators can least afford new tasks. In this way, operator
attention is not diverted away from the evolution of a critical situation to the interface per
se. Difficult-to-use interfaces may increase task complexity and may lead operators to use
the interface in ways not intended by its design or may constrain available options - e.g.,
limit the display of data to a fixed spatial organisation rather than exploit the flexibility of
the interface (Woods & Sarter, 1992). This is a typical case where functionality may be
reduced to achieve usability (Hollnagel, 1994).
6/30/2018 Page 14
reminders and prompts for action becomes therefore an importable principle in supporting
human memory in multi-task performance.
An implication of the concept of joint cognitive systems for interface design is that
the presentation and organisation of information should help operators formulate
alternative hypotheses about the causes of a situation, predict future states of the process
and select appropriate measures how to control it (De Keyser & Woods, 1990). If support
for these functions is neglected in the design of the interaction, the result may be
“cognitive tunnel vision” (Reason, 1990). Making operators aware of interesting events
that may take place in other parts of the interface and emphasising “discrimination” cues
between potentially confusing scenarios are examples of means how to prevent cognitive
lock-up.
Increasing automation is a prominent feature of new technology to the extent that a range
of tasks, previously done by the human operator, have been taken over by machines - and
computers in particular. While many applications have been motivated by a tendency to
experiment with information technology and respond to the challenge of increasing
6/30/2018 Page 15
productivity or safety, few have been motivated by the need to increase the efficiency of
the joint system of human operators and machines.
The increasing concern with “mode errors” committed in highly automated systems
raises the issue of appropriate task feedback (Sarter & Woods, 1995). Mode errors occur
when the person believes that the system is in one state or mode when it is actually in
another (Norman, 1983). Thus, operators may initiate a course of action that is appropriate
in one mode but wrong in another one. Feedback about the current automation mode and
the transitions between modes is an important issue. However, as Norman (1990)
cautioned, the task of presenting feedback in an appropriate way is not an easy matter.
Feedback should be provided in a manner similar to what operators use when they discuss
6/30/2018 Page 16
issues among themselves in a joint problem solving activity, i.e., as by means of task
structuring rather than information structuring (Hollnagel, 1996). The amount and type of
feedback should ideally match both the interactive style of operators and the nature of the
problem.
The introduction of advanced technology has made it possible for operators to follow
operating procedures on CRT screens, thereby minimising the efforts needed to retrieve
voluminous hard-copy procedures. Intelligent decision aids have also been introduced that
provide instructions sensitive to the context of the current process state, perform fault
diagnosis, and offer advice on coping with critical process states. The use of computerised
procedures and intelligent decision aids is intended to assist operators in choosing well-
tried strategies and allocate tasks optimally between different team members. Problems
may, however, occur in cases where handling the interaction with these systems increases
workload, where the intentions of the procedures or decision aids are not made clear to
operators, where the data consulted by the system are not transparent to operators, and
where operators lose track of their progress in monitoring or executing procedural steps.
The concept of intentional and proactive operator behaviour has implications for
the coupling of human and machine decision styles. Amalberti (1992) has identified a
“gap” in man-machine coupling: support systems use formal logic while humans use
qualitative reasoning and heuristics. In the context of aviation, for instance, pilots may try
to increase their confidence in the automation of short-term activities in order to devote
more effort to long-term planning and to preserve their original flight plans. This may
6/30/2018 Page 17
create a potential “gap” in decision styles as operators anticipate future events whilst
machine advisors try to solve problems as they emerge. Pilots may need information and
advice on problems that have not yet emerged or that may just possibly occur in a given
flight scenario. The implication is that machine advisors should be able to anticipate
operator information needs and responses in order to keep pace with operator “planning
ahead” strategies. This is not an easy task but research in common mental models (e.g.,
Rouse et al., 1992; Salas et al., 1994) suggests a way forward to bridge the gap between
human and machine decision styles.
Finally, cognitive ergonomics also considers the organisational culture that can
affect how computerised procedures or decision aids are used. Although the use of
computer assistance is generally at the discretion of the operators, deviations from the
recommended strategy may be investigated at a later stage after the management of the
emergency. The policy of a particular organisation may raise certain expectations about the
use of such aids that is likely to affect how operators will trade-off their criteria of
6/30/2018 Page 18
accepting or rejecting machine advice. Woods et al. (1994) discuss this authority-
responsibility bind in greater detail.
The introduction of advanced technology may change the cognitive demands of operating
teams. For instance, the operator must assume the role of a process manager or supervisor
in an effort to co-ordinate a number of “cognitive agents”, such as, automated systems,
computerised procedures, decision aids, and intelligent interfaces with predictive
capabilities. Despite the lack of opportunities for practice, the operator may also be
required to retain competence in a range of tasks carried out by automation in order to
compensate for possible failures of the technology.
This situation has some clear implications for training. Firstly, operators need to
become familiar with using the new technology. Training is therefore required to help
operators master the new forms of interaction. A major issue is how to train operators to
overcome negative transfer effects that arise from well-established attitudes and strategies
developed in the context of conventional interfaces. Secondly, operators need to develop
managerial or supervisory skills to co-ordinate the performance of the machine agents.
Sheridan (1992) provided extensive discussions on the nature of support that must be
provided to supervisory controllers. One of the unresolved issues is how to provide
operators with real-time support of their performance. Machine advisors must be able to
anticipate decision requirements of operators, as if both operator and machine are
following a common script (Lockhart et al., 1993). Recent work in common mental
models (Rouse et al., 1992) suggest some useful approaches to the development of on-line
support systems.
Thirdly, operators will still be needed to assume manual control or carry out fault
diagnosis in cases where the technology fails or when unanticipated events occur.
Therefore, traditional manual control skills and fault diagnosis must be retained although
they may only be used infrequently. This is not an easy task, since complex cognitive skills
must be developed progressively as in the case of manual systems. Moreover, as Lockhart
et al. (1993) pointed out, it is necessary to move through each stage in the development
from novice to expert. Hopkin (1992) notes that there is a “large cognitive gap” between
6/30/2018 Page 19
operators who develop a solution themselves (i.e., as in manual systems) and those who
choose a solution from a set of computer-generated alternatives. Overall, operators are
required both to master new skills and to retain competence in their traditional process
control skills. It is unfortunate if system designers continue to regard operators of
advanced technology as manual operators who just happen to be in charge of more
complex equipment.
The introduction of advanced technology has important implications for the total amount
of communications required and the distribution of tasks to different personnel roles.
Computer-mediated communication tends to replace verbal communication. The obvious
advantage is more accurate transmission of messages, but this may also suppress or
eliminate important cues about the activities of other team members by removing the party
line effect. Tracking the process status may be difficult if it is possible for operators to
interact with the system without other operators knowing about it. This problem is
exacerbated when two experienced operators have different strategies for working with the
system; if the normal communication is lacking it may be difficult to anticipate what the
other operator is doing and when he does it. The result of these changes is loss of
awareness about the activities of other members.
6/30/2018 Page 20
The extent to which the user interface may facilitate or hinder shared cognition
across the crew is an area that needs to be addressed by the design guidelines (Salas et al.,
1994). A similar area concerns the use of computerised procedures that may make it
difficult for operators to keep track of what the other crew members are doing. Further
problems in using procedures in the context of crew dynamics relate to task delegation
among different crew members. Procedures usually make few assumptions about staffing
levels in the control room and may overload operators by specifying a number of
procedural tasks to be carried out in parallel.
Many traditional team roles - such as, assistance with task overload, development
of trust and confidence in team work, judgements of individual strengths and weaknesses -
may become more difficult for operators to fulfil if computer assistance only helps with
individual tasks. There may also be restrictions to the role of supervisor since
computerised systems can partly disguise substandard performance; an operator may thus
become inadequate or incompetent without this becoming apparent to anyone else. This
may reduce opportunities for supervisors to make judgements regarding career
developments, promotions, and re-training needs.
6/30/2018 Page 21
5. GUIDELINES FOR SYSTEM LEVEL FUNCTIONS
In addition to the specific functions, guidelines need to address the system level functions,
that is, the implications of the coupling or interaction between specific design features.
The most appropriate approach is to use coupled (simulation) models of the system and the
operators, to enable a systematic investigation of the dynamics of the interaction. Although
such joint system simulations are being pursued in a number of areas (e.g., Corker &
Smith, 1993; Fujita et al., 1993; Sasou et al., 1993) they are still at an early stage of
development. A much simpler approach is to map the dependencies between the level of
automation, the “hard” task features (e.g., interface design and computer support), and the
“soft” task features (e.g., training, task distribution, and communication). There are two
different approaches to map the dependencies between various design features. In the most
prevalent technology-driven approach (as shown in Figure 3), the level of automation is
mainly determined by concerns of safety and efficiency, together with technological
innovations.
The level of automation has consequences for the “hard” and the “soft” task features. The
“soft” aspects of the control room are usually the ones that must provide the necessary
flexibility and adaptability to meet the constraints of the “hard” aspects. Typically, “hard”
task features are independent factors while “soft” task features are dependent factors. For
instance, it has been a common practice to consider how training could meet interface
problems but not how the design could accommodate operator strategies. The alternative is
to use a cognitive ergonomics approach which can reveal more subtle interactions. This
approach is outlined in Table 1 as a first-order matrix of relationships between the main
factors in interface design.
Table 1. Basic matrix of the links between the main factors in interface design.
The matrix in Table 1 constitutes the first layer in a multi-layered description of the links
between various design aspects that can be used as the basis for a thorough analysis and,
ultimately, the development of detailed design guidelines. Read by rows, Table 1describes
how the factor named in the first cell of each row influences the factors of each column.
For each cell (except for the diagonal) the general influence or dependency can be
6/30/2018 Page 22
described. The content of Table 1 is a first step to assess the impact of a design, at least in
terms of a general direction.
It is possible to develop this notion further in order to provide a more detailed set
of questions to guide design evaluation. Initially, this can be done by creating a second
order matrix, as shown in Table 2. The matrix describes the influence that a factor of a row
exerts on the factors of each column. The second row of the matrix, for instance, shows the
effects of the level of automation (LoA) on each of the four other design aspects. In a
discussion of these interactions that follows, the cells of Table 2 are identified first by row
and then by column (e.g., cell [row:column]) which reflects the direction of influence.
Table 2. A second order matrix of the links between the main factors in interface design
Table 2 contains four main clusters, which are expansions of the main factors shown in
Table 1. Cluster #1 shows the interaction among the three main elements of advanced
technology, namely, level of automation [A], user-process interface [I], and computerised
support tools [S]. Automation is seen as having an impact both on the user-process
interface and on computerised tools (cells [A:I] and [A:S]). The user interface, for
instance, must reflect new demands to status indication (cf. Section 4.2 on how to increase
automation transparency and overcome mode errors) while computerised procedures must
take account of any changes in diagnosis, planning or task scheduling introduced by
automated functions. On the other hand, automation is independent of user interface and
computerised support since the latter design aspects are usually changed to make
automation more acceptable and easier to use (cells [I:A] and [S:A]).
6/30/2018 Page 23
Table 2 also describes two different principles that may be used to introduce new
technologies. In cluster #2 changes in training, staffing and crew dynamics are used to
meet design requirements (or even compensate for design deficiencies), corresponding to
the principles of “fitting the man to the machine”. The alternative represented by cluster #4
could be called “fitting the machine to the man”. Here design features of automation, user-
process interface, and computerised support take into account existing operator strategies
and crew dynamics. The latter approach requires tailoring the technology to established
work practices or the needs of a specific organisation; following this approach is likely to
minimise the adverse aspects of new automation, but it is rarely used in practice since “off-
the-self” technologies are cheaper to apply.
With respect to cluster #4, various examples have been cited in Sections 4.3 and
4.4 on designing user interfaces and computerised tools to achieve compatibility with the
operators’ mental models of the system and their operating strategies (cells [T:I] and
[T:S]). The issue of potential “gaps” in man-machine systems due to different styles of
reasoning and decision making (Amalberti, 1992) has implications for designing
computerised tools that may anticipate operator performance (Lockhart et al., 1993).
6/30/2018 Page 24
Section 4.5 also raised the point that the design of user interfaces and procedures should
facilitate shared cognition among team members (cells [C:I] and [C:S]). In this respect,
communication aspects influence the design of user interface and procedural support.
Finally, cluster #3 represents the reciprocal interaction between, on the one hand,
training and staffing and, on the other, crew communications and task delegation. For
instance, communication links - as affected by the introduction of advanced technologies -
may influence judgements of individual strengths and weaknesses by supervisors or team
colleagues and hence recognition of training needs (cell [C:T]). Reduced possibilities for
observation in team work are likely to affect professional norms, self-esteem, and training
needs. On the other hand, changes in staffing due to automation may have consequences
for communication and workload (cell [T:C]), especially in tasks seemingly unrelated to
introduced automated functions (Lee & Morgan, 1994).
The contents of the cells in Table 2 describe the relations between the various
design aspects in a general way. In many cases it may be possible to develop or identify
more precise descriptions of the dependencies and represent these in terms of models or
rules. It is possible to envisage further layers of the matrix containing more precise
descriptions of the dependencies between cells. If the level of detail is increased, the
matrix layer will eventually correspond to a complete application model for design
6/30/2018 Page 25
decisions. Yet even on a high level of abstraction, the formalised representation of the
interactions between the main aspects serves as a useful tool for control room design. By
using this type of approach, the system designer is constantly reminded of the fact that the
design is incomplete if it does not include a consideration of the behaviour of the joint
system. To illustrate the practicality of this approach, a sample of questions have been
specified, in the attached Appendix, in order to guide the development of design guidelines
at the conceptual level. These questions can be translated into a set of design guidelines
on the basis of a more complete application model.
6. CONCLUSIONS
The changing nature of industrial processes has affected the role of the operator. Classical
ergonomics was based on the assumption that all significant detail of the task could be
anticipated and addressed by the design, hence that the operator worked in a closed loop. It
was therefore - in principle - possible to distribute the tasks between operators and
automatic control systems and to specify information presentation and control such that an
appropriate performance would result. Cognitive ergonomics acknowledges that the
complexity of the system may defeat the design principles, and that operators therefore
also engage in open loop control. This can happen when automation fails or when
unanticipated situations occur. It can also happen because the demand to closed loop
control is reduced and the nature of work therefore induces a tendency to look ahead and
plan. Designing for open loop control is significantly different from designing for closed
loop control. In open loop the emphasis of the system design must shift from feedback and
manipulation to providing the operator with an understanding of the process and
facilitating the activation of appropriate knowledge.
The design framework advocated here is based on a structured approach that has its
origins in the disciplines of cognitive ergonomics, cognitive psychology, and Cognitive
Systems Engineering (Hollnagel & Woods, 1983). The designer is first provided with a
number of design principles or concepts, as described in Section 3, which permeate the
proposed framework. The actual design methodology is specified at a generic level in
terms of a number of qualitative layers which examine the dependencies of various design
aspects in the context of operator tasks. It is conceivable that more specific guidelines can
be produced when Table 1 and Table 2 are expanded to more detailed layers of analysis.
This approach is also useful in helping designers to structure existing guidelines into
functional groups that correspond to the various “hard” and “soft” task features and
thereby to improve usability of guidelines.
A final comment should be made about the treatment of “soft” task features in
system design. It has often been the case that training and team work have been used to
compensate for design deficiencies. In the approach advocated here, the solution is rather
to include a degree of “design adaptability” to accommodate the problem-solving
strategies of operators, their different levels of competence, and the team dynamics. This
does not mean that the capabilities of new technology should be constrained to match
existing skills and types of knowledge. It rather serves as a caution that changes in the
style of interaction with new technology and with team members is likely to impact upon
the wider organisational culture of training, supervision, team work, and professional
norms and standards. Striking a balance between adaptability and innovation remains a
challenging design issue that calls for further developments in cognitive ergonomics.
Acknowledgement.
6/30/2018 Page 27
This paper has its origin in work done in a project on “Assessment Of Changes In
Cognitive Demands On Operations Personnel”, performed for the Nuclear Installations
Inspectorate, UK. The contribution of Dr. Craig Reiersen (NII) is gratefully acknowledged.
6/30/2018 Page 28
7. REFERENCES
6/30/2018 Page 29
Hollnagel, E. (1996). Decision support and task nets. In S. Robertson (Ed.) Contemporary
Ergonomics 1996 (Proceedings of the Annual Conference of the Ergonomics Society, April
10-12, Leicester). London: Taylor & Francis.
Hollnagel, E. & Woods, D. D. (1983). Cognitive systems engineering: New wine in new
bottles. International Journal of Man Machine Studies, 18, 583-600.
Hopkin, V. D. (1992). Human Factors issues in air traffic control. Human Factors Bulletin,
35(6), 1-4.
Hopkin, V. D. (1995). Human factors in air traffic control. London: Taylor & Francis.
Klein, G., Kaempf, G.L., Wolf, S., Thorsden, M. & Miller T. (1997). Applying decision
requirements to user-centered design. International Journal of Human-Computer Studies,
46, 1-15.
Langlotz, C. P. & Shortliffe, E. H. (1983). Adapting a consultation system to critique user
plans. International Journal of Man Machine Studies, 19, 479-496.
Lee, J. D. & Morgan, J. (1994). Identifying clumsy automation at the macro level:
Development of a tool to estimate ship staffing requirements. Proceedings of the Human
Factors and Ergonomics Society 38th Annual Meeting, October 24-28, Nashville,
Tennessee.
Lind, M. & Larsen, M. N. (1995). Planning support and the intentionality of dynamic
environments, In J. M. Hoc, P. C. Cacciabue & E. Hollnage (eds.), Expertise and
technology: Cognition and human-computer interaction. Hillsdale, NJ: Lawrence Erlbaum
Associates.
Lockhart, J. M., Strub, M. H., Hawley, J. K. & Tapia, L. A. (1993). Automation and
supervisory control: A perspective on human performance, training, and performance
aiding. Proceedings of the Human Factors and Ergonomics Society 37th Annual Meeting.
October 11-15, Seattle, Washington.
Meister, D. (1985). Behavioural analysis and measurements methods. NY: Wiley.
Mossier, K. L., Palmer, E. A., & Degani, A. (1992). Electronic checklists: Implications for
decision making. Proceedings of the Human Factors Society 36th Annual Meeting. October
12-16, Georgia, Atlanta.
Norman, D. A. (1983). Design rules based on analyses of human error. Communications of the
ACM, 26 (4), 254-258
Norman, D. A. (1988). The psychology of everyday things. NY: Basic Books.
Norman, D. A. (1990). The problem with automation: Inappropriate feedback and interaction,
not “over-automation”. Phil. Trans. R Soc London, B 327, 585 - 593
O’Hara, J. M., Brown, W. S., Stubler, W. F., Wachtel, J. A. & Persensky, J. J (1995). Human-
system interface design review guideline (NUREG-0700). Washington, DC: Nuclear
Regulatory Commission.
O’Hara, J. M. & Hall, R. E. (1992). Advanced control rooms and crew performance issues:
Implications for human reliability. IEEE Transactions on Nuclear Science, 39(4), 919 -
923.
6/30/2018 Page 30
Rasmussen, J. (1986). Information processing and human machine interaction: An approach to
cognitive engineering. Amsterdam: North Holland.
Reason, J. (1990). Human error. Cambridge, UK: Cambridge University Press.
Redding, R. E & Seamster, T. L. (1994). Cognitive Task Analysis in air traffic controller and
aviation crew training. In N. Johnston, N. McDonald & R. Fuller (Eds.), Aviation
psychology in practice, Aldershot: Avebury Technical.
Rouse, W. B., Cannon-Bowers, J. A. & Salas, E. (1992). The role of mental models in team
performance in complex systems. IEEE Transactions on Systems, Man, and Cybernetics,
22, 1296-1308.
Salas, E., Stout, R. J. & Cannon-Bowers, J. A. (1994). The role of shared mental models in
developing shared situational awareness. In Gibson, R. D., Garland, D. J. & Koonce, J. M.
(Eds.), Situational awareness in complex systems. Daytona Beach, FL: Embry-Riddle
Aeronautical University Press.
Sarter, N. B & Woods, D. D. (1995). How in the world did we ever get into that mode? Mode
error and awareness in supervisory control. Human Factors, 37(1), 5-19
Sasou, K., Yoshimura, S., Takano, K., Iwai, S. & Fujimoto, J. (1993). Conceptual design of
simulation model for team behaviour. In E. Hollnagel & M. Lind (Eds.), Designing for
simplicity. Proceedings of 4th European Conference on Cognitive Science Approaches to
Process Control, August 25-27, Hillerød, Denmark.
Sheridan, T. B. (1992). Telerobotics, automation, and human supervisory control. Cambridge,
MA: MIT Press.
Wiener, E. (1985). Beyond the sterile cockpit. Human Factors, 27(1), 75-90.
Woods, D. D. (1993). Process-tracing methods for the study of cognition outside of the
experimental psychology laboratory. In G. A. Klein, J. Orasanu, R. Calderwood & C. E.
Zsambok (Eds.), Decision making in action: Models and methods, NJ: Ablex Publishing
Woods, D. D. (1986). Paradigms for intelligent decision support. In E. Hollnagel, G. Mancini
& D. D. Woods (Eds.), Intelligent Decision Support in Process Environments. Berlin:
Springer-Verlag
Woods, D. D., Wise, J. A. & Hanes, L. F. (1982). Evaluation of Safety Parameter Display
Concepts. EPRI, NP-2239, Electric Power Research Institute, Palo Alto, California.
Woods, D. D., Roth, E. M. & Bennett, K. B. (1990). Explorations in joint human-machine
cognitive systems. In S. P. Robertson, W. Zachary & J. B. Black (Eds.), Cognition,
computing, and cooperation. New Jersey: Ablex Publishing
Woods, D. D. & Sarter N. B. (1992). Evaluating the impact of new technology on human-
machine cooperation. In J. A. Wise., V. D. Hopkin & P. Stager (Eds.), Verification and
validation of complex systems: Human factors issues. Berlin: Springer-Verlag
Woods, D. D, Johannesen, L. J, Cook, RI, & Sarter, N. B. (1994). Behind human error:
Cognitive systems, computers and hindsight. Wright-Patterson AFB, OH: Crew Systems
Ergonomics Information Analysis Center.
6/30/2018 Page 31
6/30/2018 Page 32
Structured Goals of
analysis application
Constraints,
Design guidelines
Functional
(concepts, principles,
requirements
criteria)
Evaluation
Technological
options
Automatic control
system
Displays &
Controls
measurements
Process
Results or Disturbance
products or noise
“Soft” task
Efficiency needs features
(Task distribution,
communication)
Table 1. Basic matrix of the links between the main factors in interface design
(A) (I) (S) (T) (C)
Cluster #1 Cluster #2
(A) Yes. New Yes. LoA Yes. Crew must Yes. Operator
demands to impacts have sufficient tasks are largely
Level of status indication diagnosis and knowledge and defined by the
automation & automation planning / task skills to use and LoA.
intervention. scheduling. understand
(LoA) automation.
(I) No. Interface is Yes. Interface Yes. Crew must Yes. Interface
often used to of computer have sufficient may change
User-plant increase support must knowledge and crew roles and
interface acceptability of be integrated skills to operate functions.
automation with user-plant plant.
interface.
Cluster #4 Cluster #3
Table 2. A second order matrix of the links between the main factors in interface design
APPENDIX
A sample of questions to guide the development of design guidelines at the conceptual level