Sunteți pe pagina 1din 9

1

General Aviation Automation Risk and the Research


2014 AAJ Summer Convention
Mike Danko, Danko Meredith

Anything that can be automatically done for you can be automatically done to you.
-Wylands Law of Automation

Is the glass cockpit a safety feature? Or a hazard? How, exactly, does automation get the
pilot into trouble? This paper reviews five studies, dating from 1977 to 2013, that bear on those
questions. Each of the studies approaches the automation risk from a different angle. The
disciplines include statistics, experimental research, and human factors.

Though the studies touch upon how cockpit automation causes pilots to come to grief,
they do not draw conclusions as to why. Some of the studies, however, note that the logic of the
automation is fundamentally different from the logic the pilot would use to control the
aircraft. Thus, once the automation is engaged, the pilot cannot meaningfully participate in the
decision making process. Rather, he can only monitor the results that the process
generates. That makes it difficult for the pilot to detect errors until after the aircraft has deviated
from the desired course. Other studies note that, once the pilot has detected an error, he
frequently becomes pre-occupied with fixing the automation, rather than controlling the
airplane. Perhaps the fixation is due to the pilots unwillingness to admit that he doesnt fully
understand the automation or, more accurately, the automations logic. Perhaps as a monitor
rather than a participant, he is uncomfortable with jumping back in and hand flying the aircraft
because he has, to some extent, lost situational awareness.
2

Some studies suggest the solution is to provide pilots with better training. At least one
author notes, however, that to understand and thereby promptly remedy the many errors that that
the automation can produce, the pilot needs to understand the system logic to a degree that is
simply not practical.

oo0oo


Operational User of Flight Path Management Systems: Final Report of the Performance-
based Operations Aviation Rulemaking Committee/Commercial Aviation Safety Team Flight
Deck Automation Working Group (Federal Aviation Administration September 2013).


This lengthy FAA study includes many findings. Though the study deals with
commercial aviation, much of it is applicable to General Aviation. Among the studys
conclusions:

Pilots sometimes rely too much on automated systems and may be reluctant to
intervene when things arent working out as expected.
Mode confusion errors continue to occur despite industry efforts to eliminate them.
Programming errors continue to occur.
Data entry errors may cause significant flight path deviations leading to accidents.
Pilots sometimes lack sufficient or in-depth knowledge to efficiently and effectively
ensure the automation causes the aircraft to follow the desired flight path.
Current pilot training may not be adequate.
3

Airspace procedures and clearances are not always compatible with the aircrafts
automation.

The studys authors observe that pilots train in automated systems primarily by watching
things happen. When reverting to manual flight, they are more likely to react to what they
experience, rather than be proactive and stay ahead of the aircraft.

To reduce the chances of mode confusion errors, the study recommends:

improving training regarding autoflight mode awareness;
reducing the complexity of autoflight modes from the pilots perspective; and
Improving the feedback to pilots concerning status changes such as autopilot mode
transition while insuring that the design of the mode logic assists with the pilots
intuitive interpretation of failures and reversions.

Introduction of Glass Cockpit Avionics into Light Aircraft (NTSB Safety Study/SS-01/10
(2010).)

This NTSB study was designed to test the hypothesis that the advent of glass cockpits in
GA aircraft would improve safety of their operation. The study included three separate analyses:

A statistical comparison of glass-equipped aircraft to identical aircraft that are
conventionally equipped.
A review of training resources available for glass cockpit aircraft.
A review of certain accidents involving glass-equipped aircraft to indentify emerging
safety issues.


4

The study concludes:

Glass cockpit-equipped aircraft experienced proportionately fewer total accidents
than a comparable group of aircraft equipped with conventional round gauges.
Glass cockpit-equipped aircraft experienced a higher rate of accidents in IMC.
Glass cockpit-equipped aircraft had a significantly higher percentage of fatal
accidents.

Overall conclusion: study analyses did not show a significant improvement in safety for the
glass cockpit study group.

The study offers some possible explanations for the lack of improvement in safety:

Pilots are not provided all the information necessary to understand the equipment
installed in their aircraft.
Training is not keeping up with the advances in technology
In many cases it is neither appropriate nor practical to train for all anticipated types
of glass cockpit avionics failures and malfunctions in the aircraft.


George King, General Aviation Training for Automation Surprise, (I nternational
J ournal of Professional Aviation Training & Testing Research (2011).)

This study offers theory but little data.

Authors Thesis: Automation Surprise cannot be eliminated from GA Technologically
advanced aircraft because the digital logic implicit in the programmation of highly sophisticated
flight and navigation systems cannot be adequately replicated by human operators.
5


There are two types of automation surprise in GA aircraft:

1. An unexpected or uncommanded system mode change, such as when the autopilot
switches from NAV mode to PIT mode.
2. Unexpected result from commanded change, such as when the autopilot fails to
capture the Glideslope.

In either case, the pilot is left confused with no immediate idea of what action should be
taken to correct the situation.

The author suggests that the root of the problem is that conventional aircraft present the
pilot with raw data to process. TAA aircraft take the raw data, process it, and present the pilot
with the product. The pilot is left out of the loop of the processing, and in some cases, it is
simply impossible for the pilot to remain aware of the state changes occurring within the
system. No training, short of a study of the low-level software design of the system would be
adequate for the user.

When conditions become abnormal . . . the pilot is suddenly in a position where the
sequence of events which led to the present condition is unknown. Even more serious is the
possibility that the condition in which the pilot finds the aircraft has been exacerbated by the
automation system itself as the system may have already applied several layers of error
correction to deal with the originating problems without informing the pilot of these actions.



6

Hugh Bergeron, General Aviation Single Pilot I FR Autopilot Study (Langley Research
Center (1981).)

General aviation pilots were recruited to fly instrument approaches on the NASA Langley
general aviation simulator. In addition to flying the approach, pilots were given math-based side
tasks (such as velocity-time-distance problems) to complete as time allowed. The pilots were
told not to allow the completion of the side tasks to interfere with flying the approach. The
pilots performance was assessed in five levels of automation:

1. No automation
2. Wing leveler only
3. Heading select only
4. Heading select with lateral navigation coupler
5. Full automation. Heading select with lateral nav coupler, altitude hold with vertical
nav coupler and glideslope coupler

Workload Reduction: One purpose of automation is to decrease the workload required to have
the aircraft fly its desired path, and thus free the pilot to attend to other matters. The pilots
performance on the side tasks did indeed improve when using a wing leveler, and then again
when using the autopilots heading select. Surprisingly, however, the pilots side task
performance showed no additional improvement when using the more sophisticated features
such as lateral or vertical nav couplers. In other words, autopilot features beyond heading select
did not seem to decrease the primary workload associated with flying the aircraft.

7

One interpretation of this phenomenon is that beyond the [heading] mode the subject trades off
the workload associated with flying the control task for the workload required to monitor the
autopilots control of the flight task. In short, engaging automation beyond heading mode may
be more trouble than it is worth, at least from a workload-reduction perspective.

Precision: Not surprisingly, the aircrafts ground track appeared smoother and more accurate
with automation engaged. However, once again, there was little incremental improvement in
flight path precision as automation increased beyond heading mode.

Problems associated with automation:

1. Pilot loses situational awareness. Subjects were more likely to lose track of where
they were in the approach as automation increased. It seemed that in monitoring the
autopilot [pilots] would associate instrument readings with autopilot functions rather
than to situational awareness. Therefore, if the autopilot functions were either set
incorrectly or interpreted incorrectly, the subject would frequently perform the wrong
task, thinking that everything was normal. Pilots using the automation tended to
monitor that the aircraft was doing as the autopilot had commanded, rather than what
was correct.
2. Pilot becomes engrossed. Once the aircraft is fully coupled, the pilot becomes
engaged in side tasks to the exclusion of all else.
3. Pilot falls into trance. Subjects reported that when flying fully coupled, they would
simply forget to perform the side tasks at all.

8

Other notes: (1) Pilots subjectively reported that fully automated approaches were easier
to fly than those in which only heading select was used, though the data suggest otherwise.
(2) The number of blunders increased with the level of automation. The most number of
mistakes were made during the course of fully automated approaches.


Dennis Beringer, Howard Harris Jr., Automation in General Aviation: Two Studies of
Pilot Responses to Autopilot Malfunctions. (FAA, Civil Aeromedical Institute, Office of
Aviation Medicine (1977).)


Both studies focused on pilots reaction to autopilot malfunctions in a simulator
configured as a Piper Malibu. The failures were mechanical servo/runaway trim type failures
rather than more subtle mode failures. In the first study, some pilots took up to 47 seconds to
respond to the failure, and in the second over 100 seconds. Unfortunately, the studies did not
analyze why it took so long for the pilots to recognize and react to the failures. Regardless, these
studies are of limited usefulness because they deal with pilots reactions to rather obvious
autopilot errors, such as runaway trim, rather than the type of more insidious errors that lead to
mode confusion.



Rene Almaberti Automation in Aviation: A Human Factors Perspective (Aviation
Human Factors (Chapter 7) (1998).)

This chapter is a review of various published studies supporting the following
conclusions or observations:

Fiddling with the flight management system makes pilots lose their awareness of the
passing of time, and further, their awareness of the situation and of the flight path ...
9

Problem occurs most frequently on approach with runway changes.
Logic involved in reprogramming the automation when confronted with an ATC
change differs greatly from the pilots own logic.
Problem is exacerbated when using autopilot modes that sequence automatically
without pilot input (i.e., mode confusion results when autopilots logic differs
significantly from pilots logic).
How a pilot suffering from mode confusion tends to reacts:

Pilot more likely to blame himself for not understanding the automation than to blame
the automation for questionable aircraft behavior. Pilot squanders time, fails to
intercede, and allows situation to worsen as he tries to understand the logic of the
automation.
Because the pilot does not understand the automation, he tends to accept a greater
deviation from the autopilot than he would a human copilot.
Once the pilot decides he needs to override the automation, he will override only the
automated procedure that is deviating from the goal, and he will invest excessive time
to save the rest of the automation

Takeaway: Compared to an error the pilot himself induces without automation, a pilot
takes longer to recognize automation-related errors, tolerates them longer, takes longer to rectify
them, and generally handles them less decisively.

S-ar putea să vă placă și