Sunteți pe pagina 1din 10

A Heads Up Display based on a DGPS and Real

Time Accessible Geo-Spatial Database for Low


Visibility Driving
Heon-Min Lim, Bryan Newstrom, Craig Shankwitz and Max Donath
ITS Institute and the Dept. of Mechanical Engineering
University of Minnesota
Minneapolis, MN 55455

angle. Database access times are 20-40 ms with


ABSTRACT 5-10 ms UDP packet transfer times; a typical
query using existing databases is 1-3 packets
Poor visibility conditions are a causal
long.
factor in many crashes each year. The inability to
see road boundaries, obstacles, and other
vehicles on the road is especially of concern INTRODUCTION
during the winter. Snowplow drivers, in Driving a vehicle has become a daily
particular, are affected because they must often part of life for a significant portion of the
drive when visibility is poor, when roads are developed world, including trips to the
snow covered and lanes are hard to discern. workplace or school, access to medical care,
There are serious safety and economic shopping for food, hauling freight or transporting
repercussions if snowplows cannot perform their others. Whether for convenience or for economic
function. justification, most of us spend time on the road
We are investigating how to effectively everyday, at the very least sitting in a vehicle
use the vehicle’s sensed location in the lane driven by another party.
(using high accuracy RTK DGPS) to provide There have been efforts to develop
feedback to the driver based on a conformal systems to help drivers reach their destinations
heads-up display (HUD). When visibility is poor, safely and efficiently ever since the appearance
this latter technology provides the driver with of the first automobile. Road signs, traffic
lane boundary information, corrected for eye signals, maps, steering wheels, automatic
perspective, projected onto the windshield. A transmissions, cruise control and virtually
prototype HUD has been experimentally verified everything in the vehicle and on the roadside can
and demonstrated on a truck. We have also be considered a result of these efforts.
developed an in-vehicle geo-spatial database Recent developments in vehicle based
query processor that allows us to extract the Intelligent Transportation systems (ITS) are too
needed data in real time. As the vehicle moves numerous to list. Vehicles already on the road
along the road, the vehicle's DGPS derived today have incorporated technologies that
position is used to pull the data from this "digital decades back were considered only in the context
map" and provide it to the HUD's graphics of military systems, ranging from navigation
processor. The system allows the driver of the systems that include speech recognition, to
vehicle to see the 'computed' road boundaries wireless communication and sensing, from
projected directly over the 'actual' road collision avoidance and warning systems to
boundaries even if they are obscured by snow, advanced vehicle monitoring and control
rain, or darkness. systems. High speed computer processors as
We have documented and quantified the embedded systems integrating new sensor and
latency and accuracy errors associated with actuator technologies have become a vital tool to
projecting geographic information onto the both assist the driver with the driving task and to
HUD. Image frame rates exceeding 10 Hz have protect the driver in the event of mishap.
been demonstrated. Typical latency for Developments in the last five to ten
computation and display are less than 20 ms and years have led to the commercial availability of
display projection error is equivalent to 0.5 more accurate, higher bandwidth, real time
meters at 120 meters (as viewed by the driver). DGPS and of digital maps. It is now possible to
This is equivalent to about a 0.25 degree sighting present more detail and higher fidelity
information to the driver than was ever possible maneuver a airplane. He also found that a
in the past. How do we use and present this predictive display symbol that provides preview
information to the driver in ways that help rather information could enhance manual guidance. In
than distract? In this paper, we will describe and his experiment, a prediction of 137.2 - 228.6m
evaluate a conformal heads up display (450ft - 750ft, at 75mph), which is 4 - 7 seconds
technology that takes advantage of new sensing of prediction time, resulted in the best
and computational modalities and assists the performance.
driver with guiding a vehicle in poor visibility, Rockwell et al. [6] found that
whether due to snow, fog, rain or inadequate automobile driving performance during the night
ambient lighting. We will focus on the issue of was best when drivers could see at least 75ft to
providing guidance to the driver to ensure that 100ft ahead (driving speed was 20mph-30mph).
the vehicle stays in its lane rather than on the Even though there have been many systems
navigation issue of providing street by street designed in the past to assist the driver with lane
routing instructions. keeping, most of them have resulted in
While the vehicle is in motion, real time distracting the driver from their primary task.
information must be conveyed to the driver Steinfeld and Green [14] found that a full
without distracting his/her mental attention from windshield HUD could indeed improve driving
the roadway. How to effectively perform this performance. Drivers’ response for perspective
task is one of the most important research topics projection was 20% faster than for instrument
in the development of in-vehicle information panel type displays.
display systems. The physical characteristics of
the devices that display the information and the
choice of format and content of that information DESIGN GOAL OF THE HEADS UP
are all major concerns in maximizing driver DISPLAY
information processing and minimizing driver
workload. In this paper, we will describe a series A conformal augmented display
of experiments that address the technical executed via a HUD was designed to assist
capabilities of the first generation prototype drivers by presenting visual information in the
system. forward-looking field of view. A perspective
A Heads Up Display (HUD), which projection based image of the lane boundaries
presents visual information in the drivers’ was selected in order to minimize the drivers’
forward-looking visual field is conceptually ideal mental workload in recognizing the presented
for presenting navigation or guidance visual information. The driver sees through the
information to the driver [1, 2]. By presenting projected screen (augmentation) which displays
superimposed images in the drivers’ visual field an image generated by the computer and
of view, a HUD can reduce the amount of time superimposed on the real outdoor scenery
needed for a driver to process display (conformal mapping). Drivers must perceive the
information while maintaining attention on projected road boundaries as real and thus drive
important events in the outside world [2, 3]. their vehicle within the driving lane.
In forward-looking, field of view based
systems (with characteristics not unlike feedback
control systems with preview), visual SYSTEM CONFIGURATION
information is the most important sensory input The HUD system is composed of the
for the major driving tasks of lane keeping and following hardware components: high accuracy
collision avoidance [4, 5, 6, 7]. The effect of DGPS, PC and PC104 computers, a video image
forward-looking information or preview in projector, and a combiner.
driving is well described by Modjtahedzadeh and A Novatel RT-20 DGPS was used as
Hess [8]. Results from studies investigating the primary position sensing system. In previous
visual guidance in the field of aviation [9, 10, 11, experiments [15, 16], we have found that its
12, 13] are applicable to ground based driving as performance readily meets its specifications of
well. For example, Grunwald et al. [11] showed, 20cm accuracy while operating at a 5 Hz
in a simulated environment, that a prospective sampling rate.
tunnel display could visually guide a pilot to
Figure 1. Typical driving situation with the HUD combiner (the gray rectangular lens) in the field of
view. The HUD is projecting the accentuated lines (displayed as yellow superimposed on the lane
boundaries in the original) in the central portion of the lens with 0.5m tall virtual posts drawn on the
right boundary.

There are two computers used for the but essentially transparent optically correct
generation of the HUD: a geo-spatial map plastic lens. The light reaching the eyes is a
database manager and a graphic image generator. combination of the light passing through the lens
The processor (QNX, PC-104) used for map and light reflected from the projector. The
database management was also used for driver, actually sees two images superimposed
collecting other sensory information from the together. The image passing through the
vehicle. The second processor (Windows98, Dell combiner comes from the actual frontal field of
Inspiron 3000 notebook) was used only for view while the reflected image is generated by
generating the screen projection. Query results the graphics processor. The combiner’s optical
from the geo-spatial database are delivered characteristics allow it to generate a virtual
through a local Ethernet network. Details about screen projected to float 30 - 40ft ahead of the
the communications between the database unit. This feature, which results in a virtual focus
managing processor and the graphics processor in front of the vehicle, ensures that the driver’s
are explained in the geo-spatial database section. eyes do not have to focus back and forth between
An image projector and combiner the real image and the virtual image, thus
system manufactured by Delco was used for reducing eye strain.
prototype development. This HUD was The HUD software was developed
originally designed and sold for displaying using Microsoft Visual C++6.0 under the
character based information in patrol cars. Its Windows 98 operating system using the QNX
NTSC video-displaying feature was used for the real-time operating system software development
HUD development described in this paper. Video tools.
output from the graphics processor was fed into a The position of the vehicle and the
scan converter which converts VGA input into information stored in the map database is
the needed NTSC video signal. The unit projects expressed in terms of a global coordinate frame
the image on to a combiner, a partially-reflective, located in the North American Datum 1983
(NAD83) Minnesota South State Plane situations. The camcorder was mounted at the
coordinates in units of meters. The raw DGPS driver’s right eye position using custom
string, which contains latitudinal and mounting brackets. The optical image stabilizer
longitudinal angle data coming from the DGPS, in the camcorder was enabled while taking video
was converted from degrees into global images. The digitally stored images were
coordinates. The vehicle coordinate frame was transferred to a PC, then processed and analyzed.
defined as a frame moving with the GPS antenna To synchronize the beginning of the video image
mounted on top of the vehicle. A local stream, a special mark was put on the video
coordinate frame was attached to the driver’s screen by the HUD software when recording was
eye, see Figure 2. From his/her view point, started.
straight forward was defined as positive y, an Sampling for error analysis was done at
axis to the right was assigned as positive x, and two second intervals along the centerline. A
straight up was positive z. The extracted map special grid mark was drawn to synchronize
data was converted into a local coordinate frame these two second intervals as shown in Figure 3.
that moves with the vehicle. In Figure 3, yellow lines (which may not be
apparent in a gray scale image reproduction) are
the computer generated. The three segments
Eye Coordinate along the centerline are reference marks to
System measure mismatch error. Each horizontal line
Road Location Data z x segment is 0.5m long, and the gap between two
Pk (Xk, Yk, Zk) z
y x line segments is also 0.5m. The height of the
vertical mark is also 0.5m.
Errors associated with the projected
Z lane boundaries are computed by comparing the
Vehicle Coordinate distance the projected lines are displaced from
Y
System (VX,VY,VZ) the actual lane boundary. The lateral
displacement of the projected lane boundary
OG X from the actual lane boundary at any of the three
Global Coordinate System marks is computed knowing that the length of
Figure 2. Coordinate Systems the reference mark is 0.5m. This lateral
displacement is then normalized by dividing by
the distance to the camera, which yields a value
for the normalized visual sighting angle.

EXPERIMENTAL SETUP
A Navistar International truck was used
as the test bed for the experiments. No yaw rate
gyro or magnetometer was used for estimating
the vehicle’s heading angle. The vehicle’s
heading angle was estimated only from the
trajectory of recent DGPS position values. A
simple difference method that calculates a vector
angle of current position from the last past
position of the vehicle was used to estimate
vehicle heading. By increasing the look-back
distance, noise in the heading estimate was Figure 3. Reference marks for error analysis
attenuated. The effect of the look-back distance (blown up from central portion of Figure 1)
on the heading angle estimate will be discussed
in the following experimental result section. The error was measured at three
Live video images were captured while different ‘look-ahead’ distances: 60m (196.8ft),
a driver was driving the test truck on a road. 90m (295.3ft), and 120m (393.7ft) as measured
Position data coming from DPGS was also from the driver’s eye. The topmost horizontal
simultaneously stored. A Canon Optura digital grid mark, i.e. the furthest one in Figure 3, is
video camcorder was used to record the 120m ahead.
projected HUD screens during actual driving
The objective of the conformal HUD is can be seen that the errors at all three distances
to construct and project road characteristics onto match quite well.
the display screen (or combiner) that exactly
matches with real road characteristics. To 2
quantify how well this is done, the “visual sight
1.5
angle” (vsa) was used to describe the mismatch
1
error. The visual sight angle is defined by the
ratio of the actual lateral error associated with the 0.5

Error (degree)
lane projection and distance to the eye point as 0
shown below: -0.5

-1
lateral error at distance x 60m
vsa = . -1.5 90m
distance x 120m
-2
0 20 40 60 80 100 120
The visual sight angle normalizes error Time (sec)

along the depth of the visual field and captures


the error that is perceived by the driver. All our
experimental results are described in terms of
this visual sighting angle based error.
All experiments were performed on the
Mn/Road low volume, two lane test road.
Figure 4. Error in terms of vsa for various
Mn/Road is a facility maintained by the
look-ahead distances for heading angle
Minnesota Department of Transportation
estimate based on look-back distance 0.5m, at
(Mn/DOT) to test road construction materials
10 mph
and methods. The road is isolated from public
traffic, allowing a driver to avoid interactions Figure 5 displays the mismatch error or
with other vehicles. The experiments were vsa at the 60 m ahead distance for look-back
performed on the straight sections of the road. distances of 0.5m and 1.0m while the vehicle is
To measure the effect that the heading moving at 10mph for the same maneuver shown
angle estimation error has on the accuracy of the in Figure 4.
projection of lane boundaries onto the HUD, a
series of experiments were performed. In these 2
experiments, the test truck was driven as shown
1.5
in Figure 4. After full acceleration, the vehicle
was driven to the left of the centerline, and then 1

driven back into the normal driving lane. 0.5


Error (degree)

-0.5
EXPERIMENTAL RESULTS -1

The effect of look back distance for -1.5 0.5m

estimating vehicle heading angle was measured 1.0m


-2
at 10mph (Figure 4). In the graph, errors for 0 20 40 60 80 100 120 140
three different points (60m, 90m, and 120m Time (sec)

ahead) are drawn. Data was computed at 2


second intervals. At the beginning of the data (0 Figure 5. Effect on error or vsa of look-back
to approximately 10 seconds), the level of vsa distances 0.5m and 1.0m at 10mph
may be attributed to vibration due to vehicle
acceleration.
The broken interval between 40 to 45 Longer look-back distances result in a
seconds is when the lane boundaries are not smoother vsa trajectory, but result in a longer
displayed on the combiner. This is due to the size time constant. To the driver, this results in a
(4”x6”) of the combiner. The lane boundaries more stable image, albeit at a price of greater
fall outside the field of view. (This has been error when transients occur. Given a trade-off,
corrected with the subsequent development of a the more stable image is desired because it is less
new, larger combiner and brighter projector.) It likely to annoy or lead to driver motion sickness.
Note again, that in all the experiments display image is rendered (i.e. the time delay or
the vehicle was intentionally driven from one latency).
lane to the next and back again during each test Errors also arise because of mounting
run. This was done to examine the limitations of errors associated with the projector. As is seen
the system and was used to design a subsequent in Figure 1, the road centerline did not exactly
second generation HUD. As a result of the side match the centerline drawn in HUD image. The
to side motion, the data following a lane change closer reference mark was shifted to the right
for all the error graphs was dominated by side but the farther mark was shifted to the left
transient effects. The length of the test track side. As vehicle speed increases, this mounting
limited the maximum test speeds to 30mph. error resulted in the shifted patterns of errors at
When the vehicle traveled faster (i.e. at 30mph), different locations.
the effects for varying look-back distances do not
change appreciably, as is shown in Figure 6. The 2
transient effects (still below 1 degree maximum
1.5
error or vsa) disappear in the last few seconds for
1
all the experiments. The results for those driving
segments which were not affected by intentional 0.5

Error (degree)
side to side weaving motion indicated that the 0
system could indeed achieve an average error of -0.5
approximately 0.25 degrees (equivalent to 0.5m
-1
at 120m). E60m
E90m
-1.5
E120m

-2
2
0 10 20 30 40 50
1.5 at 60m ahead Time (sec)

0.5
Figure 7. Effect on error or vsa at different
forward-looking distances at 30 mph.
Error (degree)

0
Estimate based on look-back distance 1.0 m
-0.5
Figure 8 depicts the total time measured
-1 0.5m
1.0m
for the computation associated with coordinate
-1.5
1.5m transformation from DGPS signals in the global
-2 coordinate system to the perspective projection
0 10 20 30 40 50
Time (sec)
in the eye coordinate system, including clipping
the results outside the field of view, and includes
the HUD screen drawing time for a typical
Figure 6. Effect on error or vsa of look-back
driving situation. Time to screen refresh is
distances 0.5m, 1.0m, 1.5m at 30mph
measured from the moment that the DGPS data
The mismatch errors or vsa measured at is received. This screen refresh time (which
the different look-ahead distances (i.e., 60m, includes the computation time above) was less
90m, and 120m) can be found in Figure 7. The than 16ms in most cases. The segment from 100
important errors to analyze are those during the to 300 sampling points is for the case when the
no transient condition (i.e. during steady state vehicle is moving in a straight line. Centerline
driving from 18 to 26 seconds and 38 seconds and both side line segments were drawn in the
on). The system was calibrated at the 60m field of view. Drawing was limited to 1000 ft
distance; thus we would expect that the largest ahead in the heading direction of the vehicle.
error would occur at the furthest point, i.e. at The reduction in times occurred when
120. Although we expected that there would be there was little or nothing to draw, i.e. everything
an error offset between the three look-ahead was out of the visual field of the combiner.
distances, we assumed that they would be During the segment from 50 to 100 sample
constant for all speeds. However, we found that points, the vehicle moved to the left and then to
error offsets were larger (by about 0.1 degree) at the right within the driving lane. The vehicle
higher speeds than at slower speed. This appears changed lanes completely during the segment
to be the result of the vehicle motion from the between 350 to 450 points.
time that the GPS position is acquired until the
16
14
12
10
Time (ms)

8
6
4
2
0
1 51 101 151 201 251 301 351 401 451
Screen Update Sequence number

Figure 8. Time Required for Screen Refresh

GEO-SPATIAL DATABASE
Figure 9. HUD's Field of View and Query
The image of the landscape projected Polygon
onto the HUD is based on data retrieved from Since the HUD must continually query
and stored in a geo-spatial database. The the database, there are strict limits on query
database contains all the relevant road and response time and communication latency.
geographic information needed by the HUD and Experiments have been designed to measure
other systems onboard the snowplow. The these values.
geographic data is stored, arranged and searched
by its spatial attributes, hence geo-spatial. The
central database provides consistent geo-spatial GEO-SPATIAL DATABASE
data to all vehicle systems, and allows for easy EXPERIMENTS
updates and expansions. System Architecture. The Geo-spatial
The HUD displays to the driver geo- Database Management System (GDMS)
spatial information including the current lane processes queries and maintains the database.
boundaries. The lane boundaries that are drawn The GDMS runs on a PC104 single board
can correspond to the lane striping on the road, computer, based on a 333MHz AMDK6
the shoulder, or they can represent the lanes processor, running the QNX/Neutrino real time
where they would be if they existed (for operating system. As stated before, the HUD is
example, on a gravel road). Within the database, driven by a Windows based PC. For the HUD to
the lane boundaries are stored as lines and update its image, the result of a query is sent
curves. To query the database a polygon is over a local ethernet from the GDMS on the
defined, and the query processor searches the PC104 to the Windows computer driving the
database to locate objects that intersect or lie HUD. To control communications between the
within that polygon. After the entire database is GDMS and the HUD, a client process running
searched, the results are sent back to the task under the QNX/Neutrino operating system
which sent the query request. The HUD must updates the polygon and queries the database for
accommodate a moving vehicle; therefore, the the HUD. This client processes, then sends the
HUD must continually query the database for query results across the Ethernet, using UDP
new data that has entered its field of view. The protocols (described later), to the HUD
HUD requires all road information in the area processor. Two experiments were designed to
covered by its field of view and so defines the measure the GDMS performance and Ethernet
polygon used to query the database. Figure 9 communications. Experiment 1 measures the
shows the HUD's field of view and the query response time of the GDMS and
corresponding query polygon overlaying a Experiment 2 measures the latency of the
section of a database. ethernet communication.
EXPERIMENT 1 - QUERY third computer consisted of another PC104 with
RESPONSE the same specifications and performance as the
PC104 computer running the GDMS and used
To determine the response time of the the QNX/Neutrino real time operating system.
query processor and GDMS, a sample database To simplify communications, the User Datagram
was compiled and a series of queries were Protocol (UDP) was used to send packets of data
preformed and timed. The sample database was over the local Ethernet between the GDMS and
compiled from photogrametry data received from the HUD. The overhead involved with
the Minnesota Department of Transportation. It Transmission Control Protocol (TCP) was not
covers all four lanes of Highway 101 for an 11.6 needed in such a small local network. Maximum
km segment between Rogers and Elk River, packet size for UDP is 2048 bytes, so large
Minnesota. The sample database contains 3720 queries must be cut into packets and sent in
separate objects ranging from single point multiple pieces.
objects including signs and stop lights to
multiple point objects including guard rails and
lane boundaries. Figure 10 shows 400 meters
taken from the sample database, representing a
typical section.
To stress the GDMS beyond normal limits,
the sample database was intentionally left ‘raw’;
no smoothing or compression was used to
remove extraneous information. The experiment
was to execute a series of queries on the sample
database and measure and record the time from
when the query was submitted to when the
results were returned. The size of the query result
was also recorded. A query was preformed every
10 meters along the length of the sample
database. Figure 11 shows the query response
time verses the number of objects found during
the query.
As was expected, it can be seen that as
the density of objects increase, the query takes
longer and returns more objects. Figure 12
shows the query response time verses the size, in
bytes, of the query results. This correlates well
with Figure 11, since the more objects found
would lead to more bytes of data within the
results. Knowing the size of the query results, in
bytes, relates to the second experiment since the
query results have to be sent through the local
Ethernet between the GDMS and HUD
computers.
Table 1 summarizes the results of this
experiment. The results of this experiment show
that on average it takes .0327 sec. per query or
0.73 milliseconds per object to query the
database.

EXPERIMENT 2 - ETHERNET Figure 10. Typical View of Sample Database


Showing Intersection and Bridge Area
LATENCY
A third computer was used to measure
the latency of the local Ethernet between the
PC104 computer and the Windows PC. The
Query Time verses Query Size Table 2 Ethernet Timing Results
160

140
Transfer Time
for One Packet
Objects Found by Query)
Query Size (Number of

120

100 (seconds)
80
Average 0.00834
60

40
Standard 0.00656
20 Deviation
0 Minimum 0.00181
0 0.02 0.04 0.06 0.08 0.1

Query Time (Seconds)


Maximum 0.0313

Figure 11. Query Time verses Query Size Using existing databases, rather than the
measured in Number of Objects Retrieved sample database used in experiment 1, the HUD
query results typically span 1 to 3 UDP packets.
The maximum time for query processing and
Query Time verse Query Size in Bytes communications to the HUD processor was
40000 shown to be 0.17 (or 0.0835 + 3*0.0313) seconds
35000
for large query results.
Query Size (Bytes)

30000
25000
20000 CONCLUSIONS AND FURTHER
15000
10000
RESEARCH
5000 A conformal HUD to assist drivers by
0 presenting augmented visual information was
0 0.02 0.04 0.06 0.08 0.1

Query Time (Seconds)


developed and successfully tested under real
driving conditions. We also demonstrated that it
Figure 12. Query Time verses Query Results was possible to drive the HUD in real time from
Size in Bytes a geo-spatial database.
Error is unavoidable in every system. In
this system, image projection error arise from
Table 1 Query Timing Results DGPS positioning error, error in the digital map
data, drawing latency, system vibration, and
Query Query Size Query from the calibration errors of the system. In order
Time (Number of Size to reduce the errors due to transient effects which
(Seconds) Objects) (Bytes) appeared during lane changes, we plan to
Average 0.0327 44.8 15751.8 integrate a yaw rate gyro, to improve the heading
Standard 0.0134 25.4 6236.3 angle estimate, with a faster and more accurate
Deviation DGPS receiver. Just how much error can be
tolerated by a driver remains an open research
Minimum 0.0164 13 4068
question.
Maximum 0.0835 146 35004 A larger and brighter video image
projection system has now been installed.
To time the transfer of a single packet, Information about obstacles on the road has
the GDMS computer and the Windows PC recently also been incorporated into the HUD
running the HUD would transmit a signal screen as a means to assist drivers with collision
through a serial port to the timing computer. The avoidance. This work is at an early stage, and
GDMS computer would send a signal to the will be reported on in the future.
timing computer before it sent a packet and the Finally, more study is needed to
Windows based PC would send a signal back determine how much information should be
when it received a packet. Using the real time presented to the driver so that the driver can
clock, the timing computer would store the time drive comfortably and efficiently in low
of each signal and calculate a time difference. visibility situations. Clearly, we need to also
Table 2 lists the results of the experiment. determine the field of view necessary for safe
vehicle guidance.
ACKNOWLEDGEMENTS
Aircraft," Journal of Guidance, Control, and
We would like to thank the Minnesota
Dynamics, Vol. 6, 1983, pp. 77-83
Department of Transportation (Mn/DOT) for use
11. Grunwald, A.J., Robertson, J.B., and
of the low volume test track at the Mn/ROAD
Hatfield, J. J., "Experimental Evaluation of a
research facility and for assistance provided by
Perspective Tunnel Display for Three-
their personnel. This project was partially
Dimensional Helicopter Approaches,"
supported by Mn/DOT, the ITS Institute and the
Journal of Guidance, Control, and Dynamics,
Center for Transportation Studies at the
Vol. 4, 1981, pp. 623-631
University of Minnesota, and the U.S.
12. Hess, R.A. and Gorder P.J., "Design and
Department of Transportation.
Evaluation of a Cockpit Display for Hovering
Flight," J. of Guidance, Vol. 13, No. 3, 1990,
pp. 450-457
13. Negrin M., Grunwald, A.J., and Rosen, A.
REFERENCES "Superimposed Perspective Visual Cues for
Helicopter Hovering Above a Moving Ship
1. Weihrauch, M., Meloeny, C.C., and Geosch, Deck," Journal of Guidance, Control, and
T.C., "The First Head Up Display Introduced Dynamics, Vol. 14, 1991, pp. 652-660
by General Motors," SAE #890288 14. Steinfeld, A. and Green P., “Driver Response
2. Sojouner, R.J., and Antin, J.F., "The Effects to Navigation Information on Full-
of a Simulated Head-Up Display Windshield Head-Up Displays,” Int. Journal
Speedometer on Perceptual Task of Vehicle Design, Vol. 19, No. 2, 1998
Performance," Human Factors, 1990, 32(3), 15. Bajikar, S., Gorjestani, A., Simpkins, P., and
pp. 329-339 Donath, M., "Evaluation of In-Vehicle GPS-
3. Steinfeld, A. and Green, P., "Driver Response Based Lane Position Sensing for Preventing
to Navigation Information on Full- Road Departure", Proceedings of the IEEE
Windshield Head-Up Displays," Int. Journal Conference on Intelligent Transportation
of Vehicle Design, Vol. 19, No. 2, 1998 Systems, ITSC ’97, Boston, MA, 1997.
4. Gibson, J.J., College, S., and Crooks, L.E., 16. Morellas, V., Morris, T., Alexander, L. and
"A Theoretical Field-Analysis of Donath, M., "Preview Based Control of a
Automobile-Driving," Journal of Psychology, Tractor Trailer Using DGPS for Preventing
Vol. LI, No. 3, 1938, pp. 453-471 Road Departure Accidents", Proceedings of
5. Hills, B.L., "Vision, visibility, and perception the IEEE Conference on Intelligent
in driving," Perception, Vol. 9, pp. 183-216 Transportation Systems, ITSC '97, Boston,
6. Rockwell, T.H., Ernst, Ronald, R.L., and MA, 1997.
Rulon, M.J., "Visual Requriements in Night
Driving," National Cooperative Highway
Research Program Report 99, Highway
Research Board, 1970
7. Senders, J.W., "The Attentional Demand of
Automobile Driving," Report No. 1482, Bolt,
Beranek and Newman, Inc., 1967
8. Modjtahedzadeh, A., and Hess, R.A., “A
Model of Driver Steering Control Behavior
for Use in Assessing Vehicle Handling
Qualities,” Journal of Dynamic Systems,
Measurement, and Control, Vol. 115, 1993,
pp. 456-464
9. Grunwald, A.J., Merhav, S.J., "Vehicular
Control by Visual Field Cues-Analytical
Model and Experimental Validation," IEEE
Trans. on Systems, Man, and Cybernetics,
Vol. SMC-6, No. 12, 1976, pp. 835-845
10. Lowe J.R. and Ornelas J.R. "Applications of
Head-Up Displays in Commercial Transport

S-ar putea să vă placă și