Sunteți pe pagina 1din 6

AR-Bowling: Immersive and Realistic Game Play in Real

Environments Using Augmented Reality


Carsten Matysczok
Heinz Nixdorf Institute
Fuerstenallee 11
33102 Paderborn, Germany
+49 5251 606226
carsten.matysczok@hni.upb.de
Rafael Radkowski
Heinz Nixdorf Institute
Fuerstenalee 11
33102 Paderborn, Germany
+49 5251 606228
rafael.radkowski@hni.upb.de
Jan Berssenbruegge
Heinz Nixdorf Institute
Fuerstenallee 11
33102 Paderborn, Germany
+49 5251 606233
jan.berssenbruegge@hni.upb.de


ABSTRACT
The game and entertainment industry plays an enormous role
within the development and extensive usage of new technologies.
They are one major technology driver concerning the
development of powerful graphics hardware, innovative
interaction devices and efficient game engines.
Actual game developments show the trend to include the player
with his whole body - the time of sitting in darkened rooms in
front of a computer monitor is outdated. Therefore, special hard-
and software components have been developed and new user
interfaces have been designed allowing an unprecedented game
play.
The subsequent consequential step is to play games everywhere
(independent from time and place), to include the game player
completely in the game (high level of immersion) and to integrate
the game seamlessly into reality (blurring the edges of reality and
virtuality). Therefore, new and innovative technologies must be
used. One of these technology is Augmented Reality.
In this paper, we describe the use of Augmented Reality to enable
an immersive and realistic game play in real environments. As
game we chose bowling. To support the bowling game with AR-
technology a dedicated concept has been developed. The level of
game realism is enhanced by an integrated real-time kinematics
multi-body system simulation. A first prototypical realization,
which is used for user testings, confirms the previously identified
potentials of AR-technology within game entertainment.
Keywords
Augmented Reality, games, immersion, kinematics simulation,
user interface.
1. INTRODUCTION
With the next generation of video games consoles now firmly
established worldwide and the stunning rate in which gaming
technology has moved on, it is hard to imagine what the next big
thing to come out will be. Hologramatic graphics? Virtual Reality
body suits? In-brain gaming?

All these are possibilities, but new approaches from Japan and
America are pointing in a completely different direction. Taking a
step back from the draws of sitting in darkened rooms for hours, a
more family friendly way to play games has been found. A
combination of low cost technology with simple, addictive
gaming formats is attracting a huge range of consumers that have
so far proved elusive to mass market video games companies. At
that time the development of new, innovative and intuitive user
interfaces has begun [1].



Figure 1. Interaction with virtual game characters using hand
gestures and body movements.
Approaches like Eye Toy from Sony (see figure 1) or ConnectTV's
new input devices using cheap technology from Xavix Technology
(see figure 2) address this fact.


Permission to make digital or hard copies of all or part of this work for
personal or classroom use is granted without fee provided that copies are
not made or distributed for profit or commercial advantage and that
copies bear this notice and the full citation on the first page. To copy
otherwise, or republish, to post on servers or to redistribute to lists,
requires prior specific permission and/or a fee.
ACE 2004, June 35, 2004, Singapore.
(c) 2004 ACM 1-58113-882-2/04/0006 5.00



Figure 2. Intuitive interaction device in form of a snowboard
using chips from Xavix Technology.

All these approaches use a completely new kind of user interface,
which enables the user to play computer games in a
unprecedented attractive way: Now the game player is involved in
the game with all his senses, interacting with the game with his
whole body - just like the game would be real [2, 3, 4].

The next step of game play is to play games everywhere - not in
front of a computer monitor or a TV. Quite the contrary: the game
should be integrated seamlessly into reality so that game
characters and objects appear to be part of our real life. Therefore,
new user interfaces which are innovative, self-explanatory and
intuitive to use must be developed using state-of-the-art computer
technologies. These technologies should allow the user to enter a
mental state - called flow - as easy and quick as possible.

1.1 Flow and Interaction Spaces
Csikszentmihayli describes a mental state called flow, which
includes the kind of state a player may enter when totally
entangled by the game [5]. The altered state of the flow
experience can be described as: an optimal experience,
characterized by a sense of playfulness, a feeling of being in
control, concentration and highly focused attention, mental
enjoyment of the activity for its own sake, a distorted sense of
time, and a match between challenge at hand and one's skills [6,
7]. In figure 3, flow is characterized as a situation where both
skills and challenges are high.

Flow is a suitable concept to describe the immersive game
experience. The user interface of a computer game consisting of a
screen and a keyboard and a mouse is mostly a graphical user
interface (GUI). An interaction space is a place where action and
perception coincide [8]. The technical and conceptual components
of the GUI can be described with the MVC-model (see figure 4).
low high
low
high
skills
challenges
apathy boredom
anxiety flow

Figure 3. Challenges and skills determining flow.

control view
model
digital world
physical world
input
output


Figure 4. Interaction model of graphical user interfaces [9].

In this model the presentation (view) and control components are
separated. According to this fact, the space where user actions are
executed (i.e. arrow keys on the keyboard) and the space were the
results of these actions are evaluated (i.e. the screen that displays
the computer game graphics) are not integrated.

Research on computer game play circumstantiate, that the visual
attention of gamers is almost exclusively focused towards the
computer screen whereas the game control devices seem to be
ignored completely [10]. So all actions of the user have become
fully automated and are performed subconsciously.

To overcome this disadvantage new user interfaces which are
innovative, self-explanatory and intuitive to use must be
developed using state-of-the-art computer technologies. One of
those technologies is the technology of Augmented Reality. We
choose AR as user interface, because of the following three
reasons: the user can see other players, it is safer to play and
certainly it provides an unprecedented game play.
1.2 Augmented Reality
Augmented Reality (AR) is a new form of man-machine interface,
which is closely related to VR [11]. In contrast to virtual reality,
where the user is totally immersed in the computer generated
world, augmented reality joins computer generated world and
reality [12]. In Augmented Reality the computer provides
additional information, that enhances or augments the real world
(see figure 5).



Figure 5. AR-scene: a virtual robot is placed into a real
manufacturing system.

The user can interact with the real world in a natural way, with the
AR-system providing information and assistance. Therefore, AR
enhances reality by superimposing information rather than
completely replacing it. The information can be inserted in a
context-dependent way, i.e. derived appropriately from the
visualized object.

2. CONCEPT FOR AN IMMERSIVE AND
REALISIC GAME PLAY IN REAL
ENVIRONMENTS
For a realistic game play all characters and objects of the game
should appear real and lifelike. This includes their behavior as
well as their graphical presentation. Therefore a concept for a
realistic game play in real environments should meet the
following requirements (see table 1):

Table 1. Requirements.
Graphics Behavior
Detailed modeled 3D-objects
(complexity, texture quality)
Detailed modeling of the
physical analogous models
High rendering quality of the
virtual objects
Physical correct represen-
tation and movement of
virtual objects
Seamless integration of the
virtual objects into reality
Dependencies and interactions
of virtual objects among each
other

Concerning the rendering high demands are established on the
image quality and on the modeling of the 3D-models. This fact
becomes even more important when using Augmented Reality:
For an immersive and realistic game play it should be hard to
determine, which objects are real and which are not. This
demands the use of up-to-date rendering software, primarily found
in game industry.

For a realistic movement of the virtual objects a kinematics
simulation must be used. Based on a detailed modeling of the
physical analogous models, their dependencies and interactions, a
correct kinematical simulation can be performed.

Starting from this, the complete system consists of the following
five components: database, event handling, simulation, image
generation and user interface (see figure 6).

database
3D-models
user interface
input devices GUI
physical models
event handling
head movement
hand movement
and gestures
simulation
kinematics
animation
Image generation
3D-rendering
image composition

Figure 6. Architecture of the system and its components.

All data needed within the application is managed in the database-
module. Here the 3D-models and the physical models (for the
cinematic calculation) of the whole scene are stored. The module
for the event handling analyses the position/orientation of the
user's head and hand as well as the user's hand/finger gestures
(grabbing, releasing, etc.). According to these events the physical
correct simulation of each object in the scene is calculated by the
simulation-module in real-time. The image generation is used for
3D-rendering as well as for the subsequent image composition of
the real and virtual image. The user interface contains the input
devices, receiving the user input and sending it to the event
handling, and the graphical user interface, showing the game
menu and the actual score.

3. PROTOTYPICAL REALIZATION
3.1 Hard- and Software
The developed AR-application is running on a Pentium IV 1.5
GHz equipped with a GeForce Quadro 4 980 XGL graphics card.
As output device the video see-through HMD VH-2002 from
Canon is used to get a stereoscopic image of the AR-scene (see
figure 7).


Figure 7. System structure of the prototype.

The final goal is to implement a completely mobile system.
Therefore the tracking of the user's head position is done by an
optical tracking method. For this, the software MuliMarker
Tracking from HIT Lab is used, which is based on the ARToolKit
software [13, 14]. The system offers the development of table-top
AR-environments using markers for tracking. We distribute the
markers in the whole room to cover a wide tracking area. The
markers reach from 4cm to 20cm, to track the user in close and in
wide distance.

The user is able to interact with the virtual objects in the AR-
scene by hand gestures. For the current prototype we use Pinch
Gloves from Virtual Technologies to track the user's hand
gestures. The sensors in each fingertip detect contacts between the
fingers of each hand. This information is used for the gesture
detection. The position of the hand is tracked electromagnetically
by the Fastrack system from Polhemus. For the final complete
mobile system, this electromagnetically tracking will be replaced
by an optical tracking.
3.2 Working Principle
The application's working principle can be divided roughly
into four major steps: calculation of viewing direction, analysis of
user interaction, physical simulation and rendering.

Search for
Marker
Request
Hand
Gestures
Calculation of
Behaviour
of Bowling Ball
Physical
Simulation
Find Marker
Position and
Orientation
Request
Tracking
Information
Markers
Transformation
Matrix of
Markers
Position and
Orientation of
Interaction Device
Position of Fingers
Spin and Speed
of Bowling Ball
Positions and Orientations
of Pins and Bowling Ball
Output
Video
Stream
Video Stream
from Camera
Render 3D
Objects


Figure 8. Function principle.
In the first step two video images (right and left eye) are captured.
Using these pictures and the visible markers the position and
orientation of the user can be calculated. Subsequent the actual
user interaction is analyzed. The physical simulation calculates
the movement of each virtual object of the scene. Within the last
step the rendering of the objects for the right and left eye is
performed. The complete process is shown in figure 8.
3.2.1 Calculation of Viewing Direction
For the detection of the markers and the computation of the user's
viewing position as well as for the video handling the ARToolKit,
version 3.15, is used. In this context the ARToolKit provides
various functions to compute the following transformation matrix:

1 1 0 0 0 1
33 32 31
23 22 21
13 12 11
w
w
w
z
y
x
C
C
C
Z
Y
X
W V V V
W V V V
W V V V
Z
Y
X
,
w cw c
P T P =

It defines the relationship of the user's viewing position relatively
to the defined origin of the markers. This matrix is only used for
the definition of the user's viewing direction onto the markers.

3.2.2 Analysis for User Interaction
For the interaction analysis first the hand position of the user must
be detected. Therefore the tracking system Fastrak from
Polhemus is used; it's electromagnetic sensor is mounted on a data
glove worn by the user. The Fastrak system measures six degrees
of freedom (X, Y, Z, azimuth, elevation, roll). For the
communication with the computer a standard RS-232 interface is
used. The data is read in the ASCII format with a baud rate of
57600. This process was outsourced into a separate thread,
because reading the data in the main program results in a
significant loss of performance. At that time interactive work with
the application was no longer possible. In our current solution the
data is read continuously by a separate thread. If necessary actual
data records (e.g. position and orientation) can be transferred to
the main program using shared memory. This results in no
considerable loss of performance.

The speed of the hand movement is used to calculate a force
vector. This force vector is added to the physical bodies described
in the following section. To obtain the speed and the direction of
the movement, the difference of two points during the hand
movement is calculated. The force results from the difference of
the movement multiplied with two constants.

The force added to the object is calculated by :

0 1
0 1
t t
t t
y
x
Y Y
X X
a
F
F
.
X and Y are points in space and F are the forces into direction x
and y. The times t0 and t1 have a constant distance of 0.4 sec. A is
a constant value in kg. The amount of this value was tested
empirical by values between 5 kg and 10 kg.

The force for the upper direction Z is calculated by:
[ ] [ ]
0 1 t t z
Z Z b F = ,
where b < a. We use this second calculation to make the
interaction easier. Because nobody feels the weight of the bowling
ball, the ball is thrown up into Z direction. This results from a
swinging hand movement. Normally the weight of the bowling
ball prevents this movement. To get a more realistic behavior of
the interaction, we simulated the weight of the ball during the
interaction with a lower gain value b.

3.2.3 Physical Simulation
To calculate a realistic physical behavior we are using the physic
engine Vortex from CMLabs [15]. Vortex is a dynamics engine,
which applies Newtonian Physic to calculate the physical
behavior of rigid bodies. To use Vortex, it is necessary to build
up a simulation model of the bodies and the environment. This
model consists of primitive objects like a cylinder, a plane or a
sphere representing the virtual models of the scene. To get a
realistic behavior it is important to make a valid dynamic model.
If an improperly physical representation is chosen, a mathematical
correct result will be achieved, but the user will see an atypical
behavior of the virtual objects.



Figure 9. Rigid body model.

The rigid body model of the bowling lane is shown in figure 9.
Every object is modeled as a single body. The bowling pins are
modeled as simple cylinders with a diameter of 5.74 cm and a
height of 38.1 cm. For the pins we tested multiple objects to get
the best result. The bowling ball is modeled as a sphere with a
diameter of 21.8 cm. The bowling lane is modeled as a simple
plane. The length of the plane is 18.2 m. Thus, all used
dimensions are following the real dimensions. Another important
property is the friction between the objects. Without friction, a
moving object would never stop. Therefore, Vortex provide the
Coloumb Friction model.

3.2.4 Rendering
To achieve a better rendering quality we use the game engine
Alchemy from Intrinsic [16]. Intrinsic Alchemy is a
comprehensive development and run-time environment that offers
game programmers peak performance on each hardware device
combined with remarkable flexibility and ease-of-use.
Additionally this engine allows us to render even high-polygonal
scenes in real-time by using special features like real-time
shadows or blaze effects to obtain a higher level of realism.
4. GAME PLAY
To play AR-bowling the user puts on the HMD and a pinch glove
and places one or two markers on the floor or at a wall. After
starting the application the user interacts with the application
using hand gestures and movements (see figure 10).



Figure 10. Initial state of the game.

The user grabs the bowling ball with his fingers and throws it
towards the virtual pins. According to his hand and finger
movement the bowling ball receives speed and spin (see figure
11).



Figure 11. Ball approaches the pins: speed and spin are
considered by the simulation model.
The pins as well as the bowling ball act in a physical correct way:
slightly touched pins begin tumbling and falling pins can bowl
neighbouring pins down (see figure 12).



Figure 12. Multi-body system simulation provides physical
correct object movements.
The user has twice the chance to clear all pins (strike or spare).
According to the bowling rules further throw types are gutterball,
split, spare and miss. The actual score is displayed on the
scorecard of the user. After ten rounds the final score is calculated
and the game ends.
5. USABILITY AND PERFORMANCE
TESTS
The feedback of the evaluation of the first game-prototype
confirms the demand of new user-friendly and easy-to-use
interfaces. In several trials the involved persons needed only a
short time of 1-2 minutes to understand the functions of the AR-
game and how to play it.

Using a marker-based tracking method, it is necessary to warrant
an adequately lit environment. Mirroring or dazzling surfaces
should be avoided to prevent a loss of tracking. This fact
minimizes the usability.

6. SUMMARY AND OUTLOOK
Recent advances in game development show the trend to include
the gamer with his whole body. To play games everywhere, to
include the game player completely in the game and to integrate
the game seamlessly into reality new and innovative technologies,
like Augmented Reality, should be used.
Augmented Reality enables a combination a virtual and real
world, which results not in a separation of input and output. It
allows the user to enter the mental state flow in a quick and easy
way. This potential of AR-technology was used to develop a new
concept for games providing an immersive and realistic game play
in real environments. A higher level of realism can be achieved by
simulating the real behavior of the game objects using dynamic
simulation of a multi-body system. First user tests proved these
assumptions.
Further improvements of the AR-game concern the mobility and
usability of the proposed system. In this context wearable or
mobile computers will be used. In terms of user interaction
devices we will integrate the wireless data glove Data Glove 5-W
from Fifth Dimension Technologies. As HMD we will use the
ARvision-3D Goggles from Trivisio Prototyping GmbH - a
mobile, small and lightweight video see-through HMD. For the
complete mobile system the tracking will be done only optically.
7. REFERENCES
[1] Cherny, L., Clanton, C., and Ostrom, E. Entertainment is a
Human Factor: In: CHI 97 Workshop on Game Design and
HCI, 1997.
[2] Eggen, B.; Feijs, L.; Peters, P.: Lining physical and virtual
interaction spaces. In: Proceedings of the second
international conference on Entertainment computing, 2003.
[3] Dourish, P.: Where the action is: The foundations of
embodied interaction, MIT Press, 2001.
[4] Graham, N.; Watts, L.; Calvary, G.; Coutaz, J.;Dubois, E.;
Nigay, L.: A dimension space for the design of interactive
systems within their physical environments. In: Proceedings
of the conference on designing interactive systems:
processes, practices, methods, and techniques, 2000.
[5] Csikszentmihayli, M.: Flow: The psychology of optimal
Experience, 1991.
[6] Turkle, S.: The second self: Computers and the human spirit,
1984.
[7] Holt, R.; Mitterer, J.: Examining video game immersion as a
flow state. In: Proceedings of the 108th annual psychological
assoc, Washington DC, 2000.
[8] Fjeld, M.; Lauche, K.; Dierssen, S.; Bichsel, M.; Rauterberg,
M.: BUILD-IT: A Brick-based integral Solution Supporting
Multidisciplinary Design Tasks. In: Designing Effective and
Usable Multimedia Systems (IFIP 13.2), Boston, 1998.
[9] Ullmer, B.; Ishii, H.: Emerging frameworks for tangible user
interfaces. In: IBM Systems Journal, Vol 39, Nos 3&4, 2000.
[10] Feijs, L.; de Graaf, M.: Support robots for playing games:
The role of player - actor relationships. In: Faulkner X. et al.
(Eds.), People and computers XVI, 2002.
[11] Behringer, R., Klinker, G.; Mizell, D.: Augmented Reality -
Placing Artificial Objects in Real Scenes. In: Proceedings of
the IWAR 1998, San Francisco, California, 1998.
[12] Milgram, P.; Takemura, H.; Utsumi, A.; Kishino, F.:
Augmented Reality: A Class of Displays on the Reality-
Virtuality Continuum. In: Proceedings of SPIE Conference
on Telemanipulator and Telepresence Technologies SPIE.
Boston, MA, 1994.
[13] Kawashima, T.; Imamoto, K.; Kato, H.; Tachibana, K. and
Billinghurst M.: Magic Paddle: A Tangible Augmented
Reality Interface for Object Manipulation,. In: Proceedings
of the Second International Symposium on Mixed Reality
(ISMR) 2001, Yokohama, Japan, 2001.
[14] Billinghurst, M. and Kato, H.: Collaborative Mixed Reality.
In: Proceedings of the International Symposium on Mixed
Reality (ISMR '99). Mixed Reality-Merging Real and Virtual
Worlds, 1999.
[15] http://www.cm-labs.com/
[16] http://www.intrinsic.com/

S-ar putea să vă placă și