Sunteți pe pagina 1din 17

1

A Haptic Interface Using MATLAB/Simulink



Magnus G. Eriksson and Jan Wikander

Mechatronics Lab, Department of Machine Design
The Royal Institute of Technology
Stockholm, Sweden
magnuse@md.kth.se, jan@md.kth.se



Abstract The concept of a new haptic system based on a low level interface for
MATLAB/Simulink is presented. The interface enables users to connect
dismantled, re-constructed or self-developed haptic devices without any
commercial drivers or APIs for verification and further development. A virtual
environment is easily created and connected to the Simulink models, which
represent haptic algorithms and real time communication. The paradigm shift to
start using MATLAB/Simulink with model based programming instead of the
commonly used C++ programming language for haptic applications will attract
new users. Users from other scientific areas, e.g. mechatronics, can bring new
knowledge into the haptic topic and solve control-engineering issues that will
give the users even more realistic haptic feedback in future applications.


Keywords Haptic system, real time programming, MATLAB/Simulink, virtual
reality, force algorithm, control engineering


1. Introduction

The work presented in this paper is done in the context of developing haptic and
visual simulation of surgical procedures [1]. Haptic is the sense of touching
something and get tactile and kinestic force feedback. A haptic device is connected to
a virtual world for enabling touch of 3D modeled objects. If there is a collision in the
virtual world between a tool and the object, the device provides force feedback to the
user.
The conventional way of creating these haptic systems is to use a commercial haptic
device including drivers and Application Programming Interfaces (APIs) in
conjunction with available haptic libraries (HLibs). The devices cannot be used
without the licensed drivers and APIs, which limits the freedom of developing haptic
systems and applications.
The HLibs used today put high demands on the user skills in the C++ programming
language. The HLibs are built up with C++ and the implementation of self-developed
haptic applications must be done in this language.
There are several HLibs presented on the haptic market. The CHAI libraries [2], the
eTouch API [3] and the H3DAPI [4] are three C++ based open source alternatives.
The benefits of using an open source library is firmly the enabling of access to low
level details; such as adding an arbitrary haptic device or control your own haptic
2
force effects. OpenHaptics [5] and the Reachin API [6] are examples of commercial
available HLibs.
The most haptic devices are constructed to have 3-6 Degrees Of Freedom (DOF)
input sensor signals and 3-6 DOF output actuator signals. Some commonly used
devices are the Sensables PHANTOM series [7], the Novint Falcon [8], the
Delta/Omega devices from Force Dimension [9] and the Freedom6S device from
MPB Technologies [10]. Each one of them is introduced with their own commercial
licensed drivers.

In this paper we present a new haptic system set up; where a Sensable PHANTOM
Omni haptic device is dismantled and in low level directly connected to
MATLAB/Simulink [11]. Simulink enables model based programming instead of
using the C++ programming language. This paradigm shift in haptics will admit new
users and extend the haptic topic into new scientific areas. Where the users can be
more into mathematics and control system engineering than programming experts.
This will enable further development of the haptic topic, since there are still many
control-engineering issues to be solved to give realistic haptic feedback to the user.
The area of using MATLAB for haptic applications is quite young and unexplored;
hence, it is difficult to find published information about other systems elsewhere in
literature. However, Handshake VR Inc. [12] has developed a commercial
MATLAB/Simulink interface for the Sensable Omni haptic device. In that case the
OpenHaptics API drivers still need to be used. By using pre-defined haptic Simulink
blocks which directly is connected to the OpenHaptics API they enable higher level of
programming with the application. However, our system differs in the way that the
low level connection to the haptic device conveys use of self-developed control
algorithms and haptic feedback without any required drivers and APIs. All the
kinematics, transformations, collision detection and force feedback is modeled by the
user through real time communication with the sensors and actuators of the device.
Thus, implementation and verification of self modified or constructed haptic devices
is possible in this system. Systems based on haptic drivers and APIs do not have this
flexibility and easy compatibility.
This system also gives an extra input for education and greater understanding in
haptics for beginners. Often when conventional graphical-based haptic development
APIs and environments are used the subjects have problem to separate visualization
and haptics. They create a 3D object and add a haptic surface to it. In this system all
the haptic algorithm information is built up in Simulink separately from the graphics.
The 3D rendering is an extra shell added in the end for nice visualization of the haptic
collision and movements of the haptic devices. To create it in this way increases the
knowledge of the separation between haptic and graphic rendering.
The requirements of a haptic system that are fulfilled in this work are a minimum
updating frequency of the haptic loop at 1000 Hz and real time graphic rendering of
the 3D object to be visualized at 30 Hz [13].

The paper is organized as follows. Section 2 describes the components used for
creating the haptic system in MATLAB. Section 3 gives an overview of haptic
modeling and implementation. In section 4 test results and verification of a specific
application is presented. Finally, section 5 gives some conclusions and a glance into
possible future work.


3
2. System Components

In this section the components used for the running a haptic application on MATLAB
are presented. Each component is relevant to fulfill the development of a complete
system. The main components are: the development platform, the real-time control
tool, the graphic rendering interface and the haptic device.

2.1 MATLAB / Simulink / Real Time Workshop

The mathematical-based programming language MATLAB is used as a base for
development of the haptic interface. MATLAB is an easy to use program for
development, analysis and visualization of algorithms and numerical computations.
Simulink is a block library that runs on MATLAB. It provides a model based
programming language for simulation and analysis. The programming takes place in a
graphical environment where the algorithms can be simulated and tested for relevant
data. Simulink is a useful tool to avoid experimental set up and fast simulated results.
The Real Time Workshop [14] is a plug-in to Simulink, which is building code from
the blocks during a simulation. The code can then be used for real time applications in
conjunction with the algorithms created by blocks in Simulink.
In this research, all the developed algorithms are implemented in Simulink and
compiled with the Real Time Workshop.

2.2 dSPACE

dSPACE [15] is a real-time tool for control prototyping and verification of
mechatronic systems. The compiled code from the Real Time Workshop is
downloaded to the dSPACE platform, where it will be driven in real time on the
dSPACE CPU. The basic function of dSPACE is to read sensor signals from some
external device, manage the signals with the downloaded algorithms, and send
relevant signals back to the actuators of the device.
Here the signals of the encoders and potentiometers from the haptic device are read
and PWM-signals are sent to the motors.

2.3 Virtual Reality Toolbox

The Virtual Reality (VR) Toolbox [16] for MATLAB is used for 3D rendering of
simulated objects. It can either be implemented as a Simulink block or in low level
MATLAB programming.
The virtual environment is built up with the VRML-editor V-Realm Builder [17].
Each geometrical object is defined as a node in the VRML scene graph. Each node
contains fields, which can be reached from MATLAB. E.g. a sphere is a node that
contains the field translation. To this field a signal can be sent in from
MATLAB/Simulink to perform graphic rendering of real time translation of the
sphere, that is how we did it in this project. The VR Toolbox is usually used to
demonstrate simulated signals from Simulink as 3D objects. But in our case, real time
signals are used instead of simulated ones. Therefore dSPACE MLIB-functions are
used to take real time signals from the dSPACE platform to MATLAB workspace. A
4
script in the MATLAB workspace transfers the signals with the VR Toolbox to the
virtual environment. The Blaxxun VRML-viewer [18] enables 3D visualization of the
rendered objects. The user can explore the virtual environment with several functions
of the VRML-viewer; such as zoom, rotation and translation.

2.4 Sensable PHANTOM Omni Haptic Device

The Sensable PHANTOM Omni haptic device is used as implementation for haptic
feedback to the user when manipulating virtual objects. Commonly the OpenHaptics
drivers needs for the device to work with haptic applications. The drivers are not used
in this project, just the hardware. The haptic device is dismantled and the sensors and
actuators are low level connected to dSPACE. There are six input signals (3 encoders
+ 3 potentiometers) and three output signals (3 dc motors). Sensor and actuator
signals are connected to the dSPACE system that also controls timing in the system.
The device can be seen as an inverted robot arm, where the algorithm reads the
position and calculates a force feedback to the user if collision in the virtual
environment occurs. The mechanical construction of the arm consists of a three linked
robot arm, three dc motors that strengthen wires between the joints and six sensors to
calculate the position and orientation of the end effector (x, y, z, yaw, pitch and roll).
There are also two buttons on the device and a pre-defined position location that
enables calibration. Figure 1 is showing the dismantled PHANTOM Omni haptic
device.



Figure 1. The dismantled PHANTOM Omni haptic device.
2.5 Drivers

Since the haptic device is dismantled; new drivers are created for easy
implementation. Motor drivers are used to convert the PWM-signals from dSPACE
when collision to relevant motor voltage. To enable the whole working range of the
motor 18V supply is used, which gives a feeling of high stiffness in the haptic
feedback.
Other drivers are also created for indication of the buttons and the calibration position
location. These drivers are switches that give 5V/0V when a button is pushed/not
pushed. The high or low values give 1/0 to the slave bit in channels on the dSPACE
platform.
5
3. System Overview

The system overview is presented in figure 2. The haptic interface of using
MATLAB/Simulink is shortly described as follows. A PHANTOM Omni haptic
device is dismantled and the sensors and actuators are low level connected to
dSPACE. The signals are transferred from dSPACE to MATLAB/Simulink (running
on a PC) by MLIB-functions. A virtual reality scene is built up in the MATLAB VR
Toolbox. The procedure of the algorithm is as follows (inertia is assumed to be
neglected):
Read the encoders and use direct kinematics position of the end effector.
Check collision detection.
If no collision: No signals to the motors.
If collision: Calculate a force and transform it to motor torques. Send PWM-signals
from dSPACE to the motors based on the torques. A PI-controller is used to control
the motor current.
Graphic rendering of the tool object and a virtual scene for interaction.





Figure 2. System overview of the MATLAB/Simulink haptic interface.


The following subsections are in detail describing the systems functionality.


VRML Viewer
VR Toolbox
Matlab Script
Simulink Models
Real Time Workshop
dSPACE CPU

Haptic Device
Motor drivers
Virtual environment
Sphere translation
Position end effector
with MLIB
Compile
Download code
3 PWM-signals
3 Motor voltages
3 Encoder
signals
3 Potentiometer
signals
MATLAB
1000 Hz
30 Hz
6
3.1 Kinematic Model

A direct kinematic model has been developed for the PHANTOM Omni haptic
device. The six sensor signals (3 encoders + 3 potentiometers) can be reached from
the dSPACE CPU and the position and orientation of the end effector (x, y, z, yaw,
pitch and roll) can be calculated. But for the collision detection and haptic algorithm
the position of the end effector is the only parameter of interest. Therefore only the
three encoder signals are used in the kinematic model to calculate the correct position.
For the graphic rendering it can be useful with all the six sensors to enable depicting
of the end effectors rotation.
The well known Denawit-Hartenberg [19] convention is adopted to define each link
and frame of the open chain manipulator. The sensor signals and lengths of the links
are the necessary joint and rigid body variables for an unambiguous solution. See
figure 3 for a sketch of the links, joints and variables for the PHANTOM Omni.



Figure 3. A sketch of the links, joints and variables for the PHANTOM Omni.


All the joints are revolute; hence
1
,
2
and
3
will be the controlling variables in the
direct kinematic model. The base frame is O
0
and the end effector frame is O
e
. The
established Denawit-Hartenberg link parameters are specified in table 1.

Link (L
i
) Frame (O
i
) a
i
i
d
i

i

L
0
O
0
- - - -
L
1
O
1
0 /2 0
1

L
2
O
2
a
2
0 0
2

L
3
O
3
a
3
0 0
3


Table 1. The established Denawit-Hartenberg link parameters for the PHANTOM Omni.

a2
a3
1
2
3
z0
z1
z2
z3
y1
y2
y3
x0,x1
x2
x3
O0,O1
O2
O3=Oe
7
On the basis of the parameters in table 1, the homogenous transformation matrix to go
from one frame to another can be described as follows. ) (
1
i
i
i
A

is the matrix,
i
is the
variable and i is frame 0..n.

(
(
(
(

1 0 0 0
cos sin 0
sin sin cos cos cos sin
cos sin sin cos sin cos
) (
1
i i i
i i i i i i i
i i i i i i i
i
i
i
d
a
a
A



Eq. 1

For the PHANTOM Omni haptic device n=3 and gives the following transformation
matrices between all frames.

(
(
(
(

=
1 0 0 0
0 0 1 0
0 cos 0 sin
0 sin 0 cos
) (
1 1
1 1
1
0
1


A
Eq. 2

(
(
(
(


=
1 0 0 0
0 1 0 0
sin 0 cos sin
cos 0 sin cos
) (
2 2 2 2
2 2 2 2
2
1
2

a
a
A Eq. 3

(
(
(
(


=
1 0 0 0
0 1 0 0
sin 0 cos sin
cos 0 sin cos
) (
3 3 3 3
3 3 3 3
3
2
3

a
a
A Eq. 4

Based on these matrices the homogenous transformation matrix
0
e
T is computed,
which yields the position and orientation of the end effector with respect to the base
frame 0. (Frame 3 is equal to the frame of the end effector).

(
(
(
(

+ + + +
+
+
=
= = =
1 0 0 0
0
) ( ) ( ) (
2 2 3 2 3 2 3 3 3 2 3 2 3 2 3 2
2 1 2 3 2 1 3 3 2 1 3 1 3 2 1 3 2 1 3 2 1 3 2 1
2 1 2 2 1 3 3 3 2 1 3 1 2 3 1 3 2 1 3 2 1 3 2 1
3
2
3 2
1
2 1
0
1
0
3
0
s a s c a s c a c c s s s c c s
c s a s s s a c s s a c c s s s s s s s s c s s
c c a s c s a c c c a s s c c s c c s s c c c c
A A A T T
e

Eq. 5

Where c
1
indicates cos(
1
), s
2
indicates sin(
2
) and so on. In the derived
transformation matrix the three first rows of the last column give the global position
of the end effector.

8
|
|
|
.
|

\
|
+ +
+
+
=
|
|
|
.
|

\
|
2 2 3 2 3 2 3 3
2 1 2 3 2 1 3 3 2 1 3
2 1 2 2 1 3 3 3 2 1 3
_ _
sin sin cos sin cos
cos sin sin sin sin cos sin sin
cos cos sin cos sin cos cos cos



a a a
a a a
a a a
z
y
x
effector end pos
Eq. 6

The expression of the end effectors position is built up in Simulink by connecting
blocks and signals to create an algorithm of the kinematics.

3.2 Collision Detection Algorithm

The pre-defined workspace of the virtual environment gives virtual boundaries that
limit the free space movements. In our case, the workspace is created as a bounding
box showed in figure 4.























The position of the end effector is calculated by the kinematic algorithm and the
locations of the virtual walls are pre-defined. By checking the xyz-coordinates of the
end effector in relation to the boundaries collision detection is performed in real time
on the dSPACE CPU. The collision detection algorithm is built up with blocks in
Simulink as presented in figure 5.



ymin
zmax
xmax
ymax
zmin
xmin
z
x
y
posend_effector

Figure 4. Workspace in the virtual environment is pre-defined as a bounding box.
9


Figure 5. The collision detection algorithm for xyz-coordinates created in Simulink.

3.3 Haptic Feedback

If the collision detection algorithm finds collision between the sphere, which
illustrates the depicted position and movements of the end effector, and the virtual
walls a haptic feedback will be sent to the user. The force feedback gives the user a
sense of touching the virtual walls and the sphere can be dragged along the
boundaries.
The basic idea with the haptic algorithm is based on the well known proxy-probe
method. The probe position is equal to the global position of the end effector
independent if collision or not. The proxy position is the position on the boundary
where collision occurs. The force algorithm is a modified spring-damper model
e d c e d k F + =

, where the spring constant, k, and the damper constant, c, are
arbitrary chosen. d is the distance between the proxy and the probe and e is the
normalized gradient to the collided surface. See figure 6.



Figure 6. The haptic algorithm when collision occurs in the virtual environment.
F
d
e

Probe
Proxy
Free Workspace
Boundary
10
The motivation of using the spring-damper model as a relevant haptic algorithm is as
follows. Assume that the probe and the proxy are two dynamical masses that moves in
relation to each other connected by a spring and a damper. The derivation and
motivation is based on the approximations proved in figure 7.



Figure 7. Approximations used for derivation of using the spring-damper model as a haptic algorithm.


Based on the approximations from figure 7 the following derivation shows that
spring-damper model gives a relevant haptic feedback for the probe-proxy case.

Newtons second law gives:

= x m F
x
Eq. 7
x m F g m x c x k
probe external probe
= + + Eq. 8
The probe is assumed to be weightless by motivation from above = 0
probe
m
= + 0
external
F x c x k Eq. 9
x c x k F
external
+ = Q.E.D.




mproxy
mprobe
Approximate the proxy with
a wall that not moves.
mprobe
The probe can be assumed to
be weightless; since if no
external force (no collision):
Equilibrium and probe=proxy at
the wall or in free workspace.
Probe=Proxy
If there is collision detected: An
external force is applied based
on the spring-damper model.
mprobe
Fexternal
Create a free body diagram in
equilibrium of the probe object.

x=the distance the probe has
been moved from the proxy (wall)
by the external force.
g m
probe

x k x c
external
F
x
11
3.4 Motor Torques

The force that occurs if collision is detected must be transformed to corresponding
motor torques for the three motors. Inertia, and hence dynamic impacts, is assumed to
be neglected; therefore will not the motor torques be dependent of velocity and
acceleration. The mechanical construction of the PHANTOM Omni gives that two of
the motors rotate around the global x-axis and one motor rotates around the global z-
axis. See figure 8.



Figure 8. Relevant parameters for the torque algorithm.


The torque algorithm is described in relation to the information given in figure 8. The
three known positions from the origin
1
p ,
2
p and
end
p give the corresponding vector
directions of the links
1
L ,
2
L and
3
L . The three dimensional force vector is
determined from the haptic algorithm and located at the position of the end effector.
The algorithm gives the torques
1
T ,
2
T and
3
T in three dimensions for each motor, but
T
1z
, T
2x
and T
3x
are the only used components.

0
1 1
= p L Eq. 10
1 2 2
p p L = Eq. 11
2 3
p p L
end
= Eq. 12
( )
z
T k j i F L L L T
1 3 2 1 1
... ... ... + = + + = Eq. 13
( )
x
T k j i F L L T
2 3 2 2
... ... ... + = + = Eq. 14
x
T k j i F L T
3 3 3
... ... ... + = = Eq. 15



F
1
L
2
L
3
L
1
p
2
p
end
p
0
z
T
1

x
T
2

x
T
3

x
y
z
12
3.5 PWM-signals

A PWM-signal must be sent to control a motor from the dSPACE CPU. Hence, the
above mentioned motor torques is changed to currents based on gear ratio and motor
type. The currents (one for each motor) are normalized to relevant PWM-signals (0-
1). The PWM signals are sent to the motors that strengthen the wires of the haptic
device and give the user a feeling of force feedback when collision is detected in the
virtual environment.

3.6 PI-control of the Current

An error is established by comparing the measured actual motor currents and the
calculated motor currents. A PI-controller is implemented to reduce the errors of the
algorithm that calculates the motor currents. The integration part of the controller is
removing the static error and summing the current difference every time a PWM-
signal is sent to the motors. Hence, after a while the errors are reduced. The P-factor
is tested and verified for a specific constant value. The control signal of the current is
sent back to the motor. See figure 9.



Figure 9. Simulink model of the PI-controller for one motor.


The real current in the motor is measured by using an operational amplifier (Opamp).
A resistor is mounted in serial connection with the motor, and the Opamp is
measuring the voltage drop over the resistor when activating the motor. The voltage
drop is taken in to the dSPACE platform. The motor current is received by dividing
the voltage drop with the resistor value.

3.7 Graphic Rendering

To do the haptic feedback more understandable a 3D virtual environment is built up
and visualized. The collision detection is based on max/min- boundaries along the
xyz-axes, which easily is rendered as a virtual cube having the walls at the given
boundary values. A small sphere is graphic rendered to illustrate the movements of
the end effector of the haptic device. The sphere follows the movements of the end
effector in real time at 30 Hz, which enables visualization of the collision with the
virtual walls. There are no deformations of the collided walls; hence just the
13
translation of the sphere needs to be updated in the graphics loop since the other
objects are static unchanged. The virtual environment presented in figure 10 is
rendered using the MATLAB VR Toolbox.




Figure 10. The virtual environment including the walls and the sphere.



The work presented in sub-sections 3.1-3.6 is implemented and built up in Simulink
and by using the real time workshop it is compiled and downloaded to the dSPACE
processor. All the collision detection and haptic feedback is performed on the
dSPACE platform at 20 kHz in real time. To extend the solution with 3D visualization
a virtual environment is created and connected to the system through the MATLAB
VR Toolbox, which is running on the PC. The position of the end effector is in real
time transferred with MLIB functions from the dSPACE CPU to the PC for
visualization. The translation of the small sphere in the virtual environment directly
follows the position of the end effector in real time.


4. Test Results and Verification

In this paper some early test results and verification of the haptic algorithm is
presented. The test is based on the application described above with the small sphere
that collides with the walls in the virtual environment.

4.1 Product Specification

A Pentium D 2.8 GHz with 2.0 GB RAM desktop PC was used for this application.
The graphic card is an Intel 82945G Express. A dismantled Sensable PHANTOM
Omni is used as a haptic device. No drivers or API components are required since
direct connection to low-level sensors and dc-motors. The dSPACE CPU platform is
used for real time controlling and routing of signals. MATLAB 7.2 and Simulink 6.4
were used with the Real Time Workshop. The MATLAB Virtual Reality Toolbox 4.0
was used for graphic rendering.


14
4.2 The Application

A 3D rendered small sphere is following the movements of the end effector and
collision detection is performed against pre-defined virtual walls. The virtual
environment is created in the V-Realm Builder VRML-editor. A MATLAB m-file
consisting of all variables must be updated before compiling the Simulink model and
download it to the dSPACE platform for every new application.

4.3 Test Procedure and Results

The developed haptic platform described above has been tested and verified for the
basic application of the sphere colliding with virtual walls along the global xyz-axes.
A user is holding the haptic device and manipulating the virtual objects. A haptic
feedback is sent to the device when the operator moves the sphere so it collides with
the walls, which gives the user a sense of kinestic and tactile feedback. Test data has
been logged from a specific test, where the user is dragging the sphere against one
wall and relevant data are saved. The logged data are the position of the probe relative
to the wall, the calculated force, the torques and the PWM-signals for the three
motors. See figure 11-14. The globally defined boundary conditions of the walls and
the pre-defined spring and damper constants of the force algorithm were also used for
the analysis of the system. The test procedure took place for one certain case of pre-
defined parameters and logged data. The stiffness constant in the haptic algorithm was
set to a high value to give the user a sense of collision between two rigid materials. As
mentioned above the virtual scene was 3D rendered for visual feedback.



Figure 11. Position of the sphere relative to
the wall.

Figure 13. The modeled torques for the three
motors.


Figure 12. The magnitude of the calculated
force.

Figure 14. The PWM-signals for the three
motors.
15
The result from figure 11 indicates that the position of the sphere follows the position
of the wall very well, but follows the user movements into the material when applying
higher force. The surface is modeled to be quite stiff; therefore the probe is not deeper
in to the material which had been the case for a surface modeled with a lower spring
constant. From figure 12 it can be seen that the magnitude of the calculated force
directly follows the penetration distance of the surface: this is as expected. The force
is transformed to corresponding motor torques for the three motors. Figure 13 depicts
that the modeled torques (T
1z
, T
2x
and T
3x
) are varying as the user is dragging and
pushing the sphere along the surface of the wall. T
1z
is zero because no collision in the
y-direction. Any other conclusions are hard to draw. The PWM-signal sent from
dSPACE to each motor directly follows the torque, as illustrated in figure 14. It is
hereby verified that the force algorithm and the MATLAB/Simulink haptic system
works properly for this application.

There have also been blind tests with subjects that never tried haptics before; and they
recognize that you really get a realistic perception of touching the virtual objects.


5. Conclusion and Future Work

In this work a haptic interface for MATLAB/Simulink has been developed. A
PHANTOM Omni haptic device is dismantled and the sensors and actuators are low
level connected to a dSPACE platform for real time communication. The haptic
algorithm including kinematics, collision detection, force calculation, transformations
to motor torques and implementation of a PI controller is modelled in Simulink. The
developed algorithms built up of Simulink blocks are compiled with the Real Time
Workshop. A virtual reality scene is built up in the MATLAB VR Toolbox for real
time 3D visualization.

There are some major benefits to use this system:
MATLAB/Simulink enables model based programming instead of using the
C++ programming language.
The low level connection to the haptic device conveys use of self-developed
haptic algorithms without any required drivers and APIs.
Easy implementation and verification of self modified or constructed haptic
devices is possible in this system.
In this system all the haptic algorithm information is built up in Simulink
separately from the graphics. To create it in this way increases the knowledge
of the separation between haptic and graphic rendering.

The developed haptic platform has been tested and verified for the basic application
of a 3D rendered small sphere that follows the movements of the end effector and
collision detection is performed against pre-defined virtual walls. By good results
from the tests it was hereby verified that the force algorithm and the
MATLAB/Simulink haptic system works properly for this application.

Possible future work could include some of the following suggestions:

To create other applications will include more advanced collision detection
algorithms for arbitrary modeled 3D objects.
16
Development of physical based force algorithms, not just the commonly used
spring-damper model.
In relation to previous work [1]; implement haptic milling applications in the
MATLAB/Simulink haptic interface. Enable more functions to manipulate the
virtual objects than just touch them, e.g. cutting, milling and shape
deformations of the objects.
Analysis of the impact when two stiff materials are colliding. Implement
already developed force feedback algorithms for this case [20]. Investigate
important parameters for the haptic control algorithm to avoid stability
problems. Analyze the influence of the haptic rate for this problem.
Perform blind tests, where subjects both touch real objects and virtual objects.
For modeling of realistic stiffness in the haptic feedback.
In this paper the dynamical impact of the motor torques model is assumed to
be neglected. In future work a dynamical based model will be implemented
and this assumption will be investigated.
Develop a 6-DOF haptic device, which can give the user a haptic feedback of
both force and torques, implement it to this system for tests and verification.
This is important for further progress of the haptic research topic.
Supplementary development of the MATLAB/Simulink haptic system so it
can be used in robotic education and control engineering laborations with
focus on haptics.



Acknowledgements

This research is part of the development of a Haptic Milling Surgery Simulator. In
context of this project exchange student Mustafa Umut Akan from Sabanci University
in Turkey has done a great effort.


Nomenclature

i
Angle between z
i-1
and z
i
[rad]

i
Angle between x
i-1
and x
i
[rad]
a
i
Distance along x
i
from O
i
[m]
1 i
i
A
Homogenous transformation matrix to
go from one frame to another [4x4]
c
i
cos(
i
)
c Damper constant [Ns/m]
d Distance between the proxy and the
probe [m]
d

Change of distance between the proxy


and the probe [m/s]
d
i
Distance along z
i-1
from O
i-1
[m]
e Normalized gradient to the collided
surface
F Haptic force [N]
F
external
External applied force [N]
g Gravity [m/s
2
]
i Frame/link number
I_cal Calculated motor current [A]
I_error Motor current difference [A]
I_real Real motor current [A]
k
i
Integration constant
k Spring constant [N/m]
L
i
Link
i
L
Vector direction and length of link
m
probe
Mass of the probe [kg]
m
proxy
Mass of the proxy [kg]
O
i
Frame
O
Global origin
i
p
Joint positions relative to
O

P Proportional constant
R Resistance [ohm]
s
i
sin(
i
)
o
e
T
Homogenous transformation matrix to
go from the base frame to end effector
frame [4x4]
i
T
Motor torque [Nm]
17
U_drop Voltage drop [V]
x Spring length [m]
x Change of spring length [m/s]
x

Acceleration in x-direction [m/s
2
]
x,y,z Global coordinate system
x
i
,y
i
,z
i
Local coordinate system at O
i



References

[1] Eriksson M.G., Haptic and Visual Simulation of a Material Cutting Process,
Licentiate Thesis, KTH Stockholm Sweden; 2006.
[2] CHAI 3D, http://www.chai3d.org.
[3] eTouch API, http://www.ethouch3d.org.
[4] H3DAPI, http://www.h3dapi.org.
[5] OpenHaptics, http://www.sensable.com/products-openhaptics-toolkit.htm.
[6] Reachin API, http://www.reachin.se.
[7] Sensable Inc. PHANTOM, http://www.sensable.com/products-haptic-devices.htm
[8] Novin Falcon, http://home.novint.com/products/novint_falcon.php.
[9] Force Dimension, http://www.forcedimension.com/fd/avs/home.
[10] MPB Technologies, http://www.mpb-technologies.ca/mpbt/haptics.
[11] MATLAB/Simulink, http://www.mathworks.com/.
[12] Handshake VR Inc., http://www.handshakevr.com/.
[13] Mark W.R., Randolph S.C., Finch M., Verth J., Adding force feedback to graphic
systems: issues and solutions, 23
rd
Conference on computer graphics; 1996, pp. 447-52.
[14] Real Time Workshop, http://www.mathworks.com/products/rtw/.
[15] dSPACE, http://www.dspaceinc.com/ww/en/inc/home.cfm.
[16] The Virtual Reality Toolbox, http://www.mathworks.com/products/virtualreality/.
[17] V-Realm Builder, http://www.ligos.com/.
[18] Blaxxun VRML-viewer, http://www.blaxxun.com/.
[19] Sciavicco L., Siciliano B., Modeling and control of robot manipulators, 2
nd
edition,
Springer; 1999, pp. 39-77.
[20] Flemmer H., Control design and performance analysis of force reflective teleoperators
a passivity based approach, Doctoral thesis, KTH Stockholm Sweden; 2004.

S-ar putea să vă placă și