Documente Academic
Documente Profesional
Documente Cultură
I confirm that the project dissertation I am submitting is entirely my own work and that any
material used from other sources has been clearly identified and properly acknowledged
and referenced. In submitting this final version of my report to the JISC anti-plagiarism
software resource, I confirm that my work does not contravene the university regulations
on plagiarism as described in the Student Handbook. In so doing I also acknowledge that I
may be held to account for any particular instances of uncited work detected by the JISC
anti-plagiarism software, or as may be found by the project examiner or project organiser. I
also understand that if an allegation of plagiarism is upheld via an Academic Misconduct
Hearing, then I may forfeit any credit for this module or a more sever penalty may be
agreed.
Date: 04 / 05 / 2008
The University of Surrey UAV Vision and Control System Richard Lane
Abstract
This report details the development of a UAV Vision and Control System, as a contribution
towards the second year of the Special Engineering Project. The eventual aim of the project was to
have a UAV hovering over a set of fiducials, on top of the Special Engineering Project’s UGV. The
vision side to the project progressed well throughout the year, with software completed to a
point where it can track fiducials and calculate orientation. The control side to the project
suffered greatly from incompatibility issues; however a working control interface prototype has
been developed.
i
The University of Surrey UAV Vision and Control System Richard Lane
Acknowledgements
The author would like to thank the following people for their help and support through the
duration of the project. Dr. Richard Bowden for selecting me to work on such an exciting project,
and for the help and guidance given to me throughout the project. Adam Lindsay for his swift
development of the USB to analogue voltage microcontroller. Liam Ellis for the various times he
has answered my queries, and for lending me several pieces of equipment. Shiv Tailor for taking
some excellent photos for the SEP website. Lastly Ben Abbott, Adam Lindsay, Affan Shaukat, and
Dr. Philips Jackson for their valuable ideas and input into project meetings.
ii
The University of Surrey UAV Vision and Control System Richard Lane
Table of Contents
1 Introduction .................................................................................................................... 1
1.1 The Special Engineering Project ................................................................................................... 1
1.2 Project Objectives.......................................................................................................................... 2
1.3 Specifications................................................................................................................................... 2
1.4 Breakdown of Report.................................................................................................................... 2
2 Background and Literature........................................................................................... 3
2.1 Unmanned Aerial Vehicles............................................................................................................ 3
2.1.1 Example UAV System: “md 4-200”......................................................................................... 3
2.2 Existing Vision and Control Systems........................................................................................... 4
2.3 Remote Controlled Aircraft......................................................................................................... 4
2.4 Quadrocopters ............................................................................................................................... 6
2.5 OpenCV........................................................................................................................................... 6
2.5.1 CvCam......................................................................................................................................... 6
2.5.2 HighGUI ...................................................................................................................................... 7
2.6 Network API................................................................................................................................... 7
2.6.1 Bcastmanager.............................................................................................................................. 7
3 Vision and Control System ........................................................................................... 8
3.1 System Overview ........................................................................................................................... 8
3.2 Source of Hardware and Software .............................................................................................. 8
3.3 Modifications to the Original Plan ............................................................................................... 9
4 Hardware ...................................................................................................................... 10
4.1 UAV................................................................................................................................................10
4.1.1 X-UFO Aerial Platform...........................................................................................................10
4.1.2 Wireless Camera .....................................................................................................................10
4.2 UGV ...............................................................................................................................................12
4.2.1 Target Fiducials ........................................................................................................................12
4.2.2 Superbright LEDs .....................................................................................................................13
4.3 External Laptop ............................................................................................................................13
4.3.1 Laptop Specifications ...............................................................................................................13
4.3.2 Resource Usage........................................................................................................................14
4.4 Control Interface..........................................................................................................................14
4.4.1 Investigated Control Methods ...............................................................................................14
4.4.2 USB-PPM Interface Method 1 ................................................................................................14
4.4.3 USB-PPM Interface Method 2 ................................................................................................15
4.4.4 USB-Voltage Interface Method ..............................................................................................16
5 Software ........................................................................................................................ 21
5.1 Software Overview ......................................................................................................................21
5.2 Fiducial Detection ........................................................................................................................21
iii
The University of Surrey UAV Vision and Control System Richard Lane
iv
The University of Surrey UAV Vision and Control System Richard Lane
List of Figures
List of Tables
v
The University of Surrey UAV Vision and Control System Richard Lane
List of Appendices
Glossary of Terms
API Application Programming Interface
CCTV Closed-Circuit Television
CVSSP Centre for Vision, Speech and Signal Processing (at The University of Surrey)
DAC Digital-to-Analogue Converter
DFT Discrete Fourier Transform
IC Integrated Circuit
IP Internet Protocol
KLT Kanade-Lucas-Tomasi feature tracker
PC Personal Computer
PCB Printed Circuit Board
PPM Pulse Position Modulation
R/C Radio Control
RS-232 Recommended Standard 232 (a serial data standard)
SEP Special Engineering Project
TCP Transmission Control Protocol
UAV Unmanned Arial Vehicle
UGV Unmanned Ground Vehicle
USB Universal Serial Bus
vi
The University of Surrey UAV Vision and Control System Richard Lane
1 Introduction
The original four team members; Ahmed Aichi, Edward Cornish, Peter Helland, and Martin
Nicholson, succeeded in producing a robotic platform capable of navigating it’s self to a certain
extent. Some parts of the project were deemed either not as successful as hoped, or un-useful to
the continuation of the project and have therefore not be used again this year.
This year the project has been continued with four new team members, responsible for four more
individual projects to be integrated into SEP.
- The author is responsible for producing a vision and control system capable of hovering an
airborne platform (referred to as the UAV in the rest of this report), above the original
robotic platform (referred to as the UGV).
- Ben Abbott is responsible for the continuation and improvement, of the high level control and
onboard vision system for the UGV, as well as project management aspects such as budget
and chairing meetings.
- Adam Lindsay is responsible for producing a generic wired and wireless sensor interface to be
poled by the decision system, including useful sensors to SEP such as odometers, and
gyroscopes.
- Affan Shaukat is responsible for producing a sound source localization system capable of
detecting significant noises and relaying a direction to the decision system.
The UGV was previously designed to navigate around the CVSSP department. It was decided to
push the project in the direction of security applications, which led to the above four projects. The
vision system and generic sensor interface is to give feedback to the decision system about
location and movement of the UGV, and audio sensing to be used to detect interesting events for
the UGV to navigate too. The UAV is intended to provide a hovering platform, or disembodied
head for the UGV, to in the future mount a camera on to take pictures of people found in the
CVSSP corridors.
1
The University of Surrey UAV Vision and Control System Richard Lane
Secondary Objectives:
1. Enhance the vision system so it’s possible to stabilise a UAV above the UGV without the
fixed fiducials by tracking the UGV itself.
2. Enhance the vision system so it’s possible to stabilise a UAV above any object, by training the
systems to recognise and track a specific object.
The secondary objectives are where the project could progress to, if the primary objectives are
achieved earlier than anticipated.
1.3 Specifications
1. The aerial platform must be capable of carrying a wireless camera.
2. The vision system must work at the frame rate of the camera (25-30Hz).
3. The vision system must be robust enough to keep tracking under stable conditions, and
achieve tracking for 50% of the time under unstable conditions.
4. The control system must provide full software control of the aerial platform.
5. The aerial platform must be at least as stable under automated flight, as during manual flight.
2
The University of Surrey UAV Vision and Control System Richard Lane
[1]
Figure 1: Picture of example UAV system “md 4-200”
The "md 4-200" developed by a German company called "microdrones GmbH" is described as a
vertical take off and landing, autonomous unmanned micro vehicle. It is capable of manual flight
using a traditional style R/C transmitter, as well as automated waypoint navigation and automated
software controlled flight.
Specification:
Important specification points include: [1]
- The integration of accelerometers, gyroscopes, magnetometer, airpressure, humidity and
temperature sensors to allow reliable stable flight.
- GPS for autonomous waypoint navigation or position hold.
- Up to 20 minutes of flight time.
- 200g payload limit.
A USB R/C interface is also available. The "PPM9_USART" which is also developed by
"microdrones GmbH" allows the emulation of a standard R/C transmitter via software [2].
3
The University of Surrey UAV Vision and Control System Richard Lane
They succeeded in creating a vision based estimation system using a single camera mounted
downwards on a UAV, with a tracking target in the field of view of the camera. The tracking target
used is described as “a novel target that incorporates the use of moiré patterns” [6]. The target
pattern consists of 5 red patches and two orthogonal moiré patterns as shown in Figure 2.
[7]
Figure 2: Example Vision Target using Moiré Patterns
The Red Dots shown in Figure 3 are used to calculate the yaw and distance of the camera relative
to the target as well as to locate the moiré patterns. Moiré patterns in this case are created using
two sets of gratings, one a set distance on top of the other. This result is “a near constant
apparent wavelength on the image plane” [8] independent of distance of the camera from the
target. Relative position, pitch and roll are calculated using DFT calculations on the information
provided by the moiré patterns. More detailed information on the position estimation can also be
found in the thesis entitled “Six Degree of Freedom Estimation Using Monocular Vision and Moiré
Patterns” by Tournier [9].
Although the method applied by Tournier, Valenti, How, and Feron for orientation calculations
was successful and accurate it is far beyond the complexity which could be achieved within the
author’s project time scale. However a similar method of calculating yaw and distance has been
employed in the author’s project, using five red fiducials. The same five fiducials are then also used
to provide some relatively simple and therefore less accurate position, pitch and roll values.
4
The University of Surrey UAV Vision and Control System Richard Lane
Miniature helicopters
These are meant for indoor flying so are fairly small, but most have limited control and also weigh
less than a wireless camera and battery, which makes most of them useless to the project. The
example shown in Figure 3a is the “Heli-Q”. Its size is shown in relation to a cigarette.
Mid-sized helicopters
These are generally used outdoors; they are fairly large and should be able to hold the weight of a
wireless camera and battery. A large down side is the size of the main rotors that are typically
over 600mm in diameter and therefore not very safe for flying through tight corridors. Mid-sized
helicopters are also mostly for experienced R/C flyers and are notoriously hard to hover. The
example shown in Figure 3b is the “Walkera Dragonfly”, which is a standard mid-sized helicopter.
Quadrocopters
These are a perfect size although about 600mm wide like the mid-sized helicopters; quadrocopters
have four much smaller rotors, making them safer for use in corridors. They also appear powerful
enough to hold the wireless camera and battery and are naturally stable due to onboard gyro
stabilisation. A small down side is that there are relatively few ready to use quadrocopters
commercially available. The example shown in Figure 3c is the “X3D-BL UFO”, which is a high end
quadrocopter.
Blimps
These would be by far the easiest option to control, but they are bulky and are made to be very
light, so it is possible a blimp would not be able to support any extra weight. The example shown
in Figure 3d is the “Tri-Turbofan” blimp.
b) Mid-sized Helicopter:
“Walkera Dragonfly” [11]
a) Miniature Helicopter:
“Heli-Q” [10]
c) Quadrocopter:
“X3D-BL UFO” [12]
d) Blimp:
“Tri-Turbofan” [13]
5
The University of Surrey UAV Vision and Control System Richard Lane
2.4 Quadrocopters
Quadrocopters are a form of helicopter with four fixed pitch rotor blades. Two pairs of rotors
spin in opposite directions to create counter torque as shown in Figure 4. Directional motion is
achieved by independently varying the speed of the four rotors; in the descriptions below the
motors are numbered as shown in Figure 4.
- Height is controlled by varying the speed of all four rotors. Increasing the speed of rotors 1-4
increases height.
- Yaw is controlled by speeding up two opposite rotors and slowing the other two. Speeding up
1&3 and slowing 2&4 would result in turning right.
- Pitch is controlled by increasing and decreasing the speed of two opposite rotors. Assuming 1
is the forward rotor, speeding up 3 and slowing 1 will result in forward motion.
- Roll is controlled by increasing and decreasing the speed of two opposite rotors. Assuming 1
is the forward rotor, speeding up 2 and slowing 4 will result in rolling left.
[14]
2.5 OpenCV
“OpenCV (Open Source Computer Vision) is a library of programming functions mainly aimed at real
time computer vision. Example applications of the OpenCV library are Human-Computer Interaction
(HCI); Object Identification, Segmentation and Recognition; Face Recognition; Gesture Recognition;
Motion Tracking, Ego Motion, Motion Understanding; Structure From Motion (SFM); and Mobile
Robotics.” [15]
OpenCV was originally developed by Intel. Several parts of OpenCV are used within the author’s
project, CvCam and HighGUI which are described below, as well as common mathematic
functions, drawing functions, and data structures.
2.5.1 CvCam
“CvCam is a universal cross-platform module for processing video stream from digital video cameras.
It is implemented as a dynamic link library (DLL) for Windows and as a shared object library (so) for
linux systems and provides a simple and convenient Application Programming Interface (API) for
6
The University of Surrey UAV Vision and Control System Richard Lane
reading and controlling video stream, processing its frames and rendering the results. CvCam is
distributed as a part of Intel’s OpenCV project under the same license and uses some functionality of
the Open Source Computer Vision Library.” [16]
CvCam is used as the method for initialising the wireless camera, streaming video and then
accessing individual video frames for processing.
2.5.2 HighGUI
“While OpenCV is intended and designed for being used in production level applications, HighGUI is
just an addendum for quick software prototypes and experimentation setups. The general idea
behind its design is to have a small set of directly useable functions to interface your computer vision
code with the environment.” [17]
HighGUI is used to create a window for displaying video frames from CvCam; it is also used for
some very basic user event handling.
2.6.1 Bcastmanager
Bcastmanager is an additional part to the Network API which allows the use of broadcast names
when establishing a connection instead of specific IP addresses. This helps when using the
Network API to communicate between two pieces of software on different PCs, as an IP allocated
to a PC on one day may be different on another day. That fact makes it impractical to directly use
IP addresses. The Bcastmanager works by running of both target PCs and broadcasting their name
and IP address across their network. One PC’s name and IP address is picked up by the other PC
and this information is passed on to Network API. The Network API on each PC will now know
the current IP address of the other PC.
7
The University of Surrey UAV Vision and Control System Richard Lane
8
The University of Surrey UAV Vision and Control System Richard Lane
9
The University of Surrey UAV Vision and Control System Richard Lane
4 Hardware
4.1 UAV
The X-UFO is an R/C (remote controlled) quadrocopter made by a company called Silverlit (see
Appendix C for specification and picture). It has an onboard mechanical gyro and control circuit,
which automatically handles speed control for the four motors and compensates for most
unwanted pitch and roll. Having four automatically regulated propellers makes the X-UFO
naturally stable, because of this hovering requires minimal effort compared to traditional
helicopters; the user need only control the height and compensate for drifting. The X-UFO is also
capable of lifting the extra weight of a wireless camera and 9V battery, but care needs to be taken
when mounting extra weight to maintain the centre of gravity.
The X-UFO is supplied by a 12V 350mAh battery, which causes a problem when developing, such
that 8 hours charging time for the battery results in less than 5 minutes flying/development time.
To solve this problem a 12V 5A power supply was purchased and modified with a 10 metre long
light weight cable. An extra socket for the power supply was also added to the X-UFO (shown in
Figure 7), giving the flexibility to use either the power supply or original battery.
Mounting
A wireless camera is to be attached to the underside of the X-UFO to provide visual feedback to
the decision systems by referencing a set of fiducials on top of the UGV. The camera itself is small
and light, and can easily be mounted to the X-UFO’s foam body, however the 9V battery required
to power it is relatively bulky. This may cause some future issues with the centre of balance if not
mounted correctly. Tests have shown that stability is not an issue if the battery is held within the
X-UFO’s battery cradle, however weight is an issue and the X-UFO struggles to gain much height
10
The University of Surrey UAV Vision and Control System Richard Lane
with the camera and battery mounted (shown in Figure 8). See Appendix E for pictures and
technical specifications of the wireless camera.
Interference
The camera transmits on a set frequency channel at around 2.4GHz which could cause two issues.
Firstly it could interfere with local wireless computer networking (which is also at 2.4GHz). Tests
have already shown that if a wireless network is trying to use the same channel the network will
be completely blocked, however on different channels both systems can be used together.
Secondly the RF link for the wireless sensor system being developed by Adam Lindsay also uses
channels at around 2.4GHz. This should be less of a problem as a feature of the wireless sensor
system link is to be able to channel hop and automatically pick a noise free channel. During testing
no interference has been detected between the wireless camera transmitter and wireless sensor
system.
WinDVR Setup
InterVideo WinDVR is the software which comes with the wireless camera receiver. Once
installed options need to be set before use with the Vision and Control System software. The
setting under Setup%Option%Device%Standard% should be changed to “PAL_D”. The options
under Setup%Option%Display%Video should be set as shown in Figure 9.
11
The University of Surrey UAV Vision and Control System Richard Lane
4.2 UGV
The target fiducials to be used for the vision tracking system consists of 5 ultra bright diffused red
LEDs. Each of the five small LED and resistor stripboards can be bolted on top of already existing
mount points for the UGV’s ultrasound sensors, or to mount points on this years extended
platform. The LEDs are set out with one in each corner and one in the middle front centre, as
shown in Figure 10. The picture shown in Figure 10 is cardboard template with the LED
stripboards attached to it. The template, which is to the same scale as the UGV mount points, is
used to hold the LEDs in the correct orientation for testing and demonstration purposes.
420mm
280mm
It was decided that the best power supply for the LEDs was one of the UGV laptop’s USB ports,
as this would save weight and cost by not requiring extra batteries. Each small LED board (shown
12
The University of Surrey UAV Vision and Control System Richard Lane
in Figure 11) plugs into a main supply board (shown in Figure 11) separately to facilitate easy and
neat attachment to the UGV. The supply board can then be hidden underneath the UGV’s laptop
platform.
To provide the best visibility on video of the LEDs, large superbright 8mm LEDs were purchased
as tests using standard brightness 5mm LEDs showed that the LEDs could not be easily
distinguished at distances of over a metre. A test board (shown in Figure 13) was built containing 4
superbright LEDs to be used to find how many LEDs would be required for each fiducial. Tests
showed that just one superbright LED is easily visible using the wireless camera up to distances of
over 3 metres.
The laptop used for development and testing of the project code was an Apple MacBook Pro,
running Windows XP Pro. The image processing function development for the project code can
tend to be very processor intense depending on the operating conditions. For future reference in
case of code re-use on a different laptop some relevant specifications of the laptop used are listed
13
The University of Surrey UAV Vision and Control System Richard Lane
below, these were taken from the Mac OS System Profiler. Resource usage data described in
section 4.3.2 should also be taken into account.
Below some average CPU Usage figures for the Vision and Control System software are given,
with increase amounts of the Vision and Control software running on the laptop described in
section 4.3.1.
During the course of the project three control interface methods were investigated. These are
referred to as USB-PPM Interface Method 1, USB-PPM Interface Method 2, and USB-Voltage
Interface Method, and described in detail in future sections. The USB-Voltage Interface Method is
the final chosen method for the control interface as it was the only method to prove reliable.
Method
The original plan was to attempt to use the USB-PPM box to control the X-UFO via the original
transmitter’s buddy port. Assuming they were compatible this would allow a very tidy control
method for the X-UFO from software.
14
The University of Surrey UAV Vision and Control System Richard Lane
Initial Results
After receiving the X-UFO and USB-PPM box they proved to be incompatible. It was found that
the X-UFO transmitter does not use a standard PPM signal for its buddy control. Instead it uses a
very different pulse system, which is then converted to a PPM signal just before transmission. This
result had been anticipated, as the X-UFO transmitter is intended as a toy rather than a
professional R/C transmitter. The planned solution was to try using a professional R/C transmitter
instead, see 4.4.3.
Method
This secondary plan was to attempt to use the USB-PPM box to control the X-UFO via a third
party R/C transmitter’s buddy port. This however relies on both the USB-PPM box being
compatible with the transmitter and the transmitter being compatible with the X-UFO. But again if
compatible this would allow a very tidy control method for the X-UFO from software.
Solution one was to try and use the Laser 6 transmitter (which has a 35MHz frequency) to control
the X-UFO by changing the X-UFO’s receiver crystal to 35MHz. This failed presumably because
the rest of the X-UFO’s receiver circuitry is only compatible with 26MHz channels. Two new
aerials of correct length for receiving 35MHz R/C were also made and tested with the X-UFO
with no success. The calculations used for length of the aerial required are shown below.
Solution two was to try and use the Hitec R/C transmitter to control the X-UFO (which has a
26MHz frequency) by changing the transmitter’s crystal to 26MHz. This also failed presumably
because the rest of the transmitter’s circuitry is only compatible with 35MHz channels. However it
15
The University of Surrey UAV Vision and Control System Richard Lane
was noticed that the X-UFO did respond correctly to the transmitter when the transmitter’s
aerial was rested on the X-UFO’s mains power cable. The only possible explanation for this is that
the RF signal from the transmitter was propagating up the power cable (this could be seen using an
oscilloscope) and was being picked up by the receiver. This could be made to work 50% of the
time by securing the power cable along side the transmitter’s aerial (as shown in Figure 14). This
method could have been used as a temporary solution had it been more reliable; obviously an
unreliable control interface is extremely undesirable with aerial vehicles. It did however prove the
important concept that the X-UFO could be controlled using the signal from the USB-PPM box.
Figure 14: Unreliable X-UFO Control Method using Hitec Laser 6 Transmitter
Method
After little success with previously investigated USB-PPM interface methods it was decided to take
a different approach. The original X-UFO transmitter was dismantled to find that each channel is
essentially driven by a voltage from a potentiometer (attached to the control sticks). This means
that the transmitter could be modified to be driven by a software controlled voltage source. It was
decided that Adam Lindsay would strip down the current code for the wireless sensor system
16
The University of Surrey UAV Vision and Control System Richard Lane
microcontrollers, so that they would take a servo value via RS-232 and output the servo value as a
voltage from the microcontrollers’ DACs. Four channels are required, which can be achieved
using two microcontrollers with two DACs each. The original prototype for this system is shown
in Figure 15. The voltages from the microcontrollers can then be amplified and used to drive the
original X-UFO transmitter. Several modifications are required to the X-UFO transmitter, which
are described in later sections along with the intermediate circuits required.
17
The University of Surrey UAV Vision and Control System Richard Lane
18
The University of Surrey UAV Vision and Control System Richard Lane
Holes were drilled in the front of the transmitter for the switches as well as a hole for the five pin
voltage input socket. These modifications are shown in Figure 18. Care should be taken when the
X-UFO is powered on, if the switches are in the software control position but no input is present
this will lead to unpredictable behaviour of the X-UFO.
Potentiometer
plugs
19
The University of Surrey UAV Vision and Control System Richard Lane
Results
This method provides a working interface between software and the X-UFO aerial platform, albeit
not as simple as the original plan. Servo values can be sent via software to the modified X-UFO
transmitter and the input for each channel can be selected using the switches on the front of the
transmitter.
20
The University of Surrey UAV Vision and Control System Richard Lane
5 Software
Process
For each frame of video loaded by CvCam a callback function is called; within the callback function
is most of the Vision and Control System code. Video is rendered at 30Hz which means that 30Hz
is also the frequency of the system code. The overall working of the system is described below as
pseudocode and each section in described in detail in future sections. The full software code can
be found in Appendix C.
callback loop
if Network API communications are connected
if height message received
set acknowledgement bit
set new target height
process odd frame
choose 5 reference points
get decision commands from reference points
if commands are valid
adapt values for grouping thresholds
send control values
else
set error bits
show control visualisaton
if message bits set
send message to UGV decision system
loop
Video acquisition is achieved by using CvCam, part of the OpenCV library. During initialisation a
callback function is passed to CvCam; the callback function is then called by CvCam for every
video frame taken from the wireless camera. Most of the project code is then run from the
callback function, which has access to the most current image data structure every time it is called.
Video is then rendered to a window using HighGUI, another part of the OpenCV library.
Fiducial detection starts by scanning through pixels in a video frame and grouping them together if
they meet the required criteria. These grouped pixels, or possible LEDs, are sorted based on the
likelihood of them actually being an LED. Likelihood is calculated using shape, size and orientation
as the criteria, the five possible LEDs with the highest likelihood are then assumed to be the five
fiducials.
21
The University of Surrey UAV Vision and Control System Richard Lane
Process
Pixels are examined one at a time; from left to right on each line and each line from top to bottom
of the frame. Each pixel is tested to see if it meets one of two criteria, is it bright white or strong
red as these are the best criteria available to describe the pixels making up an LED in the image
data structure. If a pixel matches the criteria it is grouped to an existing or new possible LED
object, the pixel grouping algorithm is described in a later section. For easy visualisation of
relevant pixels bright white pixels are changed to pure white, strong red pixels are changed to
pure red, and all other pixels are changed to black within the processed field.
Criteria
Each pixel is stored in the image data structure as three component colour values; red, green and
blue. For a pixel to be classed as bright white the sum of the red, green and blue values must be
greater than the adaptive white threshold. For a pixel to be classed as strong red the green and
blue values are deducted from the red value, the result must then be greater than the adaptive red
threshold.
Process
Once a pixel is determined as relevant it needs to be grouped, the result is a list of possible LEDs
containing an averaged [x,y] position in the frame, the amount of pixels, and a likelihood factor
which is used later on. A relevant pixel’s [x,y] position is compared to the position of each
22
The University of Surrey UAV Vision and Control System Richard Lane
possible LED. The pixel is added to the closest possible LED, as long as the distance is below the
adaptive distance threshold. The possible LED position is averaged with the pixel position and the
pixel count is increased. If the pixel is not close enough to any current possible LED a new
possible LED is created.
Process
Once a list of possible LEDs is obtained, the list requires sorting to decide which of the possible
LEDs are most likely to be the target fiducials. This is done by assigning each possible LED a
likelihood, which represents its likelihood of being one of the five fiducials. This likelihood value is
then modified using four criteria; Group Pixel Threshold, Shape Analysis, Size Histogram, and
Relative Orientation. For each criterion a likelihood is calculated for each possible LED and then
multiplied with the possible LED’s current likelihood. The five possible LEDs with the highest
likelihood are then taken to be the five fiducials.
Shape Analysis
The shape of the target fiducials in a frame will generally be circular, which means possible LEDs
found to not be circular can be given a lower likelihood. The shape analysis works by starting at
the centre point of each possible LED and then work outwards one pixel at a time checking that it
is not black (a non-useful pixel). This is done in eight directions, four perpendicular and four
diagonal; the distances between the centre and the edge of the shape are recorded. The
23
The University of Surrey UAV Vision and Control System Richard Lane
differences between distances are then summed, perpendicular first, then diagonal, and then both.
These sums are used to produce a likelihood that the possible LED is circular.
Size Histogram
Each of the target fiducials in a frame will generally be represented by a similar amount of pixels.
Therefore if a histogram is taken and five possible LEDs are in the same histogram group there is a
strong likelihood that they are the target fiducials. It is also very likely that the five possible LEDs
will be in two adjacent groups. For this reason the histogram likelihood process is repeated twice
with different histogram widths for improved accuracy. Once each histogram has been produced
likelihoods for each possible LED are calculated from the amount of items in the histogram group,
to which the possible LED belongs.
Relative Orientation
Possible LEDs which form right angles or straight lines with each other are more likely to be the
target fiducials because of the target fiducial layout. To start the orientation likelihood process the
angle between two possible LEDs relative to another possible LED is calculated for every
combination of possible LEDs which do not already have a likelihood of 0.0. If a combination is
found which is within a set threshold of 90° or 180° then this combination is stored in a list for
later reference. The list contains details of the reference possible LED and the product of all three
possible LED’s likelihoods. Once the list is complete the maximum product of likelihoods for each
reference possible LED is used for the likelihood that the reference possible LED is a fiducial. This
results in a large likelihood for combinations where all three possible LEDs already have a high
likelihood. Combinations which happen by chance will not have all three possible LEDs with high
likelihood and will therefore have a small product of likelihoods.
24
The University of Surrey UAV Vision and Control System Richard Lane
Results
Results for the possible LED sorting are shown in several screenshots in Figure 20. The
screenshots show the progressive improvement of sorting, as each likelihood algorithm is added.
25
The University of Surrey UAV Vision and Control System Richard Lane
Possible LEDs are highlighted by green rings, where the size of the ring represents the likelihood
that the possible LED is actually a target fiducial. Part e) of Figure 20 shows the final result, with
the five largest rings around the five fiducials, indicating that all five fiducials would be correctly
picked out from the original possible LEDs.
Figure 20: Screenshots showing the effect of likelihood LED sorting algorithms
a) Example exaggerated
noisy frame created to
test likelihood method of
sorting possible LEDs.
The frame contains
shapes which could be
mistaken for target
fiducials, some of which
are fairly common in
strongly lit locations.
26
The University of Surrey UAV Vision and Control System Richard Lane
27
The University of Surrey UAV Vision and Control System Richard Lane
The results of the pixel size histogram (using the example frame shown in Figure 20) are shown in
Figure 21. For the two histograms performed there are two clear groups that are most likely to
contain possible LEDs which are target fiducials.
Figure 21: Histogram plots showing distribution of the amount of pixels in possible LEDs
Process
Once there has been five significant fiducials discovered in a frame the camera’s orientation
relative to the fiducials can be calculated. Figure 22 shows the fiducials as they should be
numbered, the steps listed below describe how the fiducials are given the correct number and
how orientation is calculated.
28
The University of Surrey UAV Vision and Control System Richard Lane
Forward direction
of camera
Average of
five points
29
The University of Surrey UAV Vision and Control System Richard Lane
30
The University of Surrey UAV Vision and Control System Richard Lane
Angle Calculations
The following calculation is used to find the angle of one point compared to a reference point.
When used in the orientation calculations the reference point is the average of the five fiducial
locations. Resulting angles are relative to the camera forward direction shown in Figure 22, this
direction is taken as 0°.
Results
Figure 23 shows a screenshot of orientation data being displayed on same noisy test frame used to
demonstrate likelihoods. Four lines should connect the fiducials in a rectangular shape, this shows
the fiducials are being numbered correctly. These lines can be red or green, red is normal and
green visualises the direction of pitch and roll. Blue lines show the forward directions of the
camera and fiducials. The green line shown between points 3-4 in this example means that the
UAV would be rolling to the left. The green line shown between points 1-4 means that the UAV
would be pitching forwards.
31
The University of Surrey UAV Vision and Control System Richard Lane
Process
A protocol has been implemented for communications between the UAV decision system and the
UGV decision system using Network API and Bcastmanager, which were developed by Ahmed
Aichi during the first year of SEP. The Bcastmanager is used to broadcast an address string
across a network, which removes the need to know the target program’s IP address.
Bcastmanager then finds the IP address of the target program which is also broadcasting an
address string and establishes a connection for Network API. Once connected, the UAV
decision system checks for a received message before each video frame is processed. After each
video frame is processed any messages ready to be sent out to the UGV decision system are
transmitted, for example to acknowledge a received message and/or report an error.
32
The University of Surrey UAV Vision and Control System Richard Lane
Purpose
For the purpose of testing UAV to UGV communications, an emulation program was written
which pretends to be the UGV decision system (the code is included in Appendix E). Once
running it establishes a connection with the UAV decision system using Network API and
Bcastmanager in the same that the UGV decision systems would. It will then pick up error
messages sent from the UAV decision system and display their meaning in a console window. This
provides and excellent way of checking when error messages are being sent and that they are
being sent correctly. The UGV emulator also sends a height request every few seconds to the
UAV decision system. The height sent will increment each message by 0.1 metres until it receives
an “invalid command” message back from the UAV decision system. It will then decrement each
message until again it receives an “invalid command” message. This is used to test the UAV
decision systems response to height requests, which could be out of the allowed range, and to
check it accepts a valid height as its new target height.
33
The University of Surrey UAV Vision and Control System Richard Lane
There are several types of error which could occur during use of the Vision and Control System;
anticipated errors and how they are handled are described below.
34
The University of Surrey UAV Vision and Control System Richard Lane
information or the fiducials available until tracking is restored. This would be part of the
automated control implementation which is incomplete.
Progress
Due to the amount of issues encountered getting the hardware control interface working, as
described in previous sections, the control integration of the project software remains largely
incomplete. The actual software-hardware communications are in place for both types of control
35
The University of Surrey UAV Vision and Control System Richard Lane
method; the USB-PPM box or the USB-Voltage microcontroller. Servo values can be sent to either
piece of hardware; however the calculation of the servo values is incomplete.
5.4.2 Visualisation
Process
A small visualisation box is drawn in the corner of the video window. It gives a visual
representation of the control servo values being sent out of the interface. Four lines represent the
four control sticks on the R/C transmitter. The lengths of the lines represent the servo values.
Figure 24 shows a screenshot of a test frame containing the control visualisation.
Pitch
Roll
Yaw
Throttle
Purpose
A small Remote Control Test program was written to provide a simplified method of testing the
control interface between software and R/C transmitter, without running the full vision software.
It is compatible with both interface methods using the USB-PPM Box or the USB-Voltage
microcontrollers; the method used is set via a compile option. The program allows the use of a
keyboard to change servo values sent out by the selected interface. With each key press the
appropriate servo value is incremented/decremented by a set step size. Instructions are shown on
36
The University of Surrey UAV Vision and Control System Richard Lane
screen in the form of a transmitter (shown in Figure 25), pressing space resets the servo values to
default, and pressing esc closes the program.
37
The University of Surrey UAV Vision and Control System Richard Lane
6.1 Hardware
Live tests were conducted to assess the mobility of the X-UFO as a UAV platform. The first test
performed was using the mains supply umbilical with the camera and 9V battery mounted to the
X-UFO. Upward acceleration was found to be fairly sluggish, achieving height at a rate of about 0.5
metres/sec. As a result of the extra weight the rotors have to spin faster to keep the UAV a flight,
therefore turning speed is also reduced because the maximum difference between rotor speeds is
reduced. Overall however the mobility of the UAV is still acceptable, as shown in Figure 26, the
UAV was capable of taking off and manoeuvring around the test area from about 0.5 metres to the
ceiling at 3.0 metres. It is also worth noting that with the extra weight centrally distributed there
were no visible negative effects on platform stability.
Figure 26: Screenshots from video showing UAV mobility with camera mounted
The second test was using the mains supply umbilical with the camera mounted to the X-UFO, but
without the 9V camera battery. Overall mobility was greatly improved with upward speeds of
around 1.0 metres/sec. Therefore if extra greater mobility is required in the future it is
recommended that the wireless camera is also powered from the same supply as the X-UFO.
The third test was using the X-UFO battery to supply the X-UFO with the camera and 9V battery
also mounted to the X-UFO. The extra weight from the camera and 9V battery almost prevents
the X-UFO from taking off. Height can barely be gained and due to the short flight time of the X-
UFO battery, flight was only achieved for 30 seconds before the battery could not keep up the
demand of full power to the motors. This test was repeated without the 9V camera battery and
38
The University of Surrey UAV Vision and Control System Richard Lane
results similar to the first test were achieved. However flight time was limited to about 4 minutes
before it was impossible for the UAV to gain height. Therefore if a battery powered UAV is
required several steps could be taken.
- A reduction of weight on the UAV. Such as the camera using the same battery as the X-UFO,
and/or the removal of safety precautions such as the foam padding surrounding to rotors.
- An upgrade to the X-UFO battery. Batteries with higher capacity are readily available in
professional R/C stores.
- An upgrade to the UAV platform. There are several higher rated quadrocopters available
commercially. This option was considered at the start of the project however due to their
price they were not suitable.
Tests were conducted using the USB-Voltage interface method (described in section 4.4.4) and the
Remote Control test software (described in 5.4.3). These tests proved that the control interface
works with no noticeable lag between pressing a key on the laptop to the X-UFO reacting
appropriately. As the Remote Control software provides servo values in 10 steps it is an
impractical control method for actual flight. However whilst holding the top of the X-UFO to
prevent collisions it was clear that each channel functions correctly whilst varying the values from
minimum to maximum.
6.2 Software
Live tests were conducted to assess the fiducial tracking algorithm. The screenshots in Figure 27
show successful fiducial detection, with the likelihood of possible LEDs being the fiducials
represented as the size of the green circle marking them. The normal light condition screenshot
shows some possible LEDs which have been discarded; they are marked as tiny green dots due to
the extremely low likelihood of them being the fiducials. The normal light condition screenshot
shown in Figure 28 is also a good example of fiducial detection working correctly. Small white
reflections off of a radiator can be seen, these would have been picked up as possible LEDs
however they have been successful discarded.
Despite coping with noise extremely well compared previous detection algorithms, detection
problems still occur in some situations. In situations where the LEDs are very close to reflective
surfaces, the reflection on the surface can merge with the light from the LED. This results in a
larger mass of light in roughly the same position as the LED, which will get discarded by the
fiducial detection as the shape will not be circular or the same size as the other fiducials. In cases
where there happens to be noisy light in the frame, which is the same shape and size as a fiducial,
the algorithm can also fail. If the noise also happens to be at right angles to the other fiducials this
can lead to the noise being picked over an actual fiducial.
39
The University of Surrey UAV Vision and Control System Richard Lane
Figure 27: Screenshots showing Fiducial Detection in Low and Normal Light Conditions
Live tests were conducted to assess orientation calculations in live conditions. Due to the
relatively simple nature of the orientation algorithm, as long as fiducial detection is successful,
orientation calculations are successful 100% of the time. Screenshots showing successful
orientation calculations are shown in the Figure 28.
Figure 28: Screenshots showing Orientation Calculations in Low and Normal Light Conditions
40
The University of Surrey UAV Vision and Control System Richard Lane
7 Additional Work
A Gantt chart was produced at the start of the project (shown in Appendix A) which included a
breakdown of the expected tasks during the project. The time allocated to some of the tasks
proved to be overly optimistic, and did not take into account busy times of the year or the four
weeks between semesters. There were also some unanticipated tasks taking up several weeks. At
the time of writing the Interim report the Gantt chart was revised and has since been updated
again with the time taken for each task and the addition of extra tasks (shown in Appendix B).
In the Gantt charts provided in Appendix A and Appendix B orange squares show the planned
times scales for each task, green lines show the actual time taken to complete tasks. If a task has a
green line but no orange square this shows that it is an unexpected task added into the Gantt
chart after it was first produced. In the revised Gantt chart in Appendix B lighter orange squares
show the time plan carried over from the first Gantt chart for comparison.
7.1.2 Costing
The total budget for the SEP project was £1000, a breakdown of hardware costs for the UAV
Vision and Control project is shown in Table 1. The original planned costs for the author’s project
totalled £299.48, but with the need for a mains supply for the X-UFO the last two items in the
table were ordered, raising the total to £321.45. The total may be more than a quarter of the SEP
budget but the other three projects obtained a large amount of hardware for free.
A list of the part numbers and source of hardware ordered from the SEP budget is shown in Table
2. For future reference flyonthewall.uk.com was a particularly unreliable source and it took over a
month to receive the wireless camera ordered from them, this hampered progress on the project
41
The University of Surrey UAV Vision and Control System Richard Lane
whilst waiting for camera (a major component in the project). Tom’s RC is based in USA but the
SC-8000 was received within a week. Most of the small components were ordered from Rapid
through lab services as it is a preferred source, however parts may be found cheaper elsewhere.
42
The University of Surrey UAV Vision and Control System Richard Lane
8 Conclusions
Over the course of the project the most progress was made on the vision fiducial tracking system.
It can successfully pick out five fiducials from a video frame in a range of lighting conditions and can
handle noise to a certain extent. Therefore objective 1 “to produce a robust vision system capable
of tracking a set of fixed fiducials” can be considered achieved. It could be argued that the
robustness can be improved further. There will always be some situations where it would be
extremely difficult to pick fiducials out of a frame using just the current methods employed in the
project. Some improvements considered are described in section 9.3.
The orientation calculations of the UAV relative to the UGV are calculated correctly. The exact
location calculations have not been implemented due to the delay in getting the control interface
working, without an interface there was no need for the location information. Therefore objective
2 “to use information provided by the vision system to establish the camera’s location relative to
the fiducials” can only be considered partially achieved.
The control interface despite taking a large amount of the project time to produce is in a working
prototype state. It would have been extremely beneficial to the rest of the project objectives if the
originally planned interface had worked as it should first time. Despite being a prototype, objective
3 “to produce a control and decision system to interface between the vision system and UAV” can
be considered achieved. A black box generic final product would have an ideal solution; however
the final solution only works with the modified X-UFO transmitter. If a more professional
platform was to be purchased the interface is also in place using the originally planned interface.
Unfortunately objective 4 “to stabilise a UAV above a set of fiducials fixed to the UGV, by using a
vision system for the control feedback” has not been achieved. Due to the lack of a working
control interface for almost the entire project duration it has not been possible to start work on
the control feedback system. None of the secondary objectives were attempted due to the time
constraints on the project.
The project management aspects of the project have had varied success. Regular weekly meetings
helped the SEP team motivated and focused. The author’s efforts to plan task timings using a Gantt
chart failed several weeks into the project. Too many small tasks were overlooked, and the whole
plan was affected drastically by the constant delays in producing the control interface.
43
The University of Surrey UAV Vision and Control System Richard Lane
9 Discussions
Actually achieving a working control interface took too much time during the project. If I could
change anything it would have been early decisions about the control interface. Two better
possibilities would have been to buy a more expensive aerial platform, one with a professional R/C
transmitter which isn’t intended as a toy, or to have decided to go in the direction of modifying
the internals of the X-UFO transmitter at an earlier stage of the project.
Failing to achieve automated flight is a great disappointment to me. This would have been the most
exciting part of the project to get working. However with the amount of problems encountered
with the control interface, it became clear late on in the project that this would not be
implemented.
Towards the end of the project it began to feel like I was working on two projects. If the original
interface plan had worked first time I feel I would have been able to achieve automated flight in
some form and a more reliable tracking system. In hindsight it would have made more sense to
split my project between two students.
There are still extra reliability methods I did not have time to implement such as temporal
checking of possible LEDs during fiducial detection. An example of this is assigning a higher
likelihood to possible LEDs which are close to the location of fiducials in the previous frame.
There are two other methods of tracking I considered during the project to be included alongside
the current fiducial detection. The first is using edge detection to pick out the edges on the UGV.
44
The University of Surrey UAV Vision and Control System Richard Lane
After finding the edges, corners can be found and the position of these corners could be
compared to the position of possible LEDs during fiducial detection to improve reliability.
Secondly I thought about implementing KLT tracking to again improve reliability as well as
providing addition positional information. The remnants of testing these ideas can be found within
my project code however there was not enough time to experiment further.
A method of coping with the merging of noisy light and fiducial light could be to detect circular
edges using available edge detection algorithms. This could be used to further improve fiducial
detection by being able to at least recognise half of the circular fiducial shape.
I spent some time trying to find ready code to use for camera pose estimation. There is most
likely something available however this was put to one side until the control interface was
working. Implementing an established pose estimation algorithm would produce more accurate
values to be fed to the control feedback system.
A large amount of work is needed on the control feedback system as it is nearly non-existent due
to the lack of time remaining after managing to produce a working control interface too late in the
project. It would also be good to see a finished black box version of the control interface with a
surface mounted PCB.
45
The University of Surrey UAV Vision and Control System Richard Lane
References
1. microdrones GmbH: MD 4-200 System Features
http://www.microdrones.com/md4-200.html, 2008-05-01
5. “Estimation and Control of a Quadrotor Vehicle Using Monocular Vision and Moiré Patterns”
Glenn P. Tournier, Mario Valenti, Jonathan P. How, and Eric Feron,
http://acl.mit.edu/papers/GNC06_TournierHow.pdf, 2006-08
6. Abstract (page 1)
7. Figure 3 (page 3)
8. Moiré Target Overview (page 2)
9. “Six Degree of Freedom Estimation Using Monocular Vision and Moiré Patterns”
Glenn P. Tournier, http://vertol.mit.edu/papers/tournier_thesis.pdf, 2006-06
46
The University of Surrey UAV Vision and Control System Richard Lane
21. Tom’s RC
http://www.tti-us.com/rc/sc8000tech.htm, 2007
22. flyonthewall.uk.com
http://www.flyonthewall.uk.com, 2007
47
The University of Surrey UAV Vision and Control System Richard Lane
Appendix A - i
The University of Surrey UAV Vision and Control System Richard Lane
Appendix A - ii
The University of Surrey UAV Vision and Control System Richard Lane
Appendix B - i
The University of Surrey UAV Vision and Control System Richard Lane
Appendix B - ii
The University of Surrey UAV Vision and Control System Richard Lane
Appendix B - iii
The University of Surrey UAV Vision and Control System Richard Lane
//--------------------------------------------------------------------------//
// aerial_platform.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains various bits for aerial_platform project
//--------------------------------------------------------------------------//
#ifndef __AERIAL_PLATFORM_H__
#define __AERIAL_PLATFORM_H__
// Complile options
//#define CONNECT_NETAPI
//#define USE_UGV_EMULATION
#define BCAST_LOCATION "C:/SEP_code/netAPI/"
#define EMULATOR_EXE "C:/SEP_code/ugv_emulation/release/ugv_emulation.exe"
//#define SEP_REPORT_ERRORS
//#define PAUSE_ON_ENTER
#define CTRLMETHOD_USB2VOLTAGE
//#define CTRLMETHOD_SC8000
#define CONTROL_CAMERA 1
#define CONTROL_CAMERA_WIDTH 720
#define CONTROL_CAMERA_HEIGHT 576
#define CONTROL_CAMERA_FPS 30.0
Appendix C - i
The University of Surrey UAV Vision and Control System Richard Lane
#define PROCESS_LEDS
//#define PROCESS_EDGES
//#define PROCESS_KLT
#define CALL_OK 1
#define CALL_ERROR 0
typedef enum {
STATE_CLOSE,
STATE_ERROR,
STATE_INIT,
STATE_PAUSED,
STATE_RESUME,
STATE_START,
STATE_STOPPED
} state_machine;
#endif //__AERIAL_PLATFORM_H__
aerial_platform.cpp
//--------------------------------------------------------------------------//
// aerial_platform.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains main() and procesing_callback() for control of the aerial
// platform
//--------------------------------------------------------------------------//
#include "stdafx.h"
#include "aerial_platform.h"
#include "sep_uavcomms.h"
#include "sep_cvcam.h"
#include "sep_commands.h"
#include "sep_uavcontrol.h"
#include "sep_orientation.h"
#include "cvcam.h"
#include "highgui.h"
#ifdef PROCESS_LEDS
#include "sep_leds.h"
#endif //ifdef PROCESS_LEDS
#ifdef PROCESS_EDGES
#include "sep_edges.h"
#endif //ifdef PROCESS_EDGES
#ifdef PROCESS_KLT
#include "sep_klttracker.h"
#endif //ifdef PROCESS_KLT
Appendix C - ii
The University of Surrey UAV Vision and Control System Richard Lane
{
#endif // ifdef CONNECT_NETAPI
bool gotcommands(false);
static sepLEDs callbackProcess;
static sepOrientation callbackOrientation;
static sepUAVControl callbackUAVControl;
// error status bits
static char callbackErrorBits(0);
#ifdef CONNECT_NETAPI
// check for target height message
static double tempheight;
if (callbackComms.commsCheck(tempheight))
{
callbackErrorBits += ERRORBIT_ACKNOWLEDGE;
callbackOrientation.SetTargetHeight(tempheight, callbackErrorBits);
}
#endif // ifdef CONNECT_NETAPI
callbackProcess.Get5RefPoints(refpoints);
// calc orientation and get commands
gotcommands = callbackOrientation.GetCommands(image, refpoints,
controldirections);
// if got valid commands proceed and send commands to UAV
if (gotcommands)
{
// adapt values for grouping and distance thresholds
callbackProcess.AdaptValues(callbackOrientation.GetMinDistance());
// send commands to uav
// callbackUAVControl.SendCommands(controldirections);
}
else
{
// check orientation class for reportable errors
callbackOrientation.GetErrors(callbackErrorBits);
}
// show control visualisation in video window
callbackUAVControl.RefreshDisplay(image);
#ifdef CONNECT_NETAPI
// transmit status if needed
if (callbackErrorBits)
{
callbackComms.commsSend(callbackErrorBits);
callbackErrorBits = 0;
}
}
#endif //ifdef CONNECT_NETAPI
#endif //ifdef PROCESS_LEDS
#ifdef PROCESS_EDGES
static sepEdges callbackProcess;
callbackProcess.ProcessFrame(image);
#endif //ifdef PROCESS_EDGES
Appendix C - iii
The University of Surrey UAV Vision and Control System Richard Lane
#ifdef PROCESS_KLT
static sepKLTTracker callbackProcess;
callbackProcess.ProcessFrame(image);
#endif //ifdef PROCESS_KLT
// stopped state
case STATE_STOPPED:
command_line_in(current_state, next_state);
break;
// start state
case STATE_START:
ControlCamera.camStart();
#ifdef PAUSE_ON_ENTER
next_state = STATE_PAUSED;
#else //ndef PAUSE_ON_ENTER
next_state = STATE_CLOSE;
#endif //PAUSE_ON_ENTER
break;
// paused state
case STATE_PAUSED:
command_line_in(current_state, next_state);
break;
// resume state
case STATE_RESUME:
ControlCamera.camResume();
next_state = STATE_PAUSED;
break;
// close state
case STATE_CLOSE:
Appendix C - iv
The University of Surrey UAV Vision and Control System Richard Lane
#ifdef PAUSE_ON_ENTER
ControlCamera.camKill();
cvDestroyWindow("control camera");
#else //ndef PAUSE_ON_ENTER
cvDestroyWindow("control camera");
#endif //PAUSE_ON_ENTER
break;
// error state
case STATE_ERROR:
command_line_in(current_state, next_state);
break;
// default
default:
#if SEP_REPORT_ERRORS
cerr << "state machine entered unknown state" << endl;
#endif // SEP_REPORT_ERRORS
next_state = STATE_ERROR;
}
current_state = next_state;
}
#else //PROCESS_FRAME_TEST
#endif //PROCESS_FRAME_TEST
return 0;
}
sep_commands.h
//--------------------------------------------------------------------------//
// sep_commands.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains various bits for aerial_platform project
//--------------------------------------------------------------------------//
#ifndef __SEP_COMMANDS_H__
#define __SEP_COMMANDS_H__
#include "aerial_platform.h"
#endif //__SEP_COMMANDS_H__
sep_commands.cpp
//--------------------------------------------------------------------------//
// sep_commands.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
Appendix C - v
The University of Surrey UAV Vision and Control System Richard Lane
#include "stdafx.h"
#include "sep_commands.h"
// in paused state
case STATE_PAUSED:
cout << "input command (r = resume, c = close)...";
cin >> command_in;
switch (command_in)
{
case 'c':
next_state = STATE_CLOSE;
break;
case 'r':
next_state = STATE_RESUME;
break;
default:
#if SEP_REPORT_ERRORS
cerr << "invalid input" << endl;
#endif // SEP_REPORT_ERRORS
return CALL_ERROR;
}
break;
// in error state
case STATE_ERROR:
cout << "input command (c = close)...";
cin >> command_in;
switch (command_in)
{
case 'c':
next_state = STATE_CLOSE;
break;
default:
#if SEP_REPORT_ERRORS
cerr << "invalid input" << endl;
#endif // SEP_REPORT_ERRORS
return CALL_ERROR;
}
Appendix C - vi
The University of Surrey UAV Vision and Control System Richard Lane
break;
}
return CALL_OK;
}
sep_cvcam.h
//--------------------------------------------------------------------------//
// sep_cvcam.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains prototype for sepCVCAM class
//--------------------------------------------------------------------------//
#ifndef __SEP_CVCAM_H__
#define __SEP_CVCAM_H__
#include "cvcam.h"
#include "highgui.h"
class sepCVCAM
{
public:
sepCVCAM(int number, VidFormat format, void* callback);
~sepCVCAM();
int camInit(HWND window);
int camStart();
int camPause();
int camResume();
int camKill();
private:
sepCVCAM(){}
int cam_number;
VidFormat cam_format;
HWND cam_window;
void* cam_callback;
};
#endif //__SEP_CVCAM_H__
sep_cvcam.cpp
//--------------------------------------------------------------------------//
// sep_cvcam.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains the sepCVCAM class for using cvcam for video input
//--------------------------------------------------------------------------//
#include "stdafx.h"
#include "sep_cvcam.h"
#include "aerial_platform.h"
Appendix C - vii
The University of Surrey UAV Vision and Control System Richard Lane
sepCVCAM::~sepCVCAM()
{
}
int sepCVCAM::camStart()
{
// start cvcam
cvcamStart();
#ifdef PAUSE_ON_ENTER
cout << "cvcam started, press enter to pause" << endl;
cvWaitKey(0);
cvcamPause();
cout << "cvcam paused" << endl;
#else //ndef PAUSE_ON_ENTER
cout << "cvcam started" << endl;
cvWaitKey(0);
cvcamStop();
cvcamExit();
cout << "cvcam killed" << endl;
#endif //PAUSE_ON_ENTER
return CALL_OK;
}
int sepCVCAM::camPause()
{
// pause cvcam
cvcamPause();
cout << "cvcam paused" << endl;
return CALL_OK;
}
int sepCVCAM::camResume()
{
// resume cvcam and wait for keys
cvcamResume();
cout << "cvcam resumed, press enter to pause" << endl;
Appendix C - viii
The University of Surrey UAV Vision and Control System Richard Lane
cvWaitKey(0);
cvcamPause();
cout << "cvcam paused" << endl;
return CALL_OK;
}
int sepCVCAM::camKill()
{
// stop and close cvcam
cvcamStop();
cvcamExit();
cout << "cvcam killed" << endl;
return CALL_OK;
}
sep_edges.h
//--------------------------------------------------------------------------//
// sep_edges.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains prototype for sepEdges class
//--------------------------------------------------------------------------//
#ifndef __SEP_EDGES_H__
#define __SEP_EDGES_H__
#include "cv.h"
#include "highgui.h"
#include "aerial_platform.h"
class sepEdges
{
public:
sepEdges();
~sepEdges();
void ProcessFrame(IplImage* frameimage);
private:
CvSize imagesize;
CvMemStorage* storage;
IplImage* originalimage;
IplImage* bwcopyimage;
};
sep_edges.cpp
//--------------------------------------------------------------------------//
// sep_edges.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains the sepEdges class for detecting and tracking edges of UGV
//--------------------------------------------------------------------------//
#include "stdafx.h"
#include "sep_edges.h"
// edges constructor
sepEdges::sepEdges()
Appendix C - ix
The University of Surrey UAV Vision and Control System Richard Lane
{
imagesize = cvSize(CONTROL_CAMERA_WIDTH,CONTROL_CAMERA_HEIGHT);
bwcopyimage = cvCreateImage(imagesize,IPL_DEPTH_8U,1);
storage = cvCreateMemStorage(0);
}
// edges deconstructor
sepEdges::~sepEdges()
{
cvReleaseImage(&bwcopyimage);
cvReleaseMemStorage(&storage);
}
// processes image
void sepEdges::ProcessFrame(IplImage *frameimage)
{
originalimage = frameimage;
// convert to gray scale
cvCvtColor(originalimage, bwcopyimage, CV_BGR2GRAY);
// use cvcanny
CvSeq* lines = 0;
int linecount;
cvCanny(bwcopyimage, bwcopyimage, 50, 200, 3);
// use houghlines
lines = cvHoughLines2(bwcopyimage, storage, CV_HOUGH_PROBABILISTIC,
1, CV_PI/180, 80, 30, 10);
for(linecount = 0; linecount < lines->total; linecount++)
{
CvPoint* line = (CvPoint*)cvGetSeqElem(lines, linecount);
cvLine(originalimage, line[0], line[1], CV_RGB(0,0,255), 2, 8);
}
}
sep_klttracker.h
//--------------------------------------------------------------------------//
// sep_klttracker.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains prototype for sepKLTTracker class
//--------------------------------------------------------------------------//
#ifndef __SEP_KLTTRACKER_H__
#define __SEP_KLTTRACKER_H__
#include "cv.h"
#include "highgui.h"
#include "aerial_platform.h"
class sepKLTTracker
{
public:
sepKLTTracker();
~sepKLTTracker();
void ProcessFrame(IplImage* frameimage);
private:
void ProcessFeature(void);
CvSize imagesize;
IplImage* originalimage;
IplImage* bworiginalimage;
IplImage* bwfeatureimage;
IplImage* eigimage;
Appendix C - x
The University of Surrey UAV Vision and Control System Richard Lane
IplImage* tempimage;
IplImage* pyramid1;
IplImage* pyramid2;
CvPoint2D32f featurefeatures[FEATURES_TO_TRACK];
int numfeatures;
CvPoint2D32f framefeatures[FEATURES_TO_TRACK];
CvSize opticalflowwindow;
CvTermCriteria opticalflowterminationcriteria;
char opticalflowfoundfeature[FEATURES_TO_TRACK];
float opticalflowfeatureerror[FEATURES_TO_TRACK];
};
sep_klttracker.cpp
//--------------------------------------------------------------------------//
// sep_klttracker.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains the sepKLTTracker class for
//--------------------------------------------------------------------------//
#include "stdafx.h"
#include "sep_klttracker.h"
// opticalflow constructor
sepKLTTracker::sepKLTTracker():
imagesize(cvSize(CONTROL_CAMERA_WIDTH, CONTROL_CAMERA_HEIGHT)),
opticalflowwindow(cvSize(2,2)),
opticalflowterminationcriteria(
cvTermCriteria(CV_TERMCRIT_ITER | CV_TERMCRIT_EPS, 20, 0.3)),
numfeatures(FEATURES_TO_TRACK)
{
// create required images
bworiginalimage = cvCreateImage(imagesize, IPL_DEPTH_8U, 1);
eigimage = cvCreateImage(imagesize, IPL_DEPTH_32F, 1);
tempimage = cvCreateImage(imagesize, IPL_DEPTH_32F, 1);
pyramid1 = cvCreateImage(imagesize, IPL_DEPTH_8U, 1);
pyramid2 = cvCreateImage(imagesize, IPL_DEPTH_8U, 1);
ProcessFeature();
}
// opticalflow deconstructor
sepKLTTracker::~sepKLTTracker()
{
cvReleaseImage(&bwfeatureimage);
cvReleaseImage(&bworiginalimage);
cvReleaseImage(&eigimage);
cvReleaseImage(&tempimage);
cvReleaseImage(&pyramid1);
cvReleaseImage(&pyramid2);
}
// processes image
Appendix C - xi
The University of Surrey UAV Vision and Control System Richard Lane
int linethickness(1);
CvScalar linecolor(CV_RGB(255,0,0));
sep_leds.h
//--------------------------------------------------------------------------//
// sep_leds.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains prototype for sepLEDs class
//--------------------------------------------------------------------------//
#ifndef __SEP_LEDS_H__
#define __SEP_LEDS_H__
#include "cv.h"
#include "highgui.h"
#include "aerial_platform.h"
#include <map>
//#define DEBUG_SHOW_HISTOGRAM_INFO
//#define DEBUG_SHOW_ORIENTATION_INFO
//#define DEBUG_SHOW_LIKELIHOOD_INFO
Appendix C - xii
The University of Surrey UAV Vision and Control System Richard Lane
//#define DEBUG_SHOW_THRESHOLD_VALUES
#define SHOW_USEFUL_ONLY
#define LIKELIHOOD_CIRCLE_WEIGHT 0.05
#define SIZE_HISTOGRAM_STEPS_START 25
#define SIZE_HISTOGRAM_STEP_JUMP 10
#define HISTOGRAM_REPETITIONS 2
#define ORIENTATION_ANGLE_VARIATION 20
#define PERCENT_OF_RED_PEAK 0.3 // 30%
#define PERCENT_OF_WHITE_PEAK 0.05 // 5%
#define DEFAULT_LED_REDTHRESHOLD 50
#define DEFAULT_LED_WHITETHRESHOLD 700
#define DEFAULT_LED_DISTTHRESHOLD 20
#define DEFAULT_LED_GROUPTHRESHOLD 20
#define LED_NUM_USED 5 // must be 5 or less
#define FIELD_ODD 0
#define FIELD_EVEN 1
#define R_OFFSET 2
#define G_OFFSET 1
#define B_OFFSET 0
class sepLEDs
{
public:
sepLEDs();
~sepLEDs();
void ProcessField(IplImage* frame_image, int fieldin);
void ClearField(int fieldin);
void Get5RefPoints(CvPoint* ref_points);
void AdaptValues(int min_distance);
private:
void ProcessFrame_ScanPixels(void);
double ProcessFrame_PixelDist(int x_compare, int y_compare, int led_compare);
void ProcessFrame_PickLeds(void);
double TestCircle(CvPoint TestPoint);
void SizeHistogram(void);
void TestOrientation(void);
double PointAngle(CvPoint Origin, CvPoint Point1, CvPoint Point2);
int PixelValue(const char& charin);
bool QuestionPixel_Useful(char& r, char& g, char& b);
bool QuestionPixel_StrongRed(int& r, int& g, int& b);
bool QuestionPixel_BrightWhite(int& r, int& g, int& b);
char* AccessPixel(CvPoint pnt, IplImage* img);
int processtype;
IplImage* ImageData;
int field;
int peak_red;
int peak_white;
int adaptled_redthreshold;
int adaptled_whitethreshold;
int adaptled_distthreshold;
int adaptled_groupthreshold;
Appendix C - xiii
The University of Surrey UAV Vision and Control System Richard Lane
LedDetails PrevLEDsFound[LED_NUM_USED];
LedDetails LEDsFound[LED_NUM_USED];
map<int,LedDetails> PossibleLeds;
};
#endif //__SEP_LEDS_H__
sep_leds.cpp
//--------------------------------------------------------------------------//
// sep_leds.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains the sepLEDs class for processing frames from the
// cvcam callback
//--------------------------------------------------------------------------//
#include "stdafx.h"
#include "sep_leds.h"
#include "sep_utils.h"
using namespace sep_utils;
// processing constructor
sepLEDs::sepLEDs()
: adaptled_redthreshold(DEFAULT_LED_REDTHRESHOLD),
adaptled_whitethreshold(DEFAULT_LED_WHITETHRESHOLD),
adaptled_distthreshold(DEFAULT_LED_DISTTHRESHOLD),
adaptled_groupthreshold(DEFAULT_LED_GROUPTHRESHOLD)
{
}
// processing deconstructor
sepLEDs::~sepLEDs(void)
{
}
Appendix C - xiv
The University of Surrey UAV Vision and Control System Richard Lane
void sepLEDs::ProcessFrame_ScanPixels(void)
{
int linepos, ledcount(0), closest[2], distance;
CvPoint testpoint;
bool pixeladded;
PossibleLeds.clear();
// scan each line
for (testpoint.y = field; testpoint.y < ImageData->height-1+field;
testpoint.y += 2)
{
testpoint.x = 0;
// scan each pixel in this line
for (linepos = (ImageData->widthStep)*testpoint.y;
linepos < (ImageData->widthStep)*(testpoint.y+1);
linepos += ImageData->nChannels)
{
if (QuestionPixel_Useful(AccessPixel(testpoint, ImageData)[R_OFFSET],
AccessPixel(testpoint, ImageData)[G_OFFSET],
AccessPixel(testpoint, ImageData)[B_OFFSET]))
{
// check for close points already found
Appendix C - xv
The University of Surrey UAV Vision and Control System Richard Lane
closest[1] = adaptled_distthreshold + 1;
for (ledcount = 0; ledcount < PossibleLeds.size(); ledcount++)
{
distance = ProcessFrame_PixelDist(testpoint.x,
testpoint.y, ledcount);
if ((distance <= adaptled_distthreshold) &&
(distance < closest[1]))
{
closest[0] = ledcount;
closest[1] = distance;
}
}
// if close enought to one of the possibleleds
if (closest[1] != adaptled_distthreshold + 1)
{
PossibleLeds[closest[0]].Position.x += testpoint.x;
PossibleLeds[closest[0]].Position.y += testpoint.y;
PossibleLeds[closest[0]].pixelcount += 1;
}
// if no points close enough add new
else
{
// first line below automatically creates a
// new entry in PossibleLeds
PossibleLeds[PossibleLeds.size()].Position.x =
testpoint.x;
// size has now therefore changed hence -1
PossibleLeds[PossibleLeds.size()-1].Position.y =
testpoint.y;
PossibleLeds[PossibleLeds.size()-1].pixelcount = 1;
PossibleLeds[PossibleLeds.size()-1].likelihood = 0.0;
}
}
testpoint.x++;
}
}
}
Appendix C - xvi
The University of Surrey UAV Vision and Control System Richard Lane
{
#ifdef SHOW_USEFUL_ONLY
r = 0;
g = 0;
b = 0;
#endif
return false;
}
}
void sepLEDs::ProcessFrame_PickLeds(void)
{
int ledcount, pickedcount, leastlikely;
double lowestlikelihood;
// save previously picked LEDs
for (ledcount = 0; ledcount < LED_NUM_USED; ledcount++)
PrevLEDsFound[ledcount] = LEDsFound[ledcount];
// check some simple likelyness statistics for possible LEDs
for (ledcount = 0; ledcount < PossibleLeds.size(); ledcount++)
{
// average x and y location values
PossibleLeds[ledcount].Position.x /= PossibleLeds[ledcount].pixelcount;
PossibleLeds[ledcount].Position.y /= PossibleLeds[ledcount].pixelcount;
// check pixel count is over group threshold
if (PossibleLeds[ledcount].pixelcount > adaptled_groupthreshold)
PossibleLeds[ledcount].likelihood = 1.0;
// test to see if shape is circlular
PossibleLeds[ledcount].likelihood *=
TestCircle(PossibleLeds[ledcount].Position);
}
// perform histogram on sizes
SizeHistogram();
// check orientation of points
TestOrientation();
#ifdef DEBUG_SHOW_LIKELIHOOD_INFO
for (ledcount = 0; ledcount < PossibleLeds.size(); ledcount++)
{
cout << "[" << PossibleLeds[ledcount].Position.x << "," <<
PossibleLeds[ledcount].Position.y
<< "] pixels:" << PossibleLeds[ledcount].pixelcount << " lhood:"
<< PossibleLeds[ledcount].likelihood << endl;
// mark possible points with size according to stats
if ((int)(PossibleLeds[ledcount].likelihood*30.0) < 1)
PossibleLeds[ledcount].likelihood = 0.05;
cvCircle(ImageData, PossibleLeds[ledcount].Position,
(int)(PossibleLeds[ledcount].likelihood*30.0), cvScalar(0,255,0,0),
2);
}
#endif //def DEBUG_SHOW_LIKELIHOOD_INFO
Appendix C - xvii
The University of Surrey UAV Vision and Control System Richard Lane
int smallest;
Appendix C - xviii
The University of Surrey UAV Vision and Control System Richard Lane
{
pixel = AccessPixel(test, ImageData);
if (PixelValue(pixel[R_OFFSET]) == 0) break;
test.y -= 2.0;
test.x -= 2.0;
}
// scan up and right to find circle border
for (test = TestPoint; (test.y >= 0) && (test.x < ImageData->width); uright+=2.0)
{
pixel = AccessPixel(test, ImageData);
if (PixelValue(pixel[R_OFFSET]) == 0) break;
test.y -= 2.0;
test.x += 2.0;
}
// scan down and right to find circle border
for (test = TestPoint; (test.y < ImageData->height) &&
(test.x < ImageData->width); dright+=2.0)
{
pixel = AccessPixel(test, ImageData);
if (PixelValue(pixel[R_OFFSET]) == 0) break;
test.y += 2.0;
test.x += 2.0;
}
// scan down and left to find circle border
for (test = TestPoint; (test.y < ImageData->height) && (test.x >= 0); dleft+=2.0)
{
pixel = AccessPixel(test, ImageData);
if (PixelValue(pixel[R_OFFSET]) == 0) break;
test.y += 2.0;
test.x -= 2.0;
}
// calc true diag values
uright = DiagLength(uright,uright);
uleft = DiagLength(uleft,uleft);
dright = DiagLength(dright,dright);
dleft = DiagLength(dleft,dleft);
// calc average
avg = (up + down + left + right + uleft + uright + dleft + dright) / 8.0;
// check differences in up/right/left/down
temp = ModDifference(up, right);
temp += ModDifference(right, down);
temp += ModDifference(down, left);
temp += ModDifference(left, up);
temp += ModDifference(down, up);
temp += ModDifference(left, right);
temp /= avg;
likelyhoodcalc -= LIKELIHOOD_CIRCLE_WEIGHT*temp;
// check differences in uleft/uright/dleft/dright
temp = ModDifference(uright, uleft);
temp += ModDifference(dright, uright);
temp += ModDifference(dleft, dright);
temp += ModDifference(uleft, dleft);
temp += ModDifference(uleft, dright);
temp += ModDifference(uright, dleft);
temp /= avg;
likelyhoodcalc -= LIKELIHOOD_CIRCLE_WEIGHT*temp;
// compare up/right/left/down to diags
temp = ModDifference(uright, up);
temp += ModDifference(dright, right);
temp += ModDifference(dleft, down);
temp += ModDifference(uleft, left);
temp += ModDifference(dright, up);
temp += ModDifference(dleft, right);
temp += ModDifference(uleft, down);
temp += ModDifference(uright, left);
temp /= avg;
likelyhoodcalc -= 2*LIKELIHOOD_CIRCLE_WEIGHT*temp;
Appendix C - xix
The University of Surrey UAV Vision and Control System Richard Lane
void sepLEDs::SizeHistogram(void)
{
int ledcount, value, maxvalue, prevvalue, hist,
stepsize(SIZE_HISTOGRAM_STEPS_START);
map<int,int> histogram;
// perform two histograms
for (hist = 0; hist < HISTOGRAM_REPETITIONS; hist++)
{
maxvalue = 0;
prevvalue = 0;
histogram.clear();
// sort through possible leds and create histogram
for (ledcount = 0; ledcount < PossibleLeds.size(); ledcount++)
{
// ignore irrelivent possibleleds
if (PossibleLeds[ledcount].likelihood != 0)
{
// find the group value for current pixel
value = PossibleLeds[ledcount].pixelcount -
(PossibleLeds[ledcount].pixelcount % stepsize);
if (value > maxvalue) maxvalue = value;
// check it's not 0, ignoring 0 group
if (value)
{
// add to histogram
if (histogram.find(value) != histogram.end())
{
histogram[value]++;
}
else
{
histogram[value] = 1;
}
}
}
}
// scan through histogram
for (value = 0; value <= maxvalue; value += stepsize)
{
// check histogram entry exists
if (histogram.find(value) != histogram.end())
{
#ifdef DEBUG_SHOW_HISTOGRAM_INFO
cout << value << ": " << histogram[value] << endl;
#endif //def DEBUG_SHOW_HISTOGRAM_INFO
// cap histogram value at num of leds
if (histogram[value] > LED_NUM_USED)
histogram[value] = LED_NUM_USED;
}
}
// sort through possible leds and create histogram
for (ledcount = 0; ledcount < PossibleLeds.size(); ledcount++)
{
// find the group value for current pixel
value = PossibleLeds[ledcount].pixelcount -
(PossibleLeds[ledcount].pixelcount % stepsize);
// check histogram entry exists
if (histogram.find(value) != histogram.end())
{
PossibleLeds[ledcount].likelihood *=
(double)histogram[value] / (double)LED_NUM_USED;
}
Appendix C - xx
The University of Surrey UAV Vision and Control System Richard Lane
}
stepsize += SIZE_HISTOGRAM_STEP_JUMP;
}
}
void sepLEDs::TestOrientation(void)
{
int ledcount, led1count, led2count, testcount(0), prevorigin(-1), tempcount;
double tempangle, currentmax(0.0);
map<int, LedOrientation> orientations;
// sort through possible leds and calc orientations
// reference possible LEDs
for (ledcount = 0; ledcount < PossibleLeds.size(); ledcount++)
{
if (PossibleLeds[ledcount].likelihood > 0.0)
{
// possible LEDs 1
for (led1count = 0; led1count < PossibleLeds.size(); led1count++)
{
if ((PossibleLeds[led1count].likelihood > 0.0) &&
(ledcount != led1count))
{
// possible LEDs 2
for (led2count = led1count;
led2count < PossibleLeds.size(); led2count++)
{
if ((PossibleLeds[led2count].likelihood > 0.0) &&
(ledcount != led1count)
&& (ledcount != led2count))
{
// calculate angle between two points
// wrt reference point
tempangle = PointAngle(
PossibleLeds[ledcount].Position,
PossibleLeds[led1count].Position,
PossibleLeds[led2count].Position);
if (((tempangle >= 180 –
ORIENTATION_ANGLE_VARIATION) &&
(tempangle <= 180 +
ORIENTATION_ANGLE_VARIATION)) ||
((tempangle >= 90 –
ORIENTATION_ANGLE_VARIATION) &&
(tempangle <= 90 +
ORIENTATION_ANGLE_VARIATION)))
{
#ifdef DEBUG_SHOW_ORIENTATION_INFO
cvLine(ImageData,PossibleLeds[ledcount].Position,
PossibleLeds[led1count].Position,cvScalar(255,0,0),1);
cvLine(ImageData,PossibleLeds[ledcount].Position,
PossibleLeds[led2count].Position,cvScalar(255,0,0),1);
#endif //def DEBUG_SHOW_ORIENTATION_INFO
// keep calculations for later
orientations[testcount].LED1 =
led1count;
orientations[testcount].LED2 =
led2count;
orientations[testcount].LEDorigin =
ledcount;
orientations[testcount].angle =
tempangle;
orientations[testcount].likelihood
= 5 *
PossibleLeds[ledcount].likelihood *
PossibleLeds[led1count].likelihood
* PossibleLeds[led2count].likelihood;
testcount++;
}
Appendix C - xxi
The University of Surrey UAV Vision and Control System Richard Lane
}
}
}
}
}
}
// sort through orientations
for (testcount = 0; testcount < orientations.size(); testcount++)
{
if (prevorigin == orientations[testcount].LEDorigin)
{
if (orientations[testcount].likelihood > currentmax)
{
currentmax = orientations[testcount].likelihood;
tempcount = testcount;
}
}
else if (prevorigin < 0)
{
prevorigin = orientations[testcount].LEDorigin;
currentmax = orientations[testcount].likelihood;
tempcount = testcount;
}
else
{
#ifdef DEBUG_SHOW_ORIENTATION_INFO
cvLine(ImageData,
PossibleLeds[orientations[tempcount].LEDorigin].Position,
PossibleLeds[orientations[tempcount].LED1].Position,
cvScalar(255,0,0),1);
cvLine(ImageData,
PossibleLeds[orientations[tempcount].LEDorigin].Position,
PossibleLeds[orientations[tempcount].LED2].Position,
cvScalar(255,0,0),1);
cout << "[" << PossibleLeds[prevorigin].Position.x << "," <<
PossibleLeds[prevorigin].Position.y << "] angle:"
<< orientations[testcount].angle
<< " prev:" << PossibleLeds[prevorigin].likelihood << " new:"
<< (PossibleLeds[prevorigin].likelihood * currentmax) << endl;
#endif //def DEBUG_SHOW_ORIENTATION_INFO
PossibleLeds[prevorigin].likelihood =
PossibleLeds[prevorigin].likelihood * currentmax;
currentmax = orientations[testcount].likelihood;
prevorigin = orientations[testcount].LEDorigin;
}
}
}
Appendix C - xxii
The University of Surrey UAV Vision and Control System Richard Lane
doubleang[pointcount] = 270.0;
else {
doubleang[pointcount] = atan(doubley/doublex);
doubleang[pointcount] = (doubleang[pointcount]*180)/PI;
if ((doubley < 0) && (doublex > 0))
doubleang[pointcount] = 0.0 - doubleang[pointcount];
else if ((doubley > 0) && (doublex > 0))
doubleang[pointcount] = 360.0 - doubleang[pointcount];
else if ((doubley > 0) && (doublex < 0))
doubleang[pointcount] = 180.0 - doubleang[pointcount];
else if ((doubley < 0) && (doublex < 0))
doubleang[pointcount] = 180.0 - doubleang[pointcount]; }
}
doubleangle = ModDifference(doubleang[0],doubleang[1]);
if (doubleangle > 180)
doubleangle = 360 - doubleangle;
return doubleangle;
}
sep_orientation.h
//--------------------------------------------------------------------------//
// sep_orientation.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains prototype for sepOrientation class
//--------------------------------------------------------------------------//
#ifndef __SEP_ORIENTATION_H__
#define __SEP_ORIENTATION_H__
#include "cv.h"
//#define DEBUG_SHOW_PITCH_DATA
Appendix C - xxiii
The University of Surrey UAV Vision and Control System Richard Lane
class sepOrientation
{
public:
sepOrientation();
~sepOrientation();
void SetTargetHeight(const double& target, char& errors);
bool GetCommands(IplImage* frame_image, CvPoint* ref_points_in,
double* directions_return);
int GetMinDistance(void);
void GetErrors(char& errors);
private:
int CheckRefData(void);
int CheckFrameGlitch(void);
void FindCentre(void);
void SortPoints(void);
void FindDirection(void);
void FindPitch(void);
void FindHeight(void);
void EstimateCentre(void);
void ReturnDirections(double* directions);
void FindMinDistance(void);
int PointDistance(CvPoint point1, CvPoint point2);
double PointAngle(CvPoint refpoint, CvPoint point1);
IplImage* ImageData;
CvPoint* RefData;
CvPoint CentrePoint;
int next_saved_point;
CvPoint PreviousPoints[NUM_PREVIOUS_POINTS];
CvPoint ActualCentrePoint[NUM_AVG_POINTS];
double ortn_direction;
double ortn_frontbackpitch; // (+)front (-)back
double ortn_leftrightpitch; // (+)left (-)right
double ortn_height;
double target_height;
int sortedpoints[NUM_REF_POINTS];
int min_distance;
int error_glitchcount;
int error_partialcount;
int error_completecount;
};
#endif //__SEP_ORIENTATION_H__
sep_orientation.cpp
//--------------------------------------------------------------------------//
// sep_orientation.cpp
Appendix C - xxiv
The University of Surrey UAV Vision and Control System Richard Lane
#include "stdafx.h"
#include "math.h"
#include "aerial_platform.h"
#include "sep_orientation.h"
#include "sep_utils.h"
using namespace sep_utils;
sepOrientation::sepOrientation()
: next_saved_point(-NUM_PREVIOUS_POINTS), target_height(1.0)
{
}
sepOrientation::~sepOrientation()
{
}
Appendix C - xxv
The University of Surrey UAV Vision and Control System Richard Lane
int sepOrientation::GetMinDistance(void)
{
return min_distance;
}
Appendix C - xxvi
The University of Surrey UAV Vision and Control System Richard Lane
else
directions[3] = 0.0 - ortn_direction;
// set forward/backward
// set left/right
Appendix C - xxvii
The University of Surrey UAV Vision and Control System Richard Lane
{
CentrePoint.x += RefData[pointcount].x;
CentrePoint.y += RefData[pointcount].y;
}
CentrePoint.x /= NUM_REF_POINTS;
CentrePoint.y /= NUM_REF_POINTS;
cvCircle(ImageData, CentrePoint, 5, cvScalar(255,0,0,0), 2);
}
void sepOrientation::SortPoints(void)
{
DistanceSort Distances[NUM_REF_POINTS*NUM_REF_POINTS], Least[2], Most[4];
int pointcount, pointcount2, sortcount(0);
double refangle, tempangle;
Appendix C - xxviii
The University of Surrey UAV Vision and Control System Richard Lane
Appendix C - xxix
The University of Surrey UAV Vision and Control System Richard Lane
// find average
CvPoint AvgCentrePoint = {0,0};
int pointcount;
for (pointcount = 0; pointcount < NUM_AVG_POINTS; pointcount++)
{
AvgCentrePoint.x += ActualCentrePoint[pointcount].x;
AvgCentrePoint.y += ActualCentrePoint[pointcount].y;
}
AvgCentrePoint.x /= NUM_AVG_POINTS;
AvgCentrePoint.y /= NUM_AVG_POINTS;
// draw circle
cvCircle(ImageData, AvgCentrePoint, 5, cvScalar(255,0,0,0), 2);
}
Appendix C - xxx
The University of Surrey UAV Vision and Control System Richard Lane
{
doubleang = 270.0;
}
else
{
doubleang = atan(doubley/doublex);
doubleang = (doubleang*180)/PI;
if ((doubley < 0) && (doublex > 0)) doubleang = 0.0 - doubleang;
else if ((doubley > 0) && (doublex > 0)) doubleang = 360.0 - doubleang;
else if ((doubley > 0) && (doublex < 0)) doubleang = 180.0 - doubleang;
else if ((doubley < 0) && (doublex < 0)) doubleang = 180.0 - doubleang;
}
return doubleang;
}
void sepOrientation::FindMinDistance(void)
{
int pointcount, pointcount2, tempdistance, usefulpoints(0);
min_distance = CONTROL_CAMERA_WIDTH*CONTROL_CAMERA_HEIGHT;
// workout min distance between points
for (pointcount = 0; pointcount < NUM_REF_POINTS; pointcount++)
{
for (pointcount2 = pointcount+1; pointcount2 < NUM_REF_POINTS;
pointcount2++)
{
if ((RefData[pointcount].x >= 0) && (RefData[pointcount].y >= 0) &&
(RefData[pointcount2].x >= 0) && (RefData[pointcount2].y >= 0))
{
tempdistance = PointDistance(RefData[pointcount],
RefData[pointcount2]);
if (tempdistance < min_distance) min_distance = tempdistance;
}
}
if ((RefData[pointcount].x >= 0) && (RefData[pointcount].y >= 0))
usefulpoints++;
}
if (usefulpoints == 0) min_distance = -1;
if (usefulpoints == 1) min_distance = 0;
}
sep_uavcomms.h
//--------------------------------------------------------------------------//
// sep_uavcomms.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains prototype for sepUAVComms class
//--------------------------------------------------------------------------//
#ifndef __SEP_UAVCOMMS_H__
#define __SEP_UAVCOMMS_H__
#include "TCPLink.h"
#include "aerial_platform.h"
//#define DEBUG_SHOW_COMMS_TRANSLATION
#ifdef USE_UGV_EMULATION
#define LOCAL_PORT 2001
#define REMOTE_PORT 2000
#define CHANNEL_TO_UAV "to_uav"
#define CHANNEL_TO_UGV "to_ugv"
#define LOCAL_ADDRESS "uav_machine"
#define REMOTE_ADDRESS "ugv_machine"
Appendix C - xxxi
The University of Surrey UAV Vision and Control System Richard Lane
class sepUAVComms
{
public:
sepUAVComms();
~sepUAVComms(void);
bool commsConnected(void);
void commsSend(char statusbits);
bool commsCheck(double& passheight);
private:
bool commsInit(void);
bool startEmulator(void);
bool connected;
bool inittried;
bool initfailed;
int local_port;
int remote_port;
string channel_to_uav;
string channel_to_ugv;
string local_address;
string remote_address;
TCPLink* connection_to_ugv;
TCPLink* connection_from_ugv;
};
#endif //__SEP_UAVCOMMS_H__
sep_uavcomms.cpp
//--------------------------------------------------------------------------//
// sep_uavcomms.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains the sepUAVComms class for communicating between UAV decision
// system and UGV decision systems, using netAPI
//--------------------------------------------------------------------------//
#include "stdafx.h"
#include "sep_uavcomms.h"
#include "util.h"
sepUAVComms::sepUAVComms(void)
: connected(false), inittried(false), initfailed(false)
{
}
sepUAVComms::~sepUAVComms()
{
// if connections exist delete them
if (connection_to_ugv)
delete connection_to_ugv;
if (connection_from_ugv)
delete connection_from_ugv;
}
bool sepUAVComms::commsInit(void)
Appendix C - xxxii
The University of Surrey UAV Vision and Control System Richard Lane
{
inittried = true;
#ifdef USE_UGV_EMULATION
// use values for ugv emulation
local_port = LOCAL_PORT;
remote_port = REMOTE_PORT;
channel_to_uav = CHANNEL_TO_UAV;
channel_to_ugv = CHANNEL_TO_UGV;
local_address = LOCAL_ADDRESS;
remote_address = REMOTE_ADDRESS;
// initiate TCPLink
TCPLink::Load(local_port, local_address, _on, BCAST_LOCATION);
connection_to_ugv = new TCPLink(channel_to_ugv, remote_port, remote_address);
connection_from_ugv = new TCPLink(channel_to_uav, remote_port, remote_address);
if (connection_to_ugv && connection_from_ugv)
{
cout << "netAPI initilised, attempting to connect..." << endl;
#ifdef USE_UGV_EMULATION
#ifdef EMULATOR_EXE
return startEmulator();
#else
return true;
#endif //def EMULATOR_EXE
#else
return true;
#endif //def USE_UGV_EMULATION
}
else
{
initfailed = true;
cerr << "Error: unable to initialise netAPI" << endl;
Appendix C - xxxiii
The University of Surrey UAV Vision and Control System Richard Lane
return false;
}
}
bool sepUAVComms::startEmulator(void)
{
// run emulator automatically
if (util::ExecuteProcess(EMULATOR_EXE, "", 0, true))
{
cerr << "Error: unable to execute " << EMULATOR_EXE << endl;
return false;
}
else
{
cout << "UGV emulator started" << endl;
return true;
}
}
bool sepUAVComms::commsConnected(void)
{
if (connected)
return true;
else if (initfailed)
return false;
else if (!inittried)
{
commsInit();
return false;
}
else if ((*connection_to_ugv).isConnected() &&
(*connection_from_ugv).isConnected())
{
connected = true;
TCPLink::AllConnected();
cout << "connections established" << endl;
return true;
}
else
return false;
}
Appendix C - xxxiv
The University of Surrey UAV Vision and Control System Richard Lane
sep_uavcontrol.h
//--------------------------------------------------------------------------//
// sep_uavcontrol.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains prototype for sepUAVContol class
//--------------------------------------------------------------------------//
#ifndef __SEP_UAVCONTROL_H__
#define __SEP_UAVCONTROL_H__
#include "cv.h"
#include "highgui.h"
#include "aerial_platform.h"
#ifdef CTRLMETHOD_SC8000
// set values for SC-8000
#define SC8_COMM_PORT 1
#define SC8_SERVO_MASK (unsigned char)15
#define SC8_DIO_MASK (char)'0'
// servo values for use with Hitec Laser 6 RC transmitter
#define AXIS_THROTTLE 2
#define AXIS_YAW 3
#define AXIS_PITCH 0
#define AXIS_ROLL 1
#define THROTTLE_OFF 8000
#define THROTTLE_MAX 22000
#define THROTTLE_STEP 1400
#define YAW_LEFTMAX 8000
#define YAW_MID 15000
#define YAW_RIGHTMAX 22000
#define YAW_STEP 1000
#define PITCH_BACKWARDMAX 8000
#define PITCH_MID 15000
#define PITCH_FORWARDMAX 22000
#define PITCH_STEP 1000
#define ROLL_LEFTMAX 8000
#define ROLL_MID 15000
#define ROLL_RIGHTMAX 22000
#define ROLL_STEP 1000
#endif //def CTRLMETHOD_SC8000
#ifdef CTRLMETHOD_USB2VOLTAGE
#include "Serial.h"
// servo values for use with Original X-UFO transmitter
// DAC range min=0x000, max=0xFFF
#define BIT_THROTTLE 0x20
#define BIT_YAW 0x10
#define BIT_PITCH 0x40
#define BIT_ROLL 0x80
Appendix C - xxxv
The University of Surrey UAV Vision and Control System Richard Lane
#define AXIS_THROTTLE 1
#define AXIS_YAW 0
#define AXIS_PITCH 2
#define AXIS_ROLL 3
#define SERVO_STEPS 10
#define THROTTLE_OFF 0
#define THROTTLE_MAX 2500
#define THROTTLE_STEP (THROTTLE_MAX-THROTTLE_OFF)/SERVO_STEPS
#define YAW_LEFTMAX 0
#define YAW_MID (YAW_RIGHTMAX-YAW_LEFTMAX)/2
#define YAW_RIGHTMAX 2500
#define YAW_STEP (YAW_RIGHTMAX-YAW_LEFTMAX)/SERVO_STEPS
#define PITCH_BACKWARDMAX 0
#define PITCH_MID (PITCH_FORWARDMAX-PITCH_BACKWARDMAX)/2
#define PITCH_FORWARDMAX 2500
#define PITCH_STEP (PITCH_FORWARDMAX-PITCH_BACKWARDMAX)/SERVO_STEPS
#define ROLL_LEFTMAX 0
#define ROLL_MID (ROLL_RIGHTMAX-ROLL_LEFTMAX)/2
#define ROLL_RIGHTMAX 2500
#define ROLL_STEP (ROLL_RIGHTMAX-ROLL_LEFTMAX)/SERVO_STEPS
#endif //def CTRLMETHOD_USB2VOLTAGE
class sepUAVControl
{
public:
sepUAVControl(void);
~sepUAVControl();
void SendCommands(double* directions_in);
void RefreshDisplay(IplImage* MainImg);
inline bool InitOK(void){return initok;}
void controlInit(void);
private:
bool InterpretDirections(double* directions_in);
void ConvertServoValues(void);
void SendServoValues(void);
IplImage* ControlImage;
double control_throttle;
double control_yaw;
double control_pitch;
double control_roll;
bool initok;
#ifdef CTRLMETHOD_SC8000
unsigned short servo_values[4];
int boardnumber;
int commnumber;
#endif //def CTRLMETHOD_SC8000
#ifdef CTRLMETHOD_USB2VOLTAGE
CSerial serial;
int servo_values[4];
#endif //def CTRLMETHOD_USB2VOLTAGE
};
#endif //__SEP_UAVCONTROL_H__
sep_uavcontrol.cpp
//--------------------------------------------------------------------------//
// sep_uavcontrol.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains the sepUAVControl class for sending control signals
//--------------------------------------------------------------------------//
#include "stdafx.h"
Appendix C - xxxvi
The University of Surrey UAV Vision and Control System Richard Lane
#include "sep_uavcontrol.h"
#ifdef CTRLMETHOD_SC8000
#define _USE_DLL // for TTiSC8VC
#define __WIN32__
#include "TTiSC8VC.h"
#endif //def CTRLMETHOD_SC8000
#ifdef CTRLMETHOD_USB2VOLTAGE
// send serial data out
int send_serial(CSerial& serial, int* valuesin, int axis, int axisbit)
{
char transmit[2];
long error;
// ready values
transmit[1] = valuesin[axis] & 0xFF; // lower byte
transmit[0] = (valuesin[axis] & 0xF00)/256; // upper byte
transmit[0] += axisbit; // channel number
//transmit header
error = serial.Write("iu");
if (error != ERROR_SUCCESS)
cerr << "Unable to send data" << endl;
else
{
//transmit data
error = serial.Write(transmit, 2);
if (error != ERROR_SUCCESS)
cerr << "Unable to send data" << endl;
}
if (error != ERROR_SUCCESS)
return 0;
else
return 1;
}
#endif //def CTRLMETHOD_USB2VOLTAGE
sepUAVControl::sepUAVControl(void)
{
#ifdef CTRLMETHOD_SC8000
boardnumber = 0;
commnumber = SC8000_COMM_PORT;
#endif //def CTRLMETHOD_SC8000
controlInit();
}
sepUAVControl::~sepUAVControl()
{
#ifdef CTRLMETHOD_SC8000
//SC8_CleanUP(boardnumber);
// NOTE: ^ should be using this but gives unresolved external
// symbol error during compile for some reason
#endif //def CTRLMETHOD_SC8000
#ifdef CTRLMETHOD_USB2VOLTAGE
// close comm port
serial.Close();
#endif //def CTRLMETHOD_USB2VOLTAGE
}
Appendix C - xxxvii
The University of Surrey UAV Vision and Control System Richard Lane
{
// convert control values to servo values
ConvertServoValues();
// send servo values
SendServoValues();
}
}
}
void sepUAVControl::controlInit(void)
{
#ifdef CTRLMETHOD_SC8000
boardnumber = SC8_Initialize(boardnumber, commnumber);
if (!boardnumber)
{
#if SEP_REPORT_ERRORS
cerr << "error initiating sc8" << endl;
#endif // SEP_REPORT_ERRORS
initok = false;
}
else
{
cout << "sc8 initiated" << endl;
initok = true;
}
#endif //def CTRLMETHOD_SC8000
#ifdef CTRLMETHOD_USB2VOLTAGE
// initiate serial control
long error;
// attempt to open the serial port
//error = serial.Open(_T(COMM_STRING),0,0,false);
error = serial.Open((COMM_STRING),0,0,false);
if (error != ERROR_SUCCESS)
{
#if SEP_REPORT_ERRORS
cerr << "Unable to open comm port " << COMM_STRING << endl;
Appendix C - xxxviii
The University of Surrey UAV Vision and Control System Richard Lane
#endif // SEP_REPORT_ERRORS
initok = false;
}
else
{
// setup the serial port (9600,N81) using hardware handshaking
error = serial.Setup(CSerial::EBaud9600,CSerial::EData8,
CSerial::EParNone,CSerial::EStop1);
if (error != ERROR_SUCCESS)
{
#if SEP_REPORT_ERRORS
cerr << "Unable to set comm port setting" << endl;
#endif // SEP_REPORT_ERRORS
initok = false;
}
else
{
// setup handshaking
error = serial.SetupHandshaking(CSerial::EHandshakeHardware);
if (error != ERROR_SUCCESS)
{
#if SEP_REPORT_ERRORS
cerr << "Unable to set comm port handshaking" << endl;
#endif // SEP_REPORT_ERRORS
initok = false;
}
else
{
cout << "serial comms initiated" << endl;
initok = true;
}
}
}
#endif //def CTRLMETHOD_USB2VOLTAGE
}
Appendix C - xxxix
The University of Surrey UAV Vision and Control System Richard Lane
void sepUAVControl::ConvertServoValues(void)
{
// calc servo values from control values
servo_values[AXIS_THROTTLE] = THROTTLE_OFF + control_throttle * THROTTLE_STEP;
servo_values[AXIS_YAW] = YAW_MID + control_yaw * YAW_STEP;
servo_values[AXIS_PITCH] = PITCH_MID + control_pitch * PITCH_STEP;
servo_values[AXIS_ROLL] = ROLL_MID + control_roll * ROLL_STEP;
}
void sepUAVControl::SendServoValues(void)
{
#ifdef CTRLMETHOD_SC8000
SC8_SendPositions(0, SC8_SERVO_MASK, SC8_DIO_MASK, servo_values);
// NOTE: ^ the 0 should be sc8_board_num but sc8_board_num == 1
// and SC8_SendPositions only seems to work with 0
#endif //def CTRLMETHOD_SC8000
#ifdef CTRLMETHOD_USB2VOLTAGE
send_serial(serial, servo_values, AXIS_THROTTLE, BIT_THROTTLE);
send_serial(serial, servo_values, AXIS_YAW, BIT_YAW);
send_serial(serial, servo_values, AXIS_PITCH, BIT_PITCH);
send_serial(serial, servo_values, AXIS_ROLL, BIT_ROLL);
#endif //def CTRLMETHOD_USB2VOLTAGE
}
sep_utils.h
//--------------------------------------------------------------------------//
// sep_utils.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains some useful utility functions
//--------------------------------------------------------------------------//
#ifndef __SEP_UTILS_H__
#define __SEP_UTILS_H__
namespace sep_utils
{
#define PI 3.14159265
Appendix C - xl
The University of Surrey UAV Vision and Control System Richard Lane
#endif //__SEP_UTILS_H__
stdafx.h
//--------------------------------------------------------------------------//
// stdafx.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// stdafx.h for aerial_platform project
//--------------------------------------------------------------------------//
#pragma once
// cimg includes
//#include "CImg.h"
//using namespace cimg_library;
stdafx.cpp
//--------------------------------------------------------------------------//
// stdafx.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// stdafx.cpp for aerial_platform project
//--------------------------------------------------------------------------//
#include "stdafx.h"
Appendix C - xli
The University of Surrey UAV Vision and Control System Richard Lane
//--------------------------------------------------------------------------//
// remote_control.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains various bits for remote_control project
//--------------------------------------------------------------------------//
#ifndef __REMOTE_CONTROL_H__
#define __REMOTE_CONTROL_H__
#ifdef CTRLMETHOD_SC8000
// set values for SC-8000
#define SC8_COMM_PORT 1
#define SC8_SERVO_MASK (unsigned char)15
#define SC8_DIO_MASK (char)'0'
// servo values for use with Hitec Laser 6 RC transmitter
#define AXIS_THROTTLE 2
#define AXIS_YAW 3
#define AXIS_PITCH 0
#define AXIS_ROLL 1
#define THROTTLE_OFF 8000
#define THROTTLE_MAX 22000
#define THROTTLE_STEP 1400
#define YAW_LEFTMAX 8000
#define YAW_MID 15000
#define YAW_RIGHTMAX 22000
#define YAW_STEP 1000
#define PITCH_BACKWARDMAX 8000
#define PITCH_MID 15000
#define PITCH_FORWARDMAX 22000
#define PITCH_STEP 1000
#define ROLL_LEFTMAX 8000
#define ROLL_MID 15000
#define ROLL_RIGHTMAX 22000
#define ROLL_STEP 1000
#endif //def CTRLMETHOD_SC8000
#ifdef CTRLMETHOD_USB2VOLTAGE
// set serial interface values
#define COMM_STRING "\\\\.\\COM13"
// servo values for use with Original X-UFO transmitter
// DAC range min=0x000, max=0xFFF
#define BIT_THROTTLE 0x20
#define BIT_YAW 0x10
#define BIT_PITCH 0x40
#define BIT_ROLL 0x80
#define AXIS_THROTTLE 1
#define AXIS_YAW 0
#define AXIS_PITCH 2
#define AXIS_ROLL 3
#define SERVO_STEPS 10
#define THROTTLE_OFF 0
#define THROTTLE_MAX 2500
#define THROTTLE_STEP (THROTTLE_MAX-THROTTLE_OFF)/SERVO_STEPS
#define YAW_LEFTMAX 0
#define YAW_MID (YAW_RIGHTMAX-YAW_LEFTMAX)/2
#define YAW_RIGHTMAX 2500
#define YAW_STEP (YAW_RIGHTMAX-YAW_LEFTMAX)/SERVO_STEPS
#define PITCH_BACKWARDMAX 0
Appendix D - i
The University of Surrey UAV Vision and Control System Richard Lane
#endif //__REMOTE_CONTROL_H__
remote_control.cpp
//--------------------------------------------------------------------------//
// remote_control.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains main() for test remote control app for the aerial platform
//--------------------------------------------------------------------------//
#include "stdafx.h"
#include <iostream>
using namespace std;
#include "remote_control.h"
#include "remote_keys.h"
#ifdef CTRLMETHOD_SC8000
#define _USE_DLL // for TTiSC8VC
#define __WIN32__
#include "TTiSC8VC.h"
#endif //def CTRLMETHOD_SC8000
#ifdef CTRLMETHOD_USB2VOLTAGE
#include "Serial.h"
// send serial data out
int send_serial(CSerial& serial, int* valuesin, int axis, int axisbit)
{
char transmit[2];
long error;
// ready values
transmit[1] = valuesin[axis] & 0xFF; // lower byte
transmit[0] = (valuesin[axis] & 0xF00)/256; // upper byte
transmit[0] += axisbit; // channel number
//transmit header
error = serial.Write("iu");
if (error != ERROR_SUCCESS)
cerr << "Unable to send data" << endl;
else
Appendix D - ii
The University of Surrey UAV Vision and Control System Richard Lane
{
//transmit data
error = serial.Write(transmit, 2);
if (error != ERROR_SUCCESS)
cerr << "Unable to send data" << endl;
}
if (error != ERROR_SUCCESS)
return 0;
else
return 1;
}
#endif //def CTRLMETHOD_USB2VOLTAGE
void print_instructions(void)
{
// draw nice ascii art transmitter with control instructions
cout << endl;
cout << '\t' << " |" << endl;
cout << '\t' << " X-UFO |" << endl;
cout << '\t' << " /|\\" << endl;
cout << '\t' << "/---------------------\\" << endl;
cout << '\t' << "| " << THROTTLE_UP << " "
<< PITCH_FORWARD << " |" << endl;
cout << '\t' << "| | | |" << endl;
cout << '\t' << "| " << YAW_LEFT << "-- --" << YAW_RIGHT
<< " " << ROLL_LEFT << "-- --" << ROLL_RIGHT
<< " |" << endl;
cout << '\t' << "| | | |" << endl;
cout << '\t' << "| " << THROTTLE_DOWN << " "
<< PITCH_BACKWARD << " |" << endl;
cout << '\t' << "\\---------------------/" << endl;
cout << endl;
}
Appendix D - iii
The University of Surrey UAV Vision and Control System Richard Lane
keysIn.GetCommands();
system("cls");
}
#ifdef CTRLMETHOD_USB2VOLTAGE
// initiate serial control
CSerial serial;
long error;
// attempt to open the serial port
error = serial.Open(_T(COMM_STRING),0,0,false);
if (error != ERROR_SUCCESS)
cerr << "Unable to open comm port " << COMM_STRING << endl;
else
{
// setup the serial port (9600,N81) using hardware handshaking
error = serial.Setup(CSerial::EBaud9600,CSerial::EData8,
CSerial::EParNone,CSerial::EStop1);
if (error != ERROR_SUCCESS)
cerr << "Unable to set comm port setting" << endl;
else
{
// setup handshaking
error = serial.SetupHandshaking(CSerial::EHandshakeHardware);
if (error != ERROR_SUCCESS)
cerr << "Unable to set comm port handshaking" << endl;
else
{
// initiate command values and remoteKeys class
int throttle(0), yaw(0), pitch(0), roll(0), axisvalues[4];
remoteKeys keysIn(throttle, yaw, pitch, roll);
Appendix D - iv
The University of Surrey UAV Vision and Control System Richard Lane
remote_keys.h
//--------------------------------------------------------------------------//
// remote_keys.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains prototype for remoteKeys class
//--------------------------------------------------------------------------//
#ifndef __REMOTE_KEYS_H__
#define __REMOTE_KEYS_H__
class remoteKeys
{
public:
remoteKeys(int& throttleref, int& yawref, int& pitchref, int& rollref);
~remoteKeys();
void GetCommands(void);
private:
char keyin;
Appendix D - v
The University of Surrey UAV Vision and Control System Richard Lane
int& throttle;
int& yaw;
int& pitch;
int& roll;
};
#endif //__REMOTE_KEYS_H__
remote_keys.cpp
//--------------------------------------------------------------------------//
// remote_keys.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains the remoteKeys class for inputing keys and returning
// remote command values
//--------------------------------------------------------------------------//
#include "remote_keys.h"
#include <iostream>
using namespace std;
#include <conio.h>
//#include <windows.h>
// constructor
remoteKeys::remoteKeys(int& throttleref, int& yawref,
int& pitchref, int& rollref)
: throttle(throttleref), yaw(yawref), pitch(pitchref), roll(rollref)
{
}
// deconstructor
remoteKeys::~remoteKeys()
{
}
// get one key from keyboard and adjust servo values accordingly
void remoteKeys::GetCommands(void)
{
// get key from console without echo
keyin = _getch();
// check for escape key
if (keyin == ESC_KEY) throttle = THROTTLE_ESC;
// check for space/reset key
else if (keyin == SPACE_KEY){
throttle = 0;
yaw = 0;
pitch = 0;
roll = 0; }
// adjust throttle
else if (keyin == THROTTLE_UP) throttle++;
else if (keyin == THROTTLE_DOWN) throttle--;
// adjust yaw
else if (keyin == YAW_LEFT) yaw--;
else if (keyin == YAW_RIGHT) yaw++;
// adjust pitch
else if (keyin == PITCH_FORWARD) pitch++;
else if (keyin == PITCH_BACKWARD) pitch--;
// adjust roll
else if (keyin == ROLL_LEFT) roll--;
else if (keyin == ROLL_RIGHT) roll++;
// check values for limits
if (throttle == -1) throttle = 0;
else if (throttle > 10) throttle = 10;
Appendix D - vi
The University of Surrey UAV Vision and Control System Richard Lane
stdafx.h
//--------------------------------------------------------------------------//
// stdafx.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// stdafx.h for remote_control project
//--------------------------------------------------------------------------//
#pragma once
//#define CTRLMETHOD_SC8000
#define CTRLMETHOD_USB2VOLTAGE
#ifdef CTRLMETHOD_USB2VOLTAGE
#include <tchar.h>
#include <crtdbg.h>
#include <windows.h>
#include <conio.h>
#endif //def CTRLMETHOD_USB2VOLTAGE
stdafx.cpp
//--------------------------------------------------------------------------//
// stdafx.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// stdafx.cpp for remote_control project
//--------------------------------------------------------------------------//
#include "stdafx.h"
Appendix D - vii
The University of Surrey UAV Vision and Control System Richard Lane
//--------------------------------------------------------------------------//
// ugv_emulation.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains various bits for ugv_emulation project
//--------------------------------------------------------------------------//
#ifndef __UGV_EMULATION_H__
#define __UGV_EMULATION_H__
//#define EMULATION_DEBUG
#define POLL_FREQ 20 //ms
#define SEND_INTERVAL 200 //polls per send
typedef enum {
STATE_CHECK,
STATE_CLOSE,
STATE_ERROR,
STATE_HEIGHT,
STATE_INIT,
STATE_WAIT
} state_machine;
#endif //__UGV_EMULATION_H__
ugv_emulation.cpp
//--------------------------------------------------------------------------//
// ugv_emulation.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains main() for emulation of ugv comms
//--------------------------------------------------------------------------//
#include "stdafx.h"
#include "ugv_emulation.h"
#include "netapi_interface.h"
Appendix E - i
The University of Surrey UAV Vision and Control System Richard Lane
switch (current_state)
{
// check for message state
case STATE_CHECK:
#ifdef EMULATION_DEBUG
cout << "Poll: check for message" << endl;
#endif // EMULATION_DEBUG
if (UGVcomms.commsCheck())
{
next_state = STATE_WAIT;
}
else
{
cerr << "Error: could not check for messages"
<< endl;
next_state = STATE_ERROR;
}
break;
// close state
case STATE_CLOSE:
break;
// error state
case STATE_ERROR:
cout << "A FATAL ERROR HAS OCCURED, " <<
"PLEASE CLOSE UGV EMULATION" << endl;
Sleep(-1);
break;
// init state
case STATE_INIT:
if (UGVcomms.commsInit())
{
next_state = STATE_WAIT;
}
else
{
cerr << "Error: could not initiate netAPI" << endl;
next_state = STATE_ERROR;
}
break;
// wait state
case STATE_WAIT:
if (UGVcomms.commsConnected())
{
if (pollcount < SEND_INTERVAL)
{
pollcount++;
Appendix E - ii
The University of Surrey UAV Vision and Control System Richard Lane
next_state = STATE_CHECK;
}
else
{
pollcount = 0;
next_state = STATE_HEIGHT;
}
}
// sleep to prevent CPU overuse
Sleep(POLL_FREQ);
break;
// default
default:
cerr << "Error: state machine entered unknown state"
<< endl;
next_state = STATE_ERROR;
}
current_state = next_state;
}
return 0;
}
netapi_interface.h
//--------------------------------------------------------------------------//
// netapi_interface.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains prototype for netAPI_UGV class
//--------------------------------------------------------------------------//
#ifndef __NETAPI_INTERFACE_H__
#define __NETAPI_INTERFACE_H__
#include "TCPLink.h"
#define USE_DEFAULTS
#define BCAST_LOCATION "C:/SEP_code/netAPI/"
#define LOCAL_PORT 2000
#define REMOTE_PORT 2001
#define CHANNEL_TO_UAV "to_uav"
#define CHANNEL_TO_UGV "to_ugv"
#define LOCAL_ADDRESS "ugv_machine"
#define REMOTE_ADDRESS "uav_machine"
class netAPI_UGV
{
public:
netAPI_UGV(void);
~netAPI_UGV();
bool commsInit(void);
bool commsConnected(void);
bool commsCheck(void);
bool commsSend(void);
private:
double nextHeight(void);
double last_height;
bool height_error;
bool connected;
Appendix E - iii
The University of Surrey UAV Vision and Control System Richard Lane
int local_port;
int remote_port;
string channel_to_uav;
string channel_to_ugv;
string local_address;
string remote_address;
TCPLink* connection_to_uav;
TCPLink* connection_from_uav;
};
#endif //__NETAPI_INTERFACE_H__
netapi_interface.cpp
//--------------------------------------------------------------------------//
// netapi_interface.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// contains the netAPI_UGV class for interfacing with netAPI
//--------------------------------------------------------------------------//
#include "stdafx.h"
#include "netapi_interface.h"
#include "ugv_emulation.h"
// constructor
netAPI_UGV::netAPI_UGV(void)
: connected(false)
{
}
// deconstructor
netAPI_UGV::~netAPI_UGV()
{
// delete TCPLink objects
if (connection_to_uav)
delete connection_to_uav;
if (connection_from_uav)
delete connection_from_uav;
}
Appendix E - iv
The University of Surrey UAV Vision and Control System Richard Lane
Appendix E - v
The University of Surrey UAV Vision and Control System Richard Lane
Appendix E - vi
The University of Surrey UAV Vision and Control System Richard Lane
{
last_height -= HEIGHT_JUMP;
}
return last_height;
}
stdafx.h
//--------------------------------------------------------------------------//
// stdafx.h
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// stdafx.h for ugv_emulation project
//--------------------------------------------------------------------------//
#pragma once
stdafx.cpp
//--------------------------------------------------------------------------//
// stdafx.cpp
// - developed for University of Surrey - Special Engineering Project
// - by Richard Lane 2007-08
//--------------------------------------------------------------------------//
// stdafx.cpp for ugv_emulation project
//--------------------------------------------------------------------------//
#include "stdafx.h"
Appendix E - vii
The University of Surrey UAV Vision and Control System Richard Lane
[19]
Appendix F - i
The University of Surrey UAV Vision and Control System Richard Lane
Appendix G - i
The University of Surrey UAV Vision and Control System Richard Lane
Appendix H - i