Sunteți pe pagina 1din 8

International Journal of Remote Sensing Applications Volume 4 Issue 3, September 2014

doi: 10.14355/ijrsa.2014.0403.01

www.ijrsa.org

A Low-cost Unmanned Aerial System for


Remote Sensing of Forested Landscapes
M.G. Wing*1, J. Burnett1, S. Johnson2, A.E. Akay3, J. Sessions1
Oregon State University, U.S.

VDOS Global, LLC, U.S.

KahramanmaraSt mam University, Turkey

Michael.Wing@oregonstate.edu; Jonathan.Burnett@oregonstate.edu; Seth.Johnson@vdosglobal.com;


akay@ksu.edu.tr; John.Sessions@oregonstate.edu
*

Abstract
We developed a low-cost unmanned aerial system (UAS) for
less than $1400 that is capable of capturing high-resolution
imagery of landscapes. We accomplished this feat by making
use of advances in open-source technology that are
propelling small UAS applications for remote sensing.
We describe the low-cost UAS that we developed, the
process for assembling the aircraft, and the various
components that are necessary for flight. We also describe
software and piloting options that can support automated
flight. In addition, we provide examples of high-resolution
imagery that were captured of forest stands, buildings, and
other features during a flight over a university campus in
Turkey.
We hope the descriptions and examples we provide will
encourage researchers to develop and advance their own
UAS for remote sensing applications. We anticipate that
future years will bring more technological advancements
that reduce the weight and size of high-resolution sensors
and autopilot systems while increasing the reliability of
UAS. It is also likely that the future will see small UAS as the
primary means of collecting remotely sensed data for a
variety of scientific, engineering, and resource management
applications.
Keywords
UAS Technology; Affordable; Imagery

Introduction
Unmanned aerial systems (UASs) include aircraft that
do not have a pilot on board. UAS aircraft platforms
can
be
controlled
either
through
remote
communication or through an autopilot. The autopilot
can include a pre-planned flight path or other
guidance, such as automatically reacting to other
features that are within a certain radius. UAS
technology is in a state of rapid advancement. Current
UAS technology allows operators to mount color and

infrared digital cameras to take high-resolution


imagery of forested and other landscapes. The
imagery is immediately available following a flight,
and subsequent flights can be rapidly accommodated.
We describe the development and application of a
low-cost (<$1400) UAS for remote sensing of
landscapes. This UAS platform requires some effort in
building and other preparations, but presents a
potentially significant instrument for foresters and
others that can benefit from high-resolution imagery
related to land management or other activities.
We flew our platform over the Kahramanmara St
mam University (KSU) campus, located in the eastern
Mediterranean city of Kahramanmara, Turkey. While
the Federal Aviation Administration (FAA) tightly
controls UAS flights in the U.S., many other countries
do not restrict flights. Our objectives were to capture
high resolution digital color imagery of the campus in
general, and of tree stands within the campus. We
describe UAS applications, the construction and
componentry of our low-cost UAS, and the imagery
that we captured of the KSU campus and surrounding
lands.
Background
Remote sensing and GIS technologies have proven
beneficial to natural resource managers for decades
(Wing and Bettinger 2003). Remote sensing vehicles
can include satellite, airborne, and ground-based
platforms. One of the great advantages of remotes
sensing is the potential to cover large areas quickly but
this sometimes involves costs that are prohibitive to
some organizations (USFS 2004).
In the case of remote sensing across broad landscapes,
one of the prohibitive costs has been the expense
involved in supporting aerial platforms that enable

113

www.ijrsa.org

International Journal of Remote Sensing Applications Volume 4 Issue 3, September 2014

high spatial resolution imagery. Manned aerial


platforms (airplanes or helicopters) that support high
resolution imagery require, at a minimum, one pilot
and an aircraft capable of supporting significant
payloads. There is always pilot risk in manned flights.
Pilot risks constrain the ability of platforms to fly close
to the ground, which in turn limits the potential
resolution of imagery. In addition, cloud cover and
weather patterns can obscure sensors from being able
to see ground cover as most aircraft must fly at
significant heights (Hunt et al. 2008). The Federal
Aviation Administration (FAA) governs airspace
within the U.S. and requires that manned aircraft
operate at least 305 m (1000 ft) above the highest
ground obstacle in populated areas and 153 m (500 ft)
elsewhere (Federal Aviation Regulations 2010).
The potential applications of UAS for wildfire fighting
and tracking are significant yet still largely untapped
(Ambrosia et al. 2011, Wing et al. in press). Aircraft
such as rotary-wing platforms (helicopters) with
extended flight times (2-3 hours) could enhance
remote sensing of burning landscapes. Fire direction
and speed can change rapidly with results that place
human lives and structures in peril. A sustained
observation of wildfire can help direct firefighting
resources where they are most needed. Communities
can be warned and evacuated should fire spread
warrant. Firefighting crews can also be prompted for
movement to avoid danger.
Natural resource research has been the subject of
previous UAS applications for remote sensing but
many of these studies have involved platforms that
were well beyond $5,000 in costs (Wing et al. 2013).
Despite relatively high costs, researchers have found
success with the relatively fine resolution imagery and
flight flexibility that are typically associated with UAS.
Getzin et al. (2012) used high-resolution (~7 cm, 2.8 in)
imagery to associate disturbance patterns with forest
understory plant diversity. Watts et al. (2010) explored
UAS applications for wildlife surveys with imagery as
fine as 1 cm (0.4 in). This allowed identification of
alligators within swamp land. Vermeulen et al. (2013)
found that elephants were clearly discernible in UAScollected imagery but that other small- and mediumsized mammals were not. UAS have also been used for
measuring and monitoring rangeland. Rango et al.
(2009) used hyperspatial UAS imagery to measure
vegetative cover and structure within a rangeland
ecosystem.
One potential application where UAS could make

114

strong contributions is in the remote sensing of


vegetation disease outbreak and spread. Vegetation
moisture content can be measured by sensors that
operate within the infrared spectrum. The moisture
content of leaves and needles is often reduced by
disease and infestation. Many UAS are able to fly at
slower speeds and lower elevations than typical
manned aircraft. Thus a UAS outfitted with high
resolution infrared sensors should be able to more
readily detect moisture stress.
Supporting law enforcement activities is another
potential application area for UAS. Wing and Tynon
(2006, 2008) described the spatial patterns spread of
what was once considered urban-centric crime on
public recreation lands in the U.S. Recreational lands,
such as the National Forests, National Parks, and other
areas are experiencing increases in criminal activities.
Activities involve physical altercations, vandalismincluding arson, and drug production. One of the
reasons for this increase is that many public land law
enforcement officers are responsible for very large
geographic areas, and not uncommonly, lands that are
remote or difficult to access quickly (Wing and Tynon
2006). Large areas make it extremely difficult for
officers to be available when needed. In addition, even
in cases where an officer is able to arrive at a location
where criminal activity has been reported, there have
been deadly consequences as many public lands
officers work solo and not in proximity to rapid
assistance. Previous research has considered very
small UASs which are capable of highly detailed
ground observations (Helble and Cameron 2007).
Small UASs with extended flying range could strongly
enhance law enforcement officer ability to safely
observe large areas and make more informed
decisions about response. In particular, UASs that are
powered by electrical batteries could observe target
areas much more discreetly than other vehicles due to
their significantly reduced sound signature compared
to gas power. This advantage can be significant in
identifying and reacting to illegal operations.
Wildlife habitat research has been the focus of several
natural resource and UAS application studies (Koski et
al. 2009, Watts et al. 2010). UAS that are coupled with
an autopilot have the ability to fly a pre-determined
path, and to repeat any flight pattern (Laliberte et al.
2010). This direct flight planning and repeatability can
strengthen the statistical strength of habitat sampling
results by avoiding convenience sampling.
UAS technology can also be beneficial when high

International Journal of Remote Sensing Applications Volume 4 Issue 3, September 2014

spatial resolution is necessary because the requisite


flight altitudes result in dangerous conditions for
manned aircraft. The U.S. Fish and Wildlife Service
(USFWS) with the cooperation of the United States
Geological Survey (USGS) conducted a sandhill crane
population inventory in 2011 (Hutt 2011, Owen 2011).
Sensors were required that could support individual
bird species delineation in low light conditions. The
platform also had to operate quietly enough so as not
to alarm birds into flight. A ~ 1.8 kg (4 lb) RQ-11A
Raven containing a thermal camera was handlaunched and flew a gridded sampling pattern above a
nesting sandhill crane flock at ~61 m (200 ft) elevation.
The resulting imagery was used to count individual
birds. The count generated from the imagery only
differed by less than 5% from that collected by trained
ground crew (2,567 compared to 2,692). The flight had
notable successes including an accurate inventory
count, not flushing the nest population, and the ability
to gather high resolution imagery in low-light
conditions. The only significant issue reported by the
authors was a five month approval process by the
FAA in order to fly. The FAA also denied the original
request to fly during night-time conditions.
Researchers have also applied UAS for fish inventory
applications. Kudo et al. (2012) flew an unmanned
helicopter over the Moheji River in Japan to estimate
chum salmon populations. The helicopter has a 5 cc
(1.6 cubic inch) gas powered engine equipped with an
antivibration device and a Canon SLR camera. The
helicopter was flown at an approximately 30 m (100 ft)
above the river. Salmon silhouettes were extracted
from the imagery. Statistical evidence indicated a
positive correlation between the extracted silhouette
number and the standardized number of salmon
seine-netted between the weir and estuary.
The authors noted several limitations of their study.
One was that a remote pilot was needed with a
ground control station on the riverbank in order to
control the helicopter. This limited the length of the
river that could be surveyed. Another was that the
helicopter could not be kept at a consistent height
above the river. Due to this, staves had to be placed
along the river banks in order to scale and adjust
imagery to a consistent dimension. Both of these issues
could be addressed with an autopilot that would
automatically guide the helicopter to river locations,
and maintain a consistent altitude. The authors also
pointed out that only a shallow river, free from
turbidity, and significant vegetative cover could be
surveyed effectively with this approach.

www.ijrsa.org

Search and rescue (SAR) operations are another


application where UAS technology can be
advantageous. In particular, weather conditions such
as extreme cold or landscapes can endanger searchers.
UAS has the potential to reduce risk to searchers, as
well as the ability to increase the efficiency and extent
of searchers but little published research has been
done in this area. One notable study involved the
development of a flying wing aircraft for autonomous
SAR (Goodrich et al. 2008). The flying wing weighed
approximately 2 kg (4.4 lbs) and was equipped with a
gimbaled electro optical camera. The wing was
programmed to perform a spiral search pattern that
was refined through search algorithms designed to
detect humans. Trials demonstrated that the UAS was
successful at detecting still and mobile humans in a
mountainous forest setting. Based on field trials, it was
concluded that an ideal search altitude for the UAS
was approximately 60 107 m (200-350 ft) above the
ground. The study highlighted some of the potential
concerns and strengths that UAS can bring to SAR
operations. One concern is that there needs to be a preplanned method for providing information that is
generated from the UAS to any field-based personnel
that are attempting to locate a SAR objective.
However, strengths include being able to operate a
UAS closer to the ground than a manned aircraft thus
also increasing ground resolution and the probability
of object detection. This strength is also enhanced
when object detection can be accomplished
autonomously.
Methods
We used a Zephyr II aircraft platform manufactured
by Ritewing. The Zephyr II is constructed of injection
molded, flexible E-por foam, spans 140 cm (55 inches)
between wingtips, and weighs approximately 1 kg (2.2
lbs) empty. The Zephyr II sold for $325 and arrived
unassembled (Figure 1). The platform required
approximately 36 hours of preparation before the
initial flight and required that several components be
purchased and installed (Table 1).
The initial preparation involved covering the entire
platform with flexible and transparent 3 mm laminate.
A hot iron was used to mold the laminate to the
aircraft. Although laminate is not necessary, it makes
the platform more rigid, increasing flight stability. We
then laminated a set of elevons (ailerons + elevator).
The elevons function as control surfaces on the trailing
part of the wing and are constructed of balsa wood.
The elevons control the attitude (yaw, pitch, and roll)
115

www.ijrsa.org

International Journal of Remote Sensing Applications Volume 4 Issue 3, September 2014

of the aircraft when it is powered. The elevons were


attached to the end of the wings using strapping tape.
The tape attachment must be done is such a way that
the elevons can move up and down with +/- 60
degrees of deflection without resistance. Servos are
boxes containing gears to rotate a spinning shaft.
These function as the mechanical means by which the
elevons are moved. The RC receiver or autopilot
produces a signal that instructs each servo to activate,
specifying the speed and degree of rotation. This
translates to independent elevon activation in the up
or down direction.

FIGURE 1. FROM TOP OF FIGURE: UNASSEMBLED ZEPHYR II


PLATFORM, DIGITAL CAMERA, AUTO PILOT, RADIO
CONTROLLER, ELEVONS, AND A 31 CM RULER FOR SCALE.

We installed multiple electrical and mechanical


components into the platform. Although a cut-out had
been already been made into the bottom bay of the
platform, we had to expand this bay, and also created
several other small cut-outs to house other
components.
TABLE 1. COMPONENTS AND COST FOR UAS CONFIGURATION.

Component
Cost
2.4 GHz Tx/Rx
$360
4500 mAh 11.1 V LiPo
$30
Airspeed Sensor
$25
ArduPilot APM 2.5 $160
Canon S100
$300
RiteWing Zephyr II $325
TTC Radio
$86
uBlox GPS Module $76
Voltage Regulator
$15
Total
$1,377
We chose to power the aircraft through an 11.1 volt,
4500 mAh lithium-polymer (LiPo) battery that sells for
approximately
$30.
This
battery
provided
approximately 25 minutes of flight while powering the
other electrical components in our platform. We
116

selected a 1200 kv electrical motor that featured a


hinged propeller. The hinges allow the propeller
blades to be folded during transport. Hinges also
allow the prop to fold during landing, preventing
blade damage. The motor was fixed to an aluminum
mount and glued to the rear of the aircraft using an
epoxy.
The heart of the platform is the autopilot (Figure 1).
We selected the ArduPilot APM 2.5 due to its low cost
($160) and strong functionality. The autopilot is an
electrical motherboard that communicates movements
and conditions to the components that control flight
parameters. Movements can either be signaled to the
autopilot by radio controller (Figure 1) or be preprogrammed and loaded digitally onto the autopilot.
Both conditions can also occur during a flight. For
instance, a pilot might chose to use manual flight
during a launch, which would mean that the motor
RPM and elevon would be controlled by
manipulations of the radio controller levers. Once
airborne, auto mode could be enabled and the aircraft
would automatically fly to pre-programed locations.
During flight, a pilot can also to have two flight
assistance modes. The first is called fly-by-wire and
will prevent a pilot from rolling or flipping an aircraft.
Fly-by-wire allows for manual command/direction of
the autopilot-the autopilot retains flight control and
continues to observe pre-set flight regime limitations.
The second is called stabilize mode. While stabilize
mode may not prevent a roll or flip, the aircraft will
return to a level flight if the pilot releases the flight
controls. After all designated locations had been
reached, a manual or assisted (FBW-A mode) landing
could be attempted using radio controller levels. It is
also possible to program a landing to occur
automatically, without pilot interaction.
The autopilot we used can be programmed using open
source ArduPilotMissionPlanner software that is
freely available (APM Multiplatform Autopilot 2013).
The software will automatically create a flight plan by
drawing a line or polygon on an interface that accesses
Google Maps (Figure 2). If continuous imagery of an
area is desired, a user can specify the spacing between
flight paths and the software will create a flight plan
that can be loaded onto the autopilot. This capability is
significant in that it can support the acquisition of
stereo imagery, which can be used to create digital
products such as digital terrain models (DTMs),
landscape measurements, and point clouds. Point
clouds provide visual images that appear as if one is
viewing a landscape in a 3-dimensional perspective.

International Journal of Remote Sensing Applications Volume 4 Issue 3, September 2014

FIGURE 2. ARDUPILOTMISSIONPLANNER SOFTWARE.

www.ijrsa.org

radio that can be plugged into a laptop. We use a


typical Windows laptop as our groundstation. During
flight, we are able to view the position of the aircraft
with Google-provided imagery in the background.
The positional coordinates and speed of the aircraft
are also provided via a wireless telemetry link to the
aircraft. The wireless telemetry link is served by a $86
915 MHz telemetry, tracking and commanding radio
(Figure 3). This provides the command link by which
autopilot parameters can be altered, and airspeed,
location, and other flight parameters can be recorded.

The autopilot contains multiple input and output


ports. A primary input port is the radio controller
receiver. This allows the communication of signals
from the controller to the autopilot. The
communication frequency is 2.4 GHz. Our radio
controller is able to communicate with the autopilot at
distances up to 1600 m (1 mile).
Our primary sensor is a Canon Powershot S100 digital
camera that sells for approximately $300. The camera
captures color imagery at 12.1 megapixel resolution.
This translated into a ground resolution of ~5 cm (2
inches) at a height of 122 m (400 feet) which is the
recommended maximum height for "model airplane"
enthusiast flights in the U.S. (FAA 1981).
Our camera is powered by the aircraft battery through
the use of a voltage regulator. We currently program
the camera to begin taking photos at regular intervals
(5 seconds). A programming source, the Canon Hack
Development Kit (CHDK) for Canon cameras provides
detail on how to do this (CHDK Forum 2013). We have
also configured our radio controller to transmit a
signal at the push of a button that directs when photos
are taken. It is also possible to specify specific geolocations for photos to be captured. We have also
modified a Canon S100 so that it captures nearinfrared imagery. We use this camera to sense
greenness in vegetation, and have used this infrared
capability to calculate spectral indices such as the
normalized difference vegetation index (NDVI)
(Gamon et al. 1995).
The aircraft is also equipped with a $25 airspeed
sensor (Figure 3). This is in contrast to ground speed
and is an indicator of how much lift is being generated
by the aircraft and is used by the autopilot to make
speed adjustments to ensure stable flight. A $76 GPS
receiver is attached to the autopilot and calculates the
x, y and z position of the aircraft. Location and
elevation coordinates are transmitted to a transceiver

FIGURE 3. ZEPHYR II WITH COMPONENTS INSTALLED.

The surface of the Zephyr that faces the ground during


flight is very plain in appearance (Figure 4). Some
operators preferto paint this surface with a bright
color, such as orange, so that the aircraft is more
visible during flight operations.

Results
The flight over the KSU campus was accomplished in
approximately 20 minutes. We launched the aircraft
platform using manual flight controls by throwing the
aircraft into the air (Figure 5). We conducted our entire
flight with manual navigation.
Figure 6 shows a platform-captured image of KSU's
courtyard and campus. The sidewalks and other
structures are clearly discernible and demonstrate the
high detail of the color imagery. Boundary

117

www.ijrsa.org

International Journal of Remote Sensing Applications Volume 4 Issue 3, September 2014

measurements
and
structural
foundation
measurements would be supported by this image. In
addition, one of the young forest stands within the
campus is visible.

width. In addition, a definitive planting pattern was


evident in the nadir image. The enhanced resolution
would also support the derivation of tree heights due
to the presence of shadows. With the collection of
stereo images, tree heights could be calculated with
greater accuracy and precision. This information
would be invaluable for harvest and reforestation
planning.
A portion of the primary campus is shown in Figure 8.
This image was captured off-nadir which allows for
the heights of features such as trees and buildings to
be measured. Individual trees and automobiles can be
clearly discerned.

FIGURE 5. HAND LAUNCH OF ZEPHYR II.

FIGURE 8. PRIMARY CAMPUS AREA AT


KAHRAMANMARAST MAM UNIVERSITY.
FIGURE 6. A PORTION OF THE CAMPUS AND COURT YARD AT
KAHRAMANMARAST MAM NIVERSITESI.

Discussion
We demonstrated the application of a low-cost (<
$1400) UAS that is capable of collecting georeferenced,
high-resolution color imagery of structures and
forests. This capability transcends the typical process
of collecting aerial imagery. The typical process
includes the cost of a pilot and aircraft, the cost of
image collection, and the prolonged time that is
usually required before the imagery becomes
available. This process unfolds again, should
subsequent imagery be desired.

FIGURE 7. LARGE AREA FOREST STAND AT


KAHRAMANMARAST MAM UNIVERSITY.

Our collected images demonstrate that high resolution


imagery of structures and forests can be quickly
captured. Also, the images are almost immediately
accessible, and can support a variety of landscape
measurements.

Several images of the young forest stands were


captured during the flight. One image shows a nadir
perspective of a young stand (Figure 7). This image
would support estimates of inventory information,
such as the number of trees and their associated crown

UAS technology is still in its infancy, but the available


technology is still desirable for working foresters and
land managers. In particular, the ability to target flight
areas, and the ability to schedule subsequent flights
quickly is very advantageous for change detection.

118

International Journal of Remote Sensing Applications Volume 4 Issue 3, September 2014

Change detection involves observing the growth


patterns of trees, both in crown width and height, as
well as basic inventory information such as the
number of living trees following time after planting.
With the application of near-infrared, information
about the health of trees, based on moisture content in
foliage, could also be derived. This information could
provide guidance in determining whether tree stress is
environmental or pest related. In addition, detection of
tree stress could be used to direct mineral
amendments or other management responses.
One of the most in-demand sources of spatial data
today is that produced by light ranging and detection
(LiDAR) sensors. A smaller-format LiDAR sensor has
been developed that is less than 2 kg (4.4 lbs) in weight
that can be mounted on a UAS. LiDAR provides
highly detailed topographic information and also
returns infrared spatial reflectances. A threedimensional rendering of a landscape can be created
through a LiDAR point cloud, and can be used to
measure the dimensions of landscape features.
Researchers have recently begun to use color 3D
photogrammetry to create 3D point clouds of similar
to density of those created using LIDAR (Ruther et al.
2012). Unlike LIDAR, however, 3D photogrammetric
products are limited by the availability of passive light
and cannot effectively detect subcanopy structure that
is not reflecting light. Despite these limitations, these
color imagery point clouds still provide sufficient
detail to support reliable feature measurements and
the creation of canopy height models In addition color
imagery point clouds can be created at much lower
costs than LiDAR point clouds.
Although UAS technology is still in a state of rapid
development, foresters and land managers can
significantly benefit from current offerings at a cost
that falls well within the budgets of most
organizations. It can only be anticipated that UAS
technology capabilities in the near future, will yield
increased value for those that need imagery of the
resources they manage.

We thankfully acknowledge the OSU Faculty


Internationalization Grant program and the U.S.
National Institute for Food and Agriculture. for
supporting this project.

2013.
CHDK

Forum.

2013.

Available

at:

http://chdk.setepontos.com/index.php. Last accessed July


30, 2013.
Federal Aviation Regulations. 2010. Code of Federal
Regulations. Title 14. Minimum safe altitudes: General.
Gamon, J. A., C.B. Field, M.L. Goulden, K.L. Griffin, A.E.
Hartley, G. Joel, J. Peuelas, and R. Valentini. 1995.
Relationships between NDVI, canopy structure, and
photosynthesis in three Californian vegetation types.
Ecological Applications: 28-41.
Getzin, S.,

K. Wiegand and I Schning. 2012. Assessing

biodiversity in forests using very high-resolution images


and unmanned aerial vehicles. Methods in Ecology and
Evolution 3:397-404.
Helble, H., and S. Cameron. 2007. OATS: Oxford aerial
tracking system. Robotics and Autonomous Systems
55(9):661-666.
Hunt, E. R., W. D. Hively, C. S. Daughtry, G. W. McCarty,S.
J. Fujikawa, T. L. Ng and D. W. Yoel. 2008. Remote
sensing of crop leaf area index using unmanned airborne
vehicles. In ASPRS Pecora 17 Conference Proceeding,
Bethesda, MD: American Society for Photogrammetry
and Remote Sensing.
Hutt, M. 2011. USGS Takes to the Sky. Ejournal.
September/October 2011, pp 54-56. Available online at:
rmgsc.cr.usgs.gov/UAS/pdf/sandhillcranes/ejournal_Sept
Oct_2011_birdsandUAVs.pdf; last accessed Dec 29, 2012.
Laliberte, A. S., J. E. Herrick, A. Rango and C. Winters 2010.
Acquisition,

orthorectification,

and

object-based

classification of unmanned aerial vehicle (UAV) imagery


for rangeland monitoring. Photogrammetric Engineering
and Remote Sensing 76(6): 661-672.
Owen, P. A. 2011. When the Ravens met the Sandhill Cranes:
count Wildlife. Unmanned Systems: June:20-22.
Rango, A., A. Laliberte, J.E. Herrick, C. Winters, K. Havstad,
C. Steele and D. Browning. (2009). Unmanned aerial
vehicle-based remote sensing for rangeland assessment,
monitoring, and management. Journal of Applied

REFERENCES

Multiplatform

http://ardupilot.com/downloads/. Last accessed July 30,

USGS and USFWS Team turns to Unmanned Aircraft to

ACKNOWLEDGMENT

APM

www.ijrsa.org

Remote Sensing, 3(1): 033542-033542.


Autopilot.

2013.

Available

at:

Ruther, H., Smit, J. and Kamamba, D. 2012. A Comparison of

119

www.ijrsa.org

Close-Range

International Journal of Remote Sensing Applications Volume 4 Issue 3, September 2014

Photogrammetry

to

Terrestrial

Laser

Scanning for Heritage Documentation. South African


Journal of Geomatics. Vol 1 No. 2.
USDA Forest Service (USFS). 2004. Aerial Detection
Overview Surveys Futuring Committee Report. Fort
Collins, Colorado FHTET-04-07. 48 p.
Vermeulen, C., P. Lejeune, J. Lisein, P. Sawadogo, and P.
Bouch. 2013. Unmanned aerial survey of elephants. PloS
one 8(2):e54700.
Watts, A. C., J. H. Perry, S. E. Smith, M. A. Burgess, B. E.
Wilkinson, Z. Szantoi, P. G. Ifju, and H. F. Percival. 2010.

Wing, M.G. and J.F. Tynon. 2008. Revisiting the spatial


analysis of crime in National Forests. Journal of Forestry
106(2):91-99.
Wing, M.G. and P. Bettinger. 2003. GIS: An updated primer
on a powerful management tool. Journal of Forestry
101(4):4-8.
Wing, M.G., J. Burnett, and J. Sessions. In press. Remote
sensing and unmanned aerial system technology for
monitoring

and

quantifying

forest

fire

impacts.

International Journal of Remote Sensing.

Small Unmanned Aircraft Systems for LowAltitude

Wing, M.G., J. Burnett, J. Sessions, J. Brungardt, V. Cordell,

Aerial Surveys. The Journal of Wildlife Management

D. Dobler, and D. Wilson. 2013. Eyes in the sky: Remote

74(7):1614-1619.

sensing technology development using small unmanned

Wing, M.G. and J.F. Tynon. 2006. Crime mapping and spatial
analysis in National Forests. Journal of Forestry

120

104(6):293-298.

aircraft systems. Journal of Forestry 111(5):341-347.

S-ar putea să vă placă și