Sunteți pe pagina 1din 6

Drone-based reconstruction for 3D geospatial data processing

Jason D. Renwick Levente J. Klein Hendrik F. Hamann


IBM TJ Watson Research Center IBM TJ Watson Research Center IBM TJ Watson Research Center
Yorktown Heights, NY 10598 Yorktown Heights, NY 10598 Yorktown Heights, NY 10598
jdrenwic@us.ibm.com kleinl@us.ibm.com hendrikh@us.ibm.com

Abstract—Urban air quality affects health and well-being Drones have multiple advantages, such as, being easily
of more than half of the world's population. Measurement, deployable, allowing reprogramming of flight paths during
modeling and real time action on pollution sources can alleviate operation, and having sufficient computational power to
their impact. A preliminary study for drone-based image
acquisition and processing steps required for 3D reconstruction carry out calculations during data acquisition. Drone usage
of potential pollution sources is presented. The 3D surface has been extended to geospatial mapping, as an alternative to
terrain models combined with sensor data are inputs into air satellite observations, for quick observation and inspection
pollution models for visualization, understanding and potential of small areas. This can be especially important for situations
mitigation of methane plumes. Scaling these technologies across like disaster management when rapid data acquisition can
large areas requires the integration of big geospatial data with
modeling, machine learning, and image processing. save human lives and protect properties.
In recent times, natural gas (methane) exploration sites
Keywords-Drones; Unmanned Air Vehicle; Internet of
have been developed in close proximity to urban areas.
Things; Smart Cities; Open Source; IoT; UAV; Air Quality
Monitoring. In the Dallas/Fort Worth area the gas exploration sites
are interweaved with homes. Lately, pollution surveys have
I. I NTRODUCTION established that these sites may have contributed to increased
Cities are a multi system operating environment with greenhouse emission and potentially impact human health
sensors continuously acquiring data from traffic, water dis- [2]. We describe in this paper a potential framework (Figure
tribution networks, electric grids, and air quality stations. 1) for air pollution assessment based on the integration of
Such data is used to improve the quality of life for citizens, drone imagery, data fusion, and physical modeling.
increase operational efficiencies, and reduce waste. Recently,
the quantity of data from mobile devices, social media II. D RONE T ECHNOLOGY
and crowdsourcing has increased exponentially, providing Here, we explore the applicability of drones for air quality
snippets of information about the surrounding environment. monitoring, especially for situations where gas emissions
Many data sets are hosted in proprietary databases where sites are clustered. High concentrations of methane gas,
access and data formats are controlled. There is a clear need emitted from a single well pad, may be harmless but close
for data fusion of heterogeneous data sets, data curation and proximity of multiple emission sources can create hazardous
advanced analytics so that all data can be easily accessed situations. Drones are ideal to inspect the gas plumes due
across multiple computational platforms and be queryable to low operating cost, flexibility in flying at different alti-
for any location and for any time interval. tudes, and the possibility to schedule a flight at almost any
Internet of Things (IoT) can be envisioned as an in- moment. The drone-acquired high resolution imagery can
teroperable collection of cyber-physical devices comprising be processed using open source tools and machine learning
electronics, sensors/actuators, software and network connec- techniques. The confluence of easy drone operations and
tivity, enabling such devices to dynamically connect and image analytics make the drone-based implementations a
exchange data over the Internet. Traditionally, IoT was very attractive solution to address these challenges in the
associated with sensor networks which could be expanded oil and gas industry.
over large areas by scaling the number of sensing nodes. While drones are most commonly associated with high
Alternatively, satellites offer a large-area observation of the quality image processing, they can be fitted with application-
Earth's surface that can track spatial and temporal changes specific sensors like Light Detection And Ranging (LiDAR),
which are considered as important for many applications [1]. spectral cameras, acoustic or chemical sensors. The onboard
However, neither sensor networks nor satellite observations sensor can complement Global Positioning System (GPS),
can fulfill the demand for simultaneous large-area coverage compass, barometric pressure sensors which are already part
or high temporal resolution observations. of the drones navigation system. Mobile measurements using
Drones are low-cost sensor platforms which can be flown drone-attached sensors can acquire measurements at any
easily using either manual control or preprogrammed routes. location of the 3D space and adjust flight path in real time

729
Data Processing Analytics
Drones Orthorectification Machine learning
Images GDAL
Stitching
3D reconstruction
Sensor data Python
GeoReferencing
OpenCV Segmentation
PAIRS Data Fusion
Video Analytics
Satellite Weather
Data Object recognition
Topography Census curation

Figure 1. Framework for drone imagery processing, analytics and integration with PAIRS

to sample the spatial and temporal variations of gas plume III. H ARDWARE AND S OFTWARE
dispersion.
The drone used for this setup was the DJI Phantom 3
Standard which had a built-in camera, GPS module and
Drones may also act as a mobile information transmission sensor array. This drone weighs 1.22 kg and can carry
link between remote sensing points having no Internet con- an estimated maximum payload of 1.5 kg. DJI provides
nectivity and data processing cloud platforms. The drone can an android/ios app for inflight viewing through the built-
download sensor information from locations with no Internet in camera, image capture configurations and drone flight
connectivity, move the data to a location where Internet configurations. For the flights outlined in this paper, the
access exists, and upload the data to a cloud platform. image acquisition mode was set to a 5-second interval.
Furthermore, if the acquired data results into an actionable The exposure settings were manually set to keep uniform
command, drones can take the actuation command from the exposure between the images.
cloud platform, transport it to the relevant sensing point
and transmit the information to the control system that can A. Flight Limitations
execute the actuation. Such commands can, for example, be The 4480 mAh battery operated nominally at 15.2V and
turning off a gas valve or shut off the well pad operation. allowed for a maximum flight time of 20 minutes under
Visual observation by drones can ensure that the command normal conditions. The maximum speed the drone can fly
was carried out successfully. In such scenarios, drones at under no wind conditions is 16 m/s limiting the safe
enable remote operation of sites where direct communication operation of the drone to horizontal wind conditions of
with control systems may not exist. 16 m/s or less. Legally, drones can fly up to 100 mph.
The current state-of-the-art drone technology allows for
Apart from data collection, drones can be used as edge drones to fly at 80 mph while still recorded high resolution
devices for real time processing using attached computers imagery. The legal altitude limit of an unmanned air vehicles
like a Raspberry Pi [3]. On board data processing can enable (UAVs) is 400 feet. The remote controller used with the
independent decision making, the precursor for cognitive drone allowed for a maximum signal distance of 1000 m
processes and autonomy. Proactive decisions based on image with direct line of sight. For the image stabilization during
processing and modeling can enable machine learning and flight, the camera was attached to a gimbal that allowed for
actuation. Until drones evolve to a state where they can take stabilization between -90 to 30 degrees of flexibility.
independent decisions, multiple issues related to automatic
flight control and avoiding obstacles and collision have to be B. Camera
addressed. In this paper, we present the first steps in image The drone is outfitted with a camera with a 1/2.3-inch
acquisition and processing e.g. stitching, orthorectification, 12 megapixel sensor. The images acquired were 4000x3000
geo referencing, and object recognition and discuss the pixels. The permanently attached rectilinear prime lens has
necessary steps to integrate 3D reconstruction of gas well a 20mm focal length (35mm equivalent). The rectilinear
pad sites with Computational Fluid Dynamics models. construction of the lens produced a 94-degree field of view

730
with little barrel distortion. During the stitching process, the
images were undistorted using a calculated camera calibra-
tion matrix in OpenCV [4]. For small areas of stitching, the
distortion was considered negligible and the calibration step
was ignored.

C. Georeferencing
The process of achieving orthorectification and georefer-
encing were combined into a single transform with the use
of Geospatial Data Abstraction Library (GDAL) [5]. The
open-source GDAL package supplied for Python allows for
geospatial processing of images. To warp the mosaic into
orthorectification with accurate georeferencing, a polyno-
mial transform was applied to the image. The georeferencing
was tested with linear, perspective, Helmert and projective
transform. These tests produced subpar results compared to
the polynomial transform. To apply the polynomial trans- Figure 2. Georeferenced stitched orthomosaic (middle left) overlaid on a
satelitte map (right) showing accuracy of 1 m in georeferencing
form, at least 10 Ground Control Points (GCPs) had to
be matched between the stitched mosaic and known GPS
coordinates. For the experimental flights, there were no
E. Software
physical GCPs planted in the area covered by the drone. To
acquire the GCPs needed for the transform, a comparison Open source tools, Python along with OpenCV 2.4.13 was
of key image features was done between the mosaic and a primarily used in producing the stitched mosaic of aerial
known orthorectified satellite image. The transform would images. The image stitching algorithm (Figure 3) used a
yield fairly good results (Figure 2) once at least 15 GCPs combination of native OpenCV functions. It is a simplified
were acquired. Since the process involved applying a finite version of the panoramic invariant stitching algorithm [9].
transform, the resulting orthomosaic was limited by 2 major All of the images are matched to a reference image. After
factors: i) the accuracy of the compared satellite imagery and the keypoints are detected and matched, the second image
ii) the quantity of GCPs acquired. For visualization, quick is warped towards the reference image using a perspective
and simple comparison, Quantum Geographic Information transform. After every iteration, the reference image is
System (QGIS) was used to assist with the orthorectification updated and the remaining images are matched to the new
and georeferencing [6]. The georeferencer plugin in QGIS reference. This is repeated as long as sufficient matches are
also allowed for GDAL script generation so results could be found between an image in the pool and the reference image
reproduced in a Python shell without the use of external for that iteration. For small scale applications, the Python
software. The use of the generated Python script could wrapper was most convenient to use with strong online
restrict the entire image processing framework to a single support. For larger applications, the OpenCV framework
Python shell. would be better used with C++ as it allows for easier access
to parallel Graphics Processing Unit (GPU) processing and
memory allocation.
D. Methane Leak Detection
IV. E XPERIMENTAL F LIGHTS AND DATA
Traditional methane leak detection has been done with a
thermal/infrared camera. These cameras have been retrofitted A. Flight 1
to drones to suit additional applications such as power grid The drone was flown along the outside of the IBM’s TJ
inspection [7]. The drone hardware described previously can Watson Research Center to acquire aerial imagery at a height
be combined with current infrared imaging technology to of approximately 115 m. The image capture was done man-
provide a detailed thermal survey of a large area. Thermal ually and images were captured approximately 30 m apart.
imaging can be combined with the RGB georeferenced While in flight the camera was aimed perpendicular to the
orthomosaic to produce an accurate multi-spectral high reso- Earth's surface or -90 degrees from the drone. These images
lution geospatial layer. The resulting images can guide more were used in the construction of the stitched orthomosaic.
precised inspection, detection or compliance enforcement of
chemical leaks [8]. The spectral imagery obtained from the B. Flight 2
infrared camera can be supplemented by chemical sensors The drone was flown at varying heights between 3-15 m
on the drone to corroborate the potential leaks detected. around an object or structure to capture terrestrial imagery.

731
Image Pool

Remove distortion
[CameraCalibrate]

Resize Image
[Resize]

Find Key Points Match Features


[FeatureDetect] [Matcher]

Obtain
Warp/Combine
Homography
Images
[FindHomography]

Figure 3. Algorithm for image stitching Figure 4. Examples of 3D reconstruction with 2D drone images

While in flight, the camera was always pointed at the


object and images were acquired at angles between -90 to and extracting only information of interest, e.g. give me all
0 degrees to the object. These images were used for the 3D houses that are within 30 m from street and have vegetation
reconstruction (Figure 4) [10]. in close proximity while the terrain is on a slope of 5%.
Traditionally, these calculations require massive amounts of
V. DATA P ROCESSING data download and processing but in our case the PAIRS
database handles all these cross layer queries.
Drones can acquire images at a 10 cm or higher spatial
resolution thus generating a few 10s of Gbyte of data Additionally, PAIRS is a computational platform based on
during a single flight. During drone flight, apart from a highly parallelized processing implemented on Hadoop and
image acquisition, the on-board sensor data can also be Spark for leveraging built-in machine learning algorithms.
acquired; longitude, latitude, altitude, velocity, ground speed, To train machine learning algorithms, access to histori-
accelerometer, gyroscope, barometer, quaternion, roll, pitch, cal data sets is required. PAIRS hosts such geospatially
yaw and compass readings. These point data offer contextual referenced datalayers to train object detection and feature
information for the imagery, e.g. flight path can be used to recognition. We integrated drone imagery in PAIRS by
determine the best future flight route. Data management is creating contiguous large area images. Once included in
a critical aspect in applications where multiple data sets are PAIRS, the drone images are aligned with all existing data
acquired across the same geographical region. layers such as satellite, weather and topography layers.
To host the drone imagery, we are using a big data PAIRS also serves as the computation platform for drone
geospatial platform called, Physical Analytics Integrated image classification and object recognition using Python,
Repository and Services (PAIRS). It provides access to more GDAL, and OpenCV. Currently, image segmentation and
than 20 spatially and temporarily aligned data layers [1]. classification are implemented to identify and delineate
Presently, PAIRS hosts more than 500 TByte of geospatial features in the drone imagery, like vegetation, buildings, cars
data with more than 5 TByte of new data added weekly. and/or road networks. The drone-extracted information can
All data layers in PAIRS are curated and indexed. Indexing be validated against PAIRS's hosted data like road networks,
each data point allows searching every geospatial data layer population census, or building layouts [1].

732
to drone autonomy involves enhancing drones with sensors
and onboard processing capabilities [14], [15]. In the future,
drones may be capable of automatically determining the best
route, number of snapshots, and viewing angle to acquire
images for specific applications. Such cognitive decision
making capabilities are currently being investigated.
Additionally, having a uniform standard regarding the
operation of drones is needed. Recently, the FAA released
some regulations outlining the restrictions related to UAV
operation but it does not describe, in detail, the standard
for drone control in low altitude airspace. One can argue
Figure 5. Overlay of multiple georeferenced data layers in PAIRS
that the rules for operating a drone in a city should differ
to the operation of a drone in the countryside. Solutions
or implementations related to this limitation are presently
VI. A IR P OLLUTION M ONITORING
being explored with traffic management systems for UAV
We propose drone imagery as a quick and precise way operation [16].
to improve air pollution modeling in urban areas. There are
more than 200 well pads with methane emission potential VIII. C ONCLUSION
positioned in and around the city perimeter of Fort Worth, In the paper, we presented the steps involved in process-
TX. Since prevalent winds in the area can move the methane ing aerial images captured by drones through open source
leak plumes towards schools, homes for the elderly, and software. Additionally, the large volume of data collected
hospitals any modeling needs to consider the geospatial was also shown to be handled and curated by IBM's PAIRS
component of the detection. Surface terrain models extracted platform. These steps are necessary towards integrating
from drone imagery help in modeling wind turbulence drones into air pollution monitoring frameworks and models.
around buildings and infrastructure. These are necessary Furthering this work will involve outfitting a drone with
inputs for pollution models like QUIC, AERMOD, and additional sensors and reconstructing methane plumes.
CALPUFF [11], [12], [13]. The 3D reconstruction of a city
neighborhood or well pads can capture the current ground ACKNOWLEDGMENT
conditions. Furthermore, wind and methane concentration The authors would like to thank Fernando Marianno,
from IoT sensor networks deployed around that area can Conrad Albrecht, Xiaoyan Shao, Michael Schappert and
be used as input into dispersion models to constrain and to other researchers at IBM’s Thomas J Watson center for their
ensure accuracy in methane level modeling. Computational assistance in processing and understanding geospatial data,
Fluid Dynamics combined with accurate 3D reconstruction guidance through the progression of the drone technology
of the region of interest and IoT sensor network measure- work and intellectually engaging discussions. Norma Sosa
ments can provide precise estimates of city level pollution. for facilitating and supporting the Geospatial UAV project.
Nicholas Fuller for orchestrating the UWI-IBM internship.
VII. D ISCUSSION Shamin Renwick for her guidance and support in culminat-
Some of the steps involved in using drones as low-cost ing this paper.
data acquisition tools were discussed, however, there are Part of This report was prepared as an account of work
limitations that restrict the versatility of such a system. sponsored by an agency of the United States government.
One obvious limitation is the physical constraints of a Neither the United States government nor any agency
UAV, particularly, flight constraints. Current state-of-the-art thereof, nor any of their employees, makes any warranty,
drone technology allows for a maximum of 20-30 minutes expressed or implied, or assumes any legal liability or
of flight time on a full battery. The limitations in battery responsibility for the accuracy, completeness, or usefulness
lifetime, requires careful planning of the drone flight. In of any information, apparatus, product, or process disclosed,
manual or semi-automatic flights, excessive data is acquired or represented that its use would not infringe privately owned
since during flight it is difficult to estimate if consecutive rights. Reference herein to any specific commercial product,
images have sufficient overlap for stitching or if images process, or service by trade name, trademark, manufacturer,
are suitable for 3D reconstruction. Also, the autonomous or otherwise does not necessarily constitute or imply its
capabilities of drones are not standard across all available endorsement, recommendation, or favoring by the United
UAVs. This implies most drones in flight have different States government or any agency thereof. The views and
decision making processes when faced with an obstacle. This opinions of authors expressed herein do not necessarily state
lack of predictability may restrict the ability to have multiple or reflect those of the United States government or any
concurrent drone data acquisition networks. Work related agency thereof.

733
R EFERENCES [14] A. Bürkle, F. Segor, and M. Kollmann, “Towards autonomous
micro uav swarms,” Journal of Intelligent & Robotic Systems,
[1] L. J. Klein, F. J. Marianno, C. M. Albrecht, M. Freitag, S. Lu, vol. 61, no. 1, pp. 339–353, 2011. [Online]. Available:
N. Hinds, X. Shao, S. B. Rodriguez, and H. F. Hamann, http://dx.doi.org/10.1007/s10846-010-9492-x.
“PAIRS: A scalable geo-spatial data analytics platform,” in
2015 IEEE International Conference on Big Data (IEEE [15] G. Vásárhelyi, C. Virágh, G. Somorjai, N. Tarcai, T. Szörényi,
BigData 2015). IEEE, 2015, pp. 1290–1298. T. Nepusz, and T. Vicsek, “Outdoor flocking and formation
flight with autonomous aerial robots,” in 2014 IEEE/RSJ
[2] D. R. Lyon, D. Zavala-Araiza, R. A. Alvarez, R. Harriss, International Conference on Intelligent Robots and Systems.
V. Palacios, X. Lan, R. Talbot, T. Lavoie, P. Shepson, T. I. IEEE, 2014, pp. 3866–3873.
Yacovitch et al., “Constructing a spatially resolved methane
emission inventory for the barnett shale region,” Environmen- [16] T. Prevot, J. Rios, P. Kopardekar, J. E. Robinson III, M. John-
tal Science & Technology, vol. 49, no. 13, pp. 8147–8157, son, and J. Jung, “UAS traffic management (UTM) concept of
2015. operations to safely enable low altitude flight operations,” in
16th AIAA Aviation Technology, Integration, and Operations
[3] W. Zhou, D. Nair, O. Gunawan, T. van Kessel, and H. F. Conference. AIAA, 2016.
Hamann, “A testing platform for on-drone computation,”
in 2015 33rd IEEE International Conference on Computer
Design (ICCD). IEEE, 2015, pp. 732–735.

[4] (n.d.) Welcome to openCV documentation. OpenCV (Open


Source Computer Vision). Accessed Jul. 30, 2016. [Online].
Available: http://docs.opencv.org/2.4/index.html.

[5] (n.d.) GDAL - geospatial data abstraction library. Open


Source Geospatial Foundation OSGeo. Accessed Jul. 30,
2016. [Online]. Available: http://www.gdal.org.

[6] (n.d.) Discover QGIS. QGIS Development Team.


Accessed Jul. 30, 2016. [Online]. Available:
http://www.qgis.org/en/site/about/features.html.

[7] J. Zhang, L. Liu, B. Wang, X. Chen, Q. Wang, and T. Zheng,


“High speed automatic power line detection and tracking for
a uav-based inspection,” in 2012 International Conference on
Industrial Control and Electronics Engineering ICICEE, Aug
2012, pp. 266–269.

[8] A. P. Pacsi, N. S. Alhajeri, D. Zavala-Araiza, M. D. Webster,


and D. T. Allen, “Regional air quality impacts of increased
natural gas production and use in Texas,” Environmental
Science & Technology, vol. 47, no. 7, pp. 3521–3527, 2013.

[9] M. Brown and D. G. Lowe, “Automatic panoramic image


stitching using invariant features,” International Journal of
Computer Vision, vol. 74, no. 1, pp. 59–73, 2007.

[10] (n.d.) Pix4DmapperDiscovery. Pix4D. Accessed Jul. 30, 2016.


[Online]. Available: https://pix4d.com/product/pix4dmapper-
discovery.

[11] Quick Urban & Industrial Complex (QUIC) Dispersion


Modeling System. (n.d.). Los Alamos National
Laboratory. Accessed Jul. 29, 2016. [Online]. Available:
http://www.lanl.gov/projects/quic.

[12] A. J. Cimorelli, S. G. Perry, A. Venkatram, J. C. Weil,


R. J. Paine, and W. D. Peters, “AERMOD: Description of
model formulation,” Aerosp. Corp., Los Angeles, CA, Tech.
Rep. EPA-454/R-03-004, Sep. 2004. [Online]. Available:
http://www.epa.gov/scram001/7thconf/aermod/aermod mfd.pdf.

[13] Official CALPUFF modeling system. (n.d.). Exponent:


Engineering and Scientific Consulting. Accessed Jul. 29,
2016. [Online]. Available: http://www.src.com.

734

S-ar putea să vă placă și