Documente Academic
Documente Profesional
Documente Cultură
Submitted in Fulfillment of Requirement in this Lecture : EL8002 Topik Lanjut di Teknik Elektro - SEM II 2006/2007
Table of Contents
Table of Contents................................................................................................................ 2 Introduction......................................................................................................................... 3 Research Description .......................................................................................................... 4 Problem Definition.......................................................................................................... 4 Research Objective ......................................................................................................... 5 Research Methodology ................................................................................................... 5 Lastest Progress .............................................................................................................. 7 Agricultural Remote Sensing Basics .................................................................................. 8 Remote Sensing . . . How You Can Use It On Your Farm ............................................. 8 The Electromagnetic Spectrum....................................................................................... 8 Electromagnetic Energy and Plants ................................................................................ 9 How Does Remote Sensing Work?............................................................................... 11 Remote Sensing: The Complete Process ...................................................................... 13 Near Infra Red in Simple Remote Sensing for Agriculture.............................................. 14 What is Near Infrared?.................................................................................................. 14 What does NIR tell us? ................................................................................................. 15 The Digital Advantage.................................................................................................. 17 Testing for IR Sensitivity.............................................................................................. 18 NIR Images ................................................................................................................... 19 The Proposed System........................................................................................................ 21 The Autopilot System ................................................................................................... 22 Control Algorithm..................................................................................................... 22 Sensor Selection........................................................................................................ 22 Autopilot Hardware Design and Prototyping ........................................................... 24 Hardware in the Loop Simulation................................................................................. 25 HIL Simulator Development..................................................................................... 26 HIL Simulator Utilization ......................................................................................... 29 The Flight Planner Software ......................................................................................... 31 Preliminary Test and Concluding Remark........................................................................ 33 References......................................................................................................................... 35 2
Introduction
Agriculture is one of the main income sources in Indonesia. Most of the Indonesian citizens have jobs in Agriculture field. Despite this importance of agriculture in Indonesia, there is still lacks of good agriculture practices in Indonesia. One of the emerging practices in agriculture is "Precision Agriculture". Precision Agriculture refers to the use of an information and technology-based system for field management of crops. Information technology-based system will help the farmer making the right decision. This approach basically means adding the right amount of treatment at the right time and the right location within a fieldthats the precision part. Farmers want to know the right amounts of water, chemicals, pesticides, and herbicides they should use as well as precisely where and when to apply them. By using the tools of precision Agriculture, farmers can specifically target areas of need within their fields and apply just the right amounts of chemicals where and when they are needed, saving both time and money and minimizing their impact on the environment. Irrigation is both difficult and expensive and gets even more difficult when the topography of the terrain is graded. Farmers have a tendency to over irrigate, spending both more time and money than is necessary. Often times farmers look at weather variables and then schedule irrigation based on that information. But if they had better information, they could use scientific models and equations to compute more precisely, how much water their crop is using or how much more is needed. And all this require to have an accurate map of the field. Much of the ability to implement precision agriculture is based on information technologies; in particular, global positioning and navigation and geospatial / remote sensing mapping and analysis. As mentioned before one of the key technology in precision agriculture is geospatial / remote sensing mapping and analysis. An optimum remote sensing system for precision agriculture would provide data as often as twice per week for irrigation scheduling and once every two weeks for general crop damage detection. The spatial resolution of the data should be as high as 2 to 5 square meters per pixel with positional accuracy of within 2 meters. Additionally, the data must be available to the farmer within 24 hours of
acquiring them. Turnaround time is more important to farmers than data accuracy. They would gladly accept remote sensing measurements that are as poor as 75 percent accurate if they were assured of getting them within 24 hours of acquisition. Unfortunately, there are currently no Earth orbiting satellites that can meet all of a precision farmers requirements. This is where the Autonomous Unmanned Aerial Vehicle (UAV) will play its role. This research will proposed a new kind of relatively low cost autonomous UAV that will enable farmers to make just in time mosaics of aerial photo of their crop. These mosaics of aerial photo should be able to be produced with relatively low cost and within the 24 hours of acquisition constraint. The autonomous UAV will be equipped with payload management system specifically developed for rapid aerial mapping. As mentioned before turn around time is the key factor, so accuracy is not the main focus (not orthorectified aerial mapping). This system will also be equipped with special software to post process the aerial photos to produce the mosaic aerial photo map.
Research Description
Problem Definition
1. Aerial photo of crop fields is needed to enable farmers make the right decision about kind and amount of treatment at the right time and the right location within a field. 2. Turn around time of the aerial photo is more important to farmers than data accuracy. Usually the farmers need the information within 24 hours of acquisition. 3. A system that enables the farmers to make fast turn around time of the aerial photo of the crop field is needed. 4. Cost is important matter, this includes low first time investment and low operational and maintenance cost. 5. Ease of operation is also important matter, considering the availability of human resources quality.
Research Objective
1. Design and implement an UAV platform which is small enough to be operated from typical crop field in Indonesia without the need of special airstrip. The proposed launching method is by hand, so the UAV platform should be able to do short take off and landing (STOL, short take off and landing). 2. Design and implement a low cost autonomous autopilot navigation system that will be used to automatically navigate the UAV to cover the crop field to produce the mosaic aerial photo 3. Design and implements a simple flight planning software that will generates waypoints covering the crop field optimally 4. Design and implement payload management system onboard the UAV that enable the automatic timing of digital camera shutter release for aerial photo taking 5. Design and implement a post processing software that automates the mosaicking process of the aerial photos, so the turn around time will be fulfilled 6. For future development : development of specific payload for precision agriculture other than digital camera (such as bio chemical sensor, environmental sensor, weather sensor, etc)
Research Methodology
Development of Autonomous Unmanned Aerial Vehicle 1. Airframe : this is the aerial platform that will be instrumented with autonomous autopilot system and carry the mission specific payload (automatic flight planet for low altitude photography and digital camera). 2. Attitude, Heading and Position Reference System (AHPRS) : this is the main reference input for autonomous autopilot system. The AHPRS outputs euler angles (roll, pitch and yaw), true north absolute heading and position (latitude, longitude and altitude). 3. Autopilot System : this is the main controller of the airframe. It consists of two main part, the low level control system that governs the pose / attitude of the aircraft based on the objective trajectories. The second parts is the waypoint sequencer. This part determined which location the airframe should go (latitude, 5
longitude and altitude), and thus determine the trajectories input the control system. 4. Digital Camera Payload Management System and Automatic Flight Planner : The Flight Planner component will first make automatic waypoint (longitude, altitude, and altitude) that will optimally covers the area of interest to be photographed. Inputs to this systems are boundary of the area (longitude and altitude), scale of the desired aerial photo and horizontal-vertical overlap of each photo segment, then the system will automatically determined the altitude and automatic sequencing of digital camera shutter release. The Digital Camera Payload Management will simply command the shutter release sequence and logging the exact time and position of the shutter release (usually recognize as metadata, this information is needed for post processing and automatic rapid mosaicking). 5. Ground Station Software : this component will enable the operator to plan and monitor the mission execution the Autonomous Unmanned Aerial Vehicle, as well as reconfiguring the mission during execution. The monitoring is done in real time because a high speed long range data modem is used to transmit and receive mission parameter between UAV and ground station. 6. Final Integration : these steps are taken when all supporting components of the UAV are completely developed. Development of Post Processing Software Automatic Photo Mosaicking : this component will automatically combined the aerial photographs that covers small area along with the metadata (longitude, latitude and altitude of the digital camera) into single large aerial photographs that covers larger area. Aerial Photo based Agriculture Information System : this is the tools that will be used by the farmers to make analysis to the aerial photograph and support the decision making about the crop field. Remote Sensing and Geographic Information System concepts are involved in this system along with Precision Agriculture good practices.
Overall System Testing : this steps is conducted after the Unmanned Aerial System and Post Processing are completely developed. The objective is to make positive feedback to the overall research and development and to publicize the system to the potential users : the farmers.
Lastest Progress
Not all those task have been completed until this reporting phase. Airframe will be developed after the completion of other system, at this time a simple remote controlled trainer 60 airframe will be used for development purpose. AHPRS is still under development (there is other research report about this), for now the attitude reference for the autopilot is using thermopile sensor. The autopilot system have been developed in previous research and will be used as the basis. Digital Camera Payload Management System and Automatic Flight Planner have been completely developed and tested. Ground Station Software have been completely developed and tested.
wave to the peak of the next wave is the wavelength. Energy from sunlight is called the electromagnetic spectrum. The wavelengths used in most agricultural remote sensing applications cover only a small region of the electromagnetic spectrum. Wavelengths are measured in micrometers (m) or nanometers (nm). One um is about .00003937 inch and 1 m equals 1,000 nm. The visible region of the electromagnetic spectrum is from about 400 nm to about 700 nm. The green color associated with plant vigor has a wavelength that centers near 500 nm
Wavelengths longer than those in the visible region and up to about 25 m are in the infrared region. The infrared region nearest to that of the visible region is the near infrared (NIR) region. Both the visible and infrared regions are used in agricultural remote sensing.
wavelengths and the green color is reflected. Sunlight that is not reflected or absorbed is transmitted through the leaves to the ground. Interactions between reflected, absorbed, and transmitted energy can be detected by remote sensing. The differences in leaf colors, textures, shapes or even how the leaves are attached to plants, determine how much energy will be reflected, absorbed or transmitted. The relationship between reflected, absorbed and transmitted energy is used to determine spectral signatures of individual plants. Spectral signatures are unique to plant species. Remote sensing is used to identify stressed areas in fields by first establishing the spectral signatures of healthy plants. The spectral signatures of stressed plants appear altered from those of healthy plants. The following figure compares the spectral signatures of healthy and stressed sugarbeets.
10
Stressed sugarbeets have a higher reflectance value in the visible region of the spectrum from 400-700 nm. This pattern is reversed for stressed sugarbeets in the nonvisible range from about 750-1200 nm. The visible pattern is repeated in the higher reflectance range from about 1300-2400 nm. Interpreting the reflectance values at various wavelengths of energy can be used to assess crop health. The comparison of the reflectance values at different wavelengths, called a vegetative index, is commonly used to determine plant vigor. The most common vegetative index is the normalized difference vegetative index (NDVI). NDVI compares the reflectance values of the red and NIR regions of the electromagnetic spectrum. The NDVI value of each area on an image helps identify areas of varying levels of plant vigor within fields.
11
There are several factors to consider when choosing a remote sensing system for a particular application, including spatial resolution, spectral resolution, radiometric resolution, and temporal resolution. Spatial resolution refers to the size of the smallest object that can be detected in an image. The basic unit in an image is called a pixel. One-meter spatial resolution means each pixel image represents an area of one square meter. The smaller an area represented by one pixel, the higher the resolution of the image. Spectral resolution refers to the number of bands and the wavelength width of each band. A band is a narrow portion of the electromagnetic spectrum. Shorter wavelength widths can be distinguished in higher spectral resolution images. Multi-spectral imagery can measure several wavelength bands such as visible green or NIR. Landsat, Quickbird and Spot satellites use multi-spectral sensors. Hyperspectral imagery measures energy in narrower and more numerous bands than multi-spectral imagery. The narrow bands of hyperspectral imagery are more sensitive to variations in energy wavelengths and therefore have a greater potential to detect crop stress than multi-spectral imagery. Multispectral and hyperspectral imagery are used together to provide a more complete picture of crop conditions. Radiometric resolution refers to the sensitivity of a remote sensor to variations in the reflectance levels. The higher the radiometric resolution of a remote sensor, the more sensitive it is to detecting small differences in reflectance values. Higher radiometric resolution allows a remote sensor to provide a more precise picture of a specific portion of the electromagnetic spectrum. Temporal resolution refers to how often a remote sensing platform can provide coverage of an area. Geo-stationary satellites can provide continuous sensing while normal orbiting satellites can only provide data each time they pass over an area. Remote sensing taken from cameras mounted on airplanes is often used to provide data for applications requiring more frequent sensing. Cloud cover can interfere with the data from a scheduled remotely sensed data system. Remote sensors located in fields or attached to agricultural equipment can provide the most frequent temporal resolution.
12
13
14
15
By measuring the amount of reflected NIR and Red wavelengths, it is possible to determine the vegetative health of plants in an image. There are a number of different methods for quantifying the relationship between NIR/Red reflectance and plant happiness. These methods are called vegetative indices.
Vegetative Indices
The following is a short list of some common indices:
This is simply a ratio of NIR and Red reflectance. It has a typical range of around 1 for bare soil and >20 for thick vegetation.
16
The NDVI is the most commonly used vegetative index for remote sensing. Because the difference is divided by the sum of the reflectance, it normalizes the output and overcomes any problem with varying light levels at the time the measurements were taken. Typical numbers are between 0 for no vegetation and 1 for dense vegetation. Negative numbers are only possible over water bodies since 99.9% of land features will reflect more NIR than Red.
The SAVI index is almost identical to the NDVI, but it adds a soil adjustment factor that takes into account soil brightness. The values for L vary between 0 and 1, but a factor of 0.5 is a common approximation when the correct value is not known. These vegetation indices can be very useful. Many studies have shown a strong correlation between an index and a crops overall health. Recent studies have even used reflectance measurements in determining nitrogen content.
17
In order to use a digital camera for NIR work, a special external filter is needed to filter out all visible light and allow only IR to pass through. These filters are known as IR pass filters and have been used for years by film-based IR photographers and are available at many camera store locations. Since digital cameras use no film and all image information is recorded at the CCD, they provide many advantages over film-based IR photography: Instant images Proper exposures since it uses its CCD to measure light and set the shutter speed and aperture Affordable no more expensive IR film
18
NIR Images
The NIR images that you receive from a digital camera are monochrome, meaning they appear like black and white photos. It is possible to make the image look like an 'Ektachrome' color IR photo by combining a standard RGB photo with the IR image in Adobe Photoshop (see the tutorial for Photoshop). Taking both images can easily be done with a tripod and a single camera where one image is taken with the filter and the other without. One method we are testing is using two digital cameras in tandem, attached to a helium filled blimp. These cameras are then operated from the ground by a serial connection and images are stored on flash memory cards. By using consumer digital cameras to obtain IR images, a farmer could detect crop stress early and receive images instantly at a fraction of the cost of other methods.
19
20
21
Sensor Selection
Based on the Control Algorithm Development step, there several measurements needed by the PID control scenarios. These measurements are position measurements and attitude measurements. Position measurements are: speed, 22
For acquiring position measurements, a GPS receiver is used. The uBlox TIM-LA is chosen because it's relatively low cost and can provide 4 position information (speed, latitude, longitude, altitude and heading) every second (4Hz). For measuring roll and pitch angle, the best solution would be using Attitude and Heading Reference System (AHRS). AHRS consists of inertial sensors (gyroscope and accelerometer) and magnetic field sensor (magnetometer). Strap down inertial navigation mechanization and proprietary fusion algorithm is usually used in combining the sensor readings to produce reliable attitude information. But the commercially available AHRS is beyond this research budget. The other simple and low cost alternative is using a pair of thermopile sensor for sensing the horizon. The idea comes from Co-Pilottm, an auxiliary device to train a beginner aeromodeller. The basic principles are : thermopile sensor can sense remote object temperature, sky is typically having lower temperature than ground, and finally by installing the thermopile sensor in roll and pitch axis (4 thermopile sensors), during level flight all sensors approximately see the same amount of sky and ground, so the sensor output will approximately be the same. During pitch up (nose up) attitude the facing forward thermopile sensor sees more sky than ground, and the facing backward thermopile sensor sees more ground than sky. So the facing forward thermopile sensor sense cooler temperature than the facing backward thermopile sensor. By knowing the difference between the two sensor, the pitch up angle can be calculated. The same principle applied to the roll axis.
23
The thermopile sensor used in this prototype is Melexis MLX90247. For sensing heading / yaw a rate gyroscope is used. The absolute heading offset for yaw rate integration is taken from GPS heading. The yaw rate gyro used in this prototype is Analog Device ADXRS401.
24
25
In the case of UAV autopilot system development, A HIL simulator can be developed to simulate the flight characteristic of the airframe including the sensor output and the control input signal. The UAV autopilot system can be installed with the HIL simulator to see how the overall system works as a closed loop system. Here we can tune the PID gain parameter as well as the other system parameter and watch the effect to the airframe in the HIL simulator.
each one of these areas to ensure the entire plane is being affected in some way by external forces. This system produces figures that are far more accurate than those achieved by taking averages of an entire airfoil, for example. It also results in extremely precise physical properties that can be computed very quickly during flight, ultimately resulting in a much more realistic flight model. The X-Plane accuracy of the flight model is already approved by FAA, for full motion simulator to train commercial airline pilot. X-Plane's functionality can be customized using a plug in. A plug in is executable code that runs inside X-Plane, extending what X-Plane does. Plug ins are modular, allowing developers to extend the simulator without having to have the source code to the simulator. Plug ins allow the extension of the flight simulator's capabilities or gain access to the simulator's data. For HIL simulator purposes we need to make plug in that o reads attitude and position data to simulate sensor measurement o writes surface control deflection values to simulate servo command has ability to communicate with autopilot hardware with some kind of hardware interface (to give sensor measurement and accept servo command).
The HIL simulator consists of 3 main parts : sensors, process, and actuators.
The Sensors part simulates the sensor's output data in the airframe. This data will be processed by the UAV autopilot hardware as input. The sensor output data that should be produced by HIL simulator are position data (speed, altitude, latitude and longitude) and attitude data (roll, pitch and yaw). This can be accomplished by reading data from the simulator. The Actuators part simulates how the UAV autopilot hardware can change the surface control of the airframe (aileron, elevator and rudder) and throttle position. In real world application this will be done by controlling the hobby servos put in the corresponding
27
control surface or throttle engine. In HIL simulator this is done by writing data to the XPlane that will affect the control surface of the simulated airframe. The Process part simulates how the airframe will react to the input given by the UAV autopilot hardware. So basically this part is where we should put the system dynamic model (transfer function). Generally this is the most complex part of the HIL simulator, but fortunately this part is already provided by the X-Plane (using its blade element approach). The HIL simulator plug in communicates with the UAV autopilot hardware through RS232 serial communication.
28
There was one finding when testing the UAV autopilot reliability. The power supply regulator was not stable, and it can be seen from the overall system performance in the HIL simulator. This failure results in airframe crash in its worst. Since this test is conducted in HIL simulator no financial lost occurred. This is one example how HIL simulator can prevent airframe crash in real world field trial. The HIL simulator enables the PID gain tuning based on trial end error basis. Analytical method of PID gain tuning is much more difficult since we have to have the mathematical model of the plant (airframe transfer function). It's considered easier for this writer to tune the PID gain on trial and error basis since airframe crash is not a problem in HIL simulator. The shutter release mechanism is added to the autopilot system. The HIL is also utilized to test this mechanism. Basically it's a proximity detector for latitude position if the aerial photo trajectory is north-south, or proximity detector for longitude position if the aerial photo trajectory is east-west. The HIL simulator enable the automatic shutter release mechanism to be rigorously tested before real world test.
30
and this software will output the following : trajectory / waypoints for the autopilot system shutter release interval other information such as : o AreaHoriz o AreaVert o AreaPhotoHoriz o AreaPhotoVert o HorizPhotoCount o VertPhotoCount o TotalPhotoCount o VertPhotoDistance o HorizPhotoDistance o PhotoAltitude o WaypointCount
The waypoints generated by this software have to be uploaded to the autopilot. Here is the screenshot of this software
31
32
33
34
References
1. Ronnback, Sven, 2000, Development of a INS/GPS Navigation Loop for an UAV, Institutionen for Systemteknik Avdelningen for Robotik och Automation, Lulea Tekniska Universitet 2. Sanvido Marco,,Hardware-in-the-loop Simulation Framework,Automatic Control Laboratory, ETH Zurich 3. Arya, Hemendra,, Hardware-In-Loop Simulator for Mini Aerial Vehicle, Centre for Aerospace Systems Design and Engineering, Department of Aerospace Engineering, IIT Bombay, India 4. Gomez, Martin, 2001, Hardware-in-the-Loop Simulation, Embedded System Design, URL http://www.embedded.com/showArticle.jhtml?articleID=15201692 5. Desbiens, Andre and Manai, Myriam,, Identification of a UAV and Design of a Hardware-in-the-Loop System for Nonlinear ControlPurposes, Universite Laval, Quebec City, Canada 6. B. Taylor, C. Bil, and S. Watkins, 2003, Horizon Sensing Attitude Stabilisation: A VMC Autopilot, 18th International UAV Systems Conference, Bristol, UK, 7. URL http://www.x-plane.com 8. Widyawardana Adiprawita, Development of Simple Autonomous Fixed Wing Unmanned Aerial Vehicle Controller Hardware (draft), School of Electric Engineering and Informatics, Bandung Institute of Technology, 2006 9. Sven Ronnback, Development of a INS/GPS navigation loop for an UAV, Lulea Tekniska Universiteit, 2000 10. Yong Li, Andrew Dempster, Binghao Li, Jinling Wang, Chris Rizos, A low-cost attitude heading reference system by combination of GPS and magnetometers and MEMS inertial sensors for mobile applications, University of New South Wales, Sydney, NSW 2052, Australia 11. Stelian Persa, Pieter Jonker, Multi-sensor Robot Navigation System, Pattern Recognition Group, Technical University Delft, Lorentzweg 1, Delft, 2628 CJ,The Netherlands
35
12. J.F. Vasconcelos, J. Calvario, P. Oliveira, C. Silvestre, GPS AIDED IMU FOR UNMANNED AIR VEHICLES, Instituto Superior Tecnico, Institute for Systems and Robotics, Lisboa, Portugal, 2004 13. Michael J. Caruso, Applications of Magnetic Sensors for Low Cost Compass Systems, Honeywell, SSEC
36