Sunteți pe pagina 1din 60

Tehnologia LIDAR (Light Detection and Ranging), reprezint o tehnic activ de teledetecie cu ajutorul creia putem obine date

de o acuratee ridicat despre topografia terenului, vegetaie, cldiri etc. Informaii despre principiile LIDAR apar dinainte de descoperirea laserului. Din anul 1930 dateaz prima ncercare de msurare a densitii aerului n partea superioar a atmosferei. Acronimul de LIDAR a fost introdus pentru prima data n anul 1953 de ctre Middelton i Spilhaus. n anul 1960, odat cu descoperirea laserului (implementat de compania Hughes Aircraft), se trece la dezvoltarea tehnologiilor LIDAR moderne, evoluie ce a continuat de-a lungul timpului.

Caracteristici ale tehnologiei LIDAR


Tehnologia LIDAR folosete 3 sisteme de baz: scanarea laser pentru o ct mai bun msurare a distanelor, sistemul de poziionare global (GPS) i Inertial Measurement Unit (IMU) pentru nregistrarea orientrii (Fig.1). Toate aceste 3 sisteme necesit calculatoare puternice cu o capacitate ridicat de stocare i calcul. Cu ajutorul scanrii laser sunt nregistrate diferenele de timp dintre impulsurile laser trimise din avionul ce efectueaz zborul i cele reflectate de suprafaa topografic. Sistemul GPS (Global Position System) este reprezentat dintr-un receptor GPS situate n cadrul avionului ce realizeaz zborul pentru a nregistra poziia continu a acestuia i o staie GPS (diferenial GPS) amplasat n teren pentru a corecta diferenele, astfel nct s se obin o traiectorie ct mai bun a aparatului de zbor. Sistemul IMU const ntr-un set de giroscoape i accelerometre ce msoar continuu nlimea, acceleraia, avionului.

Figura 1. Sistemul LIDAR Pentru obinerea de date referitoare la topografia terenului, sistemul LIDAR recepioneaz impulsurile laser n intervalul de lungime de und cuprins ntre 1040 1060 nm (banda infrarou apropiat). Pentru obinerea de date referitoarea la batimetrie, undele laser sunt centrate aproximativ pe intervalul de und de 530 nm (benzile albastru i verde, benzi n care undele laser au capacitatea de a penetra apa). Tehnologia LIDAR evit de asemenea problemele de ortorectificare, deoarece fiecare punct este georefereniat. Seturile de date LIDAR se gsesc fie n formatul LAS, fie n formatul ASCII.

Manipularea seturilor de date LIDAR


Seturile de date LIDAR i nu numai, constau n sute de milioane de puncte (puncte ce conin informaii de tipul x,y,z), ce sunt foarte mari pentru a fi manipulate. Pentru procesarea unui set aa mare de date, transferul dintre disk i memoria intern (numit i I/O input/output), reduce semnificativ performana mainii de calcul ducnd chiar la blocaje. Un algoritm eficient de procesare a intrrilor ieirilor care minimizeaz numrul de accesri ale diskului extern duce la mbuntiri semnificative a performanei. Exist numeroase aplicaii ce au module de interpolare, ns toate acestea ntmpin dificulti n manipularea seturilor mari de date (de ordinul sutelor de milioane). Cercetrile fcute de profesorii Pankaj K. Agarwal, Lars Arge i Andrew Danner au scos n eviden urmtoarele: Toate metodele de interpolare ale aplicaiei ArcGis 9.1 i anume Kriging, IDW (Inverse Distance Weight), Spline i Topo-to-Raster (bazat pe metoda ANUDEM a lui Hutchinson), nu au reuit s proceseze mai mult de 25 de milioane de puncte pentru a obine un DEM la o rezoluie spaial de 6 m, singura care s-a apropiat cel mai mult fiind metoda Topo-to-Raster (aproximativ 21 de milioane de puncte). O alt aplicaie a fost GRASS, folosindu-se modulul s.surf.rst, dar nici acesta nu a reuit s interpoleze mai mult de 25 de milioane de puncte. Singura aplicaie ce a reuit sa proceseze un numr mai mare de 25 de milioane de puncte (aproximativ 50 de milioane de puncte), a fost aplicaia QTModeler 4 a companiei Applied Imagery. Cei trei profesori amintii mai sus, au impelementat un algoritm I/O (input/output) mult mai eficient, pentru a reliza un DEM folosind seturi de date LIDAR (un set S de N puncte), bazat pe procedeul quad-tree segmentation ce reuete s proceseze un numr mult mai mare de puncte. Algoritmul const n trei faze: faza de fragmentare, pentru care segmentarea este calculat n funcie de setul de date S; faza identificrii vecinului, unde pentru fiecare segment n descompunere se calculeaz punctele din segment i segmentele nvecinate relevante i faza de interpolare, unde suprafaa este interpolat i sunt calculate valorile ficrei celule a grid-ului din segment. Algoritmul a fost implementat n aplicaia C++. Ca date de intrare, folosete un set de date S, o valoare pentru dimensiunea celulei grid-ului i un parametru Kmax care stabilete numrul maxim de puncte dintr-un segment i calculeaz suprafaa interpolat a grid-ului folosind algoritmul de segmentare i metoda spline pentru interpolare. Folosind acest algoritm, s-a reuit interpolarea unui set de date de 500 milioane de puncte (aproximativ 20 Gb), n urma creia s-a obinut un grid cu o rezoluie spaial de 6 m ntr-un timp de 53 de ore. Este de observat faptul c acest algoritm permite procesarea unui numar destul de mare de date, pretndu-se foarte bine la interpolarea seturilor de date LIDAR.

Aplicaii utile pentru procesarea datelor LIDAR


ALDPAT, aplicaie util n analiza i clasificarea datelor LIDAR. Aplicaie free. HHViewer, aplicaie ce permite utilizatorilor s vizualizeze, analizeze, editeze seturi de date 2D i 3D. Aplicaie comercial.

LIDAR Analyst extensie a aplicaiei ArcGIS, extensie ce extrage automat i vizualizeaz 3D date despre topografia terenului, cldiri, pomi i areale acoperite cu pduri, obinute din seturi de date LIDAR. Aplicaie comercial. LViz, aplicaie implementat de ctre Jeffrey Conner cercettor n cadrul Universitii din Arizona, conceput special pentru interpolarea i vizualizarea 3D a datelor LIDAR. Aplicaie free. MARS, aplicaie conceput pentru analiza, procesarea i manipularea seturilor mari de date. Aplicaie comercial. Quick Terrain Modeler, aplicaie implementa de Jonhs Hopkins, ce reuete s proceseze i s vizualizeze 3D seturi mari de date (aproximativ 200 de milioane de puncte). Aplicaie comercial. Terrasolid, aplicaie destinat procesrii seturilor mari de date obinute prin scanare laser. Aplicaie comercial.

Bibliografie
Claus Weitkamp Lidar: Range-Resolved Optical Remote Sensing of the Atmosphere Pankaj K. Agarwal, Lars Arge i Andrew Danner From Point Cloud to Grid DEM: A Scalable Approach GEON Lidar Remote Sensing Overview LIDAR Technology

Remote sensing
From Wikipedia, the free encyclopedia Jump to: navigation, search For the technique in archaeological surveying, see remote sensing (archaeology). For the claimed psychic ability, see remote viewing. For the electrical meaque, see four-terminal sensing.

Synthetic aperture radar image of Death Valley colored using polarimetry.

Remote sensing is the acquisition of information about an object or phenomenon, without making physical contact with the object. In modern usage, the term generally refers to the use of aerial sensor technologies to detect and classify objects on Earth (both on the surface, and in the atmosphere and oceans) by means of propagated signals (e.g. electromagnetic radiation emitted from aircraft or satellites).[1][2]

Contents

[hide]

1 Overview 2 Data acquisition techniques o 2.1 Applications of remote sensing data o 2.2 Geodetic o 2.3 Acoustic and near-acoustic 3 Data processing o 3.1 Data processing levels 4 History 5 Remote Sensing software 6 See also 7 References 8 Further reading 9 External links

[edit] Overview

This video is about how Landsat was used to identify areas of conservation in the Democratic Republic of the Congo, and how it was used to help map an area called MLW in the norh.

There are two main types of remote sensing: passive remote sensing and active remote sensing. [3] Passive sensors detect natural radiation that is emitted or reflected by the object or surrounding areas. Reflected sunlight is the most common source of radiation measured by passive sensors. Examples of passive remote sensors include film photography, infrared, charge-coupled devices, and radiometers. Active collection, on the other hand, emits energy in order to scan objects and areas whereupon a sensor then detects and measures the radiation that is reflected or backscattered from the target. RADAR and LiDAR are examples of active remote

sensing where the time delay between emission and return is measured, establishing the location, height, speed and direction of an object. Remote sensing makes it possible to collect data on dangerous or inaccessible areas. Remote sensing applications include monitoring deforestation in areas such as the Amazon Basin, glacial features in Arctic and Antarctic regions, and depth sounding of coastal and ocean depths. Military collection during the Cold War made use of stand-off collection of data about dangerous border areas. Remote sensing also replaces costly and slow data collection on the ground, ensuring in the process that areas or objects are not disturbed. Orbital platforms collect and transmit data from different parts of the electromagnetic spectrum, which in conjunction with larger scale aerial or ground-based sensing and analysis, provides researchers with enough information to monitor trends such as El Nio and other natural long and short term phenomena. Other uses include different areas of the earth sciences such as natural resource management, agricultural fields such as land usage and conservation, and national security and overhead, ground-based and stand-off collection on border areas.[4] By satellite, aircraft, spacecraft, buoy, ship, and helicopter images, data is created to analyze and compare things like vegetation rates, erosion, pollution, forestry, weather, and land use. These things can be mapped, imaged, tracked and observed. The process of remote sensing is also helpful for city planning, archaeological investigations, military observation and geomorphological surveying.

[edit] Data acquisition techniques


The basis for multispectral collection and analysis is that of examined areas or objects that reflect or emit radiation that stand out from surrounding areas.
[edit] Applications of remote sensing data

Conventional radar is mostly associated with aerial traffic control, early warning, and certain large scale meteorological data. Doppler radar is used by local law enforcements monitoring of speed limits and in enhanced meteorological collection such as wind speed and direction within weather systems. Other types of active collection includes plasmas in the ionosphere. Interferometric synthetic aperture radar is used to produce precise digital elevation models of large scale terrain (See RADARSAT, TerraSAR-X, Magellan). Laser and radar altimeters on satellites have provided a wide range of data. By measuring the bulges of water caused by gravity, they map features on the seafloor to a resolution of a mile or so. By measuring the height and wavelength of ocean waves, the altimeters measure wind speeds and direction, and surface ocean currents and directions. Light detection and ranging (LIDAR) is well known in examples of weapon ranging, laser illuminated homing of projectiles. LIDAR is used to detect and measure the concentration of various chemicals in the atmosphere, while airborne LIDAR can be used to measure heights of objects and features on the ground more accurately than with radar technology. Vegetation remote sensing is a principal application of LIDAR. Radiometers and photometers are the most common instrument in use, collecting reflected and emitted radiation in a wide range of frequencies.

The most common are visible and infrared sensors, followed by microwave, gamma ray and rarely, ultraviolet. They may also be used to detect the emission spectra of various chemicals, providing data on chemical concentrations in the atmosphere. Stereographic pairs of aerial photographs have often been used to make topographic maps by imagery and terrain analysts in trafficability and highway departments for potential routes. Simultaneous multi-spectral platforms such as Landsat have been in use since the 70s. These thematic mappers take images in multiple wavelengths of electro-magnetic radiation (multi-spectral) and are usually found on Earth observation satellites, including (for example) the Landsat program or the IKONOS satellite. Maps of land cover and land use from thematic mapping can be used to prospect for minerals, detect or monitor land usage, deforestation, and examine the health of indigenous plants and crops, including entire farming regions or forests. Hyperspectral imaging produces an image where each pixel has full spectral information with imaging narrow spectral bands over a contiguous spectral range. Hyperspectral imagers are used in various applications including mineralogy, biology, defence, and environmental measurements. Within the scope of the combat against desertification, remote sensing allows to follow-up and monitor risk areas in the long term, to determine desertification factors, to support decision-makers in defining relevant measures of environmental management, and to assess their impacts.[5]

[edit] Geodetic

Overhead geodetic collection was first used in aerial submarine detection and gravitational data used in military maps. This data revealed minute perturbations in the Earths gravitational field (geodesy) that may be used to determine changes in the mass distribution of the Earth, which in turn may be used for geological studies.

[edit] Acoustic and near-acoustic

Sonar: passive sonar, listening for the sound made by another object (a vessel, a whale etc.); active sonar, emitting pulses of sounds and listening for echoes, used for detecting, ranging and measurements of underwater objects and terrain. Seismograms taken at different locations can locate and measure earthquakes (after they occur) by comparing the relative intensity and precise timing.

To coordinate a series of large-scale observations, most sensing systems depend on the following: platform location, what time it is, and the rotation and orientation of the sensor. High-end instruments now often use positional information from satellite navigation systems. The rotation and orientation is often provided within a degree or two with electronic compasses. Compasses can measure not just azimuth (i. e. degrees to magnetic north), but also altitude (degrees above the horizon), since the magnetic field curves into the Earth at different angles at different latitudes. More exact orientations require gyroscopic-aided orientation, periodically realigned by different methods including navigation from stars or known benchmarks. Resolution impacts collection and is best explained with the following relationship: less resolution=less detail & larger coverage, More resolution=more detail, less coverage. The

skilled management of collection results in cost-effective collection and avoid situations such as the use of multiple high resolution data which tends to clog transmission and storage infrastructure.

[edit] Data processing


See also: Inverse problem

Generally speaking, remote sensing works on the principle of the inverse problem. While the object or phenomenon of interest (the state) may not be directly measured, there exists some other variable that can be detected and measured (the observation), which may be related to the object of interest through the use of a data-derived computer model. The common analogy given to describe this is trying to determine the type of animal from its footprints. For example, while it is impossible to directly measure temperatures in the upper atmosphere, it is possible to measure the spectral emissions from a known chemical species (such as carbon dioxide) in that region. The frequency of the emission may then be related to the temperature in that region via various thermodynamic relations. The quality of remote sensing data consists of its spatial, spectral, radiometric and temporal resolutions.
Spatial resolution The size of a pixel that is recorded in a raster image typically pixels may correspond to square areas ranging in side length from 1 to 1,000 metres (3.3 to 3,300 ft). Spectral resolution The wavelength width of the different frequency bands recorded usually, this is related to the number of frequency bands recorded by the platform. Current Landsat collection is that of seven bands, including several in the infra-red spectrum, ranging from a spectral resolution of 0.07 to 2.1 m. The Hyperion sensor on Earth Observing-1 resolves 220 bands from 0.4 to 2.5 m, with a spectral resolution of 0.10 to 0.11 m per band. Radiometric resolution The number of different intensities of radiation the sensor is able to distinguish. Typically, this ranges from 8 to 14 bits, corresponding to 256 levels of the gray scale and up to 16,384 intensities or "shades" of colour, in each band. It also depends on the instrument noise. Temporal resolution

The frequency of flyovers by the satellite or plane, and is only relevant in time-series studies or those requiring an averaged or mosaic image as in deforesting monitoring. This was first used by the intelligence community where repeated coverage revealed changes in infrastructure, the deployment of units or the modification/introduction of equipment. Cloud cover over a given area or object makes it necessary to repeat the collection of said location.

In order to create sensor-based maps, most remote sensing systems expect to extrapolate sensor data in relation to a reference point including distances between known points on the ground. This depends on the type of sensor used. For example, in conventional photographs, distances are accurate in the center of the image, with the distortion of measurements increasing the farther you get from the center. Another factor is that of the platen against which the film is pressed can cause severe errors when photographs are used to measure ground distances. The step in which this problem is resolved is called georeferencing, and involves computer-aided matching up of points in the image (typically 30 or more points per image) which is extrapolated with the use of an established benchmark, "warping" the image to produce accurate spatial data. As of the early 1990s, most satellite images are sold fully georeferenced. In addition, images may need to be radiometrically and atmospherically corrected.
Radiometric correction Gives a scale to the pixel values, e. g. the monochromatic scale of 0 to 255 will be converted to actual radiance values. Topographic correction In the rugged mountains, as a result of terrain, each pixel which receives the effective illumination varies considerably different. In remote sensing image, the pixel on the shady slope receives weak illumination and has a low radiance value, in contrast, the pixel on the sunny slope receives strong illumination and has a high radiance value. For the same objects, the pixel radiance values on the shady slope must be very different from that on the sunny slope. Different objects may have the similar radiance values. This spectral information changes seriously affected remote sensing image information extraction accuracy in the mountainous area. It became the main obstacle to further application on remote sensing images. The purpose of topographic correction is to eliminate this effect, recovery true reflectivity or radiance of objects in horizontal conditions. It is the premise of quantitative remote sensing application. Atmospheric correction eliminates atmospheric haze by rescaling each frequency band so that its minimum value (usually realised in water bodies) corresponds to a pixel

value of 0. The digitizing of data also make possible to manipulate the data by changing gray-scale values.

Interpretation is the critical process of making sense of the data. The first application was that of aerial photographic collection which used the following process; spatial measurement through the use of a light table in both conventional single or stereographic coverage, added skills such as the use of photogrammetry, the use of photomosaics, repeat coverage, Making use of objects known dimensions in order to detect modifications. Image Analysis is the recently developed automated computer-aided application which is in increasing use. Object-Based Image Analysis (OBIA) is a sub-discipline of GIScience devoted to partitioning remote sensing (RS) imagery into meaningful image-objects, and assessing their characteristics through spatial, spectral and temporal scale. Old data from remote sensing is often valuable because it may provide the only long-term data for a large extent of geography. At the same time, the data is often complex to interpret, and bulky to store. Modern systems tend to store the data digitally, often with lossless compression. The difficulty with this approach is that the data is fragile, the format may be archaic, and the data may be easy to falsify. One of the best systems for archiving data series is as computergenerated machine-readable ultrafiche, usually in typefonts such as OCR-B, or as digitized halftone images. Ultrafiches survive well in standard libraries, with lifetimes of several centuries. They can be created, copied, filed and retrieved by automated systems. They are about as compact as archival magnetic media, and yet can be read by human beings with minimal, standardized equipment.
[edit] Data processing levels

To facilitate the discussion of data processing in practice, several processing levels were first defined in 1986 by NASA as part of its Earth Observing System[6] and steadily adopted since then, both internally at NASA (e. g.,[7]) and elsewhere (e. g.,[8]); these definitions are:

Lev el

Description

Reconstructed, unprocessed instrument and payload data at full resolution, with any and all communications artifacts (e. g., synchronization frames, communications headers, duplicate data) removed.

1a

Reconstructed, unprocessed instrument data at full resolution, timereferenced, and annotated with ancillary information, including radiometric and geometric calibration coefficients and georeferencing parameters (e. g., platform ephemeris) computed and appended but not applied to the Level 0 data (or if applied, in a manner that level 0 is fully recoverable from level 1a

data).

1b

Level 1a data that have been processed to sensor units (e. g., radar backscatter cross section, brightness temperature, etc.); not all instruments have Level 1b data; level 0 data is not recoverable from level 1b data.

Derived geophysical variables (e. g., ocean wave height, soil moisture, ice concentration) at the same resolution and location as Level 1 source data.

Variables mapped on uniform spacetime grid scales, usually with some completeness and consistency (e. g., missing points interpolated, complete regions mosaicked together from multiple orbits, etc.).

Model output or results from analyses of lower level data (i. e., variables that were not measured by the instruments but instead are derived from these measurements).

A Level 1 data record is the most fundamental (i. e., highest reversible level) data record that has significant scientific utility, and is the foundation upon which all subsequent data sets are produced. Level 2 is the first level that is directly usable for most scientific applications; its value is much greater than the lower levels. Level 2 data sets tend to be less voluminous than Level 1 data because they have been reduced temporally, spatially, or spectrally. Level 3 data sets are generally smaller than lower level data sets and thus can be dealt with without incurring a great deal of data handling overhead. These data tend to be generally more useful for many applications. The regular spatial and temporal organization of Level 3 datasets makes it feasible to readily combine data from different sources.

[edit] History

The TR-1 reconnaissance/surveillance aircraft.

The 2001 Mars Odyssey used spectrometers and imagers to hunt for evidence of past or present water and volcanic activity on Mars.

The modern discipline of remote sensing arose with the development of flight. The balloonist G. Tournachon (alias Nadar) made photographs of Paris from his balloon in 1858. Messenger pigeons, kites, rockets and unmanned balloons were also used for early images. With the exception of balloons, these first, individual images were not particularly useful for map making or for scientific purposes.[citation needed] Systematic aerial photography was developed for military surveillance and reconnaissance purposes beginning in World War I and reaching a climax during the Cold War with the use of modified combat aircraft such as the P-51, P-38, RB-66 and the F-4C, or specifically designed collection platforms such as the U2/TR-1, SR-71, A-5 and the OV-1 series both in overhead and stand-off collection. A more recent development is that of increasingly smaller sensor pods such as those used by law enforcement and the military, in both manned and unmanned platforms. The advantage of this approach is that this requires minimal modification to a given airframe. Later imaging technologies would include Infra-red, conventional, doppler and synthetic aperture radar.[citation needed] The development of artificial satellites in the latter half of the 20th century allowed remote sensing to progress to a global scale as of the end of the Cold War. Instrumentation aboard various Earth observing and weather satellites such as Landsat, the Nimbus and more recent missions such as RADARSAT and UARS provided global measurements of various data for civil, research, and military purposes. Space probes to other planets have also provided the opportunity to conduct remote sensing studies in extraterrestrial environments, synthetic aperture radar aboard the Magellan spacecraft provided detailed topographic maps of Venus, while instruments aboard SOHO allowed studies to be performed on the Sun and the solar wind, just to name a few examples.[citation needed] Recent developments include, beginning in the 1960s and 1970s with the development of image processing of satellite imagery. Several research groups in Silicon Valley including NASA

Ames Research Center, GTE and ESL Inc. developed Fourier transform techniques leading to the first notable enhancement of imagery data.[citation needed]

LIDAR (Light Detection And Ranging, also LADAR) is an optical remote sensing technology that can measure the distance to, or other properties of a target by illuminating the target with light, often using pulses from a laser. LIDAR technology has application in geomatics, archaeology, geography, geology, geomorphology, seismology, forestry, remote sensing and atmospheric physics,[1] as well as in airborne laser swath mapping (ALSM), laser altimetry and LIDAR contour mapping. The acronym LADAR (Laser Detection and Ranging) is often used in military contexts. The term "laser radar" is sometimes used, even though LIDAR does not employ microwaves or radio waves and therefore is not radar in the strict sense of the word.

Contents
[hide]

1 General description 2 Design 3 Applications o 3.1 Agriculture o 3.2 Archaeology o 3.3 Biology and conservation o 3.4 Geology and soil science o 3.5 Meteorology and atmospheric environment o 3.6 Law enforcement o 3.7 Military o 3.8 Physics and astronomy o 3.9 Robotics o 3.10 Surveying o 3.11 Transportation o 3.12 Wind farm optimization o 3.13 Solar Photovoltaic Deployment Optimization o 3.14 Other uses o 3.15 Alternative technologies 4 See also 5 References 6 External links

[edit] General description


LIDAR uses ultraviolet, visible, or near infrared light to image objects and can be used with a wide range of targets, including non-metallic objects, rocks, rain, chemical compounds, aerosols, clouds and even single molecules.[1] A narrow laser beam can be used to map physical features with very high resolution.

LIDAR has been used extensively for atmospheric research and meteorology. Downwardlooking LIDAR instruments fitted to aircraft and satellites are used for surveying and mapping a recent example being the NASA Experimental Advanced Research Lidar.[2] In addition LIDAR has been identified by NASA as a key technology for enabling autonomous precision safe landing of future robotic and crewed lunar landing vehicles.[3] Wavelengths in a range from about 10 micrometers to the UV (ca. 250 nm) are used to suit the target. Typically light is reflected via backscattering. Different types of scattering are used for different LIDAR applications; most common are Rayleigh scattering, Mie scattering and Raman scattering, as well as fluorescence. Based on different kinds of backscattering, the LIDAR can be accordingly called Rayleigh LiDAR, Mie LiDAR, Raman LiDAR and Na/Fe/K Fluorescence LIDAR and so on.[1] Suitable combinations of wavelengths can allow for remote mapping of atmospheric contents by looking for wavelength-dependent changes in the intensity of the returned signal.

[edit] Design

A basic LIDAR system involves a laser range finder reflected by a rotating mirror (top). The laser is scanned around the scene being digitised, in one or two

dimensions (middle), gathering distance measurements at specified angle intervals (bottom).

In general there are two kinds of lidar detection schema: "incoherent" or direct energy detection (which is principally an amplitude measurement) and Coherent detection (which is best for doppler, or phase sensitive measurements). Coherent systems generally use Optical heterodyne detection which being more sensitive than direct detection allows them to operate a much lower power but at the expense of more complex transceiver requirements. In both coherent and incoherent LIDAR, there are two types of pulse models: micropulse lidar systems and high energy systems. Micropulse systems have developed as a result of the ever increasing amount of computer power available combined with advances in laser technology. They use considerably less energy in the laser, typically on the order of one microjoule, and are often "eye-safe," meaning they can be used without safety precautions. High-power systems are common in atmospheric research, where they are widely used for measuring many atmospheric parameters: the height, layering and densities of clouds, cloud particle properties (extinction coefficient, backscatter coefficient, depolarization), temperature, pressure, wind, humidity, trace gas concentration (ozone, methane, nitrous oxide, etc.).[1] There are several major components to a LIDAR system:
1. Laser 6001000 nm lasers are most common for non-scientific applications. They are inexpensive, but since they can be focused and easily absorbed by the eye, the maximum power is limited by the need to make them eye-safe. Eye-safety is often a requirement for most applications. A common alternative, 1550 nm lasers, are eye-safe at much higher power levels since this wavelength is not focused by the eye, but the detector technology is less advanced and so these wavelengths are generally used at longer ranges and lower accuracies. They are also used for military applications as 1550 nm is not visible in night vision goggles, unlike the shorter 1000 nm infrared laser. Airborne topographic mapping lidars generally use 1064 nm diode pumped YAG lasers, while bathymetric systems generally use 532 nm frequency doubled diode pumped YAG lasers because 532 nm penetrates water with much less attenuation than does 1064 nm. Laser settings include the laser repetition rate (which controls the data collection speed). Pulse length is generally an attribute of the laser cavity length, the number of passes required through the gain material (YAG, YLF, etc.), and Q-switch speed. Better target resolution is achieved with shorter pulses, provided the LIDAR receiver detectors and electronics have sufficient bandwidth.[1] 2. Scanner and optics How fast images can be developed is also affected by the speed at which they are scanned. There are several options to scan the azimuth and elevation, including dual oscillating plane mirrors, a combination with a polygon mirror, a dual axis scanner (see Laser scanning). Optic choices affect the angular resolution and range that can be detected. A hole mirror or a beam splitter are options to collect a return signal. 3. Photodetector and receiver electronics Two main photodetector technologies are used in lidars: solid state photodetectors, such as silicon avalanche photodiodes, or photomultipliers. The sensitivity of the receiver is another parameter that has to be balanced in a LIDAR design. 4. Position and navigation systems LIDAR sensors that are mounted on mobile platforms such as airplanes or satellites require instrumentation to

determine the absolute position and orientation of the sensor. Such devices generally include a Global Positioning System receiver and an Inertial Measurement Unit (IMU).

3D imaging can be achieved using both scanning and non-scanning systems. "3D gated viewing laser radar" is a non-scanning laser ranging system that applies a pulsed laser and a fast gated camera. Imaging LIDAR can also be performed using arrays of high speed detectors and modulation sensitive detectors arrays typically built on single chips using CMOS and hybrid CMOS/CCD fabrication techniques. In these devices each pixel performs some local processing such as demodulation or gating at high speed down converting the signals to video rate so that the array may be read like a camera. Using this technique many thousands of pixels / channels may be acquired simultaneously.[4] High resolution 3D LIDAR cameras use homodyne detection with an electronic CCD or CMOS shutter.[5] A coherent Imaging LIDAR uses Synthetic array heterodyne detection to enables a staring single element receiver to act as though it were an imaging array.[6]

[edit] Applications

This LIDAR-equipped mobile robot uses its LIDAR to construct a map and avoid obstacles.

Other than those applications listed above, there are a wide variety of applications of LIDAR, as often mentioned in National LIDAR Dataset programs.

[edit] Agriculture

Agricultural Research Service scientists have developed a way to incorporate LIDAR with yield rates on agricultural fields. This technology will help farmers improve their yields by directing their resources toward the high-yield sections of their land.

LIDAR also can be used to help farmers determine which areas of their fields to apply costly fertilizer. LIDAR can create a topographical map of the fields and reveals the slopes and sun exposure of the farm land. Researchers at the Agricultural Research Service blended this topographical information with the farm lands yield results from previous years. From this information, researchers categorized the farm land into high-, medium-, or low-yield zones.[7] This technology is valuable to farmers because it indicates which areas to apply the expensive fertilizers to achieve the highest crop yield.
[edit] Archaeology

LIDAR has many applications in the field of archaeology including aiding in the planning of field campaigns, mapping features beneath forest canopy,[8] and providing an overview of broad, continuous features that may be indistinguishable on the ground. LIDAR can also provide archaeologists with the ability to create high-resolution digital elevation models (DEMs) of archaeological sites that can reveal micro-topography that are otherwise hidden by vegetation. LiDAR-derived products can be easily integrated into a Geographic Information System (GIS) for analysis and interpretation. For example at Fort Beausejour - Fort Cumberland National Historic Site, Canada, previously undiscovered archaeological features below forest canopy have been mapped that are related to the siege of the Fort in 1755. Features that could not be distinguished on the ground or through aerial photography were identified by overlaying hillshades of the DEM created with artificial illumination from various angles. With LIDAR the ability to produce high-resolution datasets quickly and relatively cheaply can be an advantage. Beyond efficiency, its ability to penetrate forest canopy has led to the discovery of features that were not distinguishable through traditional geo-spatial methods and are difficult to reach through field surveys, as in work at Caracol by Arlen Chase and his wife Diane Zaino Chase.[9] The intensity of the returned signal can be used to detect features buried under flat vegetated surfaces such as fields, especially when mapping using the infrared spectrum. The presence of these features affects plant growth and thus the amount of infrared light reflected back.[10]

[edit] Biology and conservation

LIDAR has also found many applications in forestry. Canopy heights, biomass measurements, and leaf area can all be studied using airborne LIDAR systems. Similarly, LIDAR is also used by many industries, including Energy and Railroad, and the Department of Transportation as a faster way of surveying. Topographic maps can also be generated readily from LIDAR, including for recreational use such as in the production of orienteering maps.[1] In addition, the Save-the-Redwoods League is undertaking a project to map the tall redwoods on California's northern coast. LIDAR allows research scientists to not only measure the height of previously unmapped trees but to determine the biodiversity of the redwood forest. Stephen Sillett who is working with the League on the North Coast LIDAR project claims this technology will be useful in directing future efforts to preserve and protect ancient redwood trees.[11][full citation needed]
[edit] Geology and soil science

High-resolution digital elevation maps generated by airborne and stationary LIDAR have led to significant advances in geomorphology (the branch of geoscience concerned with the origin and evolution of Earth's surface topography). LIDAR's abilities to detect subtle topographic features such as river terraces and river channel banks, to measure the land-surface elevation beneath the vegetation canopy, to better resolve spatial derivatives of elevation, and to detect elevation changes between repeat surveys have enabled many novel studies of the physical and chemical processes that shape landscapes.[citation needed] In geophysics and tectonics, a combination of aircraft-based LIDAR and GPS has evolved into an important tool for detecting faults and for measuring uplift. The output of the two technologies can produce extremely accurate elevation models for terrain - models that can even measure ground elevation through trees. This combination was used most famously to find the location of the Seattle Fault in Washington, USA.[12] This combination also measures uplift at Mt. St. Helens by using data from before and after the 2004 uplift.[13] Airborne LIDAR systems monitor glaciers and have the ability to detect subtle amounts of growth or decline. A satellitebased system, NASA's ICESat, includes a LIDAR sub-system for this purpose. NASA's Airborne Topographic Mapper[14] is also used extensively to monitor glaciers and perform coastal change analysis. The combination is also used by soil scientists while creating a soil survey. The detailed terrain modeling allows soil scientists to see slope changes and landform breaks which indicate patterns in soil spatial relationships.
[edit] Meteorology and atmospheric environment

The first LIDAR systems were used for studies of atmospheric composition, structure, clouds, and aerosols. Initially based on ruby lasers, LIDAR for meteorological applications was constructed shortly after the invention of the laser and represent one of the first applications of laser technology. Differential Absorption LIDAR (DIAL) is used for range-resolved measurements of a particular gas in the atmosphere, such as ozone, carbon dioxide, or water vapor. The LIDAR transmits two wavelengths: an "on-line" wavelength that is absorbed by the gas of interest and an off-line wavelength that is not absorbed. The differential absorption between the two wavelengths is a

measure of the concentration of the gas as a function of range. DIAL LIDARs are essentially dual-wavelength backscatter LIDARS.[citation needed] Doppler LIDAR and Rayleigh Doppler LIDAR are used to measure temperature and/or wind speed along the beam by measuring the frequency of the backscattered light. The Doppler broadening of gases in motion allows the determination of properties via the resulting frequency shift.[15][16] Scanning LIDARs, such as NASA's HARLIE LIDAR, have been used to measure atmospheric wind velocity in a large three dimensional cone.[17] ESA's wind mission ADMAeolus will be equipped with a Doppler LIDAR system in order to provide global measurements of vertical wind profiles.[18] A doppler LIDAR system was used in the 2008 Summer Olympics to measure wind fields during the yacht competition.[19] Doppler LIDAR systems are also now beginning to be successfully applied in the renewable energy sector to acquire wind speed, turbulence, wind veer and wind shear data. Both pulsed and continuous wave systems are being used. Pulsed systems using signal timing to obtain vertical distance resolution, whereas continuous wave systems rely on detector focusing. Synthetic Array LIDAR allows imaging LIDAR without the need for an array detector. It can be used for imaging Doppler velocimetry, ultra-fast frame rate (MHz) imaging, as well as for speckle reduction in coherent LIDAR.[6] An extensive LIDAR bibliography for atmospheric and hydrospheric applications is given by Grant.[20]
[edit] Law enforcement See also: LIDAR speed gun

LIDAR speed guns are used by the police to measure the speed of vehicles for speed limit enforcement purposes.[citation needed]
[edit] Military

Few military applications are known to be in place and are classified, but a considerable amount of research is underway in their use for imaging. Higher resolution systems collect enough detail to identify targets, such as tanks. Here the name LADAR is more common. Examples of military applications of LIDAR include the Airborne Laser Mine Detection System (ALMDS) for counter-mine warfare by Aret Associates.[21] A NATO report (RTO-TR-SET-098) evaluated the potential technologies to do stand-off detection for the discrimination of biological warfare agents. The potential technologies evaluated were Long-Wave Infrared (LWIR), Differential Scatterring (DISC), and Ultraviolet Laser Induced Fluorescence (UV-LIF). The report concluded that : Based upon the results of the LIDAR systems tested and discussed above, the Task Group recommends that the best option for the near-term (20082010) application of stand-off detection systems is UV LIF .[22] However, in the long-term, other techniques such as stand-off Raman spectroscopy may prove to be useful for identification of biological warfare agents. Short-range compact spectrometric lidar based on Laser-Induced Fluorescence (LIF) would address the presence of bio-threats in aerosol form over critical indoor, semi-enclosed and outdoor venues like stadiums, subways, and airports. This near real-time capability would

enable rapid detection of a bioaerosol release and allow for timely implementation of measures to protect occupants and minimize the extent of contamination.[23] The Long-Range Biological Standoff Detection System (LR-BSDS) was developed for the US Army to provide the earliest possible standoff warning of a biological attack. It is an airborne system carried by a helicopter to detect man-made aerosol clouds containing biological and chemical agents at long range. The LR-BSDS, with a detection range of 30 km or more, was fielded in June 1997.[24] Five LIDAR units produced by the German company Sick AG were used for short range detection on Stanley, the autonomous car that won the 2005 DARPA Grand Challenge. A robotic Boeing AH-6 performed a fully autonomous flight in June 2010, including avoiding obstacles using LIDAR.[25][26]
[edit] Physics and astronomy

A worldwide network of observatories uses lidars to measure the distance to reflectors placed on the moon, allowing the moon's position to be measured with mm precision and tests of general relativity to be done. MOLA, the Mars Orbiting Laser Altimeter, used a LIDAR instrument in a Mars-orbiting satellite (the NASA Mars Global Surveyor) to produce a spectacularly precise global topographic survey of the red planet. In September, 2008, NASA's Phoenix Lander used LIDAR to detect snow in the atmosphere of Mars.[27] In atmospheric physics, LIDAR is used as a remote detection instrument to measure densities of certain constituents of the middle and upper atmosphere, such as potassium, sodium, or molecular nitrogen and oxygen. These measurements can be used to calculate temperatures. LIDAR can also be used to measure wind speed and to provide information about vertical distribution of the aerosol particles.[citation needed] At the JET nuclear fusion research facility, in the UK near Abingdon, Oxfordshire, LIDAR Thomson Scattering is used to determine Electron Density and Temperature profiles of the plasma.[28]
[edit] Robotics

LIDAR technology is being used in Robotics for the perception of the environment as well as object classification.[29] The ability of LIDAR technology to provide three-dimensional elevation maps of the terrain, high precision distance to the ground, and approach velocity can enable safe landing of robotic and manned vehicles with a high degree of precision.[30] Refer to the Military section above for further examples.

[edit] Surveying

This TomTom mapping van is fitted with five LIDARs on its roof rack.

Airborne LIDAR sensors are used by companies in the remote sensing field. It can be used to create DTM (Digital Terrain Models) and DEM (Digital Elevation Models) this is quite a common practice for larger areas as a plane can take in a 1km wide swath in one flyover. Greater vertical accuracy of below 50mm can be achieved with a lower flyover and a slimmer 200m swath, even in forest, where it is able to give you the height of the canopy as well as the ground elevation. a reference point is needed to link the data in with the WGS (World Grid System)[citation needed]
[edit] Transportation

LIDAR has been used in Adaptive Cruise Control (ACC) systems for automobiles. Systems such as those by Siemens and Hella use a lidar device mounted on the front of the vehicle, such as the bumper, to monitor the distance between the vehicle and any vehicle in front of it.[31] In the event the vehicle in front slows down or is too close, the ACC applies the brakes to slow the vehicle. When the road ahead is clear, the ACC allows the vehicle to accelerate to a speed preset by the driver. Refer to the Military section above for further examples.
[edit] Wind farm optimization

Lidar can be used to increase the energy output from wind farms by accurately measuring wind speeds and wind turbulence.[32] An experimental[33] lidar is mounted on a wind turbine rotor to measure oncoming horizontal winds, and proactively adjust blades to protect components and increase power.[34]
[edit] Solar Photovoltaic Deployment Optimization

LiDAR can also be used to assist planners and developers optimize solar photovoltaic systems at the city level by determining appropriate roof tops[35] and for determining shading losses.[36]

[edit] Other uses

The video for the song "House of Cards" by Radiohead was believed to be the first use of realtime 3D laser scanning to record a music video. The range data in the video is not completely from a LIDAR, as structured light scanning is also used.[37][38]
[edit] Alternative technologies

Recent development of Structure From Motion (SFM) technologies allows delivering 3D images and maps based on data extracted from visual and IR photography. The elevation or 3D data is extracted using multiple parallel passes over mapped area, yielding both visual light image and 3D structure from the same sensor, which is often a specially chosen and calibrated digital camera.

[edit] See also


3D Flash LIDAR Atomic line filter CLidar Laser rangefinder libLAS, a BSD-licensed C++ library for reading/writing ASPRS LAS LiDAR data LIDAR detector List of laser articles National LIDAR Dataset USA Optech, a company focusing on lidars Optical time domain reflectometer Satellite laser ranging Sonar Time-domain reflectometry TopoFlight

[edit] References
1.
^ a b c d e Cracknell, Arthur P.; Hayes, Ladson (2007) [1991]. Introduction to Remote Sensing (2 ed.). London: Taylor and Francis. ISBN 0-8493-9255-1. OCLC 70765252 2. ^ 'Experimental Advanced Research Lidar', NASA.org. Retrieved 8 August 2007. 3. ^ Amzajerdian, Farzin; Pierrottet, Diego F.; Petway, Larry B.; Hines, Glenn D.; Roback, Vincent E.. "Lidar Systems for Precision Navigation and Safe Landing on Planetary Bodies". Langel Research Center. NASA. http://ntrs.nasa.gov/search.jsp? R=20110012163. Retrieved May 24, 2011. 4. ^ Medina, Antonio. Three Dimensional Camera and Rangefinder. January 1992. United States Patent 5081530. 5. ^ Medina A, Gay F, and Pozo F. Compact laser radar and three-dimensional camera. 23 (2006). J. Opt. Soc. Am. A. pp. 800805 http://www.opticsinfobase.org/josaa/abstract.cfm?URI=josaa-234800. 6. ^ a b Strauss C. E. M., "Synthetic-array heterodyne detection: a singleelement detector acts as an array", Opt. Lett. 19, 1609-1611 (1994) 7. ^ "ARS Study Helps Farmers Make Best Use of Fertilizers". USDA Agricultural Research Service. June 9, 2010. http://www.ars.usda.gov/is/pr/2010/100609.htm. 8. ^ EID; crater beneath canopy

9.

^ John Nobel Wilford (2010-05-10). "Mapping Ancient Civilization, in a Matter of Days". New York Times. http://www.nytimes.com/2010/05/11/science/11maya.html?pagewanted=all. Retrieved 2010-05-11. 10. ^ The Light Fantastic: Using airborne lidar in archaeological survey. English Heritage. 2010. pp. 45. http://www.english-heritage.org.uk/publications/lightfantastic/. 11. ^ Councillor Quarterly, Summer 2007 Volume 6 Issue 3 12. ^ Tom Paulson. 'LIDAR shows where earthquake risks are highest, Seattle Post (Wednesday, April 18, 2001). 13. ^ 'Mount Saint Helens LIDAR Data', Washington State Geospatial Data Archive (September 13, 2006). Retrieved 8 August 2007. 14. ^ 'Airborne Topographic Mapper', NASA.gov. Retrieved 8 August 2007. 15. ^ http://superlidar.colorado.edu/Classes/Lidar2011/LidarLecture14.pdf 16. ^ Li,, T. et al. (2011). "Middle atmosphere temperature trend and solar cycle revealed by long-term Rayleigh lidar observations". J. Geophys. Res. 116. 17. ^ Thomas D. Wilkerson, Geary K. Schwemmer, and Bruce M. Gentry. LIDAR Profiling of Aerosols, Clouds, and Winds by Doppler and Non-Doppler Methods, NASA International H2O Project (2002). 18. ^ 'Earth Explorers: ADM-Aeolus', ESA.org (European Space Agency, 6 June 2007). Retrieved 8 August 2007. 19. ^ 'Doppler lidar gives Olympic sailors the edge', Optics.org (3 July, 2008). Retrieved 8 July 2008. 20. ^ Grant, W. B., Lidar for atmospheric and hydrospheric studies, in Tunable Laser Applications, 1st Edition, Duarte, F. J. Ed. (Marcel Dekker, New York, 1995) Chapter 7. 21. ^ http://www.arete.com/index.php?view=stil_mcm 22. ^ http://www.rta.nato.int/pubs/rdp.asp?RDP=RTO-TR-SET-098 NATO Laser Based Stand-Off Detection of biological Agents 23. ^ http://www.ino.ca/en-CA/Achievements/Description/project-p/short-rangebioaerosol-threat-detection.html 24. ^ .http://articles.janes.com/articles/Janes-Nuclear,-Biological-and-ChemicalDefence/LR-BSDS--Long-Range-Biological-Standoff-Detection-System-UnitedStates.html 25. ^ Spice, Byron. Researchers Help Develop Full-Size Autonomous Helicopter Carnegie Mellon, 6 July 2010. Retrieved: 19 July 2010. 26. ^ Koski, Olivia. In a First, Full-Sized Robo-Copter Flies With No Human Help Wired, 14 July 2010. Retrieved: 19 July 2010. 27. ^ NASA. 'NASA Mars Lander Sees Falling Snow, Soil Data Suggest Liquid Past' NASA.gov (29 September 2008). Retrieved 9 November 2008. 28. ^ CW Gowers. ' Focus On : Lidar-Thomson Scattering Diagnostic on JET' JET.EFDA.org (undated). Retrieved 8 August 2007. Archived September 18, 2007 at the Wayback Machine 29. ^ IfTAS 30. ^ Amzajerdian, Farzin; Pierrottet, Diego F.; Petway, Larry B.; Hines, Glenn D.; Roback, Vincent E.. "Lidar Systems for Precision Navigation and Safe Landing on Planetary Bodies". Langley Research Center. NTRS. http://ntrs.nasa.gov/search.jsp? R=20110012163. Retrieved May 24, 2011. 31. ^ Bumper-mounted lasers 32. ^ Mikkelsen, Torben et al (October 2007). "12MW Horns Rev Experiment". Risoe. http://130.226.56.153/rispubl/reports/ris-r-1506.pdf. Retrieved 2010-04-25. 33. ^ "Smarting from the wind". The Economist. 2010-03-04. http://www.economist.com/science-technology/technologyquarterly/displaystory.cfm?story_id=15582251. Retrieved 2010-04-25. 34. ^ Mikkelsen, Torben & Hansen, Kasper Hjorth et al. Lidar wind speed measurements from a rotating spinner Danish Research Database & Danish Technical University, 20 April 2010. Retrieved: 25 April 2010.

35.

^ Ha T. Nguyen, Joshua M. Pearce, Rob Harrap, and Gerald Barber, The Application of LiDAR to Assessment of Rooftop Solar Photovoltaic Deployment Potential on a Municipal District Unit, Sensors, 12, pp. 4534-4558 (2012). 36. ^ Nguyen, Ha T.; Pearce, Joshua M. (NaN undefined NaN). "Incorporating shading losses in solar photovoltaic potential assessment at the municipal scale". Solar Energy 86 (5): 12451260. DOI:10.1016/j.solener.2012.01.017. http://hal.archives-ouvertes.fr/hal-00685775. 37. ^ Nick Parish (2008-07-13). "From OK Computer to Roll computer: Radiohead and director James Frost make a video without cameras". Creativity. http://creativity-online.com/? action=news:article&newsId=129514&sectionId=behind_the_work. 38. ^ http://www.velodyne.com/lidar/lidar.aspx Retrieved 2 May 2011

[edit] External links

Wikimedia Commons has media related to: LIDAR

The USGS Center for LIDAR Information Coordination and Knowledge (CLICK) - A website intended to "facilitate data access, user coordination and education of lidar remote sensing for scientific needs." How Lidar Works

Producing Elevations LIDAR


Barbara McKay Archibald

Terrain with

LIDAR is an acronym for Light Direction and Ranging (National Oceanic and Atmospheric Administration [NOAA], 2007a). Other names it is known by are Laser Altimetry, Airborne Laser Terrain Mapping, Airborne Laser Swath Mapping and LADAR (Harding,

2000, Satale1 and Kulkarni, n.d., Wikipedia, 2007). It is a remote sensing system used by a variety of disciplines such as geography, natural resource management, engineering, atmospheric science, and archaeology for both research and more practical applications (Department of Army U.S. Corps of Engineers [USACE], 2002, National Aeronautics and Space administration [NASA], 1998, Humme, Lindenbergh and Sueur, 2007, MOSAIC, 2001, Wikipedia, 2007).

Remote sensing is a technique for measuring, observing or monitoring (NASA, 1998) a phenomena from a distance (NASA, 1998). Most systems function by measuring the electromagnetic energy reflected or emitted by objects. Different objects reflect or emit electromagnetic energy differently. This means that an object observed under one remote sensing technique may be

unremarkable but under another technique the object may stand out. Using more complex multispectral remote sensing permits researchers to identify and work with phenomena using the magnitudes of energy measured at more than one wavelength (the objects spectral response pattern). These systems set objects further apart (DiBiase, 1999-2007). Familiar remote sensing systems include eyesight, cameras, and telescopes (NASA, 1998).

Remote sensing is beneficial in many ways. First and foremost, it is comparatively an inexpensive and fast method of data collection. It permits data collection in areas where access is difficult or conditions may be toxic or dangerous. Finally, it allows data collection with little or no impact on an environment or the subject of study (NASA, 1998).

There are two varieties of remote sensing: passive and active remote sensing. A passive system measures energy which is emitted from an object or bounced off its surface. The origin of the energy is external to the remote sensing system. While an active system emits the energy which is later reflected off a surface and recorded by a receiver. LIDAR is an active remote sensing system (DiBiase, 1999-2007, NASA, 1998).

Some variation exists between LIDAR systems in order to service the needs of the diverse theses the systems are used to address. At a simplified state LIDAR can be used from the earths surface, an airplane, satellite or other space vehicle. The system consists of a light source and a receiver. A laser emits pulses of light energy in the ultraviolet, visible or infrared spectrum at a rate of a few per second to thousands per second. Different wavelengths are used depending on the type of measurement being made. The pulses

bounce off a surface and are reflected back to the receiver, a telescope, where they are collected and measured (MOSAIC, 2001, NASA, 1998, NASA, 2000, Sica, 1999). The product from these measurements can be used to measure temperature, density and pressure of gasses and aerosols, velocity, or elevation (NASA, 1998).

NASA identifies three variety of LIDAR: range finder LIDAR, differential absorption LIDAR (DIAL) and Doppler LIDAR. Range finder LIDAR uses short wavelengths. The system measures the time between the emission of a laser pulse and the time it is recorded (NASA, 1998). Using the speed of light, it uses this measurement to calculate distance between the receiver and the surface from which the wavelength bounced (MOSAIC, 2001). Given the short wavelength, range finder LIDAR are useful for measuring small objects as well as determining their size and shape (NASA, 1998).

DIAL is used to measure temperature, density and pressure of gasses and aerosols in the atmosphere. The system uses two lasers. The wavelength of the first equals the target gas absorption peak, and the second is between the absorption features of the gas. A ratio is calculated. The ratio indicates the absorption

characteristics which are related to temperature, density and pressure of the gas. NASAs LITE mission uses DIAL to measure cloud and aerosol structures (NASA, 1998).

Doppler LIDAR measures a subjects velocity using the change in wavelength caused by surface motion. This change is called Doppler shift. The system interrupts a returning wavelength with another wavelength to produce a radio wave. The radio waves frequency determines the change between outgoing and incoming signals. NASAs SPARCLE uses Doppler LIDAR to measure wind velocity (NASA, 1998).

The variety of LIDAR of interest here is range finding LIDAR which is used to calculate elevation. Accuracy is achieved; horizontal distance is within 15 centimeters of true position and vertical distance is within 1/1000th of the flight height. There is a further subdivision of topographic LIDAR and bathymetric LIDAR. The first measures height above water while the second measures distance beneath water up to 50 meters. These systems are mounted on airplanes or helicopters. Four primary components make up these systems: a LIDAR sensor, at least two global positioning (GPS) units, an inertial measurement unit (IMU) and an operator and pilot display (USACE, 2002).

figure 1 Lidar System (USACE, 2002)

The LIDAR system is composed of one or two eye-safe lasers and a telescope (Harding, 2000, MOSAIC, 2001). Topographic LIDAR emit an infrared wavelength laser while bathometric LIDAR use two lasers: an infrared wavelength spectrum laser and a blue-green wavelength spectrum laser (MOSAIC, 2001). The lasers are mounted on the underside of the aircraft. Pulses of light energy are emitted from the laser and reflected to the earth by rotating mirrors set at a given angle. The rotating mirrors distribute the light energy across a large region beneath the platform (Harding, 2000,

NOAA, 2007a). Several factors influence the distance between the height points such as plane altitude and speed, as well as mirror angel (Satale1 and Kulkarni, n.d.). [See NOAA's website for

http://www.csc.noaa.gov/products/sccoasts/html/animate.htm

an image of LIDAR data points as they are collected (2007b).] The wavelengths bounce off surfaces. Infrared wavelengths of

topographic LIDAR reflect off of the ground, vegetation, man made artifacts and other features. Infrared wavelengths from bathometric LIDAR reflect off of the surface of the water, and the blue-green wavelengths pass through the water to the bottom of the body of water from which they reflect (MOSAIC, 2001). The energy returns along the same optical path to the air platform and is received by the telescope. The telescope focuses the returned energy on a discriminator and a time interval meter. Here time is calculated (Harding, 2000, USACE, 2002, NOAA, 2007a). Using the speed of light, time is used to calculate the distance between the receiver and the surface from which the wavelength bounced. In the case of bathymetric LIDAR, the difference between the two calculated distances is the water depth (MOSAIC, 2001). These coordinates are stored in digital format (USACE, 2002).

Newer LIDAR systems are able to collect, calculate and store multiple returns. Since the diameter of a beam can be several

meters across, a portion of the beam may hit an object at a high elevation. This portion of the beam reflects back to the detector while the remainder of the beam continues its downward path. This may happen several times until the last portion of the beam reflects from the lowest topographical point. Each return is measured and recorded by these LIDAR systems. This

advancement is significant because this allows researchers to develop digital elevation maps (DEM) void of vegetation elevations. Using proprietary software, LIDAR operators are able to remove vegetation elevations from and create bare-earth DEMs (Harding, 2000, MOSAIC, 2001).

Two GPS units are needed to work with a LIDAR system. One unit is the Kinematic OTF GPS in the aircraft and a second unit of equal ability is on the ground at a ground station. (A third unit may be used on the ground as a backup system.) All GPS units must be able to collect L1/L2 carrier phase data at 1 Hz. A Kinematic OTF GPS measures the location of a traveling platform without static initialization (USACE, 2002). In order to measure location a Kinematic OTF GPS unit requires a lock on a minimum of 4 GPS satellites to measure its location within a 10 centimeter accuracy (MOSAIC, 2001, USACE, 2002).

The IMU is a unit that measures the pitch, role and heading of the aircraft. This is used to understand the direction in which the laser is directed. The IMU data are incorporated with GPS data and the laser distance data to establish in a three dimensional, x, y, z geographic coordinate (and height) (USACE, 2002).

Operator and pilot displays are important to the LIDAR system. The operator display enables the operator to monitor the various components of the system for proper function. The pilot display helps the pilot follow the flight plan (USACE, 2002).

In addition to the above four components, imaging equipment may be added to the platform. These images may be superimposed on or used in conjunction with the LIDAR data for further analysis (USACE, 2002).

Prior to initiating a LIDAR data collection mission it is critical that the flight be thoroughly planned. First, it is important to know the parameters and conditions of the flight area. This helps in planning the flight and in the measurement of good data. Ground controls must be established using one geodetic network. A base station, calibration control and project area controls are important. The first must be located within 30 km of the project area. The ground and

air equipment are simultaneously initiated to lessen the impact of ambiguity. The latter are used to calibrate the site and establish and test system accuracy. As mentioned above L1/L2 carrier phase GPS must be used, and geodetic techniques must be used to postprocess the data (USACE, 2002).

Quality data and system calibration are fundamental. The flight plan must include airport bidirectional and cross flight lines (USACE, 2002) and project cross flight lines (USACE, 2002). These are used to check the system for accuracy, make adjustments and discover systematic error. In order to check data and find and remedy bias, geodetic project ground control points are established at the calibration site and across the project area. Proper GPS setup at the ground control station is needed. This involves the use of a tripod and tribrach, proper calibration and plumbing of

equipment, establishing that all GPS unit receivers are in sync and ensuring that data collection begins prior to the arrival of the aircraft (USACE, 2002). Finally, an overlapping flight path

contributes to quality data (USACE, 2002).

The data collection starts. During which, the equipment and data are monitored to assure the laser is operational and the IMU and GPS units are working properly (USACE, 2002).

When data collection is completed, data are compiled from the ground and air equipment; and calculations begin. GPS data are imported into processing software to establish the platforms kinematic solution trajectory (USACE, 2002). These data are merged with the IMU data to produce position and orientation data. These are combined with distance data and manipulated with a geodetic algorithm resulting in three-dimensional x, y, z

coordinates (USACE, 2002).

What do these coordinates mean in respect to the earths surface? Assuming the procedures briefly discussed above were followed to ensure accurate measurements, elevation data should be within 15 centimeters of true position and horizontal data should be within 1/1000 of the platform flight height (USACE, 2002). Positions are measured with GPS units. GPS units measure using the World Geodetic System (WGS84) datum (National Research Council of the National Academies, 2007, 57-59). WGS84 datum relates to WGS84 ellipsoid. Ellipsoids are approximations of geoids. Geoids are models of the earth (DiBiase, 1999 -2007). Therefore, the coordinate points produced by the GPS are the geographic coordinates and ellipsoid heights. Ellipsoid height is a distance measurement from the surface of an ellipsoid. In order to understand the vertical position in terms of elevation, a geoid

model (i.e. GEOID03) must be applied. This will change ellipsoid height to elevation (National Research Council of the National Academies, 2007, 72).

GPS coordinates do not use the same geodetic system as the benchmark system in the U. S. The U. S. benchmark system uses North American Datum 1983 (NAD83) as a horizontal control datum and the North American Vertical Datum 1988 (NAVD88) as a vertical control datum. NAD83 is based on the international standard ellipsoid, Geodetic Reference System 1980 (GRS80). NAVD88 is based on sea level measurements. In many cases the similarity between GRS80 and WGS84 are enough that data can be used as they are. In other cases data may need to be transformed (National Research Council of the National Academies, 2007, 5759).

For many uses of LIDAR data it is important to classify the data. This means the landscape must be qualified. Is it an urban or rural landscape? What buildings or vegetation are present? Is the landscape barren? The mix of objects that make-up the

environment and the density of coverage is import in developing products such as bare-earth DEMs (USACE, 2002).

Many products can be produced from LIDAR data. It is simply threedimensional spatial data in its raw form. These points can be used in combination with GIS software to develop raster gridded DEMs (see figure 2); areal imagery can be used to construct contour plots with these data; surface modeling and even animated clips can be made using these data (MOSAIC, 2001).

Copyright Puget Sound LIDAR Consortium. All rights reserved. Used here for educational purposes only.

figure 2 A Perspective view of LIDAR DEM (Puget Sound LIDAR Consortium, 2005).

As grid raster data USGS DEMs and DEMs constructed from LIDAR appear similar, but they are not. They differ in how they are made, the coordinate systems used, resolution, orientation and accuracy.

Of the two products, USGS DEMs are more standardized. There are 3 general scales: small, medium, and large, ranging from 7.5 minutes to 1 degree in coverage. The DEMs are made by measuring points along a series of north-south profiles which make up a grid. The profiles are projected using UTMs in the proper UTM zone. All profiles parallel the central meridian. Resolution varies between scales. The 7.5 minute quadrangle is at a 10 meter resolution while the 1 degree resolution is 3 arc seconds (1/1200th of a degree) which equals 66 meters by 93 meters at 45 degrees north. Finally, the preferred vertical accuracy for the 7.5 minute quadrangle is 7 meters but may be as large as 15 meters (DiBiase, 1999-2007).

In contrast, although many of the methods for acquiring LIDAR data are standardized and some criteria are the same across LIDAR data collection projects, the products vary based on the character of the projects. First, these coordinates are measured using the area of

interest rather than paper. The flight line and hence the orientation of the DEM is dictated by the geometry of the project area. Scale and resolution are dictated by the project. Resolution is generally smaller; distance between points may be as small as 2 meter, but may be larger. LIDAR DEM accuracy is 15 centimeters in height while vertical accuracy is 1/1000th of the flight height. Most interesting about LIDAR DEMs is that vegetation points can be removed to produce bare earth DEMs (Harding, 2000, MOSAIC, 2001, USACE 2002).

Copyright Puget Sound LIDAR Consortium. All rights reserved. Used here for educational purposes only.

figure 3 A contour map, USGS DEM and LIDAR DEM of the same 1 square mile on Bainbridge Island, Washington (Puget Sound LIDAR Consortium, 2005).

USGS DEMs and LIDAR DEMs clearly differ. Both have their advantages. Goals often dictate which DEM will work best for a project. LIDAR DEMs have both better accuracy and can achieve higher resolution. (See figure 3.) Once a project reaches a certain size LIDAR DEMs are less expensive than other DEMs and many of the measurements and calculations for LIDAR DEMs are automated.

Bare-earth capabilities are useful as well (Harding, 2000, MOSAIC, 2001, USACE 2002). On the other hand, USGS DEM resolution and scale may serve a project fine and many of these DEMs already exist (DiBiase, 1999-2007). They do not need to be planned, measured, and plotted. As always, it is the project that determines the kind of DEM needed.

Following this discussion of LIDAR systems, it is only fair to provide some examples of LIDAR system use. The first example is a costal height acquisition project in South Carolina (NOAA, 2007c). The second is an application of bare-earth techniques in the support of archaeological research (Humme, Lindenbergh, and Sueur, C, 2007).

Along the South Caroline coastline NOAA has an ongoing project using LIDAR remote sensing to map the states beaches and monitor topographic shifts. These data help federal, state and local coastal resource managers assess erosion rates, set backs and the effectiveness of coastal mitigation projects such as jetty

construction and beach nourishment (sand replacement) (NOAA 2007 c, d).

The data are periodically collected using a topographic LIDAR system mounted on a Havilland Twin Otter Aircraft. Four transects are flown over the beach parallel to the coastline: two transects are centered over the water-land transition and two are centered over the sand-development transition. In order to measure the greatest number of points, flights are operated at low tide. The plane flies 60 meters per second at an altitude of 300 meters. The system emits 2,000 to 5,000 pulses per second. These parameters create a swath with a 300 meter diameter (NOAA, 2007 a).

Produced from these data are false color images, images mixing LIDAR coordinates with orthophotos and with vector based maps as well as beach topographic profile plots. False color images effectively communicate topography. While the combined

image/orthophotos and image/vector based maps help researchers identify features illustrated by the false color images and help discriminate unlike features of like elevation such as trees and roof tops (NOAA 2007a).

figure 4 Example of false color image, Kiawah Island, South Carolina (NOAA 2007).

These data and techniques are particularly useful in areas of low relief such as this study. They effectively communicate topographic shifts, identify environmental problems and help professionals plan mitigation strategies (NOAA 2007a).

Another example of LIDAR system data use is a study by Humme, Lindenbergh and Sueur. They use previously collected LIDAR points from a Dutch nationwide data set Acrueel Hoogtebestand Nederland (AHN)". It has a resolution of ground points in forested areas of at least 16 meters, and the data set has an accuracy with

a standard deviation of 15 centimeters. Their thesis was to identify Celtic fields dating to 800 B.C. to 0 B.C. in forested regions. These sites are identified as areas of approximately 40 meters square encompassed by low earth walls. Over time, it is speculated that forests overgrew some of these sites hiding the fields from easy discovery by other survey and remote sensing methods (Humme, Lindenbergh, and Sueur, C, 2007).

The authors investigate coordinate points in an area near Doorwerth in eastern Netherlands. They build upon an earlier study of this kind by Sueur in the same region where Sueur applied a technique of illumination from a software visualization program to ground points in the dataset to expose micro-relief of Celtic fields. Humme, Lindenbergh and Sueur seek to improve upon the earlier study. The authors apply a filter to remove large scale topography from the data to better expose the micro-relief using the geostatistical Krigeing method. Relief was reduced from 70 meters to 50 centimeters. By removing large topography and leaving behind small topography, roads, foot paths and earthen walls are better reveled. In addition to, the method brings out previously unknown fields adjacent to the fields exposed in the earlier study (Humme, Lindenbergh and Suer, 2007).

There are an ever growing number of projects one can undertake with LIDAR DATA. LIDAR data are used in creating flood risk maps, conducting oil surveys, surveying and planning pipeline corridors, studying forests and much more. (Crow, Benham, Devereaux and Amable, 2007, Humme, Lindenbergh, and Sueur, 2007, and Mosaic, 2001). The potential for this quick, accurate and inexpensive mapping tool is vast. One only needs to apply this remote sensing technique with an attention to detail to acquire data useful for their questions.

Sources

Crow, P., Benham, S., Devereax, B. J., and Amable, G. S. (2007) Woodland vegetation and its implications for archaeological survey using LiDAR [Electronic version]. Forestry, 80 (3), 241-252. Retrieved December 12, 2007, from http://forestry.oxfordjournals.org.ezaccess.libraries.psu.edu/cgi/cont ent/full/80/3/241 Department of Army, U. S. Army Corp of Engineers (2002) Engineering and design photogrammetric mapping, Chapter 11 Airborne LIDAR Topographic Surveying pp. 11-1-11-12. Publication number EM 1110-1-1000. Retrieved December 16, 2007, from ftp://ftp-fc.sc.egov.usda.gov/NCGC/products/elevation/coe-lidarspecs.pdf DiBiase, D. (1999-2007) The nature of geographic data, Lesson 2 Part IV Secton B, Summary, Lesson 7 Part II Section A, and Lesson

8, Part I Section A, Part IV. The Pennsylvania State University World Campus Certificate Program in GIS. Retrieved November 29, 2007. Harding, D. J. (2000) Principles of airborn laser altimeter terrain mapping. Retrieved December 15, 2007, from http://pugetsoundlidar.ess.washington.edu/laser_altimetry_in_brief. pdf Humme, A., Lindenbergh, R. and Sueur, C. (2007) Revealing Celtic fields from Lidar data using Krigeing dated filtering. Retrieved December 12, 2007, from http://www.isprs.org/commission5/proceedings06/paper/1241_Dres den06.pdf MOSAIC MAPPING SYSTEMS INC. (2001) A white paper on LIDAR mapping, Retrieved December 12, 2007, from ftp://ftpfc.sc.egov.usda.gov/NCGC/products/elevation/lidar-applicationswhitepaper.pdf
National Research Council of the National Academies, 2007, Elevation data for floodplain mapping, The National Academies Press 4.1.1 (57-59), 4.3.1 (72) http://books.nap.edu/openbook.php?record_id=11829&page=57, http://books.nap.edu/openbook.php?record_id=11829&page=58, http://books.nap.edu/openbook.php?record_id=11829&page=59, http://books.nap.edu/openbook.php?record_id=11829&page=72

National Aeronautics and Space Administration (1998) Fact Sheets: Remote sensing and lasers: FS-1998-03-35-LaRC. Retrieved December 16, 2007, from http://www.nasa.gov/centers/langley/news/factsheets/RemoteSensi ng.html National Oceanic and Atmospheric Administration (2007a) About LIDAR, Retrieved December 12, 2007, from http://www.csc.noaa.gov/products/sccoasts/html/tutlid.htm National Oceanic and Atmospheric Administration (2007b) Collecting LIDAR data, Retrieved December 21, 2007, from http://www.csc.noaa.gov/products/sccoasts/html/animate.htm

National Oceanic and Atmospheric Administration (2007c) Project overview, Retrieved December 20, 2007, from http://www.csc.noaa.gov/products/sccoasts/html/pdecript.htm National Oceanic and Atmospheric Administration (2007d) LIDAR: surveying South Carolinas beaches, Retrieved December 20, 2007, from http://www.csc.noaa.gov/products/sccoasts/html/ocrm_app.htm Puget Sound LIDAR Consortium (2005) Example 1. South end Rockaway Beach/NE corner Blakely Harbor, SE Bainbridge Island, Washington, Retrieved December 21, 2007, from http://pugetsoundlidar.ess.washington.edu/example1.htm Puget Sound LIDAR Consortium (2005) Example 2. One-square-mile area on Bainbridge Island, Retrieved December 21, 2007, from http://pugetsoundlidar.ess.washington.edu/example2.htm Sica, R. J. (1999) Exploring the atmosphere with lidar, University of Western Ontario, Canada. Retrieved December 16, 2007, from http://pcl.physics.uwo.ca/science/lidarintro/ Satale1, D. M. and Kulkarni, M. N. (n.d.) LiDAR in Mapping. Retreived December 15, 2007, from http://www.gisdevelopment.net/technology/gis/mi03129pf.htm Wikipedia (2007) Lidar, Retrieved December 12, 2007, from http://en.wikipedia.ork/wiki/LIDAR

Light Detection and Ranging (LIDAR)


Contents

[hide]

1 Introduction & History 1.1 What is the technology and how is it used?

1.2 Why is it interesting or important? 1.3 How was it discovered or developed and by whom? 2 Analysis 2.1 Physics of the technology in detail 3 Conclusion 3.1 What is the future of this technology? 3.2 How is this technology viewed by the public and how does it differ from the scientific

community?

4 Works Cited

Introduction & History


What is the technology and how is it used?
LIDAR, also known as light detection and ranging, was originally developed to help map out and survey archeological sites and areas where traditional surveying would be too expensive or time consuming. LIDAR uses pulses of light instead of radio frequencies to scan over the target area. These frequencies are then bounced back to cameras which process the data and convert it to reflect how the surface of the target area is shaped and how the elevations of different structures can range from mountains to man-made. LIDAR is being used on airplanes and helicopters to survey areas such as Stonehenge. By using LIDAR, archeologists have been able to find new areas of Stonehenge that have been overlooked or undetected by traditional methods (Irish, 1998). It uses a variety of techniques in order to provide elevations measurements as precise as six inches to the actual height. Some of these techniques include global positioning systems, very precise navigation systems, laser-range finders, and high speed computers to process the data being collected. Once the survey is complete, the data collected can be uploaded to computers back in the lab where digital elevation models, DEM, can be immediately accessed to start the analysis process of the survey (Shamayleh, 2003).

Image Courtesy of NASA Airborne LIDAR is also used to detect depths of different bodies of water, provided they are clear enough for lasers to penetrate. A device called a transceiver, which transmits and receives the laser signals, is mounted on the bottom of the airborne vehicle. When the lasers hit the water, some of the energy bounces off of the surface of the water and the other portion makes its way to the bottom of the body of water and then bounces back. The portion that bounces off the surface is called the surface return, and the portion that bounces off the bottom is called the bottom return. Depending on how long it takes for these different returns to make their way back to the transceiver, determines how deep the body of water is. Different factors such as how much of the lasers energy is absorbed by the water, the scattering of the laser in the water, and the refraction of the beam of light make the depth of water that LIDAR can detect limited. The maximum depth of water in which LIDAR can be used to detect is proportional to what is called the diffuse attenuation coefficient, given as the variable K, and is measured in units of inverse length. LIDAR systems are able to produce data as long the product of the depth of the water and K is less than 4 during the day and 5 at night (Irish, 1998). In 1994, a LIDAR system known as SHOALS or scanning hydrographic operational airborne LIDAR systems, was developed by the US Army Corps of Engineers (USACE) to collect data in locations such as the Atlantic and Pacific Ocean, the Gulf of Mexico, the Great Lakes, and areas around the Caribbean islands. The data collected by the SHOALS can be used to determine the conditions of the coastlines and channels used for navigation in order to determine how these areas have changed over a given time period (Irish, 1998). There is also a form of LIDAR that can be used on the ground as opposed to in the air. This form is mounted on a tripod which sends out millions of laser beams which can convert the data to a three dimensional image of the area. Highly advanced LIDAR systems can even determine how reflective the material being viewed is or even the colors of the

target. These land LIDAR systems, along with the aerial LIDAR systems, can be used to create a Digital Terrain Model (DTM) which can then be used to study natural phenomena like landslides and earthquakes (Baldo, 2009).

Why is it interesting or important?


LIDAR is interesting due to the fact that it uses light instead of radio frequencies to detect the distance and speed of stationary or moving objects. Since light has a shorter wavelength than its radio counterpart, it is more sensitive to aerosol and cloud particles. It is important to archaeologist due to the fact that they use it in airplanes to scan areas of the earth for research. A new technique to produce highly accurate measurements of the topography of the Earth's Surface (Doneus, 2006) which also could be called LIDAR. The device that is responsible for scanning the environment is usually mounted below a airplane or helicopter (Doneus, 2006). An example of this device being used on an aircraft is shown below. To measure the difference in height between different geological items, the laser scanner emits short infrared pulsed into different directions across the measurement site (Doneus, 2006). Then a photo diode records the back scattered echo and uses this to determine the time the signal took to travel to and from the ground (Doneus, 2006). The equation used to figure this out is later explained in the analysis section of this article. ALS works best for archaeology due to the fact that archeologist need detailed maps of archaeological sites and ALS is a technique that can fulfill that need. It also has a high probability of being used for detection of archaeological sites that are in densely covered areas of the world (Doneus, 2006). It is important to major local and federal government programs like NASA, the U.S. Geological Survey, and the U.S. Army Corps of Engineers. These government organizations use LIDAR to survey different topography and the atmosphere. An instance of when LIDAR was used is when NASA, the U.S. Geological Survey, and the U.S. Army Corps of Engineers was exploring the use of airborne laser mapping systems to map the differences that Hurricane Katrina made to the entire coastline (Military & Aerospace Electronics, 2006). They used the data from the laser mapping systems that was taken before and after the hurricane to compare the patterns of coastal change caused by erosion. Also they used it to the magnitudes of destruction to buildings and infrastructure (Military & Aerospace Electronics, 2006). Two different systems were used during the evaluation which included NASA's Experimental Airborne Research Lidar (EAARL) and Army Corps of Engineers' Compact Hydrographic Airborne Rapid Total Survey (CHARTS) (Military & Aerospace Electronics, 2006). The USACE routinely evaluates its coastal navigation structures as part of its maintenance management program (US Army Corps of Engineers, 1996. Condition and Performance Rating Procedures for Rubble Breakwaters and Jetties. Draft Report.US Army Corps of Engineers., Washington, DC.US Army Corps of Engineers, 1996). Since SHOALS collects high-density bathymetry and topography

remotely, it can completely map these difficult-to-survey structures. The SHOALS data of a rubble-mound structure identifies and quantifies any breaches or loss of elevation along a structure's length, displacement and settling of groups of armor stone, and side-slope defects such as steepening. Furthermore, the SHOALS bathymetry identifies the location and dimensions of any scour features. In addition to the bathymetry and topography, the recorded video image of the structure, collected during the mission flight, may aid in further evaluating the above-water condition of the structure. For example, core stone exposure through gaps in the armor stone and large armor stone fractures may be identified. These features' locations can also be approximated from the video. LIDAR technology utilizes the Global Positioning System (GPS), precision inertial navigation systems, laser-range finders, and high speed computing for data collection. LIDAR systems on airborne platforms (e.g., an airplane or helicopter) usually measure the distance between an object the laser beam hits and the airborne platform carrying the system. Airborne laser mapping instruments are active sensor systems, as opposed to passive imagery such as cameras. With LIDAR, it is possible to obtain elevation information on large tracts in relatively short time; elevation data obtained with LIDAR can be up to 6-inch accurate. LIDAR system uses the speed of light to determine distance by measuring the time it takes for a light pulse to reflect back from a target to a detector. A laser emitter can send about 5,000 pulses per second, but due to the high speed of light a detector can sense the reflected pulse before the next one is sent. Following a data collection flight, the data-tapes are transferred to a ground-based computer where a display of recorded data is immediately available. LIDAR systems produce data that can be used in digital elevation models (DEM). The high density of elevation points provides the possibility to create high-resolution DEM models (5). LIDAR has been effectively used in several applications including highway location and design and highway safety.

Courtesy of Simon

How was it discovered or developed and by whom?


LIDAR, a fairly new technology with concepts about using it to measure depths of water in the early 1960s, although the first studies and prototypes did not come out tills the mid 1960s. Scripps Institute of Technology had a big part in the development of LIDAR and how well it would work in performing surveys along with Syracuse University Research Center, the US Navy, the Canadian government, and the Australian government, all of who contributed to the first LIDAR system (Irish, 1998).

The development of the second generation model of LIDAR started in 1975 by the National Aeronautics and Space Administration. The second generation was designed to be mounted on an aerial vehicle and was called the Airborne Oceanographic LIDAR, or AOL. AOL was used to determine how effective LIDAR was in determining the depths of bodies of water and surveying the areas around it. After tests in 1977, LIDAR proved to be a very effective way to conduct surveys. Other countries such as Australia, Canada, and the Soviet Union also tested the second generation LIDAR system and reported similar positive results. The first operational LIDAR system, the Larsen 500, was developed in the mid 1980s by two companies, Optech and the Canadian Center for Remote Sensing, for Canada and became the first system to conduct surveys in the Northwest Territories. Sweden developed the first LIDAR system for a helicopter, called FLASH, in 1989, along with the US and Australia. Todays generation of LIDAR systems LADS, Laser Airborne Depth Sounder, was developed for the Royal Australian Navy in an effort to survey and map the coral reefs off the coast of Australia. Doing this enables an easy way to locate new routes ships could cruise through. Optech also developed similar helicopter type LIDAR systems for USACE, the Swedish National Maritime Administration, and the Swedish Navy. In 1994, the USACE used this system to develop SHOALS and began using this to conduct surveys and depth analysis. Sweden was developing their own system, Hawk Eye, about the same time as SHOALS was being developed. The only difference in the two systems is the algorithm that is used to analyze the data. The Swedes used a program that Saab developed, where as SHOALS used a program that the National Oceanographic and Atmospheric Administration (NOAA) developed. The Swedes were able to use their system to not only analyze the coast line but also to detect submarines (Irish, 1998).

Analysis
Physics of the technology in detail
LIDAR can better be explained by describing the individual parts of the LIDAR equation which makes LIDAR possible. As stated before in the introduction to this article, LIDAR stands for Light Detection and Ranging. In this section the different components of the LIDAR equation, which consist of K, G(R), (R), and T(R), would be explained below. Each component of the LIDAR equation has its own equation. LIDAR is used my many agencys and professions like law enforcement and geologist but most of them do not know the details behind how LIDAR works. This section will explain the different parts of the LIDAR equation and it comes together to form one single equation which is displayed at the end of this section. LIDAR measurement principle The formula shown below is used to find the distance R of the scattering volume from time delay of the signal. All of this is done by sending a pulsed laser beam into the atmosphere. the beam hits air molecules in the atmosphere and is scattered in

all directions. The few beams that return to the source are collected by telescopes and sent into a detection unit. This is when this formula is used to tell the distance from which the scattering came from. From the time delay, the distance R of the scattering volume can be determined and is expressed as (Wandinger, 2005),

LIDAR equation This is the equation used to measure the power of the back scattered light that is received by the telescopes on the ground. (Wandinger, 2005)

The equation P(R,) displayed above relies on 4 parts which are listed and described below. (Wandinger, 2005) K - system constant According to Wandinger "P0 is the average power of a single laser pulse and the temporal pulse length, the pulse energy is derived by E0 = P0. R = (c)/2 is the effective pulse length. A is the area of the primary reception optics and the overall system efficiency (optical transmission from emitter to receiver and detection efficiency)". The equation below shows how K is an experimentally controlled parameter.

G(R) - range-dependent measurement geometry Wandinger states in his book that "The geometry factor contains the overlap function of the laser beam with the receiver field of view, called O(R), and a R-2 dependence. The latter is obtained from the fact, that the area of reception is part of a sphere's surface with radius R, centered in the scattering volume". The equation below relates G(R) to the main equation that measure the power of the back scattered LIDAR beam.

(R) - back scatter coefficient The back scatter coefficient is the parameter that determines the strength of the signal being received by the telescopes on the ground. You can tell by the term "Back scatter coefficient" that it is used to describe how much light is scattered in the backward direction (Wandinger, 2005). "The back scatter coefficient is an atmospheric parameter which determines the strength of the LIDAR signal. It describes how much light is scattered in the backward direction. Given that Nj is the concentration of scattering particles of kind j and dj,sca(,)/d is the differential back scatter cross section of the particles, the back scatter coefficient results in the sum of all kinds of scatterers" (Wandinger, 2005). The equation below is what used to find out the back scatter coefficient of a returning scattered LIDAR signal from the atmosphere.

Displayed below is another equation that is used to write the back scatter coefficient (R,) as a sum (Wandinger, 2005).

T(R) - transmission term "That part of light which gets lost on the way from the LIDAR to the scattering volume is expressed by the transmission term in the equation below" (Wandinger, 2005).

This term came from a specific form of a law called the Lambert-Beer-Bouguer law which is a law for LIDAR (Wandinger, 2005). "The integral describes the way from the LIDAR to the location of back scattering at R. The factor 2 represents the forward and backward travel of the light. The extinction coefficient (R,) is

computed similarly to the back scatter coefficient as a product of concentration and extinction cross section j, ext for every type of scatterer" (Wandinger, 2005). The equation below shows how the T(R) is derived from the specific form of the Lambert-Beer-bouguer law for LIDAR (Wandinger, 2005).

Hopefully from what you have seen and read from the analysis part of this article, you will have a better understanding of what the main LIDAR equation is comprised of and why these terms are included in the equation. If you still do not know how to piece together the equation, it is displayed below to help you out.

Conclusion
What is the future of this technology?
The continued advances in LIDAR technology with respect to resolution, cost, size, and data processing, promise to continue the evolution of traditional applications along with the emergence of entirely new ones. The use of LIDAR to gather data about the environment in the field of Oceanography is now being expanded to include research on the dynamics and composition of the atmosphere. Particularly, the troposphere layer of the atmosphere contains valuable information about long range weather patterns as well as environmental pollution. With the advent of increased resolution in LIDAR systems it is now believed that four dimensional input data can now be obtained. This new dimension in the data will make it possible to know the density of air throughout the troposphere (Hays 3).

Imaage is Courtesy of NASA In response to this breakthrough the National Research Council (NRC) has requested a demonstration in the 2011-2012 time frame (Hays 3). In addition to this research on the atmospheric environment, LIDAR is also being harnessed to better understand global climate change. By using a LIDAR sensor operating in the near infrared, scientists hope to be able to track the movement of greenhouse gases in the troposphere where their movements can be most easily tracked (Sonnenfroh 1). By better understanding the sources and sinks of the carbon cycle, scientists will be better informed to make recommendations in the fight against global warming. Along with incredible advances made in environmental study, LIDAR is also revolutionizing the field of robotics. No where is this more dramatically demonstrated then in the Defense Advanced Research Projects Agencys (DARPA) Grand Challenge. The events purpose was to have autonomous vehicles navigate through the Mojave Desert while avoiding obstacles and obeying traffic laws. All of the final contestants used Velodyne's HDL-64E sensor to some degree to create a 3D map of the terrain and obstacles (Velodyne 1). Over the course of the contest (2003-2006) the LIDAR system substantially increased resolution by replacing the fixed electronics and rotating mirrors of traditional systems with a system of sixty four independent lasers revolving at nine hundred rpm that were able to collect roughly a million measurements per second (Velodyne 1). Just as extraordinary is the incredible reduction in size and weight of the system, as can be seen in the

images below. This incredible feat promises to make LIDAR ubiquitous in that it will now be able to be hosted upon a much larger variety of platforms. These platforms may include everything from collision avoidance sensors on standard automobiles to being cheaply installed on boats to improve wetland conservation in third world nations.

Courtesy of the Velodyne Corporation

How is this technology viewed by the public and how does it differ from the scientific community?
LIDAR is not as well-known as the similar technology RADAR. It has few uses the general public is aware of (police LIDAR, etc) but the number of ways it can be used is ever increasing. Needless to say it is more well-known in the scientific community because of the multiple ways it can be used. LIDAR is used in a number of different areas including meteorology, physics and in military contexts. How it works is only well understood in the advanced scientific community due to the complexities of the LIDAR measurement principle and the LIDAR equation described above therefore LIDAR is mostly used for scientific applications.

Works Cited
"NASA airborne laser mapping systems studies Katrina damage." Military & Aerospace Electronics 17.1 (Jan. 2006): 6-6. Academic Search Premier. EBSCO. 30 Mar. 2009 <http://www.lib.ncsu.edu/cgi-bin/proxy.pl? server=http://search.ebscohost.com/login.aspx? direct=true&db=aph&AN=19964915&site=ehost-live&scope=site>. Baldo, Marco . "LIDAR monitoring of mass wasting processes: The Radicofani landslide, Province of Siena, Central Italy ." Geomorphology 10515 Apr 2008 193-

201. 2 Apr 2009 <http://www.sciencedirect.com/science? _ob=ArticleURL&_udi=B6V93-4TKPV7S2&_user=290868&_coverDate=04%2F15%2F2009&_alid=895446987&_rdoc=7&_f mt=high&_orig=search&_cdi=5887&_sort=d&_st=4&_docanchor=&_ct=1088&_acct =C000015398&_version=1&_urlVersion=0&_userid=290868&md5=bea22e79d929 b8d5f7c0757acff82cc3>. Doneus, Michael. "Full-waveform airborne laser scanning as a tool for archaeological reconnaissance." From Space To Place 2006 99-105. 29 Mar 2009 <http://w3.riegl.com/uploads/tx_pxpriegldownloads/Doneus_ROME_01.pdf>. Hays, Paul. "Space-based Doppler Winds LIDAR: A Vital National Need." Michigan Aerospace Corporation. Accessed 5/6/09. http://space.hsv.usra.edu/LWG/Splash %20Papers/Hays.pdf Irish, J.L. "Coastal engineering applications of high-resolution lidar bathymetry." Coastal Engineering 35(1998) 47-71. 2 Apr 2009 <http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6VCX-3V4J12N4&_user=290868&_coverDate=10%2F31%2F1998&_rdoc=1&_fmt=full&_orig=sear ch&_cdi=5966&_sort=d&_docanchor=&view=c&_acct=C000015398&_version=1&_ urlVersion=0&_userid=290868&md5=d10c11e7cf36931c59d2d919c4f11dc6#bb1>. NASA.Earth Observatory,.Pictures. Accessed 5/6/09. http://earthobservatory.nasa.gov/Features/Lidar/lidar3.php Shamayleh,, Huda. "Proceedings of the 2003 Mid-Continent Transportation Research Symposium,." Iowa State. 05 Aug 2003. Department of Civil Engineering University of Nebraska-Lincoln. 2 Apr 2009 <http://www.ctre.iastate.edu/pubs/midcon2003/ShamaylehLIDAR.pdf>. Simon. LIDAR_Aircraft_Pic.JPG/Picture.Accessed 5/6/09. http://www.sanctuarysimon.org/monterey/sections/other/whats_new_lidar.php Sonnenfroh, D. M.(2005) "Satellite Lidar for Global Warming Gas Measurement" American Geophysical Union. Accessed 5/6/09. http://adsabs.harvard.edu/abs/2005AGUFM.A21D0907S Velodyne Corp."History and Vision". Accessed 5/6/09. http://www.velodyne.com/lidar/vision/default.aspx Wandinger, U.: Introduction to lidar, in Lidar - Range-resolved optical remote sensing of the atmosphere, C. Weitkamp (Ed.), Springer, New York (2005).