Sunteți pe pagina 1din 158

Remote Sensing and GIS

Remote Sensing-I
Remote Sensing is the science and art of
obtaining information about an object,
area or phenomenon through the analysis
of data acquired by a device that is not in
contact with the object, area or
phenomenon under investigation

The two basic processes involved are data acquisition


and data analysis.
The elements of the data acquisition process are
energy sources
Propagation of energy through the atmosphere
Energy interactions with earth surface features
Retransmission of energy through the atmosphere
Airborne and/or space borne sensors
Resulting in the generation of sensor data in pictorial
and/or digital form

In short, we use sensors to record


variations in the way earth surface
features reflect and emit electromagnetic
energy.
The data analysis process involves
examining the data using various viewing
and interpretation devices to analyse
pictorial data and/or a computer to
analyse digital sensor data.

Reference data about the resources being


studied (such as soil maps, crop statistics,
or field check data) are used when and
where available to assist in the data
analysis.
With the aid of reference data, the analyst
extracts information about the type, extent
location and condition of the various
resources over which the sensor data
were collected.

This information is then compiled


generally in the form of hard copy maps
and tables or as computer files that can be
merged with other layers of information
in a geographic information system (GIS).
Finally the information is presented to
users, who apply it to their decisionmaking process.

Remote Sensing Terms


Electromagnetic radiation (EMR): The
energy transported by the propagation of
sunlight. Detection and measurement of
EMR is generally the basis of remote
sensing.
Reflectance: A measure of the ability of a
surface to reflect energy. Mathematically,
the ratio of radiant energy by an object to
that incident upon it is called reflectance.

Brightness values or digital values: remote


sensing systems record the amount of
reflected or emitted energy from the
earths surface. This data is stored as
brightnessvalues or digital values. The
greater the brightness of the scene, the
higher the digital value. Hence, digital
values stored on the magnetic media are
often referred to as brightness values.

Spectral band: A range of wavelength in


the electromagnetic spectrum; a series of
intensity values measured and stored on
magnetic media with in a selected
wavelength interval. It is generally denoted
as .
Panchromatic: One which is sensitive to a
wide spectral band, such as all visible
light.

Multispectral: two or more spectral bands.


Spectral signature: Quantitative
measurement of the properties of an
object at one or several wavelengths.
Spatial resolution: The smallest object on
the ground that can be seen on the image.
Multitemporal: Two or more images that
have been recorded over the same area
but at different times/dates.

Sensor: A device that receives


electromagnetic radiation, converts it into
a signal and presents it in a form suitable
for obtaining information on an object.
Pixel: A picture element having both
spatial and spectral aspects.
Image: The presentation of a scene by
optical, electro-optical, opto-mechanical or
electronic means

False Colour Composite (FCC): A colour


image produced by red, green and blue
wave bands, so that the colours produced
by the Earths surface do not correspond
to normal visual experience. The most
commonly seen FCCs display the verynear infrared as red, red as green and
green as blue.

Computer Compatible tapes (CCT):


Magnetic tapes containing digital values.
Class: Surface characteristic type, such as
forest or water that is of interest to the
investigator.
Thematic map: A map designed to
illustrate a particular class or theme.
Ground truth: The data gathered from field
visits or existing records and used to
assist an interpreting imagery.

Image analysis: The process of analysing


the data contained in an image and
extracting the useful information.
Infrared: The part of the EMR between 0.7
and 20m wavelengths. This region is
further subdivided into near-infrared(0.71.3m), middle infrared (1.3-3.0m) and
far infrared (7.0-15.0m)

Landuse: The surface of the land used in


various way e.g., agriculture, urban use
etc.
Land cover: The surface cover of the earth
e.g., water, forest etc.

Visible light is only one of many forms of electromagnetic


energy. Radio waves, ultraviolet rays and X-rays are
other familiar forms. All this energy is inherently similar
and radiates in accordance with basic wave theory.
In remote sensing, it is most common to categorize
electromagnetic waves by their wavelength location with
in the electromagnetic spectrum.
The most prevalent unit used to measure wavelength
along the spectrum is the micrometer (m). A micrometer
equals 1*10-6m.

Ultraviolet(UV) energy adjoins the blue end of


the visible portion of the spectrum. Adjoining the
red end of the visible region are three different
categories of infrared(IR) waves: near-IR (from
0.7 to 1.3m), mid IR from (1.3 to 3m), and
thermal-IR (beyond 3m). At much longer
wavelengths (1mm to 1m) is the microwave
portion of the spectrum.
Most common sensing systems operate in one
or several of the visible, IR, or microwave
portions of the spectrum.

Although many characteristics of electromagnetic


radiation are most easily described by wave theory,
another theory offers useful insights into how
electromagnetic energy interacts with matter. This
theory-the particle theory-suggests that electromagnetic
radiation is composed of many discrete units called
photons or quanta. The energy of the quantum is given
as Q=h
Where
Q=energy of quantum, Joules(J)
h=planks constant, 6.626*10-34Js
=frequency

We can relate the wave and quantum


models of electromagnetic radiation
behaviour by solving eq c= into
equation Q=((hc)/)
Thus we can see that the energy of a
quantum is inversely proportional to its
wavelength.

The longer the wavelength involved the lower its


energy content. This has important implications
in remote sensing from the stand point that
naturally emitted long wavelength radiation,
such as microwave emission from terrain
features is more difficult to sense than radiation
of shorter wavelengths, such as emitted thermal
IR energy. The low energy content of long wave
lengths must view large areas of the earth at
any given time in order to obtain detectable
energy signal.

The sun is the most obvious source of


electromagnetic radiation for remote
sensing.
All matter at temperatures above absolute
zero (0K or -273C) continuously emits
electromagnetic radiation. Thus, terrestrial
objects are also sources of radiation,
though it is of considerably different
magnitude and spectral composition than
that of the sun.

How much energy any object radiates is, among other


things, a function of the surface temperature of the
object. This property is expressed by the StefanBoltzmann law, which states that
M=T4
Where
M=total radiant exitance from the surface of a material,
watts (w) m-2
=Stefan-Boltzmann, constant, 5.6697*10-8wm-2K-4
T=absolute temperature(K) of the emitting material

The particular units and the value of the


constant are not critical for the student to
remember, yet it is important to note that the
total energy emitted from an object varies as T4
and therefore increases very rapidly with
increase in temperature.
It should be noted that this law is expressed for
an energy source that behaves as a black body.
A black body is a hypothetical, ideal radiator that
totally absorbs and reemits all energy incident
upon it.

Just as the total energy emitted by an object varies with


temperature, the spectral distribution of the emitted
energy also varies.
The dominant wavelength, or wavelength at which a
blackbody radiation curve reaches a maximum, related
to its temperature by wiens displacement law.
m= A/T
Where
m= wavelength of maximum spectral radiant exitance,
m
A=2898 mK
T=temperature, K

For a black body, the wavelength at which the


maximum spectral radiant exitance occurs
varies inversely with the blackbodys absolute
temperature. We observe this phenomenon
when a metal body such as a piece of iron is
heated, as the object becomes progressively
hotter, it begins to glow and its color changes
successively to shorter wavelengths from dull
red, to orange, to yellow, and eventually to white.

The earths ambient temperature (that is, the


temperature of surface materials such as soil, water and
vegetation) is about 300K (27C). From wiens
displacement law, this means the maximum spectral
radiant exitance from earth features occurs at a
wavelength of about 9.7m. Because this radiation
correlates with terrestrial heat, it is termed thermal
infrared energy. This energy can neither be seen nor
photographed, but can be sensed with such thermal
devices as radiometers and scanners.

By comparison, the sun has a much higher


energy peak that occurs at about 0.5m.
Our eyes and photographic film are sensitive to
energy of this magnitude and wavelength. Thus,
when the sun is present, we can observe earth
features by virtue of reflected solar energy. Once
again, the longer wavelength energy emitted by
ambient earth features can be observed only
with a non photographic sensing system.

Certain sensors, such as radar systems,


supply their own source of energy to
illuminate features of interest. These
systems are termed active systems in
contrast to passive systems that sense
naturally available energy. A very common
example of an active system is a camera
utilizing a flash. The same camera used in
sunlight becomes a passive sensor.

Planks law
Planks law explains the photo electric effect. Planks law
defines the spectral exitance of a black body (Henderson
1970)
M = (C1/5) (exp(C2/T) 1)
Where, C1=3.742*10-16watts per m2

C2=1.4388*10-2mK
(nu)=wavelength in micrometers
T=temperature in kelvin
M=spectral exitance per unit wavelength
The longer the wavelength, the lower is the energy
content

Remote Sensing-II

Energy interactions in the


atmosphere
Irrespective of its source, all radiation
detected by remote sensors passes
through some distance, or path length, of
atmosphere. The net effect of the
atmosphere varies with these differences
in path length and also varies with the
magnitude of the energy signal being
sensed, the atmospheric conditions
present, and the wavelengths involved.

We merely wish to introduce the notion


that the atmosphere can have a profound
effect on, among other things, the intensity
and spectral composition of radiation
available to any sensing system. These
effects are caused principally through the
mechanisms of atmospheric scattering
and absorption.

Scattering
-Atmospheric scattering is unpredictable diffusion
of radiation by particles in the atmosphere.
-Rayleigh scatter is common when radiation
interacts with atmospheric molecules and other
tiny particles that are much smaller in diameter
than the wavelength of the interacting radiation.
The effect of Rayleigh scatter is inversely
proportional to the fourth power of wavelength.
Hence there is a much stronger tendency for short
wavelengths to be scattered by this scattering
mechanism than long wavelengths.

A blue sky is a manifestation of Rayleigh


scatter. In the absence of scatter, the sky would
appear black. As sunlight interacts with earths
atmosphere, it scatters the shorter (blue) wave
lengths more dominantly than the other visible
wavelengths. Consequently we see a blue sky.
At sunrise and sunset, however, the suns rays
travel through a longer atmosphere path length
than during midday. With the longer path, the
scatter (and absorption) of short wavelengths is
so complete that we see only the less scattered,
longer, wavelengths of orange and red.

Rayleigh scatter is one of the primary


causes of haze in imagery. Visually haze
diminishes the crispness, or contrast,
of an image.

Mie scatter exists when atmospheric


particle diameters essentially equal the
wavelength of energy being sensed.
Water vapour and dust are major causes
of mie scatter. This type of scatter tends to
influence longer wavelength compared to
Rayleigh scatter.

Non selective scatter is a more bothersome


phenomenon, which comes about when the diameters of
the particles causing scatter are much larger than the
energy of wavelengths being sensed. Water droplets for
example cause such scatter. They commonly have a
diameter in the range 5 to 100m and scatter all visible
and near to mid-IR wavelengths about equally.
Consequently this scattering is non selective with
respect to wavelength. In visible wavelengths, equal
quantities of blue, green, and red light are scattered,
hence fog and clouds appear white.

Absorption
In contrast to scatter, atmospheric absorption results in
the effective loss of energy to atmospheric constituients.
This normally involves absorption of energy at a given
wavelength. The most efficient absorbers of solar
radiation in this regard are water vapour, carbon dioxide,
and ozone. Because these gases tend to absorb
electromagnetic energy in specific wavelength bands,
they strongly influence where we look spectrally with
any given remote sensing system. The wavelength
ranges in which the atmosphere is particularly
transmissive of energy are reffered to as atmospheric
windows.

The figure shows the interrelationship


between energy sources and atmospheric
absorption characteristics. Figure (a)
shows the spectral distribution of the
energy emitted by the sun and by earth
features. These two curves represent the
most common sources of energy used in
remote sensing.

Figure (b), spectral regions in which the


atmosphere blocks energy are shaded.
Remote sensing data acquisition is limited
to the nonblocked spectral regions, called
atmospheric windows. Note in figure (c)
that the spectral sensitivity range of the
eye (the visible range) coincides both
with an atmospheric window and peak
level of energy from the sun.

Emitted heat energy from the earth, shown by


a small curve in (a), is sensed through the
windows at 3 to 5m and 8 to 14m using such
devices as thermal scanners. Multispectral
scanners sense simultaneously through multiple,
narrow wavelength ranges that can be located at
various points in the visible through the thermal
spectral region. Radar and passive microwave
systems operate through a window in the region
1mm to 1m.

The important point to note from figure is


the interaction and the interdependence
between
the
primary
sources
of
electromagnetic energy, the atmospheric
windows through which source energy
may be transmitted to and from earth
surface features, and the spectral
sensitivity of the sensors available to
detect and record the energy.

One cannot select the sensor to be used in any


given remote sensing task arbitrarily. One must
instead consider
(1) the spectral sensitivity of the sensors
available
(2) the presence or absence of atmospheric
windows in the spectral range(s) in which one
wishes to sense, and
(3) the source magnitude, and spectral
composition of the energy available in these
ranges

Energy Interactions with Earth


Surface Features
When electromagnetic energy is incident on any given
earth surface feature, three fundamental energy
interactions with the feature are possible. Various
fractions of the energy incident on the element are
reflected, absorbed, and/or transmitted. Applying the
principle of conservation of energy, we can state the
interrelationship between these three energy interactions
as EI()=ER()+EA()+ET()
Where EI denotes the incident energy, ER denotes the
reflected energy, EA denotes the absorbed energy, and
ET denotes the transmitted energy, with all energy
components being a function of wavelength .

Two points concerning this relationship


should be noted.
First, the proportions of energy reflected,
absorbed, and transmitted will vary for
different earth features, depending on their
material type and condition. These
differences permit us to distinguish
different features on an image.

Second the portion of reflected, absorbed


and transmitted energy will vary at
different wavelengths.
Thus, two features may be distinguishable
in one spectral range and be very different
in another wavelength band within the
visible portion of the spectrum, these
spectral variations result in the visual
effect called color.

For example, we call objects blue when


they reflect highly in the blue portion of the
spectrum, green when they reflect highly
in the green portion of the spectrum and
so on.
The eye utilizes spectral variations highly
in the green spectral region.

Specular reflectors are flat surfaces that


manifest mirror like reflections, where the
angle of reflection equals the angle of
incidence.
Diffuse (or lambertian) reflectors are rough
surfaces that reflect uniformly in all
directions. Most earth surfaces are neither
perfectly specular nor diffuse reflectors.
Their characteristics are somewhat
between the two extremes.

The category that characterizes any given surface is


dictated by the surfaces roughness in comparison to the
wavelength of the energy incident upon it.
For example, in the relatively long wavelength radio
range, rocky terrain can appear smooth to incident
energy. In comparison, in the visible portion of the
spectrum, even a material such as fine sand appears
rough.
In short, when the wavelength of incident energy is much
smaller than the surface height variations or the particle
sizes that make up a surface, the surface is diffuse.

Diffuse
reflections
contain
spectral
information on the color of the reflecting
surface, where as specular reflections do
not.
Hence in remote sensing, we are most
often interested in measuring the diffuse
reflectance properties of terrain features.

Experience has shown that many earth surface


features of interest can be identified, mapped,
and studied on the basis of their spectral
characteristics.
Experience has also shown that some features
of interest cannot be spectrally separated.
To utilize remote sensing data effectively, one
must know and understand the spectral
characteristics of the particular features under
investigation in any given application. Likewise,
one must know what factors influence these
characteristics.

Parameters of a Sensor
In remote sensing, sensor is a device used to
record the reflected electromagnetic radiations
from the objects on the earths surface. The
sensor converts these radiations into electrical
signals.
In the design of an optimal remote sensor, an
ideal set of system parameters is an essential
requisite.
In a multispectral sensor system, these
parameters have strong interrelationships with
one another.

Resolution
Spatial resolution
The minimum detectable area on the ground by a detector
placed on a sensor is called the spatial resolution.
The spatial resolution is understood as the projection of a
detector element (which defines the instantaneous field of view
of a detector, i.e., IFOV) on to the ground.
The swath is a related feature of spatial resolution and IFOV of a
sensor, which can be explained as the width of a strip of terrain
recorded by a sensor.
In simple words, the spatial resolution can be defined as the
smallest object on the ground identifiable on the image.

Spectral resolution
The smallest amount of spectral change that
can be detected by a sensor is called the
spectral resolution. The finer the spectral
channels, the better is the spectral resolution

Radiometric resolution
The presence of grey levels defines the
radiometric resolution. The more are the grey
levels, the better is the radiometric resolution.
For example in IRS-LISS II the radiometric
resolution is 128(0-127)

Temporal resolution
The temporal resolution is characterized by
the smaller period of repetitive coverage. For
example SPOT gives revisit time i.e. in four
days, one particular area can be revisited.

Sensors
In remote sensing, the acquisition of data
is dependent upon the sensor system
used. Various remote sensing platforms
are equipped with different sensor
systems. Sensor is a device that receives
electromagnetic radiation, converts it into
a signal and presents it in a form suitable
for obtaining information about the land or
earth resource as used by an information
gathering system.

Sensors can be grouped, either on the


basis of energy source or on the basis of
wavebands employed. Based on energy
source, sensors are classified as follows.
Active sensor: An active sensor operates
by emitting its own energy which is
needed to detect the various phenomena
(e.g. RADAR, camera with a flash gun)

Passive sensor: The operation of passive


sensor is dependant on the existing
sources of energy, like sun(e.g.
photographic systems, multispectral
scanners).

Important sensor systems are described as


follows:
(i) Photographic cameras:
The photographic system, having conventional
camera with black and white photography, is the
oldest and probably, so far, the most widely used
sensor for recording information about ground
objects. Photographic cameras have successfully
used from aircraft, ballons, and from manned and
unmanned space craft.

In this system, the information is limited to size and


shape, as the films used are sensitive only to visible
region of spectrum. The response of black & white
film is about 0.4m 0.7m. For infrared imagery,
films with response extending upto 0.9m are
available.
These cameras have a limited spectral response
extending upto 0.9m. The middle IR and thermal IR
regions, which are of great interests, cannot be
covered with photographic cameras.

Return beam vidicon (RBV):


This is very similar to a television camera. In such a
system, the ground image is formed by a fixed
camera lens on a photosensitive semi-transparent
sheet. The image is created on this surface as
electrical charge or potential. The television camera
was the first electronic system which took images of
earth from space. With the launching of TIROS-1 in
1960, vidicon cameras were largely used in a number
of meteorological satellites for routinely viewing earth
for weather studies.

The best example of high resolution TV


cameras operated in space for resource
survey was the RBV used in LANDSAT series.
On LANDSAT 1 and 2, three RBV cameras
were used, each corresponding to a different
waveband 0.475-0.585m(green), 0.5800.690m(red)
and
0.690-0.830m(near
infrared).

Multispectral imagery was produced in


LANDSAT by using separate camera tubes for
each band and selecting the spectral band
with
appropriate
filters.
The
basic
disadvantage for such a sensor system is the
difficulty of registration in all the bands.
Limitations of such systems include limited
spectral response, low resolution, poor
radiometric
accuracy
and
geometric
distortions.

Thermal system:
sensors which operate in infrared and part of
microwave region are called thermal sensors.
These are based on the principle of StefanBoltzmann law of radiation, and utilize the
scanning
method
for
recording
electromagnetic energy. Thermal images are
generally found to have large distortions.

Optical-mechanical scanners:
Such scanners have a combination of beam
splitters and filters for spectral band selection.
The imaging system has the advantage that
any set of desired spectral bands can be
selected with appropriate filter and detector
combinations.
The most widely used sensor in this category
is the MSS on board LANDSAT series. MSS
has four spectral bands, covering from 0.5 to
1.1m region.

MSS consists of a telescope with a 23cm


aperture. The cross track scan is achieved by
an oscillating mirror. The IFOV is defined by
the ends of optical fibres arranged in a 6*4
matrix at the focal plane of the telescope. The
light conducted through the fibres is detected
by photomultipliers or photodiodes with
appropriate filters. The detector outputs are
suitably amplified, multiplexed and either
transmitted or recorded on a high density
tape.

RADAR and microwave sensors:


The acquisition of data in microwave region has been
possible since 1950s, but its application to natural
resoures is considerably less developed as compared
to visible and IR image interpretations.
Active microwave systems (also called RADAR
system) map the terrain features by transmitting a
series of microwave pulses and recording the
strength and timing of echoes reflected from objects
in the systems field of view

Microwave
sensors
have
a
distinct
advantages because they are unaffected by
atmospheric conditions and are thus able to
penetrate smoke, clouds, haze and snow.
Under this sensor system, plan position
indicator(PPI),
side
looking
airborne
RADAR(SLAR) and Synthetic Aperture
RADAR(SAR) can be grouped.

Advanced remote sensors:


Linear Imaging and Self scanning Sensors
are the advanced imaging systems unlike the
mechanical scanning system, these sensors
use an array of solid state devices. The array
may be made of photo-diodes, phototransistors
or
Charge
Coupled
Devices(CCDs).

The motion of the spacecraft produces successive


scanlines, thereby giving a two-dimensional picture.
The resolution mainly depends on the number of
photodetectors available in a linear array and the
required swath. The SPOT and IRS series carry such
solid state sensor systems which are also known as
push-broom scanners.
The IRS 1C, known as next generation and most
advanced satellite, carries an improvised sensor
system. Besides carrying a sophisticated LISS-III
camera, it has panchromatic camera(PAN) and a
Wide Field Sensor(WiFS)

Earth Observation Missions


Starting with IRS-1A in 1988, ISRO has launched many operational
remote sensing satellites. Today, India has one of the largest
constellations of remote sensing satellites in operation. Currently,
eleven operational satellites are in orbit RESOURCESAT-1 and 2,
CARTOSAT-1, 2, 2A, 2B, RISAT-1 and 2, OCEANSAT-2, MeghaTropiques and SARAL. Varieties of instruments have been flown
onboard these satellites to provide necessary data in a diversified
spatial, spectral and temporal resolutions to cater to different user
requirements in the country and for global usage. The data from
these satellites are used for several applications covering
agriculture, water resources, urban planning, rural development,
mineral prospecting, environment, forestry, ocean resources and
disaster management.

Mission
OPERATIONAL
SARAL

Date of Launch

Launch Vehicle

Payloads

Data Availability

May 25, 2013

PSLV-C20

Since Mar 13, 2013

RISAT-1
Megha-Tropiques

April 26, 2012


Oct 12, 2011

PSLV-C19
PSLV-C18

RESOURCESAT-2

Apr 20, 2011

PSLV-C16

Oceansat-2

23.09.2009

PSLV-C14

Ka band Altimeter,
ALTIKA
ARGOS Data
Collection System
Solid State C-band
Transponder (SCBT)
SAR
MADRAS, SAPHIR,
ScaRaB and ROSA
LISS III,LISS IV
Mx,AWiFS
OCM,SCAT

RISAT-2
IMS-1

20.04.2009
28.04.2008

PSLV-C12
PSLV-C9

SAR
IMS-1 Mx, HySI

CARTOSAT-2A
CARTOSAT - 2
CARTOSAT-1
Resourcesat-1(IRSP6)
Tech. Exp. Satellite
(TES)
DECOMMISSIONED
Oceansat -1 (IRS-P4)

Apr 28, 2008


10.01.2007
05.05.2005
17.10.2003

PSLV-C9
PSLV-C7
PSLV-C6
PSLV-C5

22.10.2001

PSLV-C3

PAN
PAN
PAN
LISS III,LISS IV Mx,
AWiFS
PAN

26.05.1999

PSLV-C2

OCM, MSMR

IRS-1D

29.09.1997

PSLV-C1

PAN,LISS III WiFS

IRS-P3

21.03.1996

PSLV-D3

WiFS,MOS

Since Jul 1, 2012


Since May 8, 2011
OCM Since Jan 1,
2010
SCAT from Jan 1,
2010-Jan 30, 2014.
Since Apr 22, 2009
Apr 30, 2008 to Sep
20, 2012
Since Apr 14, 2007
Since May 8, 2005
Since Dec 7, 2003
Nov 1, 2001 to Dec
12, 2011
Jul 1, 1999 to Aug
05, 2010
Jan 1, 1998 to Dec
31, 2009
Apr 1, 1996-Jan 25,

IRS P6/Resourcesat-1 and


Resourcesat-2
The IRS (Indian Remote Sensing)
satellites form a large family of Earth
observation satellites operated by the
Indian space agency.

IRS P6/Resourcesat-1 and Resourcesat-2 ensure


continuity of medium and high resolution data supply
provided by the twin satellites IRS-1C and IRS-1D.
These two, launched in 1995 and 1997 respectively,
have completed their missions after more than 10 years
of service. Like their predecessors, Resourcesat
satellites carry a LISS-III sensor as well as a wide field
AWiFS sensor, but the high resolution (5.8 m) LISS-4
sensor replaces the panchromatic sensor. The highresolution data are useful for applications such as urban
planning and mapping, while the average resolution is
used for vegetation discrimination, land mapping, and
natural resources management.

altitude: 816-818 km
inclination: 98.6 degrees
orbit: sun-synchronous polar
orbit period: 101 minutes
revisit time (LISS-4 et AWIFS): 5 days
swath width : 23,9 km 70,3 km (LISS-IV) ; 140 km
(LISS-III); 740 km (AWIFS)
satellites:
IRS-P6/Resourcesat-1 (17/10/2003 operational)

Resourcesat-2 (20/04/2011 operational)

PAN
Amsterdam Airport
1998 ANTRIX, SI, Euromap Neustrelitz

LISS
San Francisco Bay - Courtesy of National Point of Contact,
www.npoc.nl

LISS-III sensor
The LISS-III (Linear Imaging Self
Scanning Sensor) sensor is an optical
sensor working in four spectral bands
(green, red, near infrared and short wave
infrared). It covers a 141km-wide swath
with a resolution of 23 metres in all
spectral bands.

Band

Spectral band

Resolution

0,52 - 0,59 m

23 x 23 m

0,62 - 0,68 m

23 x 23 m

0,77 - 0,86 m

23 x 23 m

1,55 - 1,70 m

23 x 23 m

LISS-IV sensor
LISS-IV can work either in panchromatic or in
multispectral mode with the same bands as
LISS-III (except SWIR). However, the resolution
is much better (5.8 m). For Resourcesat-1, the
swath width varies from 23.9 km in multispectral
mode to 70.3 km in panchromatic mode. For
Resourcesat-2, the multispectral swath is
enhanced to 70 km. The linear array sensor can
be steered up to 26 degrees across-track,
enabling stereoscopic imaging.

Mode

Spectral band

Resolution

Panchromatic

0,50 - 0,75 m

5,8 x 5,8 m

Mode

Band

Spectral band

Resolution

Multispectral

0,52 - 0,59 m

5,8 x 5,8 m

0,62 - 0,68 m

5,8 x 5,8 m

0,77 - 0,86 m

5,8 x 5,8 m

AWiFS sensor
AWIFS (Advanced Wide Field Sensor) is
an optical sensor with intermediate spatial
resolution.

WiFS
Greece
1998 ANTRIX, SI, Euromap Neustrelitz

Band

Spectral band

Resolution

0,52 - 0,59 m

56 x 56 m

0,62 - 0,68 m

56 x 56 m

0,77 - 0,86 m

56 x 56 m

1,55 - 1,70 m

56 x 56 m

Fundamentals of AIRPHOTO
Interpretation(Visual Interpretation)
Aerial photographs contain a detailed
record of features on the ground at the
time of exposure. A photo interpreter
systematically examines the photos and
frequently, other supporting materials such
as maps and reports of field observations.
Based on this study an interpretation is
made as to the physical nature of objects
and phenomena appearing in the
photographs.

Interpretations may take place at a


number of levels of complexity, from the
simple recognition of objects on the
earths surface to the derivation of detailed
information
regarding
the
complex
interactions among earth surface and
subsurface features.

Success in photo interpretation varies with the training


and experience of the interpreter, the nature of the
objects or phenomena being interpreted, and the quality
of the photographs being utilized.
The most capable photo interpreters have keen powers
of observation coupled with imagination and a great deal
of patience.
It is important that the interpreter have a thorough
understanding of the phenomenon being studied as well
as knowledge of the geographic region under study.

Elements of Airphoto
Interpretation
Most applications consider the following basic
characteristics
Shape
Size
Pattern
Tone(or hue)
Texture
Shadows
Site
Association

Shape:
Shape refers to the general form,
configuration or outline of individual objects.
In the case of stereoscopic photographs, the
object height also defines its shape.
The shape of some objects is so distinctive
that their images may be identified solely from
this criterion.

Size:
Size of the objects on photographs must be
considered in the context of the photoscale

Pattern:
Pattern relates to the spatial arrangement of
objects.
The repetition of certain general forms or
relationships is a characteristic of many
objects, both natural and constructed, gives
objects a pattern that aids the photo
interpreter in recognizing them.

Tone(or hue):
Tone (or hue) refers to the relative brightness
or color of objects on photographs.

Texture:
Texture is the frequency of tonal change on the
photographic image.
Texture is produced by an aggregation of unit features
that may be too small to be discerned individually on
the photographs such as tree leaves and leaf
shadows.
It determines the overall visual smoothness or
coarseness of image features.
As the scale of the photograph is reduced, the texture
of any given object or area becomes progressively
finer and ultimately disappears.

Shadows:
Shadows are important to interpretations in
two opposing respects:
(1) The shape or outline of a shadow affords an
impression of the profile view of objects (which
aids interpretation) and
(2) Objects within shadows reflect little light and
are difficult to discern on photographs (which
hinders interpretation)

For example, the shadows cast by various tree


species or cultural features (bridges, silos, towers,
etc.) can definitely aid in their identification on
airphotos.
Shadows resulting from even subtle variations in
terrain elevations, especially in the case of low sun
angle photographs, can aid in assessing natural
topographic variations that may be diagnostic of
various geologic landforms.

Site:
Site refers to topographic location and is a
particularly important aid in the identification
of vegetation types.
For example, certain tree species would be
expected to occur on well-drained upland
sites, whereas other tree species would be
expected to occur on poorly drained lowland
sites.

Association:
Association refers to the occurrence of certain
features in relation to others.
For example, a ferris wheel might be difficult
to identify if standing in a field near a barn, but
would be easy to identify if standing in a field
near a barn, but would be easy to identify if in
an area recognized as an amusement park.

In a sense, the airphoto interpretation


process is like the work of a dectective
trying to put all the pieces of evidence
together to solve a mistery.
The interpreter uses the process of
convergence of evidence to successively
increase the accuracy and detail of the
interpretation.

Principles of Landform
Identification and Evaluation
Various terrain characteristics are important to soil
scientist, geologists, geographers, civil engineers, urban
and regional planners, landscape architects, real estate
developers, and others who wish to evaluate the
suitability of the terrain for various land uses.
Because terrain conditions strongly influence the
capability of the land to support various species of
vegetation, an understanding of airphoto interpretation
for terrain evaluation is also important for botanists,
foresters, wildlife ecologists, and others concerned with
vegetation mapping and evaluation.

The principal terrain characteristics that


can be estimated by means of airphoto
interpretation are bedrock type, landform,
soil texture, site drainage conditions,
susceptibility to flooding, and depth of
unconsolidated materials over bedrock.
The slope of the land surface can be
estimated by airphoto interpretation and
measured by photogrammetric methods.

Land Use Suitability Evaluation


Terrain information can be used to
evaluate the suitability of land areas for a
variety of land.
Emphasis
is
on
suitability
for
developmental purposes, principally urban
and suburban land uses.

The topographic characteristics of an area are


one of the most important determinants of the
suitability of an area for development.
For subdivision development, slopes in the 2 to
6 percent range are steep enough to provide for
good surface drainage and interesting siting and
yet flat enough so that no significant site
development problems will be encountered
provided the soil is well drained.

Some drainage problems may be


encountered in the 0 to 2 percent range,
but these can be readily overcome unless
there is a large expanse of absolutely flat
land with insufficient internal drainage.
The site plan in the 6 to 12 percent range
may be more interesting than in the 2 to 6
percent range but will be more costly to
develop.

Slopes over 12 percent present problems


in street development and also pose
serious problems when septic tanks are
used for domestic sewage disposal.
Severe
limitations
to
subdivision
development occur on slopes over 20
percent.
For industrial park and commercial sites,
slopes of not more than 5 percent are
preferred.

The soil texture and drainage conditions also


affect land use suitability.
Well drained, coarse-textured soils present few
limitations to development.
Poorly drained or very poorly drained, fine
textured soils can present severe limitations.
Shallow groundwater tables and poor soil
drainage conditions cause problems in septic
tank installation and operation, in basement and
foundation excavation, and in keeping
basements water free after construction.

In general, depths to the water table of


atleast 2m are preferred.
Depths of 1 to 2m may be satisfactory
where public sewage disposal is provided
and buildings are constructed without
basements.

Shallow depths to bedrock cause


problems in septic tank installation and
maintenance, in utility line construction, in
basement and foundation excavation and
in street location and costruction,
especially when present in combination
with steep slopes.

Depths to bedrock of 1 to 2m are


generally
unsatisfactory,
but
the
development of these areas may be
feasible in some cases. These sites are
generally unsatisfactory where septic tank
sewage disposal is to be provided.
Additional excavation costs are involved
where basement and public sewage
disposal facilities are to be constructed.

A depth to bedrock of less than 1m


presents
serious
limitations
to
development and is an unsatisfactory
condition in almost all cases of land
development

Slope stability problems occur with certain


soil-slope conditions.
Numerous areas of incipient landslide
failure have been detected by airphoto
interpretation.

It must be recognised that certain land


areas may be worthy of preservation in
their natural state because of outstanding
topographic or geologic characteristics or
because rare or endangered plant or
animal species occupy those areas.

Elements of Airphoto Interpretation for


Landform Identification and Evaluation
Airphoto interpretation for landform identification and
evaluation is based on a systematic observation and
evaluation of key elements that are studied
stereoscopically.
These are
Topography
Drainage pattern and texture
Erosion
Photo tone
Vegetation and land use

Topography
Each landform and bedrock type described
has its own characteristic topographic form
including a typical size and shape.
There is often a distinct topographic change
at the boundary between two different
landforms.
The specific amount of vertical exaggeration
observed in any given stereopair is a function
of the geometric conditions under which the
photographs are viewed and taken.

Drainage pattern and texture


The drainage pattern and texture seen on aerial
photographs are indicators of landform and bedrock
type and also suggest soil characteristics and site
drainage conditions.
Six of the most common drainage patterns are
illustrated in figure.
The dendritic drainage pattern is a well-integrated
pattern formed by a main stream with its tributaries
branching and rebranching freely in all directions and
occurs on relatively homogenous materials such as
horizontally bedded sedimentary rock and granite.

The rectangular drainage pattern is basically


a dendritic pattern modified by structural
bedrock control such that the tributaries meet
at right angles and is typical of flat-lying
massive sandstone formations with a welldeveloped joint system.
The trellis grainage pattern consists of
streams having one dominant direction, with
subsidiary
directions
of
drainage
at
rightangles, and occurs in areas of folded
sedimentary rocks.

The radial drainage pattern is formed by


streams that radiate outward from a central
area as is typical of volcanoes and domes.
The centripetal drainage pattern is the reverse
of the radial drainage pattern(drainage is
directed towards a central point) and occurs in
areas of limestone sinkholes, glacial kettle
holes,
volcanic
craters,
and
other
depressions.

The deranged drainage pattern is a


disordered pattern of aimlessly directed short
streams, ponds, and wetland areas typical of
oblation glacial till areas.
The described drainage patterns are all
destructional drainage patterns resulting
from the erosion of land surface; they should
not be confused with constructional drainage
features that are remnants of the mode of
origin of landforms such as alluvial fans and
glacial outwash plains coupled with drainage
pattern is drainage texture.

Figure shows coarse-textured and fine-textured


drainage patterns.
Coarse textured patterns develop where the soils and
rocks have good internal drainage with little surface
runoff.
Fine textured patterns develop where the soils and
rocks have poor internal drainage and high surface
runoff.
Fine textured drainage patterns develop on soft,
easily eroded rocks, such as shale, whereas coarsetextured patterns develop on hard, massive rocks,
such as granite.

Erosion
Gullies are the smallest drainage features that can be
seen on aerial photographs and may be as small as a
meter wide and a hundred meters long.
Gullies result from the erosion of unconsolidated
material by runoff and develop where rainfall cannot
adequately percolate into the ground, but instead
collects and flows across the surface in small rivulets.
These initial rivulets enlarge and take on a particular
shape characteristic of the material in which they are
formed.

As illustrated in figure, short gullies with Vshaped cross sections tend to develop in
sand and gravel
Gullies with U-shaped cross sections tend to
develop in silty soils
Long gullies with gently rounded cross
sections tend to develop in silty clay soils.

Photo tone
The term photo tone refers to the brightness at any
point on a panchromatic photograph.
The absolute value of the photo tone depends not
only on certain terrain characteristics but also on
photographic factors such as film-filter combination,
exposure and photographic processing.
Photo tone also depends on meteorological and
climatological factors such as atmospheric haze, sun
angle, and cloud shadows. Because of the effect of
these non-terrain related factors, photo interpretation
for terrain evaluation must rely on an analysis of
relative tone values, rather than absolute tone values.

Relative tone values are important because


they often form distinct photographic patterns
that may be of great significance in airphoto
interpretation.

Vegetation and Land use


Differences in natural or cultivated vegetation
often indicate differences in terrain conditions.

The Airphoto Interpretation


Process
Through an analysis of the elements of photo
interpretation(topography, drainage pattern and
texture, erosion, photo tone, vegetation and land
use), the photo interpreter can identify different
terrain conditions and can determine the
boundaries between them.
Initially, photo interpreters will need to consider
carefully each of the above elements individually
and in combination in order to estimate terrain
conditions.

After some experience, these elements are


often applied subconsciously as the
interpreter develops the facility to recognize
certain recurring airphoto patterns almost
instantaneously.
In complex areas, the interpreter should not
make snap decisions about terrain conditions
but should carefully consider the topography,
drainage pattern and texture, erosion, photo
tone, vegetation and land use characteristics
exhibited on the aerial photographs.

Atmospheric Influences on
spectral response patterns
In addition to being influenced by temporal
and spatial effects, spectral response
patterns
are
influenced
by
the
atmosphere.
The energy recorded by a sensor is
always modified to some extent by the
atmosphere between the sensor and the
ground

Figure provides an initial frame of


reference for understanding the nature of
atmospheric effects.
Shown in this figure is the typical situation
encountered when a sensor records
reflected solar energy.

The atmosphere effects the brightness, or


radiance, recorded over any given point on the
ground in two almost contradictory ways.
First, it attenuates(reduces) the energy
illuminating a ground object (and being reflected
from the object).
Second, the atmosphere acts as a reflector
itself, adding a scattered, extraneous path
radiance to the signal detected by the sensor.

It should be noted that all of the above


factors depend on wavelength.
Also as shown in figure, the irradiance (E)
stems from the sources
(1) directly reflected sunlight and
(2) diffuse skylight, which is sunlight that
has been previously scattered by the
atmosphere.

The relative dominance of sunlight versus


skylight in any given image is strongly
dependent on weather conditions (e.g.
sunny vs hazy vs cloudy)
Likewise, irradiance varies with seasonal
changes in solar elevation angle, and the
changing distance between earth and sun.

Introduction to digital data


analysis
Digital image processing involves the
manipulation and interpretation of digital
images with the aid of a computer.
This form of remote sensing actually
began in the 1960s with a limited number
of
researches
analysing
airborne
multispectral scanner data and digitized
aerial photographs.

It was not until the launch of Landsat-1 in


1972 that digital image data became
widely available for land remote sensing
applications.
At that time not only was the theory and
practice of digital image processing in its
infancy, but also the cost of digital
computers was very high and their
computational efficiency was very low by
modern standards.

Today, access to low cost, efficient computer


hardware and software is the commonplace, and
the sources of digital image data are many and
varied.
The sources of digital data range from
commercial earth resource satellite systems to
the meteorological satellites, to airborne scanner
data, to airborne digital camera data, to image
data generated by scanning microdensitometers
and other high resolution digitizing systems

Digital image processing is an extremely broad subject,


and it often involves procedures which can be
mathematically complex.
The central idea behind digital image processing is quite
simple. The digital image is fed into a computer one pixel
at a time.
The computer is programmed to insert these data into an
equation, or series of equations, and then store the
results of the computation for each pixel.
These results form a new digital image that may be
displayed or recorded in pictorial format or may itself be
further manipulated by additional programs

The possible forms of digital image


manipulation are literally infinite.
Digital data analysis may be categorized
into one(or more) of the following five
broad types of computer-assisted
operations.

1. Image rectification and restoration:


Aim to correct distorted or degraded image data to
create a more faithful representation of the original
scene.
Involves the initial processing of raw image data to
correct for geometric distortions, to calibrate the data
radiometrically and to eliminate noise present in the
data.
The nature of any particular image restoration
process is highly dependant upon the characteristics
of the sensor used to acquire the image data.

Image rectification and restoration procedures


are often termed pre-processing operations
because they normally precede further
manipulation and analysis of the image data
to extract specific information.

2. Image Enhancement
These procedures are applied to image data
in order to more efficiently display or record
the data for subsequent visual interpretation.
Normally image enhancement involves
techniques for
increasing the visual
distinctions between features in a scene.
The objective is to create new images from
the original image data in order to increase
the amount of information that can be visually
interpreted from the data.

The enhanced images can be displayed


interactively on a monitor or they can be
recorded in a hard copy format, either in black
and white or in color.
There are no simple rules for producing the
single best image for a particular application.
Often several enhancements made from the
same raw image are necessary.

3. Image classification:
To classify the image into classes.
To replace visual analysis of the image data
with quantitative techniques for automating
the identification of features in a scene.
This normally involves the analysis of
multispectral image data and the application
of statistically based decision rules for
determining the land cover identity of each
pixel in an image.

When the decision rules in digital image classification


are based solely on the spectral radiances observed
in the data, we refer to the classification process as
spectral pattern recognition.
The decision rules may be based on the geometric
shapes, sizes and patterns present in the image data.
These procedures fall into the domain of spatial
pattern recognition.
In either case, the intent of the classification process
is to categorize all pixels into one of several land
cover classes or themes.

The categorized data may be used to produce


thematic maps of the land cover present in an
image, and/or produce summary statistics on
the areas covered by each land cover type.

Data merging and GIS integration


These procedures are used to combine image
data for a given data sets for the same area.
These other data sets might simply consist of
image data generated on other dates by the
same sensor or by other remote sensing
systems. Frequently, the inlet of data merging is
to combine remotely sensed with other sources
of information in the context of a GIS. For
example, image data are often combined with
soil, topographic, ownership, zoning and
assessment information.

Biophysical modelling:
The objective of biophysical modelling is to
relate quantitatively the digital data recorded by
a remote sensing system to biophysical features
and phenomena measured on the ground. For
example, remote sensed data might be used to
estimate such varied parameters as crop yield,
pollution concentration, or water depth. Likewise,
remotely sensed data are often used in concert
with GIS techniques to facilitate environmental
modelling.

The intent of these operations is to


simulate the functioning of environmental
systems in a spatially explicit manner and
to predict their behaviour under altered
(what if) conditions, such as global
climate change.

S-ar putea să vă placă și