Sunteți pe pagina 1din 50

Lecture notes

Digital Image Processing by Jorma Kekalainen

Digital Image Processing


Jorma Kekalainen

Contents

Introduction to images and image processing


Electromagnetic radiation and image formation
Image acquisition and sensing
Use of Matlab
Images and Matlab
Image display
Point operations and histograms
Spatial filtering
Noises and noise cleaning
Differences and edges
Fourier transforms and frequency filtering
Color models and processing

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

Page 1

Lecture notes

Digital Image Processing by Jorma Kekalainen

Literature and sources


An Introduction to Digital Image Processing
with Matlab; Notes for Image Processing by
Alasdair McAndrew
Digital Image Processing by Gonzalez & Woods
Image processing course material in Internet
Matlab virtual manuals

Jorma Kekalainen

Digital Image Processing

Digital Image Processing


Introduction to Image
and Image Processing

Lecture weeks 1 and 2

Page 2

Lecture notes

Digital Image Processing by Jorma Kekalainen

Image and picture


For our purposes, an image is a single picture
which represents something.
It may be a picture of a person, of people or
animals, or of an outdoor scene, or a
microphotograph of an electronic component,
or the result of medical imaging.
Even if the picture is not immediately
recognizable, it will not be just a random blur.
Jorma Kekalainen

Digital Image Processing

What is image processing?


Image processing involves changing the nature of an image
in order to either
1) improve its pictorial information for human interpretation,
2) render it more suitable for autonomous machine perception.

These two aspects represent two separate but equally


important aspects of image processing.
A procedure which satisfies the first condition - a
procedure which makes an image look better - may be the
very worst procedure for satisfying the second condition.
Humans like their images to be sharp, clear and detailed;
machines prefer their images to be simple and uncluttered.

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

Page 3

Lecture notes

Digital Image Processing by Jorma Kekalainen

Improving pictorial information


for human interpretation
Enhancing the edges of an image to make it
appear sharper.
Sharpening edges is a vital component of printing.
In order for an image to appear at its best on the
printed page some sharpening is usually
performed.

Removing noise from an image.


Noise being random errors in the image.

Removing motion blur from an image.


Jorma Kekalainen

Digital Image Processing

Image sharpening
Original image

Jorma Kekalainen

Lecture weeks 1 and 2

Enhanced original

Digital Image Processing

Page 4

Lecture notes

Digital Image Processing by Jorma Kekalainen

Processed images

Jorma Kekalainen

Digital Image Processing

Image sharpening

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

10

Page 5

Lecture notes

Digital Image Processing by Jorma Kekalainen

Thumb print

Jorma Kekalainen

Digital Image Processing

11

Removing motion blur


Original, blurry image

Jorma Kekalainen

Lecture weeks 1 and 2

Enhanced image

Digital Image Processing

12

Page 6

Lecture notes

Digital Image Processing by Jorma Kekalainen

Removing motion blur from an


image
Blurred image

Jorma Kekalainen

Restored image

Digital Image Processing

13

Image corrupted by noise


X-ray image of circuit
Noise reduction with a
board corrupted by salt3 x3 median filter.
and-pepper noise.

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

14

Page 7

Lecture notes

Digital Image Processing by Jorma Kekalainen

Image scrambling: Same information?!

Jorma Kekalainen

Digital Image Processing

15

Rendering images more suitable for


autonomous machine perception
Examples may include:
Obtaining the edges of an image.
This may be necessary for the measurement of objects in an
image.
Once we have the edges we can measure their spread, and the
area contained within them.
We can also use edge detection algorithms as a first step in edge
enhancement.

Removing details from an image.


For measurement or counting purposes, we may not be
interested in all the detail in an image.
For example, a machine inspected items on an assembly line,
the only matters of interest may be shape, size or color.
For such cases, we might want to simplify the image.
Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

16

Page 8

Lecture notes

Digital Image Processing by Jorma Kekalainen

Images and edges

Jorma Kekalainen

Digital Image Processing

17

Automated visual inspection of


manufactured goods etc.

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

18

Page 9

Lecture notes

Digital Image Processing by Jorma Kekalainen

Imaging the creation of images


Examples of different sources
Electromagnetic radiation
Transmitted gamma-rays
Transmitted and reconstructed gamma-rays, Single
Photon Emission Tomography (SPECT)
Emitted and reconstructed gamma-rays, Positron
Emission Tomography (PET)
Transmitted and reconstructed X-rays, Computed
Tomography (CT)

Jorma Kekalainen

Digital Image Processing

19

Imaging the creation of images


Examples of different sources
Electromagnetic radiation
Excitation rays from ultraviolet light, fluorescence
microscopy
Reflected rays from visible light
Reflected infrared rays
Emitted radio waves + magnetic fields and
reconstruction => magnetic resonance imaging (MRI)

Ultrasound
Transmitted and reflected ultrasound
Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

20

Page 10

Lecture notes

Digital Image Processing by Jorma Kekalainen

Image
Suppose we take an image, e.g. a photo.
Suppose the photo is black and white (that is, lots of
shades of gray), so no color.
We may consider this image as being a two
dimensional function f(x,y), where the function values
give the brightness of the image at any given point
(x,y).
In other words, there is a function f that depends on
the spatial (room) coordinates x and y.
We may assume that in such an image brightness
values can be any real numbers in the range 0 (black)
to 1 (white).
Jorma Kekalainen

Digital Image Processing

21

Digital image
A digital image differs from a photo in that the x, y and f(x,y)
values are all discrete.
Usually they take on only integer values, so e.g. x and y
ranging from 1 to 256 each, and the) brightness values also
ranging from 0 (black) to 255 (white).
A digital image can be considered as a large array of discrete
dots, each of which has a brightness associated with it.
These dots are picture elements called pixels.
Surroundings of a given pixel form a neighborhood.
A neighborhood can be characterized by its shape in the same
way as a matrix.
E.g. we can speak of a 5*5 neighborhood, or of a 7*9 neighborhood.
Note: Except in very special circumstances, neighborhoods have odd numbers of rows and
Jorma Kekalainen
Image Processing
22
columns, because this ensures thatDigital
the current
pixel is in the centre of the neighborhood.

Lecture weeks 1 and 2

Page 11

Lecture notes

Digital Image Processing by Jorma Kekalainen

Example of a digital image


pixel =
picture element

Zoom

Jorma Kekalainen

Original
image: 500x340
Digital Image Processing
pixels

23

Digital image: Formal definition


A digital image, I, is a mapping from a 2D grid of uniformly
spaced discrete points, {p = (r,c)}, into a set of positive integer
values, {I( p)}, or a set of vector values, e.g., {[R G B]T( p)}.
At each column location in each row of I there is a value.
The pair ( p, I( p) ) is called a pixel (for picture element).
p = (r,c) is the pixel location indexed by row, r, and column, c.
I( p) = I(r,c) is the value of the pixel at location p.
If I( p) is a single number then I is monochrome.
If I( p) is a vector (ordered list of numbers) then I has multiple
bands (e.g., a color image).

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

24

Page 12

Lecture notes

Digital Image Processing by Jorma Kekalainen

Pixel: [ p, I(p)]
c
Value: I(p)=I(r,c)

Location: p=(r,c)

Pixel Location: p = (r , c)
Pixel Value: I(p) = I(r , c)
Jorma Kekalainen

Digital Image Processing

25

Pixel: [ p, I(p)]
c
Value: I(p)=I(r,c)

p=(r,c)
=(row#, col#)
Jorma Kekalainen

Lecture weeks 1 and 2

Location: p=(r,c)

red

I (p ) = green
blue

Digital Image Processing

26

Page 13

Lecture notes

Digital Image Processing by Jorma Kekalainen

Example
Pixels and a neighborhood
Current pixel

3*5 neighborhood
Note: If a neighborhood has an even number of rows or columns (or both), it may
Jorma Kekalainen
Processing
27
be necessary
to specify which pixel inDigital
theImage
neighborhood
is the current pixel.

Digital image processing


We shall be concerned with digital image
processing, which involves using a computer
to change the nature of a digital image.
DFT of the small circle

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

28

Page 14

Lecture notes

Digital Image Processing by Jorma Kekalainen

Some applications
Image processing has an enormous range of applications in almost
every area of science and technology e.g.,
Industry
Automatic inspection of items on a production line

Agriculture
Satellite/aerial views of land, for example to determine how much
land is being used for different purposes, or to investigate the
suitability of different regions for different crops, or inspection of fruit
and vegetables - distinguishing good and fresh produce from old.

Medicine
Inspection and interpretation of images obtained from X-rays, MRI
(magnetic resonance imaging) or CT scans.

Law enforcement:
Fingerprint analysis,
Sharpening or de-blurring of speed-camera images,
All kinds of surveillance camera applications
Jorma Kekalainen

Digital Image Processing

29

Aspects of image processing


It is convenient to subdivide different image processing
algorithms into broad subclasses.
There are different algorithms for different tasks and
problems, and often we would like to distinguish the nature
of the task at hand.
Image enhancement refers to processing an image so that the
result is more suitable for a particular application.
Image restoration may be considered as reversing the damage
done to an image by a known cause.
Image analysis or segmentation involves subdividing an image
into principal parts, or isolating certain aspects of an image.

These classes are not disjoint.


A given algorithm may be used for both image enhancement or
for image restoration.
Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

30

Page 15

Lecture notes

Digital Image Processing by Jorma Kekalainen

Example: Image enhancement


Image enhancement refers to processing an image so that
the result is more suitable for a particular application:

sharpening or de-blurring an out of focus image,


highlighting edges,
improving image contrast, or brightening an image,
removing noise

Image enhancement techniques bring out the detail in an


image that is obscured or highlight certain features of
interest in an image.
Image enhancement techniques, such as contrast
adjustment and filtering, typically return a modified version
of the original image.
These techniques are frequently used as a preprocessing step to
improve the results of image analysis.
Jorma Kekalainen

Digital Image Processing

31

Example: Image restoration


Image restoration may be considered as
reversing the damage done to an image by a
known cause:
removing of blur caused by linear camera motion,
removal of optical distortions,
removing periodic interference.

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

32

Page 16

Lecture notes

Digital Image Processing by Jorma Kekalainen

Example: Image analysis or


segmentation
Image analysis is the process of extracting meaningful
information from images such as finding shapes, counting
objects, identifying colors, or measuring object properties.
Image segmentation is the process of partitioning an image
into parts or regions
finding lines, circles, or particular shapes in an image,
in an aerial photograph: identifying cars, trees, buildings, or
roads.

This division into parts is often based on the characteristics


of the pixels in the image.
E.g., one way to find regions in an image is to look for abrupt
discontinuities in pixel values, which typically indicate edges.
These edges can define regions.

Another method is to divide the image into regions based


on color values.
Jorma Kekalainen

Digital Image Processing

33

Types of digital images


We shall consider four basic types of images:
Binary
Grayscale
True color, or RGB
Indexed

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

34

Page 17

Lecture notes

Digital Image Processing by Jorma Kekalainen

Digital image
a grid of squares,
each of which
contains a single
color

each square is
called a pixel (for
picture element)

Color images have 3 values per pixel; monochrome


images have 1 value per pixel.
Jorma Kekalainen

Digital Image Processing

35

Binary image
Each pixel is just black or white.
Since there are only two possible values for
each pixel, we only need one bit per pixel.
Such images can therefore be very efficient in
terms of storage.
Images for which a binary representation may
be suitable include text (printed or
handwriting), fingerprints, blueprints,
architectural plans etc.
Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

36

Page 18

Lecture notes

Digital Image Processing by Jorma Kekalainen

Example: Binary image

Jorma Kekalainen

In this image, we have only the two


colors: black for the edges, and
white for the background or the
Digital Image Processing
37
other way round.

Grayscale
Each pixel is a shade of gray, normally from 0 (black) to
255 (white).
This range means that each pixel can be represented by
eight bits i.e. exactly one byte.
This is a very natural range for image file handling.
Other grayscale ranges are used, but generally they are
a power of 2.
Such images arise in medicine (X-rays), images of
printed works, and indeed 256 different gray levels is
sufficient for the recognition of most natural objects.
Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

38

Page 19

Lecture notes

Digital Image Processing by Jorma Kekalainen

Example: Grayscale image

C(150:170,210:235)

Jorma Kekalainen

Digital Image Processing

39

Color images
Are constructed from
three intensity maps.
Each intensity map is
projected through a color
filter (e.g., red, green, or
blue, or cyan, magenta, or
yellow) to create a
monochrome image.
The intensity maps are
overlaid to create a color
image.
Each pixel in a color image
is a three element vector.
Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

40

Page 20

Lecture notes

Digital Image Processing by Jorma Kekalainen

Primary and secondary colors of


light (Additive color mixing)
Here, secondary colors are mixtures of two primary colors.
yellow = red + green
cyan = green + blue
magenta = red + blue
CRT
LCD
plasma

Jorma Kekalainen

Digital Image Processing

41

Color images
Formation of a vector from corresponding pixel
values in three RGB component images

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

42

Page 21

Lecture notes

Digital Image Processing by Jorma Kekalainen

True color or RGB


Here each pixel has a particular color; that color being
described by the amount of red, green and blue in it.
If each of these components has a range 0 to 255, this
gives a total of 2563 =16777200 different possible colors in
the image.
This is enough colors for any image.
Since the total number of bits required for each pixel is 24
such images are also called 24-bit color images.
Such an image may be considered as consisting of a stack of
three matrices representing the red, green and blue values
for each pixel.
This means that for every pixel there correspond three
values.
Jorma Kekalainen

Digital Image Processing

43

Example: True color image

Red

Jorma Kekalainen

Lecture weeks 1 and 2

Green

Digital Image Processing

Blue

44

Page 22

Lecture notes

Digital Image Processing by Jorma Kekalainen

Example: Numerical values of


some discrete pixels

Jorma Kekalainen

Digital Image Processing

45

Indexed image
Most color images only have a small subset of the
more than sixteen million possible colors.
For convenience of storage and file handling, the image
has an associated color map, or color palette, which is
simply a list of all the colors used in that image.
Each pixel has a value which does not give its color (as
for an RGB image), but an index to the color in the
map.
It is convenient if an image has 256 colors or less, for
then the index values will only require one byte each to
store.
Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

46

Page 23

Lecture notes

Digital Image Processing by Jorma Kekalainen

Example: Indexed image

Indices

Color map

In this image the indices, rather then being the gray values
of the pixels, are simply indices into the color map.
Without the color map, the image would be very dark and
colorless.
Jorma Kekalainen

Digital Image Processing

47

Example: Indexed image


In this image the indices are simply indices into the color map.
Color map
0.1451
0.7451
0.7412
0.2392
0.8706
0.8706

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

0.1176
0.2627
0.5451
0.2588
0.6941
0.8431

0.1412
0.2314
0.6392
0.4000
0.1490
0.8196

Without the color map,


this image would be
nearly black, because
here the maximum index
48
value is 5.

Page 24

Lecture notes

Digital Image Processing by Jorma Kekalainen

Example
A small index sample from the previous image
>> ind(117:123,347:352)
ans =
0.1451
2 1 2 1 1 4
0.7451
1 4 4 1 1 1
0.7412
0.2392
2 1 4 4 1 4
0.8706
5 5 2 2 1 1
0.8706
5 5 5 5 5 1
5 5 5 5 5 5
5 5 5 5 5 5
Jorma Kekalainen

Color map
0.1176
0.2627
0.5451
0.2588
0.6941
0.8431

0.1412
0.2314
0.6392
0.4000
0.1490
0.8196

Digital Image Processing

49

Image file sizes


Image files tend to be large.
For example, suppose we consider a 512*512
binary image.
The number of bits used in this image
(assuming no compression, and neglecting any
header information) is
512*512*1 bit = 262 144 bits =32768 bytes
= 32.768 kB 0.033 MB
Note: Here we use the convention that a kilobyte is one thousand bytes, and a
Jorma Kekalainen
Digital Image Processing
megabyte
is one million bytes.

Lecture weeks 1 and 2

50

Page 25

Lecture notes

Digital Image Processing by Jorma Kekalainen

Image file sizes


A grayscale image of the same size requires:
512*512*1 byte = 262 144 bytes 262.14 kB
0.262MB
In color images each pixel is associated with
3 bytes of color information.
A 512*512 image thus requires
512*512*3 bytes = 786 432 bytes 786.43 kB
0.786 MB
Note: Many images are of course much larger than this. E.g. satellite images may be of
Jorma Kekalainen
Digital Image Processing
51
the order
of ten thousand pixels in each
direction.

Example: A picture is worth one


thousand words
Lets test the well-known proverb A picture is worth
one thousand words.
Assuming a word to contain 10 ASCII characters (on
average), and that each character requires 8 bits of
storage, then 1000 words contain
1000*10*8 = 80 000 bits of information
This is roughly equivalent to the information in a
283*283
binary image
100*100
grayscale image
58*58
RGB color image
Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

52

Page 26

Lecture notes

Digital Image Processing by Jorma Kekalainen

Image perception
Much of image processing is concerned with
making an image appear better to human eyes.
We should therefore be aware of the limitations
of the human visual system.
Image perception consists of two basic steps:
capturing the image with the eye,
recognizing and interpreting the image with the visual
cortex in the brain.

The combination and immense variability of


these steps influences the ways in we perceive
the world around us.
Jorma Kekalainen

Digital Image Processing

53

Eye works rather much like a


camera
Reflected light from
the object

Lens

Retina is
the sensor

Focal length

upside down

But the brain performs some more processing that


gives rise to visual phenomena
Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

54

Page 27

Lecture notes

Digital Image Processing by Jorma Kekalainen

Color vision
The human eye has cones which are sensitive to
different wavelength bands

Jorma Kekalainen

Digital Image Processing

55

Absorption of light by
the cones in the human eye

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

56

Page 28

Lecture notes

Digital Image Processing by Jorma Kekalainen

Example: Visual phenomenon


Observed intensities vary as to the
background.
A single block of gray will appear darker if
placed on a white background than if it were
placed on a black background.
That is, we don't perceive gray scales as they
are, but rather as they differ from their
surroundings.
Jorma Kekalainen

Digital Image Processing

57

Example: Visual phenomenon


A gray square on different backgrounds

Notice how much darker the square appears when it is


surrounded by a light gray. However, the three central squares
Jormaexactly
Kekalainen
Digital Image Processing
58
have
the same intensity.

Lecture weeks 1 and 2

Page 29

Lecture notes

Digital Image Processing by Jorma Kekalainen

Example: Visual phenomenon


Gray scale
image:

Actual
intensity:

Perceived
intensity:
Jorma Kekalainen

Digital Image Processing

59

Intensity resolution
We can only resolve 26=64 or at most 27=128 intensity levels
on an ordinary computer screen.
Based on hardware considerations, grayscale images are
usually stored with 8 bits per pixels, i.e. 28=256 intensity
levels.
Some images are stored with more than 8 bits per pixels.
E.g., CT images are stored with 12 bits per pixels, i.e. 212=4096
intensity levels.
During image processing, pixels can preferably be stored with
more than 8 bits or floating point numbers.
Color images are usually stored with 3x8 bits per pixels, 28 red
intensity levels, 28 green intensity levels and 28 blue intensity
levels giving 224 > 16 million different colors.
Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

60

Page 30

Lecture notes

Digital Image Processing by Jorma Kekalainen

Spatial resolution
Spatial resolution is a measure
of the smallest discernible
detail in the image.
Two common measures:
lp/mm (line pair/mm),
dpi (dot/inch)
5 discernible
line pairs (lp)
per mm, 5 lp/mm

Jorma Kekalainen

Newspaper: 75 dpi
Magazine: 133 dpi
Glossy brochures: 175 dpi
Gonzalez & Woods: 2400
dpi

6 dots
per inch, 6 dpi

Digital Image Processing

A newspaper
image 61

Image sampling and quantization

Continuous
image

Line A-B
from
image

After
sampling
Jorma Kekalainen

Lecture weeks 1 and 2

After
quantization
Digital Image Processing

62

Page 31

Lecture notes

Digital Image Processing by Jorma Kekalainen

Digital image acquisition

Typically camera
Jorma Kekalainen

Digital Image Processing

63

Digital image sensor


A continuous image projected
onto the sensor array

Result after sampling


and quantization

For a color
image, there
are 3 different
types of sensor
elements:
red, green and
blue

Sensor element

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

64

Page 32

Lecture notes

Digital Image Processing by Jorma Kekalainen

Grayscale color map (256 colors)


Pixel value f(x,y)

Linear transformation

A D/A-converter
converts a digital
value to an
analog value, an
Jorma Kekalainen
electrical voltage

To D/A-converter and
further to the screen
Digital Image Processing

65

True color map

To D/A-converter and
further to the red
Jorma Kekalainen
channel of the screen.

Lecture weeks 1 and 2

To D/A-converter and
further to the green
Digital Image Processing
channel
of the screen.

Over 16
million
colors

To D/A-converter and
further to the blue
66
channel of the screen.

Page 33

Lecture notes

Digital Image Processing by Jorma Kekalainen

Example: Different colormaps


>> x=magic(6)
x=
35 1 6
3 32 7
31 9 2
8 28 33
30 5 34
4 36 29
>> imagesc(x)

Jorma Kekalainen

26
21
22
17
12
13

19
23
27
10
14
18

24
25
20
15
16
11

Digital Image Processing

67

Electromagnetic spectrum and


colors

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

68

Page 34

Lecture notes

Digital Image Processing by Jorma Kekalainen

Electromagnetic spectrum and


colors
Note that the visible spectrum is a rather narrow
portion of the EM spectrum.

Jorma Kekalainen

Digital Image Processing

69

Digital Image Processing


Electromagnetic Radiation
and Image Formation

Lecture weeks 1 and 2

Page 35

Lecture notes

Digital Image Processing by Jorma Kekalainen

EM radiation
Electromagnetic radiation is energy which
propagate through space as electromagnetic
waves
The waves consist of transversal electrical and
magnetic fields that alternate with a temporal
frequency (Hertz) and spatial wavelength
(meter)

Jorma Kekalainen

Digital Image Processing

71

Frequency and wavelength


The relation between frequency and
wavelength is
c=
c is the speed of light and depends on the
medium, c c0
c0 = speed of light in vacuum 3108 m/s

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

72

Page 36

Lecture notes

Digital Image Processing by Jorma Kekalainen

Particles and energy

Jorma Kekalainen

Digital Image Processing

73

Particles and energy

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

74

Page 37

Lecture notes

Digital Image Processing by Jorma Kekalainen

Spectrum
In practice, light consists of
photons with a range of energies, or
waves with a range of frequencies

This mix of frequencies/wavelengths/energies is


called the spectrum of the light.
The spectrum gives the total amount of energy
for each frequency/wavelength/energy.
Monochromatic light consists of only one
frequency/wavelength
Can be produced by special light sources, e.g., lasers
Jorma Kekalainen

Digital Image Processing

75

Spectrum

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

76

Page 38

Lecture notes

Digital Image Processing by Jorma Kekalainen

Color spectrum
In 1666, Newton discovered that sunlight (white light) passing
through a glass prism split up into a color spectrum of wave
lengths in the interval 400-700nm.

Jorma Kekalainen

Digital Image Processing

77

Color wavelength

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

78

Page 39

Lecture notes

Digital Image Processing by Jorma Kekalainen

Classification of EM spectrum

Jorma Kekalainen

Digital Image Processing

79

Polarization
The electromagnetic field has a direction
Perpendicular to the direction of motion

The polarization of the light is defined as the


direction of the electric field.
Natural light is a mix waves with polarization
in all possible directions: it is unpolarized
light.
Special light sources or filters can produce
polarized light of well-defined polarization.
Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

80

Page 40

Lecture notes

Digital Image Processing by Jorma Kekalainen

Polarization
Plane polarization
The electric field varies only in a single plane

Jorma Kekalainen

Digital Image Processing

81

Polarization
Circular/elliptical polarization
The electric field vector rotates
Can be constructed as the sum of two plane polarized
waves with 90o phase shift

Conversely: plane polarized light can be


decomposed as a sum of two circular polarized
waves that rotate in opposite directions.
Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

82

Page 41

Lecture notes

Digital Image Processing by Jorma Kekalainen

Coherence
The phase of the light waves can either be
random: incoherent light (natural light)
in a systematic relation: coherent light

Coherent light is related to monochromatic


light sources
Compare a red LED and a red laser
Both produce light within a narrow range
The LED light is incoherent
The laser light is coherent
Jorma Kekalainen

Digital Image Processing

83

Radiation energy
Light radiation has energy
Each photon has a particular energy related to its
frequency (E = h )
The number of photons of a particular frequency
gives the amount of energy for this frequency
Described by the spectrum
Unit: Joule (J=Ws =Watt second)
Is usually not measured directly.

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

84

Page 42

Lecture notes

Digital Image Processing by Jorma Kekalainen

Radiation power
The power of the radiation, i.e., the energy
per unit time, is the radiant flux
Since the energy depends on the frequency, so
does the radiant flux
Unit: Watt (W=J/s= Joule per second)
Is usually not measured directly.

Note: Radiant flux is one of radiometric quantities. Radiometry is a set of techniques


for measuring electromagnetic radiation, including visible light. Radiometric
Jorma Kekalainen
Image Processing
techniques
in optics characterize theDigital
distribution
of the radiation power in space.85

Radiant flux density


The radiant flux per unit area is the flux density
Since the flux depends on the frequency, so does the flux
density
Unit: Watt per square meter
As the energy propagates
through a area during a
Can be measured directly!
specific time interval

Irradiance is the radiant flux received by a surface per unit


area
Excitance or emittance (old term): flux density emitted from a
surface
Note: Irradiance is often called "intensity" in branches of physics other than
Jorma Kekalainen
Digital Image Processing
86
radiometry, but in radiometry this usage leads to confusion with radiant intensity.

Lecture weeks 1 and 2

Page 43

Lecture notes

Digital Image Processing by Jorma Kekalainen

Radiant intensity
For point sources, or distant sources of small
extent, the flux density can also be measured per
unit solid angle

The radiant intensity is the radiant flux per unit


solid angle
Unit: W/sr= Watt per steradian
Jorma Kekalainen

Digital Image Processing

87

Basic principle
Preservation of energy A constant light
source must produce the same amount of
energy through a solid angle regardless of
distance to the source
The radiant intensity is constant
The radiant flux density decreases with the square
of the distance to the source

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

88

Page 44

Lecture notes

Digital Image Processing by Jorma Kekalainen

Radiometric chain

Jorma Kekalainen

Digital Image Processing

89

Interaction between light and


matter
Most types of light-matter interactions can be
represented by
n = the materials refractive index
= the materials absorption coefficient
Both parameters depend on .
More complex interactions include
polarization effects or non-linear effects.

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

90

Page 45

Lecture notes

Digital Image Processing by Jorma Kekalainen

Light incident upon a surface


When light meets a surface
Some part of it is transmitted through the new
media
Possibly with another speed and direction

Some part of it is absorbed by the new media


Usually: the light energy is transformed to heat

Some part of it is reflected

All these effects are different for different


wavelengths!
Jorma Kekalainen

Digital Image Processing

91

Basic principle
Based on preservation of energy:
E0 = E1 + E2 + E3

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

92

Page 46

Lecture notes

Digital Image Processing by Jorma Kekalainen

Refraction
The light that is transmitted into the new
medium is refracted due to the change in light
speed
Snells law of refraction:

Snell's law states that the ratio of the sines of the


angles of incidence and refraction is equivalent to
the ratio of phase velocities in the two media, or
equivalent to the reciprocal of the ratio of the
Jorma Kekalainen
Digital Image Processing
indices of refraction:

93

Absorption
Absorption implies attenuation of transmitted
or reflected light
Materials get their colors as a result of
different amount of absorption for different
wavelengths
E.g., A red object attenuates wavelengths in the
red band less than in other bands.

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

94

Page 47

Lecture notes

Digital Image Processing by Jorma Kekalainen

Absorption
The absorption of light in matter depends on the
length that the light travels through the material

a = attenuation of the light (0 a 1)


= the materials absorption coefficient
x = length that the light travels in the material
Jorma Kekalainen

Digital Image Processing

95

Absorption spectrum
The spectrum of the reflected/transmitted
light is given by

s1 = incident spectrum
s2 = reflected/transmitted spectrum
a = absorption spectrum ( 0 a() 1)
Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

96

Page 48

Lecture notes

Digital Image Processing by Jorma Kekalainen

Reflection
Highly dependent on the surface type

A real surface is often a mix between the two


cases
Jorma Kekalainen

Digital Image Processing

97

Emission
Almost independent of its interaction with
incident light:
Any object, even one that is not considered a light
source, emits electromagnetic radiation

Primarily in the IR-band, based on its


temperature.

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

98

Page 49

Lecture notes

Digital Image Processing by Jorma Kekalainen

Scattering
All mediums (except vacuum) scatter light
E.g., air, water, glass etc.

We can think of the medium as consisting of


small particles and with some probability they
reflect the light

In any possible direction


Different probability for different directions
Weak effect and roughly proportional to -4
In general, the probability depends also on the
distribution of particle sizes

Jorma Kekalainen

Digital Image Processing

99

Scattering
Scattering means that the
light ray does not travel
along a straight line
through the medium
There is a probability that a
certain photon exits the
medium in another
direction than it entered.

Examples:
The sky is blue because of
scattering of the sun light
A strong laser beam
becomes visible in air

Jorma Kekalainen

Lecture weeks 1 and 2

Digital Image Processing

100

Page 50

S-ar putea să vă placă și