Documente Academic
Documente Profesional
Documente Cultură
What is an image?
• An image is a 2D function I(x,y), where x and y
are spatial coordinates and the value of I at any
pair of coordinates (x,y) is called the intensity or
gray level.
Image Processing:
Input: Image –---- Output: Image
Image Analysis/Understanding:
Input: Image –---- Output: Measurements
Computer Vision:
Input: Image –---- Output: High-level
description
Aspects of image processing
Processing of image
levels
ε is the offset
• Gamma correction:
- It is a process used to correct power-law
response phenomena
Power-Law Transformations
γ=c=1: identity
Gamma correction in CRT
Application of Power-Law
Transformations
Piecewise-Linear Transformation
• Contrast Stretching
- To increase the dynamic range of the gray levels
in the image being processed
Contrast Stretching
Contrast Stretching
• The locations of (r1,s1) and (r2,s2) guides
the characteristics of the transformation
function
One approach is to
display a high value for all
gray levels in the range of
interest and a low value
for all other gray levels, so
it produces binary image
Gray-Level Slicing
– The second approach is to brighten the desired
range of gray levels but to preserve the
background and gray-level tonalities in the
image
Image enhancement in spatial
domain
BIT-Plane Slicing
• To highlight the contribution made to the total image
appearance by specific BIT in the image
Normalized Histogram:
It gives an estimate of the probability of occurrence of gray
level l
Normalized histogram p(l)=nl/n
Where n=total number of pixels in an image
Image enhancement in spatial
domain
• Histogram transformation
Types of processing:
- Histogram equalization
- Histogram matching (specification)
- Local enhancement
- Use of Histogram statistics for image
enhancement
Histogram Equalization
• For gray levels that take on discrete values, we
deal with probabilities: p(rk)=nk/n
• Flat histogram.
Histogram equalization
• To equalize the histogram, probabilities of
occurrences of the gray levels in the image are
taken into account.
- That is the normalized histogram is used to
spread the histogram throughout the range of
the gray level of the image.
• The transformation is
k k nj
s k = T ( rk ) = ∑ P (r ) = ∑
j=0
r j
j=0 n
Histogram equalization
Histogram matching/specification
• Histogram equalization does not allow interactive
image enhancement and generates only one result
that is an approximation to a uniform histogram
5. For each pixel in the original image, if the value of that pixel is
rk, map this value to its corresponding level sk, then map level
sk into final level zk
- In these mappings the pre-computed values from step (2) and
step (4) are used
z =G −1
(s) ⇒ z = G −1
⎡⎣T ( r ) ⎤⎦
Histogram matching/specification
• The principal difficulty in applying the histogram
specification method to image enhancement lies in
being able to construct a meaningful histogram
So,
- Either a particular probability density function (such
as a Gaussian density) is specified and then a
histogram is formed by digitizing the given function
• Arithmetic
- Subtraction/Addition
- Multiplication / Division (as multiplication of
reciprocal)
- May use multiplication to implement gray-level
masks
Logical operations
Image Subtraction
• Image subtraction of a mask from an image
is given as:
g ( x, y ) = f ( x, y ) − h ( x, y )
Where, f(x,y) is the image
h(x,y) is the image mask
- Blurring / Smoothing
- Sharpening
- Edge Detection
Purposes:
- Reduction of ‘irrelevant’ details
- Noise reduction
- Reduction of ‘false contours’ (e.g. produced by
zooming)
Spatial filtering
• Sharpening
- Used to highlight fine detail in an image
- Or to enhance detail that has been blurred
• Edge Detection
Purposes:
- Pre-processing
- Sharpening
Spatial filtering
• A mask of a fixed size and having constant
weight in its every location is used to do
the filtering
• The mask is moved from point to point on
the image
• At each point (x,y), the response of the
filter at that point is calculated using a
predefined relationship
Spatial filtering
• For a linear spatial filtering, the response is
given by a sum of products of the filter
coefficients and the corresponding image
pixels in the area spanned by the filter mask
R = w1 z1 + w2 z 2 + ... + w9 z9
mn
R = ∑ wi zi
i =1
Spatial Filtering
Spatial filtering
Image pixels (N8)
z1 z2 z3
z4 z5 z6
z7 z8 z9
Spatial filtering
• Linear filtering of an image of size M×N with
filter mask of size m×n is given as
a b
g( x, y ) = ∑ ∑ w ( s, t ) f ( x + s, y + t )
s= − a t = − b
∑ ∑ w(s, t ) f ( x + s, y + t )
g ( x, y ) = s =− a t =− b
a b
∑ ∑ w(s, t )
s =− a s =− b
Smoothing Filtering
• Averaging using
different mask sizes
1 1 1
1 1 1 * 1/9
1 1 1
1 2 1
2 4 2 * 1/16
1 2 1
Spatial Filtering
• Non-linear filters also use pixel
neighbourhoods but do not explicitly use
coefficients
∂f
= f ( x + 1) − f ( x)
∂x
• And Second derivative
∂ f
2
= f ( x + 1) + f ( x − 1) − 2 f ( x)
∂x 2
Sharpening filter
• First derivative enhances any sharp
transition
∂x ∂y
Sharpening filter
• Partial second order derivative in X-direction:
∂ f
2
= f ( x + 1, y ) + f ( x − 1, y ) − 2 f ( x, y )
∂x 2
• So,
∇ 2 f = [ f ( x + 1, y ) + f ( x − 1, y ) + f ( x, y + 1) + f ( x, y − 1)] − 4 f ( x, y )
Sharpening filter
• If the centre coefficient of the Laplacian mask is
negative
g ( x, y ) = f ( x, y ) − ∇ f ( x, y )
2