Sunteți pe pagina 1din 79

1

CHAPTER 1
INTRODUCTION
1.1Biometric system:
In biometric indicators, fingerprints are one of the highest level of reliability and have
been extensively used by forensic experts in criminal investigations. It refers to identifying an
individual based on his or her physiological or behavioral characteristics and its the capability to
reliably distinguish between an authorized person and an imposter.
Fingerprint based identification is one of the most mature and proven technique. A
fingerprint is the pattern of ridges and valleys on the surface of the finger . The uniqueness of a
fingerprint can be determined by the overall pattern of ridges and valleys as well as the local
ridge anomalies [a ridge bifurcation or a ridge ending,called minutiae points. Although the
fingerprints possess the discriminatory information, designing a reliable automatic fingerprint
matching algorithm is very challenging.
1.2Fingerprint pattern:
The fingerprint of an individual is unique and remains unchanged over a lifetime. A
fingerprint is formed from an impression of the pattern of ridges one finger. A ridge is defined as
a single curved segment, and a valley is the region between two adjacent ridges.
The minutiae, which are the local discontinuities in the ridge flow pattern, provide the
features that are used for identification. Details such as the type, orientation, and location of
minutiae are taken into account when performing minutiae extraction.
Fingerprint analysis requires comparison of several features of the print pattern and
minutiae points. Print patterns represent aggregate characteristics of ridges, while minutia points
represent unique features found within the patterns.
The major minutia features of fingerprint ridges are:
1. Ridge ending.
2. Bifurcation.
3. Short ridge (dot).

2

Minutiae and patterns are very important in the analysis of fingerprints. since no two
fingerprints (even for the same person) have been proven to be identical.
Ridge ending:
Ridge ending is a point at which the edge terminates.
Bifurcation:
Bifurcation are points at which a single ridge splits into two ridges.
Short ridges:
Short ridges are ridges which are significantly shorter than the average ridge length on
the fingerprint.


Fig 1.2.1Ridge ending Fig1.2.2 Bifurcation Fig1.2.3 Dot

1.3 Fingerprint ridge patterns:
1. Arch:
An arch is a pattern where the ridges enter from one side of the finger, Rise in the center
forming an arc, and it exits the other side of the finger.
2. Loop:
An loop is a pattern where the ridges enter from one side of the finger, Form a curve, and
tend to exit from the same side they enter.
3. Whorl:
Fingerprint ridge form concentric circles around the central point of the finger.




3

Image Enhancement:
When a fingerprint is captured, it contains a lot of redundant information. Problems with
scars, too dry or too moist fingers, or incorrect pressure must also be overcome to get an
acceptable image. The following image enhancement techniques are typically applied:

Normalization:
The purpose is to bring the pixel intensity values of the image into a more familiar gray-
scale range (0 to 255 for 8-bits per pixel systems). For example, if the intensity range of an
image is 50 to 80 and the desired range is 0 to 255, normalization entails subtracting 50 from
each pixel intensity, making the range 0 to 130;then each pixel intensity is multiplied by
255/130, making the range 0 to 255.

Low-pass Filtering:
The purpose is to smoothen the image so that no points in the image differ from its
surroundings to a great extent. A low pass filter averages out rapid changes in intensity. An
image leaving a low-pass filter would appear blurred but low-pass filtering could suppress the
noise and bring out the gradual changes in the intensity of the image.

1.4 Existing System:
The most popular approach to fingerprint enhancement which was proposed based on a
directional Gabor filtering kernel in spatial domain. The algorithm uses a properly oriented
Gabor kernel, which has frequency-selective and orientation-selective properties, to perform the
enhancement. It acquires the block frequency through the STFT analysis and estimates the local
ridge orientation too. The complex input contexts of the low-quality image, not all of the
unrecoverable regions of the fingerprint can be recovered clearly, as it is difficult to accurately
estimate some parameters of the filters through a simple analysis. Thus, the algorithm needs to
be improved to enhance the unrecoverable regions of low-quality images.
1.4.1 Disadvantages:
1. Enhancement on the low quality fingerprint image performance not satisfactory.
2. Filtering concepts does not provide accurate results.
3. Convolution matrix output is very low.

4


1.5 Proposed system:
In order to overcome the shortcomings of the existing algorithms on the fairly poor
fingerprint images with cracks and scars, dry skin, or poor ridges and valley contrast ridges.
Two-stage scheme to enhance the low-quality fingerprint image in both the spatial domain and
the frequency domain based on the learning from the images. Spatial domain uses ridge
compensation filter and frequency domain uses band pass filter as a radial filter and apply
angular filter in the coherence image. Minutiae extraction done by the block filter and minutiae
matching score will be obtained based on the minutiae matching technique.
1.5.1 Advantages:
1. Reduce the computational complexity.
2. Minutiae matching score comparatively studied with cancelable biometrics.

















5

CHAPTER 2
LITERATURE SURVEY

2.1 Introduction:
This section explains the related concept to be identified and the existing techniques and
method how can use in the project explanation specified. Over all introduction, about the
previous system to be discussed follows.

2.2 Shlomo Greenberg, Mayer Aladjem, Daniel Kogan and Itshak Dimitrov, Fingerprint Image
Enhancement Using Coherence Diffusion Filter and Gabor Filter, A. Ali et al. conf image
processing ,Information & Computational Science 9: 1 (2012),153-160
In this paper deals a fingerprint image enhancement using diffusion coherence filter and
gabor filter. The fingerprint acquired by the scanner is usually of low quality and should be
enhanced before the feature extraction and matching processes for an authentic and reliable user
identification. Enhancement of fingerprint images improves the ridge-valley structure, increases
the number of correct features there by improving the overall performance of the recognition
system.
The diffusion method:
It gives better results in the region of image with high curvature ridges that is in the core
point surrounding region but produce unsatisfactory results for cut and broken ridges. It cannot
enhance the cut and broken ridge areas in the uniform ridge-valley region of the image but gives
distorted image results for the high curvature region of the image surrounding the core point.
Gabor Filter method:
It enhances the fingerprint image in all areas of ridge-valley pattern including the core
point. It enhances the core region without any distortion, improves the clarity between ridges and
valleys, removes the gap between the broken and the scratched ridges and increases the number
of correct feature which are required to be used in matching algorithms for fingerprint
recognition.



6

Drawbacks:
The Poincare index method does not identify the core point because the initial set up
input considerably in different views such as side view, top view.
It used only for scratched and discontinuous ridges in the fingerprint.

2.3 Mayer Aladjem, Daniel kogan,Itshak Dimitrov, Shlomo Greenberg Fingerprint Image
Enhancement using filtering Technique,sholomog@ee.bgu.ac.el,2008
In this paper fingerprint image enhancement scheme uses two methods.The first
one is carried out by three ways
local histogram equalization
Wiener filtering
image binarization.

Local histogram:
We use a local histogram equalization for contrast expansion. Contrast is expanded
for most of the image pixels, the transformation improves the detectability of many image
features.
Wiener filtering for noise reduction. The binarization process is applied by adaptive
thresholding based on the local intensity mean. Thinning is then carried out through the
algorithm presented in which provides good results on fingerprints. Finally morphological
filtering is applied to eliminate artifacts in noisy regions and to fill some gaps in valid ridgelines.
The second method uses anisotropic filters. It uses both a local intensity orientation and
an anisotropic measure to control the shape of the filter.
Benefits:
It uses binarization and thinning process on fingerprint images and it increase
performance of minutiae extraction in the ridge structure.
By applying anisotropic filter, only orientation information is required. In our
enhancement. algorithm replaced the Gabor filter in with the anisotropic filter
which eliminates the need to estimate local frequency information.



7

Drawbacks:
Applying anisotropic filter is work faster in the corrupted image but it only identifies
orientation information.
Binarization and thinning process used only for minutiae extraction in the fingerprint
image.
2.4 Philippe parra,Fingerprint minutiae extraction and identification procedure
IEEE conf.,2009.
The first technique is to simply average the gradient of each pixel inside the window. As
we are only interested in the direction of the gradient and not the orientation. We cannot directly
average the vectors. For example, the two unit vectors (1,0) and (-1,0) are pointing in opposite
direction (0 and _) but have the same orientation 0. The direction is a number between 0 and 2,
whereas the orientation lies between 0 and 1.
The trick to average the gradient vector is to double the angle of their polar
decomposition. In the example, (-1,0) will become (1,0), and (1,0) will not be changed. Then,
those doubled vectors are averaged to give the average gradient: its angle represents the
orientation of the gradient and its norm represents the reliability of the angle. In the example, the
average gradient will be (1,0): its orientation is 0 and its norm is 1, that is what we expect. In the
fingerprint image, we are looking for the ridge orientation; we simply need to add 2 to the angle
to get the orthogonal orientation.
Conclusion:
In this paper deals with the minutiae matching points and minutiae extraction can be
applied in the enrollment of fingerprint image by user. In this system follows two steps, first step
uses fingerprint enhancement technique and minutiae filtering.
Fingerprint enhancement technique uses gobar filter and our proposed system uses
oriented directional gobar filter for orientation of low quality images.






8

2.5 A.K.jain, L.prabakhar, L.Hongand s.pankati Filter bank based fingerprint matching
IEEE Trans vol.9,no.5,May 2000

In this paper the four main steps in the feature extraction algorithm are used:
1) Determine a reference point and region of interest for the fingerprint image
2) tessellate the region of interest around the reference point.
3) filter the region of interest in eight different directions.
using a bank of Gabor filters (eight directions are required to completely capture the local ridge
characteristics in a fingerprint while only four directions are required to capture the global
configuration.
4) compute the average absolute deviation from the mean (AAD) of gray values in
individual sectors in filtered images to define the feature vector or the Finger Code.
It define the reference point of a fingerprint as the point of maximum curvature of
concave ridges. Define the orientation field for a fingerprint image. Local ridge orientation is
defined for the block or pixel, an image is divided into set of w x w non overlapping blocks and
a single orientation is defined for each block. In this process does not provide reliable metrics.

Conclusion:
Feature extraction algorithm uses gobar filter which eliminate noise but pixel based
processing provide efficient ridge pattern in the fingerprint image. Reference point might be
located but it recovering pattern is not efficiently determined.









9

CHAPTER 3
SYSTEM ARCHITECTURE
3.1 PROPOSED ARCHITECTURE:


















Fig 3.1 Proposed architecture


Low Quality
Fingerprint Image
Stage 1
Local Normalisation
(
Local Orientation
Estimation
Ridge Compensation
Filter
Stage 2
Coherence image divided
into subimages uses FFT
Frequency Bandpass
Filter(angular Filter,Radial
Filter)
Reconstruct Enhanced
Image uses IFFT
Minutiae matching
Binarization
Block Filtering
Minutiae Extraction
Minutiae Matching
samples
score

10

The proposed system achieves privacy in bio metric system. It works in the cancelable
biometrics. In this system deals with two stages in enhancement process. First stage eliminate the
redundancy in the fingerprint. it uses ridge compensation filter to discover the ridge structure. In
second stage uses frequency domain metrices are angular and radial to enhance the image.
Increase the computational processing speed by dividing segment parts and each partition
go into filtering process. Minutiae extraction allows orientation metrices, matching score will be
revealed for protecting privacy.

3.2 Level-0 DFD


Fig 3.2 level 0 DFD
In level 0 data flow diagram uses normalization process that identify ridge structure
such as core and delta points. It make the segmentation on the image which only contains the
minutiae points. Normalized image are segmented and masked image. It uses block processing
method in the normalization.








11

3.2 Level 1 DFD




Fig 3.3 level 1DFD
In level 1 data flow diagram uses normalized image that could be segmented based on the
block processing and apply gradient method to perform the convolution matrix operation.
Estimation orientation uses pixel wise processing on the normalized image and an orientation
smoothing method with a Gaussian window to correct the estimation.




12

3.4 Level 2 DFD:



Fig 3.4 Level2 DFD
In level 2 dataflow diagram shows the ridge compensation filter takes input from the
estimated image and the estimated image contains the orientation values that determine the ridge
pattern contrast Position values.
In this filter uses the height and width are the parameters compensation filter uses the
weighted constant values to control parameters and it uses affine transform to mask the image
with minutiae points. In level 2 data flow diagrams takes the first stage input and low quality
input perform the frequency band pass filtering operations. It includes the radial filter and
angular filter functionalities.
Radial Filter uses FFT on sub images and values are obtained. Angular filter focused on
the centered orientation image and its bandwidth is inversely proportional to the coherence

13

image used. Reconstruction of second stage enhanced image obtained by adding angular and
radial filter outputs. second stage enhancement image is matched with templates which are
resides in the public database and it uses the binarization and thinning process in the images and
it also defines minutiae extraction using orientation and smoothing method applied in the images.
Minutiae matching score obtained by minutiae matching technique applied in the
enhanced and template images.
Flow diagram:
A data flow diagram is a graphical representation of the flow of data through an
information system,modeling its process aspects.often they are a preliminary step used to create
an overview of the system which can later be elaborated.DFDs can also be used for the
visualization of data processing (structured design). There are different notations to draw data
flow diagrams defining different visual representations for processes,data store,data flow,and
external entities.
External entity:


External entities are objects outside the system,with which the system communicates.
External entities are sources and destinations of the systems inputs and outputs.
Data flow:


Data flow are pipelines through which packets of information flow. Label the arrows with
the name of the data that moves through it.
Process:

14



A process transforms incoming data flow in to outgoing data flow. A process, or
transform, is a transformation that inputs one type of data and outputs a different type. It is a
conversion of data from one form to another.
Decision box:
It represents the decision making process.


.
Associations:

Associations between actors and use cases are indicated in use case diagrams by solid
lines. An association exists whenever an actor is involved with an interaction described by a use
case. The arrowheads imply control flow and should not be confused with data flow.











15

3.5 Activity Diagram:

Fig 3.5 Activity diagram
low quality finger
print
local normalization
local orientation
estimation
spatial ridge
compensation filter
first stage
enhancement
parameter
estimation
frequency
banbass filter
second stage
enhancement
public database
minutiae matching
minutiae extraction
score

16

Activity diagram:
Activity diagrams are graphical representations of workflows of stepwise activities and
actions with support for choice, iteration and concurrency. In the Unified Modeling Language,
activity diagrams can be used to describe the business and operational step-by-step workflows of
components in a system. An activity diagram shows the overall flow of control.
Initial node:

The black circle represents the start (initial state) of the workflow.
Final node:

An encircled black circle represents the end(final state).













17


CHAPTER 4
FILTERS USED
4.1 Ridge Compensation Filter:
In this system uses Local ridge compensation filter with a rectangular window to match
local orientation. The pixel coordinates in the new axes with affine transform to match the local
orientation.
4.2 Frequency Bandpass Filter:
4.2.1 Radial Filter:
In this scheme uses exponential bandpass filter as radial filter.It provide sharp and fast
attenuation in the fingerprint images.It can reduce the noise clearly.
we use the local ridge frequency as the center frequency in the enhanced image. Suppose
the center frequencyis equals the local ridge frequency, the local ridge frequency can be
determined in the frequency domain by FFT .

4.2.2 Angular Filter:
Angular bandwidth of the filter is taken as a piecewise linear function of the distance
from the singular points, such as core and delta.Angular Filter uses raised cosine filter,it uses
coherence image to identify the bandwidth.
We relay on the flow orientation and angular coherence measure which is more robust
than the singular points detection.The higher value of the coherence means that the orientation of
the center block is most similar to each neighbours.In Fingerprint coherence value is expected to
be low. If central point is close to the singularity.

4.3 Block Filter :
The binarized image is thinned using Block Filter to reduce the thickness of all ridge
lines to a single pixel width to extract minutiae points effectively.
Thinning does not change the location and orientation of minutiae points compared to
original fingerprint which ensures accurate estimation of minutiae points.

18

Thinning preserves outermost pixels by placing white pixels at the boundary of the
image, as a result first five and last five rows, first five and last five columns are assigned value
of one. Dilation and erosion are used to thin the ridges.

























19


CHAPTER 5
MODULES

5.1 Module List :
There are four modules.They are
1. Local Normalisation
2. Local Orientation Estimation
2.1.ridge compensation Filter
3. Second Stage Enhancement
3.1 coherence image
3.2Frequency bandpass Filtering
3.3 Reconstruct enhanced image
4. Minutiae Matching
4.1 Binarization
4.2 Block Filtering
4.3Minutiae extraction
4.4 Minutiae Matching score











20


5.2 Module Description:
5.2.1Local Normalisastion:
This step takes input as a low quality fingerprint image and used to reduce the local
variations and standardize the intensity distributions in order to consistently estimate the local
orientation. The pixelwise operation does not change the clarity of the ridge and furrow
structures but reduces the variations in gray-level values along ridges and furrows, which
facilitates the subsequent processing steps.

Img(i, j) is the gray-level value of the fingerprint imagein pixel (i, j), norimg(i, j) is the
normalizing value in pixel(i, j), and coeff is the amplificatory multiple of the normalizedimage.
M is the mean of the subimage, and V is the variance of the subimage. M0 and V0 are the desired
mean and variance values, respectively. In the experiments, we usedM0 = 128 and V0 = 128
128.
5.2.2 Local Orientation Estimation:
This step determines the dominant direction of the ridges in different parts of the
fingerprint image. This is a critical processing, and errors occurring at this stage are propagated
to the frequency filter. We used the gradient method for orientation estimation and an orientation
smoothing method with a Gaussian window to correct the estimation.For a number of
nonoverlapping blocks with the size of W W, a single orientation is assigned corresponding to
themost probable or dominant orientation of the block.

For eachpixel in a block, a simple gradient operator, such as the Sobel
mask is applied to obtain the horizontal gradient value Gx (u, v) and vertical gradient value Gy
(u, v). The block horizontaland vertical gradients, i.e., Gxx and Gxy are obtained by adding up
all the pixel gradients of the corresponding direction.
5.2.3 Ridge compensation Filter:
The compensation filter uses weighted constant values to control the contrast parameters.
When the contrast parameters of the filter are controlled, thepixels along the ridge will be
enhanced, while the blurring effect will be restrained and repressed.


21



5.2.4 Second Stage Enhancement:
5.2.4.1 Coherence image:
The coherence indicates the relationship between the orientation of the central block and
those of its neighbors in the orientation map. The coherence is related to the dispersion measure
of circular data.
The coherence is high when the orientation of the central pixel (x, y) of a window is
similar to each of its neighbours (xi, yi), and W is the window size. The obtained coherence
image is used to determine the bandwidth of the angular filter.
5.2.4.2 Frequency Bandpass Filtering:
Smoothing filtered image can be divided into overlapping subimages.An each subimage
functions are
1. Subimage uses FFT to removing the dc component uses the block filtered image
2. The angular filter Fa is applied, whichis centered on the local orientation image and with
the bandwidth inversely proportional to the coherence image.
5.2.4.3 Reconstructed Enhanced image:
It deals with the IFFT technique,divided subimages are merged and provide the second
stage enhanced output.
5.2.5.1 Minutiae matching:
5.2.5.1.1Binarization:
The pre-processing of FRMSM uses Binarization to convert gray scale image into binary
image by fixing the threshold value. The pixel values above and below the threshold are set to
1 and 0 respectively.
Block Filter:
The binarized image is thinned using Block Filter to reduce the thickness of all ridge
lines to a single pixel width to extract minutiae points effectively. Thinning does not change the
location and orientation of minutiae points compared to original fingerprint which ensures
accurate estimation of minutiae points. Thinning preserves outermost pixels by placing white
pixels at the boundary of the image, as a result first five and last five rows, first five and last five
columns are assigned value of one. Dilation and erosion are used to thin the ridges.

22


Minutiae Extraction:
The minutiae location and the minutiae angles are derived after minutiae extraction. The
terminations which lie at the outer boundaries are not considered as minutiae points and Crossing
Number is used to locate the minutiae points in fingerprint image. Crossing Number is defined as
half of the sum of differences between intensity values of two adjacent pixels.



Fig 5.2.5.1 Minutiae matching process

Minutiae Matching:
To compare the input fingerprint data with the template data Minutiae matching is used.
For efficient matching process, the extracted data is stored in the matrix format. template and
input minutiae are selected as reference points for their respective data sets.






23



CHAPTER 6
SOFTWARE & HARDWARE SPECIFICATION

6.1 Hardware Requirements:
Processor :Pentium III
RAM :512MB and above
Hard Disk :40GB and above
Compact Disk :650MB
Input Device :Standard keyboard and Mouse
Output Device :VGA and High Resolution Monitor

6.2 Software Requirements:
Windows 7
Matlab7.6.0(R2008)











24



CHAPTER 7
CONCLUSION

The proposed scheme provides correct minutiae matching score in cancellable
biometrics. This scheme provides efficient results in the ridge valley structure. It increase speed
when comparing with other enhancement technique. It uses the block processing method on the
segmented image that efficiently handled by the filter and produce the effective results. Our
scheme protecting privacy over public databases and minutiae extraction results obtained
accurately. It helps to achieve best results in the fingerprint enhancement technique and
frequency band pass filter helps to achieve highest level of attenuation steep in the images. It
identify the original fingerprint based on the fingerprint verification scheme and gives the
minutiae matching score that identifies the similar in the data that stored in the database.In
criminology this type of methods are useful to identify the persons who involved in the crimes.










25


CHAPTER 8
FUTURE ENHANCEMENT

This technique can be implemented in real world problems to face difficulties because
the database size is large and different database images does not match with the current
smaller database and the minutiae matching techniques needed to be improve in the low quality
fingerprint results. The above technique used only in cancelable biometrics protecting privacy
over the Public database. Use other filters to achieve highest level of minutiae extraction on core
and delta points.













26


APPENDIX
A.1 coding:
clc;
close all;
clear all;
warning off
%% READ LOW QUALITY IMAGE
[filename, pathname]=uigetfile({'*.bmp;*.jpg;*.tif;*.png;*.gif','All Image Files''*.*','AllFiles'
'mytitle');
a=strcat(pathname ,'/', filename);
I=imread(a);
figure;imshow(I); title('INPUT LOW QUALITY IMAGE')
%% 1st STAGE ENHANCEMENT
% (i) LOCAL NORMALIZATION
blksze = 16;
thresh = 0.1;
[normim, mask] = ridgesegment(I, blksze, thresh);
figure;imshow(normim);title('LOCAL NORMALIZATION IMAGE')
figure;imshow(mask);title('mask image')
function [normim, mask, maskind] = ridgesegment(im, blksze, thresh)
im = normalise(im,0,1);
fun = inline('std(x(:))*ones(size(x))');
stddevim = blkproc(im, [blksze blksze], fun);
mask = stddevim > thresh;
maskind = find(mask);
im = im - mean(im(maskind));
normim = im/std(im(maskind));
function n = normalise(im, reqmean, reqvar)
if ~(nargin == 1 | nargin == 3)

27

error('No of arguments must be 1 or 3');
end
if nargin == 1
if ndims(im) == 3
hsv = rgb2hsv(im);
v = hsv(:,:,3);
v = v - min(v(:));
v = v/max(v(:));
hsv(:,:,3) = v;
n = hsv2rgb(hsv);
else
if ~isa(im,'double'), im = double(im); end
n = im - min(im(:));
n = n/max(n(:));
end
else
if ndims(im) == 3
error('cannot normalise colour image to desired mean and variance');
end
if nargin == 1
if ndims(im) == 3
hsv = rgb2hsv(im);
v = hsv(:,:,3);
v = v - min(v(:));
v = v/max(v(:));
hsv(:,:,3) = v;
n = hsv2rgb(hsv);
else
if ~isa(im,'double'), im = double(im); end
n = im - min(im(:));
n = n/max(n(:));

28


f ~(nargin == 1 | nargin == 3)
error('No of arguments must be 1 or 3');
end
if nargin == 1
if ndims(im) == 3
hsv = rgb2hsv(im);
v = hsv(:,:,3);
v = v - min(v(:));
v = v/max(v(:));
hsv(:,:,3) = v;
n = hsv2rgb(hsv);
else
if ~isa(im,'double'), im = double(im); end
n = im - min(im(:));
n = n/max(n(:));
end
if nargin == 1
if ndims(im) == 3
hsv = rgb2hsv(im);
v = hsv(:,:,3);
v = v - min(v(:));
v = v/max(v(:));
hsv(:,:,3) = v;
n = hsv2rgb(hsv);
else
if ~isa(im,'double'), im = double(im); end
n = im - min(im(:));
n = n/max(n(:));
end
else

29

if ndims(im) == 3
error('cannot normalise colour image to desired mean and variance');
else
function n = normalise(im, reqmean, reqvar);
end
maskind = find(mask);
im = im - mean(im(maskind));
normim = im/std(im(maskind));
function n = normalise(im, reqmean, reqvar)
if ~(nargin == 1 | nargin == 3)
error('No of arguments must be 1 or 3');
end
if nargin == 1
if ndims(im) == 3
hsv = rgb2hsv(im);
v = hsv(:,:,3);
v = v - min(v(:));
v = v/max(v(:));
hsv(:,:,3) = v;
n = hsv2rgb(hsv);
else
if ~isa(im,'double'), im = double(im); end
n = im - min(im(:));
n = n/max(n(:));
end
else
if ndims(im) == 3
error('cannot normalise colour image to desired mean and variance');
end
if nargin == 1
if ndims(im) == 3

30

hsv = rgb2hsv(im);
v = hsv(:,:,3);
v = v - min(v(:));
v = v/max(v(:));
hsv(:,:,3) = v;
n = hsv2rgb(hsv);
else
if ~isa(im,'double'), im = double(im); end
n = im - min(im(:));
n = n/max(n(:));
function n = normalise(im, reqmean, reqvar);
end
Adaptive threshold:
function [o] = adaptiveThres(a,W,noShow);
[w,h] = size(a);
o = zeros(w,h);
for i=1:W:w
for j=1:W:h
mean_thres = 0;
if i+W-1 <= w & j+W-1 <= h
mean_thres = mean2(a(i:i+W-1,j:j+W-1))mean_thres = 0.8*mean_thres;
o(i:i+W-1,j:j+W-1) = a(i:i+W-1,j:j+W-1) < mean_thres;
end;
end;
end;
if nargin == 2
imagesc(o);
colormap(gray);
end;
clc;
close all;

31

clear all;
run_function;
warning off;
chos=0;
possibility=6;
messaggio='A two-stage scheme to enhance the low-quality fingerprint image in both the spatial
domain and the frequency domain based on the learning from the images';
while chos~=possibility,
chos=menu('FINGERPRINT ENHANCEMENT ','INPUT IMAGE','LOCAL
NORMALIZATION','LOCAL ORIENTATION ESTIMATION','FIRST STAGE
ENHANCEMENT','SECOND STAGE ENHANCEMENT','EXIT');
if chos==1;
clc;
[filename, pathname]=uigetfile({'*.bmp;*.jpg;*.tif;*.png;*.gif','All Image Files';...
'*.*','All Files' },'mytitle');
a=strcat(pathname ,'/', filename);
I=imread(a);
figure(1);imshow(I); title('INPUT LOW QUALITY IMAGE')
end
if chos==2,
clc;
blksze = 16;
thresh = 0.1;
normim, mask] = ridgesegment(I, blksze, thresh);
figure(2);imshow(normim);title('LOCAL NORMALIZATION IMAGE')
end
if chos==3,
clc;
[orientim, reliability] = ridgeorient(normim, 1, 3, 3);
figure(3); plotridgeorient(orientim, 15, I, 2)
end

32

if chos==4,
clc;
blksze = 36;
[rfreq, medfreq] = ridgefreq(normim, mask, orientim, blksze, 5, 5, 15)
freq = medfreq.*mask;
stage1 = ridge_compensation_filter(normim, orientim, freq, 0.5, 0.5, 1);
stage1 =sqrt(abs(stage1)).*sign(stage1))
mx =max(max(stage1));
mn =min(min(stage1));
stage_enh_I =uint8((stage1-mn)/(mx-mn)*254+1);
figure(4);imshow(stage_enh_I);title('FIRST STAGE ENHANCEMENT')
end
if chos==5,
clc;
BLKSZ=12;
output = Frequency_bandpass_filtering(stage_enh_I, BLKSZ);
imwrite(output,'output.jpg','jpg')
figure(5);imshow(output);title('SECOND STAGE ENHANCEMENT')
end
if chos==6,
clc;
close all;
end
end
FigWin = figure('Position',[50 -50 650 500],...
'Name','Fingerprint Recognition',...
'NumberTitle','off',...
'Color',[ 0.827450980392157 0.815686274509804 0.776470588235294 ]);
AxesHandle1 = axes('Position',[0.2 0.15 0.35 0.7],...
'Box','on');
AxesHandle2 = axes('Position',[0.6 0.15 0.35 0.7],...

33

'Box','on');
BackColor = get(gcf,'Color');
FrameBox = uicontrol(FigWin,...
'Units','normalized', ...
'Style','frame',...
'BackgroundColor',[ 0.741176470588235 0.725490196078431 0.658823529411765 ],...
'ForegroundColor',[ 0.741176470588235 0.725490196078431 0.658823529411765 ],...
'Position',[0 0 0.15 1]);
%create static text.
Text2 = uicontrol(FigWin,...
'Style','text',...
'Units','normalized', ...
'Position',[0 0.95 1 0.05],...
'FontSize',15,...
'BackgroundColor',[ 0.741176470588235 0.725490196078431 0.658823529411765 ],...
'HorizontalAlignment','right', ...
'String','Fingerprint Recognition ');
Text2 = uicontrol(FigWin,...
'Style','text',...
'Units','normalized', ...
'Position',[0 0 1 0.05],...
'FontSize',15,...
'BackgroundColor',[ 0.741176470588235 0.725490196078431 0.658823529411765 ],...
'HorizontalAlignment','right', ...
'String','Fingerprint Verification ');
w=16;
textLoad='Load Fingerprint Image';
h=uicontrol(FigWin,...
'Style','pushbutton',...
Position',[0,320,80,20],...
'String','Load',...

34

'Callback',...
['image1=loadimage;'...
'subplot(AxesHandle1);'...
'imagesc(image1);'...
'title(textLoad);'...
'colormap(gray);']);
text_filterArea='Orientation Flow Estimate';
h=uicontrol(FigWin,...
'Style','pushbutton',...
'Position',[0,240,80,20],...
'String','Direction',...
'Callback',...
['subplot(AxesHandle2);[o1Bound,o1Area]=direction(image1,16);title(text_filterArea);']);
text_ROI='Region Of Interest(ROI)';
h=uicontrol(FigWin,..
'Style','pushbutton',...
'Position',[0,220,80,20],...
'String','ROI Area',...
'Callback',...
['subplot(AxesHandle2);[o2,o1Bound,o1Area]=drawROI(image1,o1Bound,o1Area);title(text_R
OI);']);
text_eq='Enhancement by histogram Equalization';
h=uicontrol(FigWin,...
Style','pushbutton',...
'Position',[0,300,80,20],...
'String','his-Equalization',...
'Callback',...
['subplot(AxesHandle2);image1=histeq(uint8(image1));imagesc(image1);title(text_eq);']);
text21='Adaptive Binarization after FFT';
h=uicontrol(FigWin,...
'Style','pushbutton',...

35

'Position',[0,260,80,20],...
'String','Binarization',...
'Callback',...
[%'W=inputdlg(text);W=str2num(char(W));'...
'subplot(AxesHandle1);'...
'image1=adaptiveThres(double(image1),32);title(text21);']);
text='Please input the FFT factor(0~1)';
text_fft='Enhancement by FFT';
h=uicontrol(FigWin,...
Style','pushbutton',...
'Position',[0,280,80,20],...
'String','fft',...
'Callback',...
['W=0.5;'...
'subplot(AxesHandle1);image1=fftenhance(image1,W);imagesc(image1);title(text_fft);']);
text31='Thinned-ridge map';
h=uicontrol(FigWin,...
'Style','pushbutton',...
'Position',[0,200,80,20],...
'String','Thining',...
'Callback',...['subplot(AxesHandle2);o1=im2double(bwmorph(o2,''thin'',Inf));imagesc(o1,[0,1]);ti
tle(text31);']);
text41='Remove H breaks';
h=uicontrol(FigWin,...
'Style','pushbutton',...
'Position',[0,180,80,20],...
'String','remove H breaks',...
'Callback',...
['subplot(AxesHandle2);o1=im2double(bwmorph(o1,''clean''));o1=im2double(bwmorph(o1,''hbre
ak''));imagesc(o1,[0,1]);title(text41);']);
textn1='remove spike';

36

h=uicontrol(FigWin,...
'Style','pushbutton',...
'Position',[0,160,80,20],...
'String','Removing spike',...
'Callback',...
'subplot(AxesHandle2);o1=im2double(bwmorph(o1,''spur''));imagesc(o1,[0,1]);title(textn1);']);
locate minutia and show all those minutia:
text51='Minutia';
h=uicontrol(FigWin,...
'Style','pushbutton',...
'Position',[0,140,80,20],...
'String','Extract',...
'Callback',...
'[end_list1,branch_list1,ridgeMap1,edgeWidth]=mark_minutia(o1,o1Bound,o1Area,w);'...
'subplot(AxesHandle2);show_minutia(o1,end_list1,branch_list1);title(text51);']);
Process for removing spurious minutia
text61='Remove spurious minutia';
h=uicontrol(FigWin,...
'Style','pushbutton',...
'Position',[0,120,80,20],...
'String','Real Minutiae',...
'Callback',...
['[pathMap1,real_end1,real_branch1]=remove_spurious_Minutia(o1,end_list1,branch_list1,o1Ar
ea,ridgeMap1,edgeWidth);'...
'subplot(AxesHandle1);show_minutia(o1,real_end1,real_branch1);title(text61);']);
textSaveName='file name';
h=uicontrol(FigWin,...
'Style','pushbutton',...
'Position',[0,100,80,20],...
'String','save',...
'Callback',...

37

['W=inputdlg(textSaveName);W=char(W);'...
'save(W,''real_end1'',''pathMap1'',''-ASCII'');']);
h=uicontrol('Style','pushbutton',...
'String','Match',...
'Position',[0,80,80,20],...
'Callback',...
['finger1=fingerTemplateRead;'...
'finger2=fingerTemplateRead;'...
'percent_match=match_end(finger1,finger2,10);']);
Ridge compensation filter:
function newim = ridge_compensation_filter(im, orient, freq, kx, ky, showfilter)
if nargin == 5
showfilter = 0;
end
angleInc = 3;
im = double(im);
[rows, cols] = size(im);
newim = zeros(rows,cols);
[validr,validc] = find(freq > 0);
ind = sub2ind([rows,cols], validr, validc);
freq(ind) = round(freq(ind)*100)/100;
unfreq = unique(freq(ind));
freqindex = ones(100,1);
for k = 1:length(unfreq)
freqindex(round(unfreq(k)*100)) = k;
end
filter = cell(length(unfreq),180/angleInc);
sze = zeros(length(unfreq),1);
for k = 1:length(unfreq)
sigmax = 1/unfreq(k)*kx;
sigmay = 1/unfreq(k)*ky;

38

sze(k) = round(3*max(sigmax,sigmay));
[x,y] = meshgrid(-sze(k):sze(k));
reffilter = exp(-(x.^2/sigmax^2 + y.^2/sigmay^2)/2)..
*cos(2*pi*unfreq(k)*x);
for o = 1:180/angleInc
filter{k,o} = imrotate(reffilter,-(o*angleInc+90),'bilinear','crop');
end
end
maxsze = sze(1);
finalind = find(validr>maxsze & validr<rows-maxsze & ...
validc>maxsze & validc<cols-maxsze);
maxorientindex = round(180/angleInc);
orientindex = round(orient/pi*180/angleInc);
i = find(orientindex < 1); orientindex(i) = orientindex(i)+maxorientindex;
i = find(orientindex > maxorientindex);
orientindex(i) = orientindex(i)-maxorientindex;
for k = 1:length(finalind)
r = validr(finalind(k));
c = validc(finalind(k));
filterindex = freqindex(round(freq(r,c)*100));
s = sze(filterindex);
newim(r,c) = sum(sum(im(r-s:r+s, c-s:c+s).*filter{filterindex,orientindex(r,c)}));
end
Ridge orientation:
function [orientim, reliability] = ...
ridgeorient(im, gradientsigma, blocksigma, orientsmoothsigma)
[rows,cols] = size(im);
sze = fix(6*gradientsigma); if ~mod(sze,2); sze = sze+1; end
f = fspecial('gaussian', sze, gradientsigma);
[fx,fy] = gradient(f);


39

Gx = filter2(fx, im);
Gy = filter2(fy, im);
Gxx = Gx.^2;
Gxy = Gx.*Gy;
Gyy = Gy.^2;
sze = fix(6*blocksigma); if ~mod(sze,2); sze = sze+1; end
f = fspecial('gaussian', sze, blocksigma);
Gxx = filter2(f, Gxx);
Gxy = 2*filter2(f, Gxy);
Gyy = filter2(f, Gyy);
denom = sqrt(Gxy.^2 + (Gxx - Gyy).^2) + eps;
sin2theta = Gxy./denom;
cos2theta = (Gxx-Gyy)./denom
sze = fix(6*orientsmoothsigma); if ~mod(sze,2); sze = sze+1; end
f = fspecial('gaussian', sze, orientsmoothsigma);
cos2theta = filter2(f, cos2theta);
sin2theta = filter2(f, sin2theta);
orientim = pi/2 + atan2(sin2theta,cos2theta)/2;
Imin = (Gyy+Gxx)/2 - (Gxx-Gyy).*cos2theta/2 - Gxy.*sin2theta/2;
Imax = Gyy+Gxx - Imin;
reliability = 1 - Imin./(Imax+.001);
reliability = reliability.*(denom>.001);
Ridge Frequency:
function [freq, medianfreq] = ridgefreq(im, mask, orient, blksze, windsze, ...
minWaveLength, maxWaveLength)
[rows, cols] = size(im);
freq = zeros(size(im));
for r = 1:blksze:rows-blksze
for c = 1:blksze:cols-blksze
blkim = im(r:r+blksze-1, c:c+blksze-1);
blkor = orient(r:r+blksze-1, c:c+blksze-1);

40

freq(r:r+blksze-1,c:c+blksze-1) = ...freqest(blkim, blkor, windsze, minWaveLength,
maxWaveLength);
end
end
Mask out frequencies calculated for non ridge regions
freq = freq.*mask;
% Find median freqency over all the valid regions of the image.
medianfreq = median(freq(find(freq>0)));
Frequency estimation:
function enhimg = Frequency_bandpass_filtering(img, BLKSZ)
global NFFT;
if BLKSZ > 0
NFFT = 32; %size of FFT
OVRLP = 2; %size of overlap
ALPHA = 0.5; %root filtering
RMIN = 4;%%3; %min allowable ridge spacing
RMAX = 40; %maximum allowable ridge spacing
ESTRETCH = 20; %for contrast enhancement
ETHRESH = 19; %threshold for the energy
else
NFFT = 32; %size of FFT
BLKSZ = 12; %size of the block
OVRLP = 6; %size of overlap
ALPHA = 0.5; %root filtering
RMIN = 3; %min allowable ridge spacing
RMAX = 18; %maximum allowable ridge spacing
ESTRETCH = 20; %for contrast enhancement
ETHRESH = 6; %threshold for the energy
end

[nHt,nWt] = size(img);

41

img = double(img); %convert to DOUBLE
nBlkHt = floor((nHt-2*OVRLP)/BLKSZ);
nBlkWt = floor((nWt-2*OVRLP)/BLKSZ);
fftSrc = zeros(nBlkHt*nBlkWt,NFFT*NFFT); %stores FFT
nWndSz = BLKSZ+2*OVRLP; %size of analysis window.
oimg = zeros(nBlkHt,nBlkWt);
fimg = zeros(nBlkHt,nBlkWt);
bwimg = zeros(nBlkHt,nBlkWt);
eimg = zeros(nBlkHt,nBlkWt);
enhimg = zeros(nHt,nWt);
[x,y] = meshgrid(0:nWndSz-1,0:nWndSz-1);
dMult = (-1).^(x+y); %used to center the FFT
[x,y] = meshgrid(-NFFT/2:NFFT/2-1,-NFFT/2:NFFT/2-1);
r = sqrt(x.^2+y.^2)+eps;
th = atan2(y,x);
th(th<0) = th(th<0)+pi;
w = raised_cosine_window(BLKSZ,OVRLP); %spectral window
Load filters:
load angular_filters_pi_4; %now angf_pi_4 has filter coefficients
angf_pi_4 = angf;
load angular_filters_pi_2; %now angf_pi_2 has filter coefficients
angf_pi_2 = angf;
Bandpass filter
FLOW = NFFT/RMAX;
FHIGH = NFFT/RMIN;
dRLow = 1./(1+(r/FHIGH).^4);
dRHigh = 1./(1+(FLOW./r).^4);
dBPass = dRLow.*dRHigh;
FFT Analysis
for i = 0:nBlkHt-1
nRow = i*BLKSZ+OVRLP+1;

42

for j = 0:nBlkWt-1
nCol = j*BLKSZ+OVRLP+1;
%extract local block
blk = img(nRow-OVRLP:nRow+BLKSZ+OVRLP-1,nCol- OVRLP:nCol+BLKSZ+OVRLP-
1);
%remove dc
dAvg = sum(sum(blk))/(nWndSz*nWndSz);
blk = blk-dAvg;
blk = blk.*w;
%do pre filtering
blkfft = fft2(blk.*dMult,NFFT,NFFT);
blkfft = blkfft.*dBPass;
dEnergy = abs(blkfft).^2;
blkfft = blkfft.*sqrt(dEnergy);
fftSrc(nBlkWt*i+j+1,:) = transpose(blkfft(:));
dEnergy = abs(blkfft).^2;

dTotal = sum(sum(dEnergy))/(NFFT*NFFT);
fimg(i+1,j+1) = NFFT/(compute_mean_frequency(dEnergy,r)+eps); %ridge separation
oimg(i+1,j+1) = compute_mean_angle(dEnergy,th); %ridge angle
eimg(i+1,j+1) = log(dTotal+eps); %used for segmentation
end;%for j
end;%for i
precomputations
[x,y] = meshgrid(-NFFT/2:NFFT/2-1,-NFFT/2:NFFT/2-1);
dMult = (-1).^(x+y); %used to cent
process the resulting maps
for i = 1:3
oimg = smoothen_orientation_image(oimg); %smoothen orientation image
end;
fimg = smoothen_frequency_image(fimg,RMIN,RMAX,5); %diffuse frequency image

43

cimg = compute_coherence(oimg); %coherence image for bandwidth
bwimg = get_angular_bw_image(cimg); %QUANTIZED bandwidth image
FFT reconstruction
for i = 0:nBlkHt-1
for j = 0:nBlkWt-1
nRow = i*BLKSZ+OVRLP+1;
nCol = j*BLKSZ+OVRLP+1;
Apply the filters
blkfft = reshape(transpose(fftSrc(nBlkWt*i+j+1,:)),NFFT,NFFT);
reconstruction
af = get_angular_filter(oimg(i+1,j+1),bwimg(i+1,j+1),angf_pi_4,angf_pi_2);
blkfft = blkfft.*(af);
blk = real(ifft2(blkfft).*dMult);
enhimg(nRow:nRow+BLKSZ-1,nCol:nCol+BLKSZ-
1)=blk(OVRLP+1:OVRLP+BLKSZ,OVRLP+1:OVRLP+BLKSZ);
end;%for j
end;%for i
%end block processing
%contrast enhancement
enhimg =sqrt(abs(enhimg)).*sign(enhimg);
mx =max(max(enhimg));
mn =min(min(enhimg));
enhimg =uint8((enhimg-mn)/(mx-mn)*254+1);
%clean up the image
emsk = imresize(eimg,[nHt,nWt]);
enhimg(emsk<ETHRESH) = 128;
%end function fft_enhance_cu
%raised_cosine

function y = raised_cosine(nBlkSz,nOvrlp)
nWndSz = (nBlkSz+2*nOvrlp);

44

x = abs(-nWndSz/2:nWndSz/2-1);
y = 0.5*(cos(pi*(x-nBlkSz/2)/nOvrlp)+1);
y(abs(x)<nBlkSz/2)=1;
%end function raised_cosine
raised_cosine_window
function w = raised_cosine_window(blksz,ovrlp)
y = raised_cosine(blksz,ovrlp);
w = y(:)*y(:)';
%end function raised_cosine_window
%get_angular_filter
function r = get_angular_filter(t0,bw,angf_pi_4,angf_pi_2)
global NFFT;
TSTEPS = size(angf_pi_4,2);
DELTAT = pi/TSTEPS;
%get the closest filter
i = floor((t0+DELTAT/2)/DELTAT);
i = mod(i,TSTEPS)+1;
if(bw == pi/4)
r = reshape(angf_pi_4(:,i),NFFT,NFFT)';\
elseif(bw == pi/2)
r = reshape(angf_pi_2(:,i),NFFT,NFFT)';
else
r = ones(NFFT,NFFT);
end;
%end function get_angular_filter
%get_angular_bw_image
%-----------------------------------------------------------
function bwimg = get_angular_bw_image(c)
bwimg = zeros(size(c));
bwimg(:,:) = pi/2; %med bw
bwimg(c<=0.7) = pi; %high bw

45

bwimg(c>=0.9) = pi/4; %low bw
%end function get_angular_bw
%get_angular_bw_image
function mth = compute_mean_angle(dEnergy,th)
global NFFT;
sth = sin(2*th);
cth = cos(2*th);
num = sum(sum(dEnergy.*sth));
den = sum(sum(dEnergy.*cth));
mth = 0.5*atan2(num,den);
if(mth <0)
mth = mth+pi;
end;
end function compute_mean_angle
%-----------------------------------------------------------
%get_angular_bw_image

%-----------------------------------------------------------
function mr = compute_mean_frequency(dEnergy,r)
global NFFT;
num = sum(sum(dEnergy.*r));
den = sum(sum(dEnergy));
mr = num/(den+eps);
%end function compute_mean_angle

%----------------------------------------------------------
function angular_filter_bank(BW,fname)
close all;
%---------------
%parameters
%---------------

46

FFTN = 32;%64;%32;
TSTEPS = 12; %15 degrees interval
DELTAT = pi/TSTEPS;
%---------------
%precompute
%---------------
[x,y] = meshgrid(-FFTN/2:FFTN/2-1,-FFTN/2:FFTN/2-1);
r = sqrt(x.^2+y.^2);
th = atan2(y,x);
th(th<0)= th(th<0)+2*pi; %unsigned
filter = [];

%-------------------------
%generate the filters
%-------------------------
for t0 = 0:DELTAT:(TSTEPS-1)*DELTAT
t1 = t0+pi;
%-----------------
%first lobe
%-----------------
d = angular_distance(th,t0);
msk = 1+cos(d*pi/BW);
msk(d>BW) = 0;
rmsk = msk;
%second lobe
d = angular_distance(th,t1);
msk = 1+cos(d*pi/BW);
msk(d>BW) = 0;
rmsk = (rmsk+msk);

imagesc(rmsk);pause;

47

rmsk = transpose(rmsk)
filter = [filter,rmsk(:)];
end;
%-----------------
%save the filters
%-----------------
angf = filter;
eval(sprintf('save %s angf',fname));
%write to a C file
%---------------------
fp = fopen(sprintf('%s.h',fname),'w');
fprintf(fp,'{\n');
for i = 1:size(filter,2)
i
k = 1;
fprintf(fp,'{');
for j = 1:size(filter,1)
fprintf(fp,'%f,',filter(j,i));
if(k == 32) k=0; fprintf(fp,'\n'); end;
k = k+1;
end;
fprintf(fp,'},\n');
end;
fprintf(fp,'};\n');
fclose(fp);
%end function radial_filter_bank
%angular_distance
function d = angular_distance(th,t0)
d = abs(th-t0);
d = min(d,2*pi-d);
end function angular_distance

48

Show Minutiae:
function show_minutia(image,end_list,branch_list);
colormap(gray);imagesc(image);
hold on;
if ~isempty(end_list)
lot(end_list(:,2),end_list(:,1),'*r');
if size(end_list,2) == 3
hold on
[u,v] = pol2cart(end_list(:,3),10);
quiver(end_list(:,2),end_list(:,1),u,v,0,'g');
end;
end;

if ~isempty(branch_list)
hold on
plot(branch_list(:,2),branch_list(:,1),'+y');
end;
%----------------------------------------------------------
%
%----------------------------------------------------------
function radial_filter_bank
close all;
FFTN = 32;%32;
RSTEPS = 16;
DELTAR = (FFTN/2)/RSTEPS

[x,y] = meshgrid(-FFTN/2:FFTN/2-1,-FFTN/2:FFTN/2-1);
r = sqrt(x.^2+y.^2);
filter = [];
pause;
for r0 = 0:DELTAR:(RSTEPS-1)*DELTAR

49

msk = 0.5*(1+cos(pi*(r-r0)/DELTAR));
msk(abs(r-r0)>DELTAR) =0;
filter = [filter,msk(:)];
imagesc(msk);
pause;
end;
radf = filter;
save radial_filters radf; %save to .mat file
%----------------------
%write to a file
%----------------------
fp = fopen('radial_filter.h','w');
for i = 1:size(filter,2)
k = 1;
fprintf(fp,'{');
for j = 1:size(filter,1)
printf(fp,'%f,',filter(j,i));
if(k == 32) k=0; fprintf(fp,'\n'); end;
k = k+1;
end
fprintf(fp,'}\n');
end;
fclose(fp);
%end function radial_filter_bank
Smoothen:
function nfimg = smoothen_frequency_image(fimg,RLOW,RHIGH,diff_cycles)
valid_nbrs = 3;
[ht,wt] = size(fimg);
nfimg = fimg;
N = 1;


50

%---------------------------------
%perform diffusion
%---------------------------------
h = fspecial('gaussian',2*N+1);
cycles = 0;
invalid_cnt = sum(sum(fimg<RLOW | fimg>RHIGH));
while((invalid_cnt>0 &cycles < diff_cycles) | cycles < diff_cycles)
%---------------
%pad the image
%---------------
fimg = [flipud(fimg(1:N,:));fimg;flipud(fimg(ht-N+1:ht,:))]; %pad the rows
fimg = [fliplr(fimg(:,1:N)),fimg,fliplr(fimg(:,wt-N+1:wt))]; %pad the cols
%---------------
%perform diffusion
%---------------
for i=N+1:ht+N
for j = N+1:wt+N
blk = fimg(i-N:i+N,j-N:j+N);
msk = (blk>=RLOW & blk<=RHIGH);
if(sum(sum(msk))>=valid_nbrs)
blk =blk.*msk;
nfimg(i-N,j-N)=sum(sum(blk.*h))/sum(sum(h.*msk));
else
nfimg(i-N,j-N)=-1;
end;
end;
end;
%--------------
%prepare for next iteration
%---------------
fimg = nfimg;

51

invalid_cnt = sum(sum(fimg<RLOW | fimg>RHIGH))
cycles = cycles+1
end;
cycles;
%end function smoothen_orientation_image
function noimg = smoothen_orientation_image(oimg)
%---------------------------
%smoothen the image
%---------------------------
gx = cos(2*oimg);
gy = sin(2*oimg);
msk = fspecial('gaussian',5);
gfx = imfilter(gx,msk,'symmetric','same');
gfy = imfilter(gy,msk,'symmetric','same');
noimg = atan2(gfy,gfx);
noimg(noimg<0) = noimg(noimg<0)+2*pi;
noimg = 0.5*noimg;
%end function smoothen_orientation_image
function [pathMap, final_end,final_branch]
=remove_spurious_Minutia(in,end_list,branch_list,inArea,ridgeOrderMap,edgeWidth)
[w,h] = size(in);
final_end = [];
final_branch =[];
direct = [];
pathMap = [];
end_list(:,3) = 0;
branch_list(:,3) = 1;
minutiaeList = [end_list;branch_list];
finalList = minutiaeList;
[numberOfMinutia,dummy] = size(minutiaeList);
suspectMinList = [];

52

for i= 1:numberOfMinutia-1
for j = i+1:numberOfMinutia
d =( (minutiaeList(i,1) - minutiaeList(j,1))^2 + (minutiaeList(i,2)-minutiaeList(j,2))^2)^0.5;
if d < edgeWidth
suspectMinList =[suspectMinList;[i,j]];
end;
end;
end;
[totalSuspectMin,dummy] = size(suspectMinList);
%totalSuspectMin
for k = 1:totalSuspectMin
typesum = minutiaeList(suspectMinList(k,1),3) + minutiaeList(suspectMinList(k,2),3);
if typesum == 1
% branch - end pair
if ridgeOrderMap(minutiaeList(suspectMinList(k,1),1),minutiaeList(suspectMinList(k,1),2) ) ==
ridgeOrderMap(minutiaeList(suspectMinList(k,2),1),minutiaeList(suspectMinList(k,2),2) )
finalList(suspectMinList(k,1),1:2) = [-1,-1];
finalList(suspectMinList(k,2),1:2) = [-1,-1];
end;
elseif typesum == 2
% branch - branch pair
if ridgeOrderMap(minutiaeList(suspectMinList(k,1),1),minutiaeList(suspectMinList(k,1),2) ) ==
ridgeOrderMap(minutiaeList(suspectMinList(k,2),1),minutiaeList(suspectMinList(k,2),2) )
finalList(suspectMinList(k,1),1:2) = [-1,-1];
finalList(suspectMinList(k,2),1:2) = [-1,-1];
end;
elseif typesum == 0
% end - end pair
a = minutiaeList(suspectMinList(k,1),1:3);
b = minutiaeList(suspectMinList(k,2),1:3);
if ridgeOrderMap(a(1),a(2)) ~= ridgeOrderMap(b(1),b(2))

53

[thetaA,pathA,dd,mm] = getLocalTheta(in,a,edgeWidth);
[thetaB,pathB,dd,mm] = getLocalTheta(in,b,edgeWidth);
%the connected line between the two points
thetaC = atan2( (pathA(1,1)-pathB(1,1)), (pathA(1,2) - pathB(1,2)) );
angleAB = abs(thetaA-thetaB);
angleAC = abs(thetaA-thetaC);
if ( (or(angleAB < pi/3, abs(angleAB -pi)<pi/3 )) & (or(angleAC < pi/3, abs(angleAC - pi) <
pi/3)) )
finalList(suspectMinList(k,1),1:2) = [-1,-1];
finalList(suspectMinList(k,2),1:2) = [-1,-1];
end;
%remove short ridge later
elseif ridgeOrderMap(a(1),a(2)) == ridgeOrderMap(b(1),b(2))
finalList(suspectMinList(k,1),1:2) = [-1,-1];
finalList(suspectMinList(k,2),1:2) = [-1,-1];
end;
end;
end;
for k =1:numberOfMinutia
if finalList(k,1:2) ~= [-1,-1]
if finalList(k,3) == 0
[thetak,pathk,dd,mm] = getLocalTheta(in,finalList(k,:),edgeWidth);
if size(pathk,1) >= edgeWidth
final_end=[final_end;[finalList(k,1:2),thetak]];
[id,dummy] = size(final_end);
pathk(:,3) = id;
pathMap = [pathMap;pathk];
end;
else
final_branch=[final_branch;finalList(k,1:2)];
thetak,path1,path2,path3] = getLocalTheta(in,finalList(k,:),edgeWidth);

54

if size(path1,1)>=edgeWidth & size(path2,1)>=edgeWidth & size(path3,1)>=edgeWidth
final_end=[final_end;[path1(1,1:2),thetak(1)]];
[id,dummy] = size(final_end);
path1(:,3) = id;
pathMap = [pathMap;path1];
final_end=[final_end;[path2(1,1:2),thetak(2)]];
path2(:,3) = id+1;
pathMap = [pathMap;path2];
final_end=[final_end;[path3(1,1:2),thetak(3)]];
path3(:,3) = id+2;
pathMap = [pathMap;path3];
end;
end;
end;
end;
percent_match = [];
fname=[];
for i=101:110
for j=1:3
tname = sprintf('d:\\419\\image\\%d_%d.tif',i,j);
fname = [fname;tname];
end;
end;
for i=1:3:12
for j=i+3:3:12
t=cputime;
fname1 = fname(i,:);
fname2 = fname(j,:);
template1=load(char(fname1));
template2=load(char(fname2));
num = match_end(template1,template2,10,0);

55

deltaT=cputime-t
i
j
tmp = [i,j,deltaT,num];
percent_match = [percent_match;tmp];
end;
end;
fname = sprintf('d:\\419\\image\\interclassTest.dat');
save(fname,'percent_match','-ASCII');
%percent_match
percent_match = [];
fname=[];
for i=101:110
for j=1:3
tname = sprintf('d:\\419\\image\\%d_%d.tif',i,j);
fname = [fname;tname];
end;
end;
for i=1:12
for j=i+1:3*ceil(i/3)
t=cputime;
fname1 = fname(i,:);
fname2 = fname(j,:);
template1=load(char(fname1));
template2=load(char(fname2));
num = match_end(template1,template2,10,0);
deltaT=cputime-t
i
j
tmp = [i,j,deltaT,num];
filter = [filter,msk(:)];

56

imagesc(msk);
pause;
end;
radf = filter;
save radial_filters radf; %save to .mat file
%----------------------
%write to a file
%----------------------
fp = fopen('radial_filter.h','w');
for i = 1:size(filter,2)
k = 1;
fprintf(fp,'{');
for j = 1:size(filter,1)
printf(fp,'%f,',filter(j,i));
if(k == 32) k=0; fprintf(fp,'\n'); end;
k = k+1;
end
fprintf(fp,'}\n');
end;
fclose(fp);
%end function radial_filter_bank
Smoothen:
function nfimg = smoothen_frequency_image(fimg,RLOW,RHIGH,diff_cycles)
valid_nbrs = 3;
[ht,wt] = size(fimg);
nfimg = fimg;
N = 1;

%---------------------------------
%perform diffusion
%---------------------------------

57

h = fspecial('gaussian',2*N+1);
cycles = 0;
invalid_cnt = sum(sum(fimg<RLOW | fimg>RHIGH));
while((invalid_cnt>0 &cycles < diff_cycles) | cycles < diff_cycles)
%---------------
%pad the image
%---------------
fimg = [flipud(fimg(1:N,:));fimg;flipud(fimg(ht-N+1:ht,:))]; %pad the rows
fimg = [fliplr(fimg(:,1:N)),fimg,fliplr(fimg(:,wt-N+1:wt))]; %pad the cols
%---------------
%perform diffusion
%---------------
for i=N+1:ht+N
for j = N+1:wt+N
blk = fimg(i-N:i+N,j-N:j+N);
msk = (blk>=RLOW & blk<=RHIGH);
if(sum(sum(msk))>=valid_nbrs)
blk =blk.*msk;
nfimg(i-N,j-N)=sum(sum(blk.*h))/sum(sum(h.*msk));
else
nfimg(i-N,j-N)=-1;
end;
end;
end;
%--------------
%prepare for next iteration
%---------------
fimg = nfimg;
invalid_cnt = sum(sum(fimg<RLOW | fimg>RHIGH))
cycles = cycles+1
end;

58

cycles;
%end function smoothen_orientation_image
function noimg = smoothen_orientation_image(oimg)
%---------------------------
%smoothen the image
%---------------------------
gx = cos(2*oimg);
gy = sin(2*oimg);
msk = fspecial('gaussian',5);
gfx = imfilter(gx,msk,'symmetric','same');
gfy = imfilter(gy,msk,'symmetric','same');
noimg = atan2(gfy,gfx);
noimg(noimg<0) = noimg(noimg<0)+2*pi;
noimg = 0.5*noimg;
%end function smoothen_orientation_image
function [pathMap, final_end,final_branch]
=remove_spurious_Minutia(in,end_list,branch_list,inArea,ridgeOrderMap,edgeWidth)
[w,h] = size(in);
final_end = [];
final_branch =[];
direct = [];
pathMap = [];
end_list(:,3) = 0;
branch_list(:,3) = 1;
minutiaeList = [end_list;branch_list];
finalList = minutiaeList;
[numberOfMinutia,dummy] = size(minutiaeList);
suspectMinList = [];
for i= 1:numberOfMinutia-1
for j = i+1:numberOfMinutia
d =( (minutiaeList(i,1) - minutiaeList(j,1))^2 + (minutiaeList(i,2)-minutiaeList(j,2))^2)^0.5;

59

if d < edgeWidth
suspectMinList =[suspectMinList;[i,j]];
end;
end;
end;
[totalSuspectMin,dummy] = size(suspectMinList);
%totalSuspectMin
for k = 1:totalSuspectMin
typesum = minutiaeList(suspectMinList(k,1),3) + minutiaeList(suspectMinList(k,2),3);
if typesum == 1
% branch - end pair
if ridgeOrderMap(minutiaeList(suspectMinList(k,1),1),minutiaeList(suspectMinList(k,1),2) ) ==
ridgeOrderMap(minutiaeList(suspectMinList(k,2),1),minutiaeList(suspectMinList(k,2),2) )
finalList(suspectMinList(k,1),1:2) = [-1,-1];
finalList(suspectMinList(k,2),1:2) = [-1,-1];
end;
elseif typesum == 2
% branch - branch pair
if ridgeOrderMap(minutiaeList(suspectMinList(k,1),1),minutiaeList(suspectMinList(k,1),2) ) ==
ridgeOrderMap(minutiaeList(suspectMinList(k,2),1),minutiaeList(suspectMinList(k,2),2) )
finalList(suspectMinList(k,1),1:2) = [-1,-1];
finalList(suspectMinList(k,2),1:2) = [-1,-1];
end;
elseif typesum == 0
% end - end pair
a = minutiaeList(suspectMinList(k,1),1:3);
b = minutiaeList(suspectMinList(k,2),1:3);
if ridgeOrderMap(a(1),a(2)) ~= ridgeOrderMap(b(1),b(2))
[thetaA,pathA,dd,mm] = getLocalTheta(in,a,edgeWidth);
[thetaB,pathB,dd,mm] = getLocalTheta(in,b,edgeWidth);
%the connected line between the two points

60

thetaC = atan2( (pathA(1,1)-pathB(1,1)), (pathA(1,2) - pathB(1,2)) );
angleAB = abs(thetaA-thetaB);
angleAC = abs(thetaA-thetaC);
if ( (or(angleAB < pi/3, abs(angleAB -pi)<pi/3 )) & (or(angleAC < pi/3, abs(angleAC - pi) <
pi/3)) )
finalList(suspectMinList(k,1),1:2) = [-1,-1];
finalList(suspectMinList(k,2),1:2) = [-1,-1];
end;
%remove short ridge later
elseif ridgeOrderMap(a(1),a(2)) == ridgeOrderMap(b(1),b(2))
finalList(suspectMinList(k,1),1:2) = [-1,-1];
finalList(suspectMinList(k,2),1:2) = [-1,-1];
end;
end;
end;
for k =1:numberOfMinutia
if finalList(k,1:2) ~= [-1,-1]
if finalList(k,3) == 0
[thetak,pathk,dd,mm] = getLocalTheta(in,finalList(k,:),edgeWidth);
if size(pathk,1) >= edgeWidth
final_end=[final_end;[finalList(k,1:2),thetak]];
[id,dummy] = size(final_end);
pathk(:,3) = id;
pathMap = [pathMap;pathk];
end;
else
final_branch=[final_branch;finalList(k,1:2)];
thetak,path1,path2,path3] = getLocalTheta(in,finalList(k,:),edgeWidth);
if size(path1,1)>=edgeWidth & size(path2,1)>=edgeWidth & size(path3,1)>=edgeWidth
final_end=[final_end;[path1(1,1:2),thetak(1)]];
[id,dummy] = size(final_end);

61

path1(:,3) = id;
pathMap = [pathMap;path1];
final_end=[final_end;[path2(1,1:2),thetak(2)]];
path2(:,3) = id+1;
pathMap = [pathMap;path2];
final_end=[final_end;[path3(1,1:2),thetak(3)]];
path3(:,3) = id+2;
pathMap = [pathMap;path3];
end;
end;
end;
end;
percent_match = [];
fname=[];
for i=101:110
for j=1:3
tname = sprintf('d:\\419\\image\\%d_%d.tif',i,j);
fname = [fname;tname];
end;
end;
for i=1:3:12
for j=i+3:3:12
t=cputime;
fname1 = fname(i,:);
fname2 = fname(j,:);
template1=load(char(fname1));
template2=load(char(fname2));
num = match_end(template1,template2,10,0);
deltaT=cputime-t

percent_match = [percent_match;tmp];

62

end;
end;
fname = sprintf('d:\\419\\image\\innerclassTest.dat');
save(fname,'percent_match','-ASCII');
%percent_match
t = cputime;
for i=101:104
for j=1:3
i
j
fname = sprintf('d:\\419\\image\\%d_%d.tif',i,j);
o1 = imread(fname);
o1 =255 - double(o1);
o1=histeq(uint8(o1));
o1=fftenhance(o1,0.45);
o1=adaptiveThres(double(o1),32,0);
[o1Bound,o1Area]=direction(o1,16,0);
[o1,o1Bound,o1Area]=drawROI(o1,o1Bound,o1Area,0);
o1=im2double(bwmorph(o1,'thin',Inf));
o1=im2double(bwmorph(o1,'clean'));
o1=im2double(bwmorph(o1,'hbreak'));
o1=im2double(bwmorph(o1,'spur'));
[end_list1,branch_list1,ridgeMap1,edgeWidth]=mark_minutia(o1,o1Bound,o1Area,16);
[pathMap1,real_end1,real_branch1]=remove_spurious_Minutia(o1,end_list1,branch_list1,o1Are
a,ridgeMap1,edgeWidth);
fname2=sprintf('%s.dat',fname);
save(fname2,'real_end1','pathMap1','-ASCII');
end;
end;
t2=cputime;


63

[imagefile1 , pathname]= uigetfile('*.bmp;*.BMP;*.tif;*.TIF;*.jpg','Open An Fingerprint
image');
if imagefile1 ~= 0
cd(pathname);
image1=readimage(char(imagefile1));
image1=255-double(image1);
end;




































64


A.2Screen shots:

A.2.1 Select the input file:




Fig A.2.1 select the input file

In this screen shot describes the selection processing of the low quality
fingerprint image that can be stored in the file. If this processing takes input that can be given by
user.







65


A.2.2 Input low quality fingerprint image


Fig A.2.1 Input low quality fingerprint image

The above low quality fingerprint images obtained from the cancelable biometrics that
contain only the minutiae points and irregular ridge structure and it may contains core and delta
point on the fingerprint images.








66



A.2.3 Normalized image:


Fig A.3.1 Normalized image

The above screen shot is the processed first stage output and it has been converted into
grey scale pattern. It calculates mean and variance on th input image.








67



A.2.4 Mask image:


Fig A.2.4 Mask image

The ridge valley structure obtained and it is masked in the ridges that specified in the
input device and it calculates weight and height in the ridge patterns.








68


A.2.5 Local Orientation Estimation


Fig A.2.5 Local Orientation Estimation
It identifies the direction basis of x and y axis and finds variation in the ridges.It
specifies 0 and 1 if value of image contains false and true respectively. Errors are present in this
section followed to the end.









69



A.2.6 First stage enhancement:



Fig A.2.6 First stage enhancement:
First stage enhancement uses ridge compensation filter to identify the minutiae
points. It results are not clear in ridge structure further do the second level enhancement.








70


A.2.7 Second stage enhancement:



Fig A.2.7 Second Stage enhancement
Second stage enhancement uses the coherence image and applying the
band pass filters in the coherence image that includes the angular filter and radial filter in the
frequency domain.








71


A.2.8 Output of the enhancement:



Fig A.2.8 Output of the enhancement
Output can be stored in the workspace of matlab software.this output
passed to the fingerprint recongnition scheme and it removes the flaws in the accessing speed.







72


A.2.9 Fingerprint verification:


A.2.9 Fingerprint verification
Fingerprint verification can load the image in the function area and performing FFT in the
enhanced fingerprint imageand calculate the histrogram value in the loaded fingerprint.










73


A.2.10 Applying FFT in the fingerprint:



Fig A.2.10 Applying FFT in the fingerprint
Adaptive binarization uses the adaptive threshold to identify the regional value of the
loaded image and region of interest speccfies the optimum value in the image and it identifies the
ROI area in the image.








74


A.2.11 Thinning


A.2.11 Thinning
Thinning calculates the minutiae extraction features and it helpful to identify the
ridge direction in the fingerprint image.










75


A.2.12 Removing h breaks:



Fig A.2.12 removing h breaks
H breaks are the two ridge directions can meet at the single point and it create
discrepancies in the ridge valley structure. Elimination of h breaks increase matching
performance.








76



A.2.13 Removing spikes:


Fig A.2.13 Removing spikes
Spikes are irrelevant value assigned in the masked image. Eliminate Indifinite
values in the minutiae points and avoids two possible outcome for the same fingerprint
image.








77


A.2.14 Finding minutia


Fig A.2.14 Finding minutia

Minutiae extraction uses the convolution filter in the ROI area and avoid
the remaining surface areas for computational cost and increase speed of the computation.







78


A.2.15 Removing spurious minutiae:

Fig A.2.16 Removing spurious minutiae
A.2.16 Matching Score:

FigA.2.16 Matching Score








79



REFERENCES

1. Shlomo Greenberg, Mayer Aladjem, Daniel Kogan and Itshak Dimitrov, Fingerprint
Image Enhancement Using Coherence Diffusion Filter and Gabor Filter, A. Ali et
al. / conf image processing,Information & Computational Science 9: 1 ,153-160,2012.

2. A.K.jain,L.prabakhar,L.Hongand s.pankatiFilter bank based fingerprint matching
IEEE Trans vol.9,no.5,May 2000.

3. J.C.Yang,D.S.Park and R.Hitchcock, Effective enhancement of low quality
fingerprints with local ridge compensation IEICE Electron.Exp., vol.5,no.23,pp,1002-
1009,2008.

4. H.Fronthaler,K.Kollreider, and J.Bigun,Local Features for enhancement and
minutiae extraction in fingerprints, IEEE Trans.Image Process.,vol 17,no.3,pp.354-
363,Mar.2008.

5. M.D.Marsico, M.Nappi, and G.Tortora,NABS:Novel approaches for biometric
systems,IEEE Trans.Syst.,Man,Cybernatics., Appl,Rev,vol.41,No.4,pp,481-
493,Jul.2011.