Sunteți pe pagina 1din 12

See

discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/298787743

Automated Diagnosis of Glaucoma Using


Empirical Wavelet Transform and Correntropy
Features Extracted from Fundus...

Article in IEEE Journal of Biomedical and Health Informatics · March 2016


DOI: 10.1109/JBHI.2016.2544961

CITATIONS READS

12 789

3 authors:

Shishir Maheshwari Ram Bilas Pachori


Indian Institute of Technology Indore Indian Institute of Technology Indore
3 PUBLICATIONS 16 CITATIONS 129 PUBLICATIONS 1,695 CITATIONS

SEE PROFILE SEE PROFILE

U Rajendra Acharya
Ngee Ann Polytechnic, Singapore University of …
512 PUBLICATIONS 11,478 CITATIONS

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Development of computer methods for the study of ocular surface temperature View project

Iterative variational mode decomposition based automated detection of glaucoma using fundus images
View project

All content following this page was uploaded by Ram Bilas Pachori on 28 March 2016.

The user has requested enhancement of the downloaded file.


1

Automated Diagnosis of Glaucoma Using Empirical


Wavelet Transform and Correntropy Features
Extracted from Fundus Images
Shishir Maheshwari, Ram Bilas Pachori, and U. Rajendra Acharya

Abstract—Glaucoma is an ocular disorder caused due to and require skilled supervision. It is suggested that combin-
increased fluid pressure in the optic nerve. It damages the ing various imaging methods will significantly improve the
optic nerve subsequently causes loss of vision. The available accuracy of glaucoma identification.
scanning methods are Heidelberg Retinal Tomography (HRT),
Scanning Laser Polarimetry (SLP) and Optical Coherence To- The glaucoma disease is characterized by change in the
mography (OCT). These methods are expensive and require structure of nerve fibers and optic disc parameters such as
experienced clinicians to use them. So, there is a need to diagnose diameter, volume, and area [5], [7]. Structural changes occur
glaucoma accurately with low cost. Hence, in this paper, we due to obstruction to the discharge of aqueous humor, which
have presented a new methodology for an automated diagnosis in turn increases IOP. This injures the optic nerve fiber
of glaucoma using digital fundus images based on Empirical
Wavelet Transform (EWT). The EWT is used to decompose the and prevents the transmission of information from eye to
image and correntropy features are obtained from decomposed the brain [8]. Ophthalmologists examine distinct regions to
EWT components. These extracted features are ranked based on identify disease during eye inspection. Different methods have
t value feature selection algorithm. Then, these features are used been employed to determine representative features such as
for the classification of normal and glaucoma images using Least irregularity of blood vessels [9].
Squares Support Vector Machine (LS-SVM) classifier. The LS-
SVM is employed for classification with Radial Basis Function The fundus images are used for the diagnosis of glaucoma
(RBF), Morlet wavelet and Mexican-hat wavelet kernels. The [10], [11] and diabetic retinopathy [12]. Damage to optic nerve
classification accuracy of proposed method is 98.33% and 96.67% fibre is detected using the morphological features of fundus
using three-fold and ten-fold cross validation respectively. images [13]. Morphological features such as cup to disc ratio,
Index Terms—Glaucoma, empirical wavelet transform, corren- the ratio of area of blood vessels in inferior-superior side to
tropy, feature selection, least squares support vector machine the nasal-temporal side, and ratio of distance between the optic
classifier. disc center and optic nerve head to diameter of the optic disc
are used to detect glaucoma [10]. In morphological methods,
I. I NTRODUCTION choosing structural elements is difficult which may not yield
high classification accuracy [1], [10]. Image segmentation
Laucoma is one of the leading causes of vision loss.
G This is caused due to the increased fluid pressure and
improper drainage of fluid in the eye [1]. It is estimated that
based techniques have been used for glaucoma detection [10],
[14]. These segmentation have shortcomings like localization,
thresholding or demarcation which may lead to unacceptable
in 2013 worldwide 64.3 million aged 40 to 80 years suffered
results and unavoidable errors in glaucoma diagnosis [1].
from glaucoma. This figure is expected to reach 76 million
In order to overcome these difficulties, an automated diag-
by 2020 and 111.8 million by 2040 [2]. The prevalence of
nosis methods are preferred for glaucoma diagnosis. Selection
glaucoma is 2.5% for people of all ages and 4.8% for above
of robust features are necessary to develop a robust system.
75 years of ages [3].
Recent studies have shown that texture features [1], [15],
Diagnosis of glaucoma is mainly based on the Intra Ocular [16] are very effective for glaucoma image detection. Higher
Pressure (IOP), medical history of patient’s family [4], and Order Spectra (HOS) coupled with texture features are used
change in optic disc structure [5]. Glaucoma suspect will to improve the classification accuarcy [15]. In [16], Discrete
have IOP more than 21 mmHg [6]. Other methods of mon- Wavelet Transform (DWT) energies are used as features for
itoring glaucoma involve Optical Nerve Hypoplasia Stereo glauoma detection. The HOS bispecturm features and wavelet
Photographs (ONHSPs), advanced imaging technology such energy features are used for glaucoma diagnosis in [1]. Unlike
as Optical Coherence Tomography (OCT), Scanning Laser DWT, the Empirical Wavelet Transform (EWT) is a signal
Polarimetry (SLP), and Confocal Scanning Laser Ophthal- dependent decomposition technique. The DWT has a set of
moscopy (CSLO) to generate reference images to study the fixed basis functions that are signal independent. The working
eye and its internal structure [5]. These methods are expensive principle of EWT depends upon frequency spectrum of the
Shishir Maheshwari and Ram Bilas Pachori are with Discipline of Electrical signal. In this work, we are proposing a novel method for
Engineering, Indian Institute of Technology, Indore-453446, India (e-mail: the classification of glaucoma images based on EWT and
phd1501102003@iiti.ac.in, pachori@iiti.ac.in). correntropy features. EWT decomposes the image into vari-
U. Rajendra Acharya is with Department of Electronics and Commu-
nication Engineering, Ngee Ann Polytechnic, Singapore-599489 (e-mail: ous frequency bands. Correntropy [17] is extracted from the
aru@np.edu.sg). decomposed EWT components. The features are normalized
2

Normal / Glaucoma
images

R, G, B, and gray
images extraction
R, G, B, and gray
(a) (b)
images
Fig. 1: Standard fundus images (a) normal, (b) glaucoma.
Two-dimensional
empirical wavelet
transform
and ranked on the basis of significant criteria. Least Squares
Support Vector Machine (LS-SVM) classifier with various ker- Sub-band images
nels such as Radial Basis Function (RBF) and wavelet kernels Correntropy based
such as Morlet and Mexican-hat are used for the classification. features extraction
Three-fold and ten-fold cross validation strategies are used to
Features
develop the automated glaucoma diagnosis system.
The rest of the paper is organised as follows: database Feature ranking and
collection is discussed in Section II. In Section III, the pro- normalization
posed methodology and subsequent feature processing steps
Ranked features
are discussed. Classifier and its parameters are addressed in
Section IV. The results of the proposed method are presented Least squares support
in Section V. The discussion on the results are described in vector machine
Section VI and the paper concludes in Section VII. classifier

II. DATA USED Normal Glaucoma


We have used two databases, private and public, in this
study. The proposed method has been applied on both the Fig. 2: Block diagram of proposed method.
databases. Brief description about the databases is as follows.
1) Private database: In this database, we have 30 normal
and 30 open angle glaucoma images. These images are
A. Empirical Wavelet Transform
obtained from Kasturba Medical College, Manipal, India.
The image quality and its usability have been certified by
the doctors of ophthalmology department. The image is In this paper, the EWT method is used for signal decomposi-
stored in 24-bit JPEG format with resolution of 560×720 tion [18]. In signal analysis, EWT is signal dependent method
pixels. Figs. 1(a) and 1(b) show the sample normal and and does not use pre-defined basis functions like in Fourier
glaucoma digital fundus images, respectively. and wavelet transform. EWT is an adaptive method of signal
2) Public database: It consists of 255 normal and 250 decomposition based on the information content of the signal.
glaucoma images. This database is obtained from Medical In EWT, the Fourier spectrum in the range 0 to π is segmented
Image Analysis Group (MIAG) and is available online into M number of parts. Each segment limit is denoted by ωm ,
publicly at http://medimrg.webs.ull.es/. The images are whereas the starting limit is ω0 = 0 and the ending limit is
stored in 24-bit JPEG file format at various resolutions. ωM = π. The transition phase Tm is centered around ωm has
width of 2%m where %m = λωm for 0 < λ < 1.
Bandpass filters defined on each contiguous segment defines
III. P ROPOSED M ETHODOLOGY
empirical wavelets. Littlewood-Paley and Meyer’s wavelets are
We have used two-dimensional (2D) EWT with Littlewood- used as a bandpass filters with the empirical scaling function
Paley as a empirical wavelet [18], [19]. The EWT decomposes ξm (W ) and the empirical wavelets ςm (W ) can be described
the signal into components of different frequency bands. as [18]:
The normal and glaucoma image samples are decomposed
into number of components. The block diagram of proposed
methodology is shown in Fig. 2. 
The R, G, B components and gray scale of fundus images 1 
 if |W | ≤ (1 − λ)ωm
ξm (W ) = cos π2 z(λ, ωm ) if (1 − λ)ωm ≤ |W | ≤ (1 + λ)ωm

are subjected to 2D EWT. The R, G, B components of 
image consists of significant details in the form of variation 0 otherwise

of gray pixel intensities. The information present in R, G, (1)


B components are used to extract features for automated
diagnosis of glaucoma. and
3

(a)

1.5 1.5

1 1
Magnitude

Magnitude
0.5 0.5

0 0
0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5
Normalized frequecy Normalized frequency

(b) (c)

Fig. 3: Fig. 3(a), 3(b), and 3(c) are sample image, mean row spectrum, row spectrum filter bank of Fig. 1(a)
.

1) Compute 1D Fourier transform of each row r of x;


 X(r, Ω) and columns c of x; X(Ω, c) and calculate the
1  if (1 + λ)ωm ≤ |W | ≤ (1 − λ)ωm+1
mean row and column spectrum magnitudes as follows:

π
 
cos 2 z(λ, ωm+1 )




 if (1 − λ)ωm+1 ≤ |W | ≤ (1 + λ)ωm+1 NRw
ςm (W ) = 1 X
sin π2 z(λ, ωm )
 
XR = X(r, Ω)
NRw r=0



if (1 − λ)ωm ≤ |W | ≤ (1 + λ)ωm





0 otherwise and

(2) NCl
1 X
XC = X(Ω, c)
NCl c=0
where z(λ, ωm ) and z(λ, ωm+1 ) can be expressed as [18]:
where number of rows and columns are denoted by N Rw
and N Cl respectively.
 
1
z(λ, ωm ) = z 2λωm (|W | − (1 − λ)ωm ) (3)
2) Perform boundaries detection on XR and XC and
R NR
build the corresponding filter bank {ξ1R , {ςm }m=1 } and
N

1
 {ξ1C , {ςm
C
}m=1
C
} respectively. NR and NC are the number
z(λ, ωm+1 ) = z 2λωm+1 (|W | − (1 − λ)ωm+1 ) (4)
of mean row and column sub-band respectively.
R NR
3) Filter x along the rows {ξ1R , {ςm }m=1 } which provides
(NR +1) output images.
The z(z) satisfies the following criteria [18],
4) Filter (NR +1) output images along the columns with
C NC
 {ξ1C , {ςm }m=1 }, this provides (NR +1)(NC +1) sub-band
0
 if z ≤ 0 images.
z(z) = 1 if z ≥ 1 (5)
 In the proposed method, the decomposed 2D EWT com-
z(z) + z(1 − z) = 1 ∀z ∈ [0, 1]

ponents has been computed using 2D empirical Littlewood-
Paley wavelet transform [19] . Figs. 3(a), 3(b), and 3(c) show
The EWT approach towards 2D signals such as images [19] sample image, its mean row spectrum, and row spectrum filter
is described as follows. Let x denotes the image, the algorithm bank respectively. Figs. 4 and 5 show the EWT components
consists of following steps [19]: of normal and glaucoma images as shown in Fig. 1 of private
4

(a) (b) (c) (d)

(e) (f) (g) (h)

(i) (j) (k) (l)

(m) (n) (o) (p)

(q) (r) (s) (t)

Fig. 4: Fig. 4(a), 4(b), 4(c) and 4(d) are red, green, blue channels and grayscale images of Fig. 1(a) respectively; Fig. 4(e),
4(i), 4(m), 4(q) are EWT components of 4(a); Fig. 4(f), 4(j), 4(n), 4(r) are EWT components of 4(b); Fig. 4(g), 4(k), 4(o),
4(s) are EWT components of 4(c); Fig. 4(h), 4(l), 4(p), 4(t) are EWT components of 4(d).

database. The decomposed EWT components are arranged in measure the distribution of texture in the decomposed image
the order of low frequency components to high frequency components. It has been applied to diagnose of Coronary
components. The pseudo colour representation is used to Artery Disease (CAD) [22]. The correntropy CE(G) for lag
improve difference between the EWT components obtained G can be expressed as [17], [21], [22]:
from R, G, B, and gray channels in Figs. 4 and 5.
 2 X
X
1
CE(G) = X−G+1 k(t[x1 , x2 ] − t[x1 − G, x2 − G]) (6)
B. Feature extraction x1 ,x2 =G

Nonlinearity tests play an important role in system analysis


and modeling because of complexity involved in analysis and where t[x1 , x2 ] is the 2D signal, X is number of rows and
modeling [17]. The correntropy has been used as feature in the columns, and the Gaussian kernel function k(t[x1 , x2 ]−t[x1 −
proposed methodology. We have extracted correntropy from G, x2 − G]) can be expressed as:
the decomposed components of 2D EWT. The correntropy is
k(t[x1 , x2 ] − t[x1 − G, x2 − G]) =
discussed as follows.

1

(t[x1 , x2 ] − t[x1 − G, x2 − G])2

(7)
Correntropy: Correntropy is a non-linear kernel based mea- 2πε
exp −
2ε 2

sure of similarity which preserves both statistical and tem-


poral information [17], [20], [21]. It measures correlation in where the Gaussian kernel parameter width is controlled by
nonlinear domain of multiple delayed samples of the signal. ε. The number of correntropy features depends on the value of
The correntropy based nonlinear features can be used to G. In this study, we have considered G = 3 and extracted three
5

(a) (b) (c) (d)

(e) (f) (g) (h)

(i) (j) (k) (l)

(m) (n) (o) (p)

(q) (r) (s) (t)

Fig. 5: Fig. 5(a), 5(b), 5(c) and 5(d) are red, green, blue channels and grayscale images of Fig. 1(b) respectively; Fig. 5(e),
5(i), 5(m), 5(q) are EWT components of 5(a); Fig. 5(f), 5(j), 5(n), 5(r) are EWT components of 5(b); Fig. 5(g), 5(k), 5(o),
5(s) are EWT components of 5(c); Fig. 5(h), 5(l), 5(p), 5(t) are EWT components of 5(d).

correntropy features from each decomposed EWT components features are arranged according to decreasing t values and
and
 in the  computation of correntropy using (6), the factor six features have been selected on the basis of arranged t
2
1
is considered as one. values. Tables I-IV depict the distribution of all six significant
X−G+1
normalized features and t values of Red (R), Green (G) and
Blue (B) channels and grayscale of images respectively. CExy
C. Feature Selection denotes y th correntropy feature of xth decomposed EWT
Feature selection plays an important role in performance component. These features are then chosen for further analysis.
evaluation. Highly discriminatory features are selected using
Student’s t-test algorithm [23]–[25] for further analysis. The
t-test discriminates two classes on the basis of population D. Feature Standardization
mean. The t-test assumes the normal distribution of feature
sets corresponding to different classes. Features are ranked In this process, the features are standardized with zero mean
based on the t value. Features with higher t value are more and unit standard deviation. The process is known as z-score
discriminatory and are chosen for performance evaluation normalization [26]. The normalization is done by subtracting
process. We have obtained four EWT components from each mean from the data and dividing the resultant with standard
image and computed three correntropy features from each deviation. The normalized data have zero mean and unit
single decomposed component. Hence, twelve entropy features deviation. Let d¯ and d˜ be mean and standard deviation of data
are computed from each image. All these features are fur- d, respectively, then normalized data dˆ is expressed as follows
ther processed using Student’s t-test process. The processed [15]:
6

TABLE I: Distribution of entropy features for R-channel of TABLE III: Distribution of entropy features for B-channel of
colour image of private database. colour image of private database.
Features Normal Glaucoma t value Features Normal Glaucoma t value
CE13 −0.514 ± 1.153 0.514 ± 0.399 4.611 CE42 0.389 ± 0.980 −0.389 ± 0.872 3.251
CE12 −0.412 ± 1.232 0.411 ± 0.408 3.475 CE13 −0.341 ± 1.322 0.341 ± 0.214 2.790
CE42 0.318 ± 1.006 −0.318 ± 0.901 2.578 CE22 0.339 ± 0.933 −0.339 ± 0.962 2.779
CE32 0.180 ± 1.047 −0.180 ± 0.933 1.407 CE23 0.338 ± 1.055 −0.338 ± 0.827 2.765
CE43 −0.079 ± 1.380 0.079 ± 0.342 0.610 CE12 −0.278 ± 1.331 0.278 ± 0.320 2.225
CE33 −0.051 ± 1.358 0.051 ± 0.429 0.390 CE32 0.168 ± 1.139 −0.168 ± 0.823 1.313

TABLE II: Distribution of entropy features for G-channel of TABLE IV: Distribution of entropy features for grayscale
colour image of private database. image of private database.
Features Normal Glaucoma t value Features Normal Glaucoma t value
CE13 −0.425 ± 1.255 0.425 ± 0.292 3.611 CE13 −0.464 ± 1.234 0.464 ± 0.255 4.039
CE12 −0.365 ± 1.269 0.365 ± 0.386 3.013 CE12 −0.407 ± 1.261 0.407 ± 0.316 3.431
CE43 −0.179 ± 1.387 0.179 ± 0.197 1.405 CE42 0.279 ± 1.073 −0.279 ± 0.849 2.230
CE33 −0.153 ± 1.398 0.153 ± 0.173 1.187 CE23 0.212 ± 1.066 −0.212 ± 0.896 1.669
CE42 0.134 ± 1.238 −0.134 ± 0.682 1.039 CE33 −0.196 ± 1.378 0.196 ± 0.237 1.536
CE32 −0.035 ± 1.345 0.035 ± 0.472 0.270 CE22 0.142 ± 1.104 −0.142 ± 0.879 1.102

• The RBF kernel function is expressed as [29], [39]:


d − d¯
dˆ = −||p − pn ||2
 
(8)
d˜ KRBF (p, pn ) = exp (11)
2σ1 2

• The expression for Morlet wavelet kernel function is


IV. LS-SVM C LASSIFIER
given as [30], [31]:
The ranked features are classified using the Least Squares QN h
(pj −pj )
i h
−||pj −pjn ||2
i
Support Vector Machine (LS-SVM) [27], [28] classifier with KMorlet (p, pn ) = j=1 cos ω0 a n exp 2σ2 2 (12)
kernels such as Radial Basis Function (RBF) [29], Morlet
wavelet and Mexican-hat wavelet [30], [31]. LS-SVM is a • The Mexican-hat kernel function is expressed as [30],
supervised machine learning algorithm used to discriminate [31]:
two or more classes using linear or non-linear hyperplanes. QN h (pj −pjn )2 −||pj −pjn ||2
i h i
It has been used in different biomedical applications such KMexican-hat (p, pn ) = j=1 1− a2 exp 2σ3 2 (13)
as diabetes diagnosis using heart rate signals [25], cardiac
where pjn represents jth element of nth traning set, a
disease using heart sound signals [32], CAD [22], seizure
represents the scaling parameter of wavelet. σ1 , σ2 ,σ3 are
detection using Electroencephalogram (EEG) [31], [33]–[36]
kernel parameters of RBF, Morlet and Mexican-hat wavelet
and glaucoma diagnosis using fundus images [1], [15], [16],
respectively.
[37].
Let there be N data points {pn , qn }Nn=1 , where pn ∈ R
m
th m V. R ESULTS
is n input data with qn ∈ R is class label corresponding
to nth data point. For classification of two classes using LS- We have extracted 12 entropy features from each R, G, and
SVM, discrimination function can be written as [27]: B channels and grayscale fundus image. Significant features
for private database are shown in Tables I-IV with their
κ(x) = sign ΩT z(p) + b
 
(9) corresponding t value. The t-test discriminates whether the
mean value of the classes is significantly different.
The classifier kernel parameters σ1 , σ2 and σ3 value are
where Ω is weight vector of dimension x and b is a bias varied from 0.1 to 1 with a step size of 0.1 and ω0 is set
and z(p) function maps p into x-dimensional space. to 0.3 for Morlet wavelet kernel by trial and error method.
LS-SVM classifier decision function can be determined as We have chosen the value of σ1 , σ2 and σ3 which yielded
[27], [38]: maximum accuracy. The plot of accuracy versus σ1 , σ2 and
"N # σ3 kernel parameters is shown in Fig. 6 for three-fold and
κ(x) = sign
X
an qn K(p, pn ) + b (10) ten-fold cross validation [41] strategies for private database.
n=1
Tables V and VI show the accuracy (Acc) [42], sensitivity
(Sn) [42] and specificity (Sp) [42] of classification for different
where K(p, pn ) is kernel function and an is Lagrange channels, number of features and kernel functions using three-
multiplier. fold and ten-fold cross validation, respectively for private
We have used RBF, Morlet wavelet and Mexican-hat database. We have obtained maximum classification accuracy
wavelet kernels for the classification with LS-SVM. The of 98.33% using RBF and Morlet wavelet kernel for three-fold
mathematical expression for these three kernel functions are cross validation. We have also obtained maximum classifica-
given as: tion accuracy of 96.67% using RBF, Morlet and Mexican-hat
7

R−channel G−channel
100 100

95 98

90
96

85
Accuracy in %

Accuracy in %
94
80
92
75

90
70

65 88

60 86
0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
kernel parameter kernel parameter

RBF
Mexican−hat
Morlet

B−channel Grayscale image


100 100

95 95

90
90

Accuracy in %
Accuracy in %

85
85
80

80
75

75
70

65 70
0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
kernel parameter kernel parameter
(a)

R−channel G−channel
100 100

95

90 95

85
Accuracy in %

Accuracy in %

80 90

75

70 85

65

60 80

55

50 75
0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
kernel parameter kernel parameter

RBF
Mexican−hat
Morlet

B−channel Grayscale image


100 100

95 95

90 90

85 85
Accuracy in %
Accuracy in %

80 80

75 75

70 70

65 65

60 60
0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0.1 0.2 0.3 0.4 0.5 0.6
kernel parameter
0.7 0.8 0.9 1 (b) kernel parameter

Fig. 6: Plot of accuracy versus kernel parameters of various wavelet functions for different channels (a) three-fold and (b)
ten-fold cross validation for private database.
8

TABLE V: Results of classification for different channels, number of features and kernel functions using three-fold validation
for private database.
Channel/Image Number of features Kernel (parameter) Acc (%) Sn (%) Sp (%)
R 6 RBF (σ1 = 0.9) 91.67 86.67 96.67
R 6 Morlet (σ2 = 1) 90.00 83.33 96.67
R 2 Mexican-hat (σ3 = 0.3) 90.00 90.00 90.00
G 4 RBF (σ1 = 0.3) 98.33 100 96.67
G 4 Morlet (σ2 = 0.6) 98.33 96.67 100
G 4 Mexican-hat (σ3 = 0.4) 95.00 96.67 93.33
B 4 RBF (σ1 = 0.5) 90.00 83.33 96.67
B 3 Morlet (σ2 = 0.5) 88.33 83.33 93.33
B 2 Mexican-hat (σ3 = .6) 88.33 83.33 93.33
grayscale 6 RBF (σ1 = 0.5) 96.67 93.33 100
grayscale 6 Morlet (σ2 = 0.9) 93.33 90.00 96.67
grayscale 5 Mexican-hat (σ3 = 0.5) 93.33 93.33 93.33

TABLE VI: Results of classification for different channels, number of features and kernel functions using ten-fold validation
for private database.
Channel/Image Number of features Kernel (parameter) Acc (%) Sn (%) Sp (%)
R 5 RBF (σ1 = 0.9) 90.00 83.33 96.67
R 5 Morlet (σ2 = 1) 88.33 83.33 93.33
R 2 Mexican-hat (σ3 = 1) 86.67 93.33 80.00
G 4 RBF (σ1 = 0.3) 96.67 100 93.33
G 4 Morlet (σ2 = 0.5) 96.67 96.67 96.67
G 4 Mexican-hat (σ3 = 0.4) 95.00 96.67 93.33
B 3 RBF (σ1 = 0.4) 88.33 83.33 93.33
B 3 Morlet (σ2 = 0.7) 88.33 83.33 93.33
B 3 Mexican-hat (σ3 = 1) 90.00 90.00 90.00
grayscale 5 RBF (σ1 = 0.6) 93.33 90.00 96.67
grayscale 6 Morlet (σ2 = 1) 91.67 86.67 96.67
grayscale 2 Mexican-hat (σ3 = 0.3) 90.00 96.67 83.33

TABLE VII: Results of classification for different channels, number of features and kernel functions using three-fold
validation for public database.
Channel/Image Number of features Kernel (parameter) Acc (%) Sn (%) Sp (%)
R 12 RBF (σ1 = 0.3) 80.01 75.05 83.92
R 12 Morlet (σ2 = 1) 79.78 74.49 83.92
R 1 Mexican-hat (σ3 = 0.3) 75.17 62.05 85.49
G 12 RBF (σ1 = 0.5) 80.45 75.03 84.71
G 11 Morlet (σ2 = 1) 80.44 74.98 84.71
G 2 Mexican-hat (σ3 = 0.7) 75.39 60.03 87.45
B 12 RBF (σ1 = 0.3) 81.09 78.50 83.14
B 11 Morlet (σ2 = 0.8) 81.32 78.99 83.14
B 3 Mexican-hat (σ3 = .7) 75.61 57.00 90.20
grayscale 12 RBF (σ1 = 0.4) 81.10 77.01 84.31
grayscale 12 Morlet (σ2 = 1) 80.44 79.01 81.57
grayscale 3 Mexican-hat (σ3 = 0.9) 75.17 63.52 84.31

TABLE VIII: Results of classification for different channels, number of features and kernel functions using ten-fold
validation for public database.
Channel/Image Number of features Kernel (parameter) Acc (%) Sn (%) Sp (%)
R 12 RBF (σ1 = 0.4) 79.32 71.00 85.80
R 12 Morlet (σ2 = 1) 78.70 74.00 82.45
R 2 Mexican-hat (σ3 = .6) 74.97 64.50 83.22
G 12 RBF (σ1 = 0.3) 80.26 74.00 85.18
G 12 Morlet (σ2 = 1) 80.44 74.00 85.51
G 2 Mexican-hat (σ3 = 0.2) 75.62 65.50 83.54
B 12 RBF (σ1 = 0.3) 80.44 77.50 82.82
B 11 Morlet (σ2 = 0.9) 80.66 78.00 82.78
B 3 Mexican-hat (σ3 = 1) 74.96 58.00 88.23
grayscale 12 RBF (σ1 = 0.4) 80.63 76.00 84.25
grayscale 12 Morlet (σ2 = .9) 80.41 78.00 82.25
grayscale 2 Mexican-hat (σ3 = 0.9) 74.93 64.00 83.45
9

TABLE IX: Comparison with previous techniques for automated glaucoma detection.
Authors Method used Number of features Classifier Acc (%) Sn (%) Sp (%)
Kolar et al. [40] Power spectral features 2 SVM 74 NR∗ NR
Nayak et al. [10] Fundus disk parametes 3 ANN NR 100 80
Bock et al. [3] PCA, FFT and spline 90 SVM 80 NR NR
Acharya et al. [15] HOS bispectrum 12 RF 91 NR NR
Dua et al. [16] Wavelet energy 14 SMO 93.33 NR NR
Mookiah et al. [1] HOS and wavelet 35 SVM 95 93.33 96.67
Noronha et. al. [37] HOS cumulants 13 SVM 92.65 100 92
Acharya et al. [24] Gabor transform 32 SVM 93.10 89.75 96.20
Our method 2D EWT and correntropy 6 SVM 98.33 100 96.67
∗ NR=Not Reported

wavelet kernels with ten-fold cross validation. Figs. 6(a) and (SMO) classifier [16]. HOS based entropy and wavelet based
6(b) shows the plot of accuracy versus σ1 , σ2 and σ3 kernel features are used for the classification of glaucoma images
parameters of various wavelet functions for different channels using SVM classifier [1]. They reported an accuracy of 95%
with three-fold and ten-fold cross validations, respectively for and also proposed glaucoma risk index for faster diagnosis of
private database. The corresponding sigma values with highest glaucoma. Acharya et al. [24] applied Gabor transformation
accuracy are marked with an ellipse in the Fig. 6. We have on fundus images and the features are extracted from the
evaluated the performance of proposed system for three res- Gabor coefficients. PCA has been applied to reduce the feature
olutions, 250X250, 500X500, and 650X650 (private dataset). dimensionality. They obtained a classification accuracy of
The obtained classification accuracies are 96.67%, 98.33%, 93.10% using SVM classifier for glaucoma diagnosis.
and 96.67% respectively using three-fold cross validation. Noronha et al. [37] extracted HOS cumulants after applying
We have evaluated the performance of this technique on Radon transform (RT) on fundus images. The features are
the public database. The results are tabulated in table VII and then subjected to linear discriminant analysis (LDA) in order
VIII. We have obtained the maximum classification accuracy to reduce feature dimensionality. The reduced feature set is
of 81.32% and 80.66% using three-fold and ten-fold cross then fed to the SVM and Nave Bayesian (NB) classifiers for
validations respectively. automated diagnosis. They reported an average classification
accuracy of 92.65%.
VI. D ISCUSSION In this work, we have used two databases. We have obtained
highest classification accuracy of 98.33% and 96.67% using
In our proposed methodology, EWT based correntropy
three-fold and ten-fold cross validation, respectively for private
features are extracted from the decomposed components. Cor-
database using empirical wavelet based entropy features. Using
rentropy is a non-linear kernelized similarity measure. This
public database, we have obtained the maximum classification
feature effectively captured the subtle variations in the pixels
accuracy of 81.32% and 80.66% using three-fold and ten-fold
and yielded high classification accuracy. In Table IX, we
cross validations, respectively.
have compared the performance of our results with previous
The advantages of the proposed method are as follows:
methodologies.
(i) Obtained a sensitivity of 100% indicating that, there is
Glaucoma is characterized by regular damage of Retinal
no false negative. This will reduce the workload of clinicians
Nerve Fibre (RNF). Fractal and power spectral features are
by more than 50%. Now they need to focus their attention
used to analyse RNF. Support Vector Machine (SVM) clas-
only on normal class.
sifier yielded a classification accuracy of 74% [40]. Various
(ii) Reported high performance using three-fold and ten-fold
morphological features are obtained from the fundus images
cross validation. Hence, the proposed method is repeatable.
and fed to the Artificial Neural Network (ANN). Their method
(iii) Used less number of features to obtain highest classi-
reported sensitivity and specificity of 100% and 80%, respec-
fication accuracy. Hence, the developed system is simple and
tively [10].
fast.
The features namely raw intensities, fast Fourier transform
(iv) Validated our proposed system using both private and
(FFT) coefficients, and B-spline coefficients are computed
public databases.
from preprocessed fundus images and Principal Component
(v) Performance of the system does not depend on the
Analysis (PCA) is applied on the extracted features to reduce
resolution of the image.
the dimensionality of feature vectors. This technique obtained
In future, authors will be exploring the possibility of testing
the classification accuracy of 80% using SVM classifier [3].
the performance of this technique with huge database and
Texture and Higher Order Spectra (HOS) bispectrum features
detect glaucoma at an early stage.
are computed for the classification of normal and glaucoma
images [15]. They have reported an accuracy of 91.7% using
Random Forest (RF) classifier. VII. C ONCLUSION
Wavelet based features are extracted using different wavelet In this paper, we have developed an automated diagnosis
families. The features are ranked and selected features are of glaucoma system. The EWT based correntropy features
subjected to various classifiers. They achieved an accuracy are extracted from fundus images. Features with high t value
of 93.33% using SVM and Sequential Minimal Optimization are used for classification. We have used different kernels
10

for classification and found that RBF and Morlet wavelet [16] S. Dua, U. R. Acharya, P. Chowriappa, and S. V. Sree, “Wavelet
kernels yielded the highest accuracies. It can be concluded based energy features for glaucomatous image classification,” IEEE
Transactions on Information Technology in Biomedicine, vol. 16, no. 1,
that the empirical wavelet based entropy features are useful pp. 80–87, 2012.
for glaucoma diagnosis. Proper selection of kernel functions [17] A. Gunduz and J. C. Principe, “Correntropy as a novel measure for
and its parameters can improve classification accuracy. We nonlinearity tests,” Signal Processing, vol. 89, no. 1, pp. 14–23, 2009.
[18] J. Gilles, “Empirical wavelet transform,” IEEE Transactions on Signal
have also observed that, the green channel of colour image Processing, vol. 61, no. 16, pp. 3999–4010, 2013.
yielded the highest accuracy as compared to other channels. [19] J. Gilles, G. Tran, and S. Osher, “2D empirical transforms. Wavelets,
The proposed methodology need to tested for huge database ridgelets, and curvelets revisited,” SIAM Journal on Imaging Sciences,
vol. 7, no. 1, pp. 157–186, 2014.
and also can be extended to diagnose glaucoma at an early [20] I. Santamaria, P. Pokharel, and J. Principe, “Generalized correlation
stage. In the proposed method, the correntropy features are function: Definition, properties, and application to blind equalization,”
computed based on the texture of decomposed components of IEEE Transactions on Signal Processing, vol. 54, no. 6, pp. 2187–2197,
2006.
different frequency bands. The same idea can be extended to [21] W. Liu, P. Pokharel, and J. Principe, “Correntropy: Properties and
diagnosis of other diseases like diabetes retinopathy, fatty liver applications in non-Gaussian signal processing,” IEEE Transactions on
disease, thyroid cancer and ovarian cancer etc. Signal Processing,, vol. 55, no. 11, pp. 5286–5298, 2007.
[22] S. Patidar, R. B. Pachori, and U. R. Acharya, “Automated diagnosis of
coronary artery disease using tunable-Q wavelet transform applied on
R EFERENCES heart rate signals,” Knowledge-Based Systems, vol. 82, pp. 1–10, 2015.
[23] J. F. Box, “Guinness, Gosset, Fisher, and small samples,” Statistical
[1] M. R. K. Mookiah, U. R. Acharya, C. M. Lim, A. Petznick, and J. S. Science, vol. 2, no. 1, pp. 45–52, 1987.
Suri, “Data mining technique for automated diagnosis of glaucoma using [24] U. R. Acharya, E. Ng, L. W. J. Eugene, K. P. Noronha, L. C. Min, K. P.
higher order spectra and wavelet energy features,” Knowledge-Based Nayak, and S. V. Bhandary, “Decision support system for the glaucoma
Systems, vol. 33, pp. 73–82, 2012. using Gabor transformation,” Biomedical Signal Processing and Control,
[2] Y. C. Tham, X. Li, T. Y. Wong, H. A. Quigley, T. Aung, and C. Y. Cheng, vol. 15, pp. 18–26, 2015.
“Global prevalence of glaucoma and projections of glaucoma burden [25] U. R. Acharya, K. S. Vidya, D. N. Ghista, W. J. E. Lim, F. Molinari, and
through 2040: A systematic review and meta-analysis,” Ophthalmology, M. Sankaranarayanan, “Computer aided diagnosis of diabetic subjects by
vol. 121, no. 11, pp. 2081–2090, 2014. heart rate variability signals using discrete wavelet transform method,”
[3] R. Bock, J. Meier, L. G. Nyúl, J. Hornegger, and G. Michelson, Knowledge-Based Systems, vol. 81, pp. 56 – 64, 2015.
“Glaucoma risk index: Automated glaucoma detection from color fundus [26] M. H. Dunham, Data Mining: Introductory and Advanced Topics.
images,” Medical Image Analysis, vol. 14, no. 3, pp. 471–481, 2010. Upper Saddle River, NJ, USA: Prentice Hall PTR, 2002.
[4] S. Kavitha, N. Zebardast, K. Palaniswamy, R. Wojciechowski, E. S. [27] J. A. K. Suykens and J. Vandewalle, “Least squares support vector
Chan, D. S. Friedman, R. Venkatesh, and P. Y. Ramulu, “Family history machine classifiers,” Neural Processing Letters, vol. 9, no. 3, pp. 293–
is a strong risk factor for prevalent angle closure in a south indian 300, 1999.
population,” Ophthalmology, vol. 121, no. 11, pp. 2091–2097, 2014. [28] J. Suykens, T. V. Gestel, J. D. Brabanter, B. D. Moor, and J. Vandewalle,
[5] M. J. Greaney, D. C. Hoffman, D. F. Garway-Heath, M. Nakla, A. L. Least squares support vector machines. World Scientific Publication,
Coleman, and J. Caprioli, “Comparison of optic nerve imaging methods Singapore, 2002, vol. 4.
to distinguish normal eyes from those with glaucoma,” Investigative [29] A. H. Khandoker, D. T. H. Lai, R. K. Begg, and M. Palaniswami,
Ophthalmology and Visual Science, vol. 43, no. 1, pp. 140–145, 2002. “Wavelet based feature extraction for support vector machines for
[6] S. Y. Shen, T. Y. Wong, P. J. Foster, J. L. Loo, M. Rosman, S. C. screening balance impairments in the elderly,” IEEE Transactions on
Loon, W. L. Wong, S. M. Saw, and T. Aung, “The prevalence and Neural Systems and Rehabilitation Engineering, vol. 15, no. 4, pp. 587–
types of glaucoma in Malay people: The Singapore Malay eye study,” 597, 2007.
Investigative Ophthalmology and Visual Science, vol. 49, no. 9, pp. [30] M. Zavar, S. Rahati, M. R. Akbarzadeh-T, and H. Ghasemifard, “Evolu-
3846–3851, 2008. tionary model selection in a wavelet-based support vector machine for
[7] L. G. Nyúl, “Retinal image analysis for automated glaucoma risk eval- automated seizure detection,” Expert Systems with Applications, vol. 38,
uation,” Proc. SPIE: Medical Imaging, Parallel Processing of Images, no. 9, pp. 10 751–10 758, 2011.
and Optimization Techniques, vol. 7497, pp. 1–9, 2009. [31] V. Bajaj and R. B. Pachori, “Classification of seizure and nonseizure
[8] R. A. Gafar and T. Morris, “Progress towards detection and characterisa- EEG signals using empirical mode decomposition,” IEEE Transactions
tion of the optic disk in glaucoma and diabetic retinopathy,” Informatics on Information Technology in Biomedicine, vol. 16, no. 6, pp. 1135–
for Health and Social Care, vol. 32, no. 1, pp. 19–25, 2007. 1142, 2012.
[9] J. Staal, M. Abràmoff, M. Niemeijer, M. Viergever, and B. V. Ginneken, [32] S. Patidar, R. B. Pachori, and N. Garg, “Automatic diagnosis of septal
“Ridge based vessel segmentation in color images of the retina,” IEEE defects based on tunable-Q wavelet transform of cardiac sound signals,”
Transactions on Medical Imaging, vol. 23, no. 4, pp. 501–509, 2004. Expert Systems with Applications, vol. 42, no. 7, pp. 3315–3326, 2015.
[10] J. Nayak, U. R. Acharya, P. S. Bhat, N. Shetty, and T. C. Lim, [33] R. Sharma, R. B. Pachori, and U. R. Acharya, “Application of entropy
“Automated diagnosis of glaucoma using digital fundus images,” Journal measures on intrinsic mode functions for the automated identification of
of Medical Systems, vol. 33, no. 5, pp. 337–346, 2009. focal electroencephalogram signals,” Entropy, vol. 17, no. 2, pp. 669–
[11] W. L. Yun, U. R. Acharya, Y. Venkatesh, C. Chee, L. C. Min, and 691, 2015.
E. Ng, “Identification of different stages of diabetic retinopathy using [34] R. Sharma and R. B. Pachori, “Classification of epileptic seizures in EEG
retinal optical images,” Information Sciences, vol. 178, no. 1, pp. 106– signals based on phase space representation of intrinsic mode functions,”
121, 2008. Expert Systems with Applications, vol. 42, no. 3, pp. 1106–1117, 2015.
[12] J. Nayak, P. S. Bhat, U. R. Acharya, C. Lim, and M. Kagathi, “Auto- [35] R. B. Pachori, R. Sharma, and S. Patidar, “Classification of normal and
mated identification of diabetic retinopathy stages using digital fundus epileptic seizure EEG signals based on empirical mode decomposition,”
images,” Journal of Medical Systems, vol. 32, no. 2, pp. 107–115, 2008. in Complex System Modelling and Control Through Intelligent Soft
[13] N. V. Swindale, G. Stjepanovic, A. Chin, and F. S. Mikelberg, “Auto- Computations, ser. Studies in Fuzziness and Soft Computing, Q. Zhu
mated analysis of normal and glaucomatous optic nerve head topography and A. T. Azar, Eds. Springer International Publishing, 2015, vol. 319,
images,” Investigative Ophthalmology and Visual Science, vol. 41, no. 7, pp. 367–388.
pp. 1730–1742, 2000. [36] R. B. Pachori and S. Patidar, “Epileptic seizure classification in EEG
[14] W. Adler, T. Hothorn, and B. Lausen, “Simulation based analysis of signals using second-order difference plot of intrinsic mode functions,”
automated classification of medical images,” Methods of Information in Computer Methods and Programs in Biomedicine, vol. 113, no. 2, pp.
Medicine, vol. 43, no. 2, pp. 150–155, 2004. 494–502, 2014.
[15] U. R. Acharya, S. Dua, X. Du, S. V. Sree, and C. K. Chua, “Automated [37] K. P. Noronha, U. R. Acharya, K. P. Nayak, R. J. Martis, and S. V.
diagnosis of glaucoma using texture and higher order spectra features,” Bhandary, “Automated classification of glaucoma stages using higher
IEEE Transactions on Information Technology in Biomedicine, vol. 15, order cumulant features,” Biomedical Signal Processing and Control,
no. 3, pp. 449–455, 2011. vol. 10, pp. 174–183, 2014.
11

[38] S. Li, W. Zhou, Q. Yuan, S. Geng, and D. Cai, “Feature extraction and U. Rajendra Acharya (SM’2009) PhD, DEng is
recognition of ictal EEG using EMD and SVM,” Computers in Biology a senior faculty member at Ngee Ann Polytechnic,
and Medicine, vol. 43, no. 7, pp. 807–816, 2013. Singapore. He is also (i) Adjunct Professor in Uni-
[39] L. Zhang, W. Zhou, and L. Jiao, “Wavelet support vector machine,” IEEE versity of Malaya, Malaysia, (ii) Adjunct faculty
Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, in Singapore Institute of Technology-University of
vol. 34, no. 1, pp. 34–39, 2004. Glasgow, Singapore and (iii) Associate faculty in
[40] R. Kolar and J. Jan, “Detection of glaucomatous eye via color fundus SIM University, Singapore. He received his PhD
images using fractal dimensions,” Radio Engineering, vol. 17, no. 3, pp. from National Institute of Technology Karnataka
109–114, 2008. (Surathkal, India) and DEng from Chiba University
[41] R. Kohavi, “A study of cross-validation and bootstrap for accuracy (Japan). He has published more than 300 papers in
estimation and model selection,” in International Joint Conference on refereed international SCI-IF journals (260), inter-
Artificial intelligence, vol. 14, no. 2, 1995, pp. 1137–1145. national conference proceedings (42) and books (17) with more than 7000
[42] A. T. Azar and S. A. El-Said, “Performance analysis of support vector citations in Google Scholar (h-index of 44). He has worked on various funded
machines classifiers in breast cancer mammography recognition,” Neural projects with grants worth more than 2 million SGD. He is on the editorial
Computing and Applications, vol. 24, no. 5, pp. 1163–1177, 2014. board of many journals and served as Guest Editor for many journals. His
major academic interests are in Biomedical Signal Processing, Bio-imaging,
Data mining, Visualization and Biophysics for better health care design,
delivery and therapy. Please visit http://urajendraacharya.webs.com/ for more
details.

Shishir Maheshwari received BE degree in Elec-


tronics and Communication Engineering from Ra-
jiv Gandhi Technological University, Bhopal, India
in 2010, and MTech degree in Signal and Image
Processing from National Institute of Technology,
Rourkela, India in 2014. He is currently working
towards the PhD degree in Electrical Engineering
at the Indian Institute of Technology Indore, Indore,
India. His research interest includes area of Biomed-
ical Signal and Image Processing.

Ram Bilas Pachori (M’2013) received the BE


degree with honors in Electronics and Communica-
tion Engineering from Rajiv Gandhi Technological
University, Bhopal, India in 2001, the MTech and
PhD degrees in Electrical Engineering from Indian
Institute of Technology Kanpur, Kanpur, India in
2003 and 2008 respectively. He worked as a Post-
doctoral Fellow at Charles Delaunay Institute, Uni-
versity of Technology of Troyes, Troyes, France
during 2007-08. He served as an Assistant Professor
at Communication Research Center, International
Institute of Information Technology, Hyderabad, India during 2008-09. He
served as an Assistant Professor at Discipline of Electrical Engineering,
School of Engineering, Indian Institute of Technology Indore, Indore, India
during 2009-13, where presently he has been working as an Associate
Professor since 2013. He worked as a Visiting Scholar at Intelligent Systems
Research Center, School of Computing and Intelligent Systems, University of
Ulster, Northern Ireland, UK during December 2014. His research interests are
in the areas of biomedical signal processing, non-stationary signal processing,
speech signal processing, signal processing for communications, computer-
aided medical diagnosis, and signal processing applications.

View publication stats

S-ar putea să vă placă și