Documente Academic
Documente Profesional
Documente Cultură
2, April 2015
ABSTRACT
Information inequalities are very useful and play a fundamental role in the literature of Information
Theory. Applications of information inequalities have discussed by well-known authors like as Dragomir,
Taneja and many researchers etc. In this research paper, we shall consider some new functional
information inequalities in the form of generalized information divergence measures. We shall also
consider relations between Csiszars f-divergence, new f-divergence and other well-known divergence
measures using information inequalities. Numerical bounds of information divergence measure have also
studied.
KEYWORD
Csiszars f-divergence, New f-divergence measure, Relative information of types, J-divergence of types,
Relative J-divergence of typesetc.
1. INTRODUCTION
Let
n
be the set of all complete finite discrete probability distributions. There are many information and
divergence measures are exist in the literature of Information Theory and Statistics. Csiszar [2] &
[3] introduced a generalized measure of information using f-divergence measure given by
n
p
I f ( P, Q) qi f i
i 1
qi
(1.1)
31
1
s 1 s
K s ( P, Q) s ( s 1) pi qi 1 , s 0,1
i 1
n
q
s ( P, Q) D(Q, P) qi log i ,
s0
i 1
pi
n
D( P, Q) pi log pi ,
s 1
i 1
qi
(1.2)
and
s 1
n
pi
1
( s 1) ( pi qi ) ,
i 1
qi
s ( P, Q )
n
p
J
(
P
,
Q
)
( pi qi ) log i ,
i 1
qi
1
Ds ( P, Q) s 1 ( pi qi )
i 1
s ( P, Q )
n
pi
J ( P, Q) pi log q ,
i 1
i
s 1
(1.3)
s 1
s 1
pi qi
,s 1
2qi
(1.4)
s 1
(2.1)
The following propositions are presented by Jain &Saraswat in [7] & [8].
S f ( P, Q) f 1
(2.2)
pi qi i 1, 2,.., n
(2.3)
f (1) 0
(2.4)
Then for any P, Q n from (2.2) of proposition 2.1 and (2.4), we have the inequality
S f ( P, Q) 0
(2.5)
pi qi i i, 2,................n
(2.6)
In particular, if P & Q are probability vectors, then Corollary 2.1.1 shows that for a strictly
convex and normalized f :[0, ) R
S f ( P, Q) 0 and S f ( P, Q) 0 iff P Q
f1 & f 2
are
two
(2.7)
convex
functions
and
g a f1 b f 2 then
S g ( P, Q) a S f1 ( P, Q) b S f2 ( P, Q) , where P, Q n .
We now give some examples of well-known information divergence measures which are obtained
from new f-divergence measure.
S f ( P, Q )
1 n pi2 1 2
1 ( P, Q)
4 i 1 qi
4
(2.8)
33
(2.9)
Relative arithmetic-geometric divergence measure:-If f (t ) t log t thenrelative arithmeticgeometric divergence measure is given by
n
p qi
S f ( P, Q ) i
2
i 1
pi qi
log
2qi
G(Q, P)
Triangular discrimination: - If f (t )
(2.10)
(t 1)2
, t 0 then Triangular discrimination is given
t
by
( pi qi )2 1
S f ( P, Q )
( P, Q)
2
i 1 2 pi qi
n
(2.11)
measure is given by
n
p q
S f ( P, Q ) i i
2
i 1
pi qi
log
2qi
1
J R ( P, Q)
2
(2.12)
P Q
PQ
S f ( P, Q) 1 B
, Q h
,Q
2
(2.13)
3. INFORMATION INEQUALITIES
The following Theorems 3.1 presented in Taneja&Pranesh Kumar [11] and Theorem 3.2
presented in Jain&Saraswat [7] respectively.
(3.1)
n
p
WI f ( P, Q) ( pi qi ) f ' i
i 1
qi
pi
R , i 1, 2,.............., n for some r and R with
qi
34
1
I f ( P, Q) WI f ( P, Q) ( R r ) f '( R) f '(r )
4
( R 1) f (r ) (1 r ) f ( R) 1
I f ( P, Q)
( R r ) f '( R) f '(r )
Rr
4
(3.2)
(3.3)
0
0 S f ( P, Q) f (1)
where
1 n
( pi qi ) f
2 i 1
p q
' i i
2qi
(3.4)
n
p q
S f ( P, Q) qi f i i
i 1
2qi
and f ' is derivative of f . If f is strictly convex, and pi , qi 0,(i 1, 2,........n ) then the
equality holds in (3.4) iff
p
p1 p2
................ n .
q1 q2
qn
Corollary 3.1If function f is normalized i.e. f (1) 0 , then we have the following inequality
n
p q p q
0 S f ( P, Q ) i i f ' i i
2 2qi
i 1
0 S f ( P, Q) ES f ' ( P, Q)
where
n
p q
ES f ' ( P, Q) i i
2
i 1
(3.5)
pi qi
f '
2qi
1
(4.1)
S f ( P, Q) ES f ' ( P, Q) I f ( P, Q) WI f ( P, Q) ( R r ) f '( R) f '(r )
4
( R 1) f (r ) (1 r ) f ( R)
(4.2)
S f ( P, Q) ES f ' ( P, Q) I f ( P, Q)
Rr
( R 1) f (r ) (1 r ) f ( R) 1
S f ( P, Q) ES f ' ( P, Q) I f ( P, Q)
( R r ) f '( R) f '(r ) (4.3)
Rr
4
35
( x y) f '( y) f ( x) f ( y) , x, y I
x 1
2
x 1
'
f ( x)
2
Now we take y
x 1
x
f
2
x 1
f
2
p
Put x i
qi
x 1
'
f ( x)
2
pi qi pi qi
f '
2qi 2qi
x 1
f
x 1
f
pi
f
qi
(4.4)
p q
f i i
2qi
pi qi pi qi n
qi f
f '
2 2qi i 1
i 1
n
pi n
pi qi
qi f
qi i 1
2qi
ES f ' ( P, Q) I f ( P, Q) S f ( P, Q)
ES f ' ( P, Q) S f ( P, Q) I f ( P, Q)
(4.5)
Hence we get
ES f ' ( P, Q) I f ( P, Q)
(4.6)
S f ( P, Q) ES f ' ( P, Q) I f ( P, Q)
(4.7)
S f ( P, Q) ES f ' ( P, Q) I f ( P, Q) WI f ( P, Q)
(4.8)
From (3.2) & (4.7) give the result (4.1)and (3.3) & (4.8) give the results (4.2) & (4.3)
respectively.
In this section we shall establish relationship between various known generalized information
measures of types in the form of inequality using the results (4.3), (4.4) and (4.5) of Theorem
(4.1).In the following Theorem we shall consider the applications of functional information
inequalities for relations between Unified relative Jensen-Shannon and arithmetic-geometric
divergence of types, Relative J-divergence of types, Relative information of types etc.
(5.1)
(5.2)
(5.3)
where
U s( r , R )
s
s
1 ( R r ) s 1 s 1
1 ( R 1) ( r 1) (1 r ) ( R 1)
R r , Ts ( r , R ) s( s 1)
Rr
4 ( s 1)
f s (t ) s(s 1) (t s 1) if s 0,1
1
(5.4)
S f ( P, Q) s (Q, P)
1
ES f ' ( P, Q) s ( P, Q)
2
I f ( P, Q) s ( P, Q)
WI f ( P, Q) s ( P, Q)
(5.5)
(5.6)
(5.7)
(5.8)
f s ( R) s(s 1) ( R s 1) if s 0,1
(5.9)
f s (r ) s( s 1) (r s 1) if s 0,1
(5.10)
Ts ( r , R )
s
s
( R 1) f (r ) (1 r ) f ( R)
1 ( R 1) ( r 1) (1 r ) ( R 1)
s(s 1)
(5.11)
Rr
Rr
37
U s( r , R ) ( R r ) f '( R) f '(r )
1 ( R r ) s 1 s 1
R r
4 ( s 1)
(5.12)
Using equation (4.1), (4.2), (4.3) (4.4), (5.4),(5.5),(5.6),(5.7),(5.8),(5.9),(5.10) (5.11) & (5.12)
give the relation (5.1), (5.2) & (5.3).
In the following corollaries, we shall discuses some cases for particular values of
in corollary (5.1), (5.2), (5.3) and (5.4) respectively.
Corollary 5.1:-If s 1
2
n
n
pi 1 ( R r ) 2 ( R r )
pi2 ( pi qi ) 1 2
1
( P, Q)
( P, Q) (qi pi )
2
4
2
R2 r 2
i 1 ( pi qi )
i 1
qi 8
1
4
1
4
( P, Q )
i 1
pi2 ( pi qi )
( pi qi )
pi2 ( pi qi )
i 1
( pi qi ) 2
( P, Q )
2 ( P, Q )
2 ( P, Q )
1 r ( R 1)(1 r ) (1 r )(1 R)
(5.14)
r R (R r)
1 r ( R 1)(1 r ) (1 r )(1 R)
2
(5.13)
r R (R r)
1 (R r)2 (R r)
8
R2 r 2
(5.15)
Corollary 5.2If s 0
1
1 Rr
F ( P, Q) ( P, Q) D( P, Q) 2 ( P, Q)
r log R R log r
2
4 Rr
1
F ( P, Q) ( P, Q) D( P, Q) ( R 1) log r (r 1) log R
2
1
1 Rr
F ( P, Q) ( P, Q) D( P, Q) ( R 1)log r (r 1)log R
r log R R log r
2
4 Rr
(5.16)
(5.17)
(5.18)
Corollary 5.3 If s 1 2
n
r R
q 1
PQ
P Q 1
4 1 B
, Q 1 (
, Q) 4h( P, Q) 2 (qi pi ) i (r R)
pi 2
2
2 2 2
i 1
R r
(1 R)( r 1) (1 r )( R 1)
PQ
P Q 1
, Q 1 (
, Q ) 4 h ( P, Q ) 4
Rr
2
2 2 2
(5.19) 4 1 B
(5.20)
PQ
P Q 1
, Q (
, Q ) 4 h( P, Q )
1
2
2
2
4 1 B
r R
(1 R)( r 1) (1 r )( R 1) 1
(r R)
Rr
Rr
(5.21)
38
(5.22)
(5.23)
(5.24)
6. NUMERICAL ILLUSTRATIONS
In this section we shall consider some numerical bounds of f-divergence measure using functional
information inequalities and binomial distribution.
Example 6.1
Let P be the binomial probability distribution for the random valuable X with parameter (n=8
p=0.5) and Q its approximated normal probability distribution.
0
.004
.005
1
.031
.030
2
.109
.104
3
.219
.220
4
.274
.282
5
.219
.220
6
.109
.104
7
.031
.030
8
.004
.005
p( x )
q( x )
.774
1.042
1.0503
.997
.968
.997
1.0503
1.042
.774
It is noted that
Ts
s( s 1)
Rr
(.28)
s
s
5 (.77) 1 33 (1.05) 1
Ts (.77,1.05)
28 s( s 1)
(r ,R)
T(.77,1.05)
.1533, T0(.77,1.05) 0.01047, T1 (.77,1.05)
.20
1
2
1 (1.05 .77)
(1.05) s 1 (.77) s 1
4 ( s 1)
.026819, U 0(.77,1.05) .011, U1(.77,1.05)
.023, U1(.77,1.05) .009
2
U s(.77,1.05)
U (.77,1.05)
1
39
REFERENCES
[1].
[2].
[3].
[4].
[5].
[6].
[7].
[8].
[9].
[10].
[11].
[12].
[13].
[14].
Bhattacharya A., Some analogues to amount of information and their uses in statistical estimation,
Sankhya8 (1946) 1-14
Csiszar I. Information measure,A critical servey.Trans.7 thprague conf. on info. Th. Statist. Decius.
Funct, Random Processes and 8th European meeting of statist Volume B. Acadmia Prague, 1978, 7386.
Csiszar I., Information-type measures of difference of probability functions and indirect observations.
Studia Sci.Math.hunger.2(1961).299-318.
Dacunha-Castella D., EcoledEte de probabilities de Saint-Flour VII-1977 Berline, Heidelberg,
New-York:Springer 1978
Dragomir S.S., V. Gluscevic and C.E.M. Pearce, Approximation for the Csiszar f divergence via
mid-point inequalities, in inequality theory and applications- Y. J. Cho, J. K. Kim and S.S. dragomir
(Eds), nova science publishers, inc., Huntington, new York, vol1, 2001, 139-154.
Hellinger
E.,
NeueBegrundung
der
Theorie
der
quadratischenFormenVonunendlichenvielenveranderlichen, J.Rein.Aug.Math,136(1909),210-271
Jain K. C. and R. N. Saraswat A New Information Inequality and its Application in Establishing
Relation among various f-Divergence Measures, Journal of Applied Mathematics, Statistics and
Informatics, 8(1) (2012), 17-32.
Jain K. C. and R. N. SaraswatSome bounds of information divergence measure in term of Relative
arithmetic-geometric divergence measure International Journal of Applied Mathematics and
Statistics, 32,(2) (2013), 48-58.
Pearson K., On the criterion that a give system of deviations from the probable in the case of
correlated system of variables in such that it can be reasonable supposed to have arisen from random
sampling, Phil. Mag., 50(1900),157-172.
Taneja I.J., New Developments in generalized information measures, Chapter in: Advances in
imaging and Electron Physics, Ed. P. W. Hawkes 91 (1995), 37-135.
Taneja I. J. and Pranesh Kumar, Relative Information of types, Csiszar f-divergence and
information inequalities, Information Sciences, 166(1-4) (2004), 105-12
Taneja I. J. and Pranesh Kumar, Generalized non-symmetric divergence measures and
inequalities.The Natural Science and Engineering Research Councils Discovery grant to Pranesh
Kumar
Sibson R., Information Radius, Z, Wahrs. undverw.geb. (14) (1969), 149-160.
Wang Yufeng Wang WendongA Novel Interest Coverage Method Based On Jensen-Shannon
Divergence In Sensor Networks 24(4) Journal of Electronics (China) July 2007.
40