Sunteți pe pagina 1din 20

Module 24

EXPECTATIONS OF RANDOM VECTORS

() Module 24 EXPECTATIONS OF RANDOM VECTORS 1 / 19


X = (X1 , . . . , Xp )0 : a p-dimensional random vector of either discrete
or of A.C. type;

SX : support of X ;

fX (·): p.m.f./ p.d.f. of X ;

SXi : support of Xi , i = 1, . . . , p;

fXi (·): marginal p.m.f./ p.d.f. of Xi , i = 1, . . . , p.

() Module 24 EXPECTATIONS OF RANDOM VECTORS 1 / 19


Result 1:

Let ψ : Rp → R be a function such that E (ψ(X )) is finite.


(i) If X is finite then
X
E (ψ(X )) = ψ(x)fX (x).
x∈SX

(ii) If X is A.C. then


Z
E (ψ(X )) = ψ(x)fX (x)dx.
Rp

() Module 24 EXPECTATIONS OF RANDOM VECTORS 2 / 19


Definition 1:
Let X and Y be two random variables.
(a) The quantity

Cov(X , Y ) = E [(X − E (X ))(Y − E (Y ))],

provided the expectations exist, is called the covariance between r.v.s


X and Y .
(b) Suppose that Var(X ) > 0 and Var(Y ) > 0. The quantity

Cov(X , Y )
ρ(X , Y ) = p ,
Var(X )Var(Y )

provided the expectations exist, is called the correlation between X


and Y .
(c) Random variables X and Y are called uncorrelated (correlated) if
ρ(X , Y ) = 0 (ρ(X , Y ) 6= 0).
() Module 24 EXPECTATIONS OF RANDOM VECTORS 3 / 19
Remark 1 :

(a) Cov(X , X ) = E [(X − E (X ))2 ] = Var(X );


(b) ρ(X , X ) = 1;
(c)

Cov(X , Y ) = E [(X − E (X ))(Y − E (Y ))]


= E [XY − E (Y )X − E (X )Y + E (X )E (Y )]
= E (XY ) − E (Y )E (X ) − E (X )E (Y ) + E (X )E (Y )
= E (XY ) − E (X )E (Y ).

(d) Cov(X , Y ) = Cov(Y , X ) and ρ(X , Y ) = ρ(Y , X );


(e) X and Y are independent ⇒ Cov(X , Y ) = 0 ⇔ ρ(X , Y ) = 0.
Converse may not be true.

() Module 24 EXPECTATIONS OF RANDOM VECTORS 4 / 19


Example 1 :
Let (X , Y ) be an A.C. bivariate r.v. with joint p.d.f.
(
1, if 0 < | y | ≤ x < 1
f (x, y ) = .
0, otherwise
Show that X and Y are uncorrelated but not independent.
Solution.
Z ∞Z ∞
E (XY ) = xy f (x, y )dxdy
−∞ −∞
Z 1Z x
= xy dydx = 0
0 −x
Z 1Z x
E (Y ) = y dydx = 0
0 −x
Cov(X , Y ) = E (XY ) − E (X )E (Y ) = 0
⇒ ρ(X , Y ) = 0.
() Module 24 EXPECTATIONS OF RANDOM VECTORS 5 / 19
SX = [0, 1], SY = [−1, 1]
SX ,Y = {(x, y ) ∈ R2 : 0 ≤ | y | ≤ | x | ≤ 1}
6= SX × SY

⇒ X and Y are not independent.


Result 2: Let X = (X1 , . . . , Xp1 )0 and Y = (Y1 , . . . , Yp2 )0 be r.v.s and let
a1 , . . . , ap1 , b1 , . . . , bp2 be real constants. Then, provided the involved
expectations are finite,
P p1  P p1
(a) E ai Xi = ai E (Xi );
i=1 i=1
P p1 p2
P  Pp1 P
p2
(b) Cov ai Xi , bj Yj = ai bj Cov(Xi , Yj );
i=1 j=1 i=1 j=1

() Module 24 EXPECTATIONS OF RANDOM VECTORS 6 / 19


(c)
p1
X  p1
X p1 X
X p1
2
Var ai Xi = ai Var(Xi ) + ai aj Cov(Xi , Xj )
i=1 i=1 i=1 j=1
i6=j
p1
X XX
= ai 2 Var(Xi ) + 2 ai aj Cov(Xi , Xj )
i=1 1≤i<j≤p1

Proof. For A.C. case


(a)
p1
X  Z p1
X 
E ai Xi = ai xi fX (x)dx
i=1 Rp i=1
p1
X Z
= ai xi fX (x)dx
i=1 Rp

Xp1
= ai E (Xi ).
i=1
() Module 24 EXPECTATIONS OF RANDOM VECTORS 7 / 19
(b)
p1
X p1
X 
ai Xi − E ai Xi
i=1 i=1
p1
X p1
X
= ai Xi − ai E (Xi )
i=1 i=1
p1
X
= ai (Xi − E (Xi ))
i=1

Similarly,
p2
X p2
X 
bj Yj − E bj Yj
j=1 j=1
p2
X
= bj (Yj − E (Yj ))
j=1

() Module 24 EXPECTATIONS OF RANDOM VECTORS 8 / 19


Then,
p1 p2 p1 X
p2
! !
X X X
Cov ai Xi , bj Yj =E ai bj (Xi − E (Xi ))(Yj − E (Yj ))
i=1 j=1 i=1 j=1
p
XX1 p2  
= ai bj E (Xi − E (Xi ))(Yj − E (Yj ))
i=1 j=1
p1 X
X p2
= ai bj Cov(Xi , Yj ).
i=1 j=1

(c)
p1
X  p1
X p1
X 
Var ai Xi = Cov ai Xi , ai Xi
i=1 i=1 i=1
p
XX1 p 1

= ai aj Cov(Xi , Xj )
i=1 j=1

() Module 24 EXPECTATIONS OF RANDOM VECTORS 9 / 19


p1
X p1 X
X p1
= ai 2 Cov(Xi , Xi ) + ai aj Cov(Xi , Xj )
i=1 i=1 j=1
i6=j
p1
X p1
XX p 1

= ai 2 Var(Xi ) + ai aj Cov(Xi , Xj )
i=1 i=1 j=1
i6=j
p1
X XX
= ai 2 Var(Xi ) + 2 ai aj Cov(Xi , Xj )
i=1 1≤i<j≤p1

(Cov(Xi , Xj ) = Cov(Xj , Xi ), i 6= j)
Result 3 : Let X 1 , . . . , X p be independent r.v.s, where X i is
ri -dimensional, i = 1, . . . , p.
(i) Let ψi : Rri → R, i = 1, . . . , p be given functions. Then
Y p  Y p

E ψi (X i ) = E ψi (X i ) ,
i=1 i=1
provided
() involved expectations exist. OF RANDOM VECTORS
Module 24 EXPECTATIONS 10 / 19
(ii) For Ai ⊆ Rri , i = 1, . . . , p,
p
Y
P({X i ∈ Ai , i = 1, . . . , p}) = P({X i ∈ Ai }).
i=1
p
P
Proof. Let r = ri and X = (X 1 , . . . , X p ). Then, we have
i=1
p
Y
fX (x 1 , . . . , x p ) = fX i (x i ), x i ∈ Rri , i = 1, . . . , p.
i=1

(i)
p
Y  p
Z Y 
E ψi (X i ) = ψi (x i ) fX (x)dx
i=1 Rr i=1
Z p
Z Y p
 Y 
= ... ψi (x i ) fX i (x i ) dx p , . . . , dx 1
Rr1 Rrp i=1 i=1

() Module 24 EXPECTATIONS OF RANDOM VECTORS 11 / 19


Z p
Z Y 
= ... ψi (x i )fX i (x i ) dx p , . . . , dx 1
Rr1 Rrp i=1

Y p Z
= ψi (x i )fX i (x i )dx
i=1Rri

Yp  
= E ψi (X i ) .
i=1

(ii) Follows from (i) by taking


(
1, if x i ∈ Ai
ψi (x i ) = , i = 1, . . . , p.
0, otherwise

() Module 24 EXPECTATIONS OF RANDOM VECTORS 12 / 19


Result 4 (Cauchy-Schwarz Inequality for random
variables) :
Let (X , Y ) be a bivariate r.v. Then, provided the involved expectations are
finite,
(a)
(E (XY ))2 ≤ E (X 2 )E (Y 2 ). (1)
The equality is attained iff P({Y = cX }) = 1 (or P({X = cY }) = 1),
for some real constant c.
(b) Let E (X ) = µX ∈ (−∞, ∞), E (Y ) = µY ∈ (−∞, ∞),
Var(X ) = σX2 ∈ (0, ∞) and Var(Y ) = σY2 ∈ (0, ∞) be finite. Then
−1 ≤ ρ(X , Y ) ≤ 1
and

X − µX Y − µY
ρ(X , Y ) = ±1 ⇔ =± ,
σX σY
with probability one.
() Module 24 EXPECTATIONS OF RANDOM VECTORS 13 / 19
Proof.
(a) Consider the following two cases:
Case 1: E (X 2 ) = 0.
In this case P({X = 0}) = 1 and therefore P({XY = 0}) = 1.
It follows that E (XY ) = 0, E (X ) = 0, P(X = cY ) = 1 (for c = 0)
and the equality in (1) is attained.
Case 2: E (X 2 ) > 0.
Then
0 ≤ E ((Y − λX )2 ) = λ2 E (X 2 ) − 2λE (XY ) + E (Y 2 )
i.e., λ2 E (X 2 ) − 2λE (XY ) + E (Y 2 ) ≥ 0, ∀λ ∈ R
This implies that the discriminant of the quadratic equation
λ2 E (X 2 ) − 2λE (XY ) + E (Y 2 ) = 0
is non-negative, i.e.,
(4E (XY ))2 ≤ 4E (X 2 )E (Y 2 )
() Module 24 EXPECTATIONS OF RANDOM VECTORS 14 / 19
⇒ (E (XY ))2 ≤ E (X 2 )E (Y 2 )
and the equality is attained iff

E ((Y − cX )2 ) = 0, for some c ∈ R


⇔ P(Y = cX ) = 1, for some c ∈ R

(b) Let Z1 = X −µ
σX
X
and Z2 = Y σ−µ
Y
Y
so that E (Z1 ) = E (Z2 ) = 0,
Var(Z1 ) = E (Z1 ), Var(Z2 ) = E (Z22 ), Var(Z1 ) = Var(Z2 ) = 1 and
2

  !
X − µX Y − µY
E (Z1 Z2 ) = E
σX σY
E ((X − µX )(Y − µY ))
=
σX σY
= ρ(X , Y ).

() Module 24 EXPECTATIONS OF RANDOM VECTORS 15 / 19


By C-S inequality

(E (Z1 Z2 ))2 ≤ (E (Z12 ))(E (Z22 ))


⇔ (ρ(X , Y ))2 ≤ 1.

By (a) equality is attained iff

P({Z1 = cZ2 }) = 1, for some c ∈ R


!
X − µX Y − µY
⇔ P { =c } = 1, for some c ∈ R
σX σY

   
Since Var X −µ
σX
X
= Var Y −µY
σY = 1, we have c 2 = 1.

() Module 24 EXPECTATIONS OF RANDOM VECTORS 16 / 19


Take Home Problem

Let (X , Y ) be a bivariate discrete r.v. with p.m.f. given by:

(x, y ) (-1, 1) (0, 0) (1, 1)


f (x, y ) p1 p2 p1

where pi ∈ (0, 1), i = 1, 2 and 2p1 + p2 = 1.


(a) Find ρ(X , Y );

(b) Are X and Y independent?

() Module 24 EXPECTATIONS OF RANDOM VECTORS 17 / 19


Abstract of Next Module

We will discuss the concept of conditional expectation of A.C. r.v.s.

() Module 24 EXPECTATIONS OF RANDOM VECTORS 18 / 19


Thank you for your patience

() Module 24 EXPECTATIONS OF RANDOM VECTORS 19 / 19

S-ar putea să vă placă și