Sunteți pe pagina 1din 1

E2 202 (AugDec 2015)

Homework Assignment 3
Discussion: Friday, Sept. 4

1. Basic properties of variance.


(a) Verify that Var(aX) = a2 Var(X) for any a R.
(b) Show that Var(X + Y ) = Var(X) + Var(Y ) + 2Cov(X, Y ).
2. Moment generating functions.
(a) Determine the moment generating function of a random variable X having a G EOM(p) distribution, i.e.,
P (X = k) = (1 p)k1 p for k = 1, 2, 3, . . . .
Use the MGF to compute the first two moments of X.
(b) Verify that for an r.v. X taking values in {1, 2, 3, . . .}, with
P (X = k) = (3/ 2 )

1
, for k = 1, 2, 3, . . . ,
k2

the moment generating function MX (t) is not finite for t 6= 0.


3. Characteristic functions.
Let X denote the characteristic function of a random variable X. Let W = a + X and Y = aX for
some constant a R. Express the characteristic functions W and Y in terms of X .
Recall that the pdf of a Gaussian random variable Z with mean 0 and variance 1 is given by
1 2
1
fZ (z) = e 2 z , z R
2

Compute the characteristic function of Z.


From (a) and (b), deduce the characeristic function of a Gaussian rv with mean and variance 2 .
4. Let X1 , X2 , . . . , Xn be n independent random variables, with each Xi satisfying P (Xi = +1) = P (Xi =
1) = 1/2.
(a) Determine the mean and variance of X =

1
n

Pn

i=1 Xi .

(b) Use Chebyshevs inequality to bound the probability P (|X| > ) for > 0.
(c) Use the Chernoff bounding technique to show that for 0 < < 1,
P (X > ) 2n[1h(

1+
)]
2

where h() is the binary entropy function defined by h(x) = x log2 (x) (1 x) log2 (1 x) for
x [0, 1]. Hence, by symmetry,
P (|X| > ) = 2 P (X > ) 2 2n[1h(

1+
)]
2

The point of this exercise is that the Chernoff bound yields an exponential decay in n for the tail probability
of X, while Chebyshevs inequality only yields a 1/n decay.

S-ar putea să vă placă și