Sunteți pe pagina 1din 4

EE 761 Question Bank Autumn 2019

1. Let X be an Exponential random variable with parameter 1. Plot bounds for P (X > a) obtained
using Markov inequality, Chebyshev inequality and Chernoff bound as a function of a. Also plot the
actual value of P (X > a) on the same plot. Repeat these steps for a Geometric random variable with
parameter 0.5.
n
X
2. Let Z = Xi , where the Xi s are i.i.d. with Xi ∼ Ber(p). Show that, ∀ > 0,
i=1
E[Z]
e

(a) P (Z > (1 + )E[Z]) ≤ ,
(1 + )1+
E[Z]
e−

(b) P (Z < (1 − )E[Z]) ≤ ,
(1 − )1−
2
 
(c) P (Z > (1 + )E[Z]) ≤ exp − E[Z] ,
3
2
 
(d) P (Z < (1 − )E[Z]) ≤ exp − E[Z] .
2
n
X
3. Let Z = Xi , where the Xi s are i.i.d. Bernoulli random variables. Let µL ≤ E[Z] ≤ µH . Show that
i=1

(a) For any t > 0,

2t2 2t2
   
P (Z > µH + t) ≤ exp − and P (Z < µL − t) ≤ exp − .
n n

(ii) For 0 <  < 1,

2 2
   
P (Z > (1 + )µH ) ≤ exp − µH and P (Z < (1 − )µL ) ≤ exp − µL .
3 2
n
X
4. Let Z = Xi , where the Xi s are i.i.d. with Xi ∼ Ber(p). Show that
i=1
  
1−a a
P (Z ≥ na) ≤ exp −n (1 − a) log + a log .
1−p p

5. Let Y be a Poisson random variable with parameter 1.


(i) Compute the moment generating function of Y .
(ii) Show that, ∀t > 0,
P (X ≥ t) ≤ exp(−t log t + t − 1).
6. Let W be a Gaussian random variable with mean 0 and variance σ 2 . Show that,
t2
 
(i) P (W ≥ t) ≤ exp − 2 ,

  2 
t 1
(ii) sup P (W ≥ t) exp 2
= , i.e., the bound obtained in Part (i) is tight, up to a constant.
t>0 2σ 2

1
EE 761 Question Bank Autumn 2019

7. Prove that ∀u ≥ 0,
u2
(1 + u) log(1 + u) − u ≥ .
2 + 2u
3

(This completes the proof of Bernstein’s inequality.)


8. Prove the following statements.
(i) If X̄ and Ȳ are negatively associated and mutually independent, then (X1 . . . Xm , Y1 . . . Yn ) is
negatively associated.
(ii) Non-increasing or non-decreasing functions of disjoint subsets of negatively associated random
variables are also negatively associated, i.e., for I1 , . . . Ik ⊆ [n] disjoint, hj : R|Ij | → R, Yj =
hj (Xi , i ∈ Ij ), all non-increasing/non-decreasing for j = 1, 2 . . . k, then (Y1 . . . Yk ) is negatively
associated.
9. Let Xi , i ∈ [n], be random X variables such that for any subset I ⊆ [n] and any t > 0, the distribution
of Xi , i ∈ I, conditioned on Xi = t is conditionally independent of any other variables and stochas-
i∈I
X
tically increasing in t; that is, for any non-decreasing f , E[f (Xi , i ∈ I)| Xi = y] is increasing in
i∈I
X
t. Further, suppose the distribution of Xi , i ∈ [n], is concentrated on the event Xi = c for some
i∈I
constant c; that is, these are the only points with non-zero probability mass. Then prove Xi , i ∈ [n],
are negatively associated.

10. Let n points be thrown uniformly at random on the unit circle. This splits the unit circle into n arcs,
which we can number 1 . . . n in counter clockwise order starting from an arbitrary point. Let li be the
length of arc i and let Zi = 1 if li ≥ c/n and 0 otherwise.
(i) Prove that the li s are negatively associated.
(ii) Prove that the Zi s are negatively associated.

11. Prove that the sum sub-Gaussian random variables is sub-Gaussian.


12. Prove that the sum sub-Exponential random variables is sub-Exponential.
13. Suppose we throw m balls into n bins independently uniformly at random. Let M be the maximum
number of balls in any bin. Prove that
  
m log n − m/n
E[M ] ≤ exp 1 + W ,
n em/n

where the Lambert W function is defined over [−1/e, ∞) by the equation W (x) exp(W (x)) = x.
14. Let X0 = 0 and for j ≥ 0 let Xj+1 be chosen uniformly over the real interval [Xj , 1]. Show that, for
k ≥ 0, the sequence
Yk = 2k (1 − Xk )
is a martingale.
15. Let Z1 , Z2 , . . . be i.i.d. Gaussian random variables with mean zero and variance σ 2 , and let Xn =
Sn2 − nσ 2 , where Sn = Z1 + · · · + Zn . Prove that {Xn } forms a martingale sequence.
2 Pn
16. Suppose X1 , X2 , . . . are i.i.d. N (0, 1) and let Zn = etSn −nt /2 , where Sn = i=1 Xi and t is a fixed
real number. Prove that for any real t the sequence {Zn } forms a martingale.

2
EE 761 Question Bank Autumn 2019

17. Consider the matching problem. For example, suppose N people, each wearing a hat, have gathered
in a party and at the end of the party, the N hats are returned to them at uniformly at random.
Those that get their own hats back then leave the room. The remaining hats are distributed among
the remaining guests at random, and so on. The process continues until all the hats have been given
away. Let Xn denote the number of guests present after the nth round of this hat returning process.
Prove that the sequence {Xn + n} forms a martingale.
18. Use the martingale stopping theorem to prove the following:
Theorem 1 (Wald’s Equation). Let X1 , X2 ,.. be non-negative, independent, identically distributed
random variables. Let T be a stopping time for this sequence. If T and Xi s have bounded expectations,
then,
T
X 
E Xi = E[T ]E[X1 ].
i=1

19. Consider an i.i.d. sequence X1 , X2 , .. with a discrete distribution, uniform over the integers {1, 2, ..., 10};
Xτ 
P(X = i) = 1/10, 1 ≤ i ≤ 10. Let τ be the first instance when Xτ = 6. Compute E Xi .
i=1

20. A sequence of random variables X1 , X2 , .., Xn is called a supermartingale if for all k ≥ 1,

E[Xk |X1 , X2 , .., Xk−1 ] ≤ Xk−1 .

Prove that the Azuma-Hoeffding inequality is valid for supermartingales.


21. Let X = (X1 , X2 , .., Xn ) be a sequence of characters chosen independently and uniformly at random
from an alphabet Σ, where s = |Σ|. Let B = (B1 , B2 , ..Bk ) be a fixed string of k characters from Σ.
Let F be the number of occurences of the string B in the random string X.
(i) Compute E[F ].
(ii) Obtain a concentration result for F .
22. Consider a random graph G(V, E) with |V | = n and |E| = N chosen uniformly at random from all
graphs of n vertices with N edges. Let X be the number of isolated vertices (vertices with no edges)
in G.
(i) Compute E[X].
(ii) Obtain a concentration result for X.
23. Let X = (X1 , X2 , .., Xn ) and Y = (Y1 , Y2 , .., Yn ) be points in Rn such that every coordinate of X and
Y is drawn independently and uniformly at random from the interval [0, 1]. Let W = ||X − Y ||2 .
(i) Compute E[W ].
(ii) Obtain a concentration result for W .
24. A sequence of random variables X1 , X2 , .., Xn is called a supermartingale if for all k ≥ 1,

E[Xk |X1 , X2 , .., Xk−1 ] ≤ Xk−1 .

Prove that the Azuma-Hoeffding inequality is valid for supermartingales.


25. Let X = (X1 , X2 , .., Xn ) be a sequence of characters chosen independently and uniformly at random
from an alphabet Σ, where s = |Σ|. Let B = (B1 , B2 , ..Bk ) be a fixed string of k characters from Σ.
Let F be the number of occurences of the string B in the random string X.

3
EE 761 Question Bank Autumn 2019

(i) Compute E[F ].


(ii) Obtain a concentration result for F .
26. Consider a random graph G(V, E) with |V | = n and |E| = N chosen unifomly at random from all
graphs of n vertices with N edges. Let X be the number of isolated vertices (vertices with no edges)
in G.
(i) Compute E[X].
(ii) Obtain a concentration result for X.
27. Let X = (X1 , X2 , .., Xn ) and Y = (Y1 , Y2 , .., Yn ) be points in Rn such that every coordinate of X and
Y is drawn independently and uniformly at random from the interval [0, 1]. Let W = ||X − Y ||2 .
(i) Compute E[W ].
(ii) Obtain a concentration result for W .
28. First toss a fair coin. Now for i ≥ 1, if the ith toss is heads, for the (i + 1)st toss, use a biased coin which
comes out heads with probability 2/3, else use a biased coin which comes out tails with probability
2/3. Let X be the number of heads in the first n tosses.
(i) Compute E[X].
(ii) Obtain a concentration result for X.
29. Let X ∼ N (0, 1). Show that the following statements hold for all u > 0:
 2
u
(a) P(|X| ≥ u) ≤ exp − .
2
r  2
21 u
(b) P(|X| ≥ u) ≤ exp − .
πu 2
r    2
21 1 u
(c) P(|X| ≥ u) ≥ 1 − 2 exp − .
πu u 2
r !  2
2 u
(d) P(|X| ≥ u) ≥ 1 − u exp − .
π 2
σ 2 s2
30. Let X ∼ SubG(σ 2 ), i.e., E[esX ] ≤ e 2 . Show that:
2 2
(a) P(|X| > t) ≤ 2e−t /2σ
.
(b) Var(X) ≤ σ 2 .
2
(c) There exists c > 0 such that E[ecX ] ≤ 2.
31. Show without using Hoeffding’s lemma that Rademacher variable (ε = +1 or −1 with probability 1/2)
is 1–sub-gaussian.
32. Show that for any set of points in a right-angled triangle, there is a tour that starts at one endpoint of
the hypotenuse, ends at the other endpoint and goes through all the points such that the sum of the
lengths of the edges is bounded by the square of the hypotenuse.
Use this to deduce that for any set of points in the unit square, there is a tour going through all the
points such that the sum of squares of the lengths of the edges in the tour is bounded by 4.
33. (Jensen’s inequality for conditional expectation) Prove that for a convex function f such that E[|f (X)|] <
∞, E[f (X)|G] ≥ f (E[X|G]).
34. Compute a bound on the variance of the solution (length of the shortest tour) of the stochastic traveling
salesman problem. Use this bound and Chebyshev’s inequality to obtain a concentration result for the
length of the shortest tour. Compare this bound with the three other bounds discussed in class.

S-ar putea să vă placă și