Sunteți pe pagina 1din 2

CMPE683: Spring 2007-2008 Handout # 5

Information Theory

Quiz Number 1 – Solutions


Closed Book; Closed Notes; Time Given=20 minutes
April 3, 2008

“I certify that I have neither received nor given unpermitted aid on this examination and
that I have reported all such incidents observed by me in which unpermitted aid is given.”

SOLUTIONS
Signature

Name Student ID

Question 1: In each of the following statements, all the random variables are discrete.
A statement will be considered true if it is true under all conditions. State whether the
following are true or false by circling the correct choice (Justification NOT required).

1. T / F H(X) > 0
If Pr(X = x0 ) = 1 for some x0 , then H(X) = 0.

2. T / F H(X|X) = 0
H(X|X = x) = 0 for every x.

3. T / F H(X) > H(X|Y )


When X and Y are independent, then H(X) = H(X|Y ).

4. T / F I(X; Y ) ≥ I(X; Y |Z)


See problem 10, chapter 2 of book (first edition), also assigned in homework.

5. T / F I(X; Y ) ≤ H(Y )
I(X; Y ) = H(Y ) − H(Y |X) ≤ H(Y ).

6. T / F I(X; Y ) = H(X) − H(Y |X)


The correct equation is I(X; Y ) = H(Y ) − H(Y |X) = H(X) − H(X|Y ).

7. T / F H(X|Y ) ≥ H(X|Y, Z)
Information never hurts (conditioning never increases entropy).

8. T / F H(X, Y, Z) ≤ H(X) + H(Y ) + H(Z)


Use Chain rule of entropy to see this.

1
9. T / F If X and Y are independent, then H(X|Y ) = H(Y |X)
If X and Y are independent, then H(X|Y ) = H(X) and H(Y |X) = H(Y ), which are
not equal in general.

10. T / F If X and Y are not independent, then I(X; Y ) > 0


This is because I(X; Y ) ≥ 0 with equality iff X and Y are independent.

Question 2: A random variable X can take six possible values.

(a) What is the maximum entropy the random variable X can possess?
Solution: log 6 (where log is taken at base 2).

(b) For what probability distribution X will have the entropy of part (a)?
Solution: Uniform distribution, i.e, X takes all values with equal probability.

(c) Let us assume that X has a probability distribution for which a corresponding entropy
can be found. If the probability mass function values can only be power of 2, what is the
probability mass function of X?
Solution: One possible solution is ( 12 , 14 , 18 , 16
1 1 1
, 32 , 32 ). Another possibility is ( 14 , 14 , 18 , 18 , 18 , 18 ).

(d) Write a code to describe X which will achieve the entropy for the probability mass
function given in part (c)?
Solution: For the first possible solution, (0, 10, 110, 1110, 11110, 11111) may be assigned to
the values of X with increasing probability values in that order.

Question 3: Consider a discrete random variable X ∼ p(x), x = 1, 2, . . . , m. We are


given a set S ⊂ {1, 2, . . . , m}. We ask whether X ∈ S and based on the answer, define a
random variable Y , such that:

⎨1 X ∈ S
Y =
⎩0 X ∈
/S

Suppose Pr{X ∈ S} = p. Find the decrease in uncertainty H(X) − H(X|Y ) in terms of p?


Solution: By intuition, this should be H(p). To see this,

H(X) − H(X|Y ) = I(X; Y ) = H(Y ) − H(Y |X)


= H(p) − H(Y |X)
= H(p)

S-ar putea să vă placă și