Documente Academic
Documente Profesional
Documente Cultură
Information Theory
“I certify that I have neither received nor given unpermitted aid on this examination and
that I have reported all such incidents observed by me in which unpermitted aid is given.”
SOLUTIONS
Signature
Name Student ID
Question 1: In each of the following statements, all the random variables are discrete.
A statement will be considered true if it is true under all conditions. State whether the
following are true or false by circling the correct choice (Justification NOT required).
1. T / F H(X) > 0
If Pr(X = x0 ) = 1 for some x0 , then H(X) = 0.
2. T / F H(X|X) = 0
H(X|X = x) = 0 for every x.
5. T / F I(X; Y ) ≤ H(Y )
I(X; Y ) = H(Y ) − H(Y |X) ≤ H(Y ).
7. T / F H(X|Y ) ≥ H(X|Y, Z)
Information never hurts (conditioning never increases entropy).
1
9. T / F If X and Y are independent, then H(X|Y ) = H(Y |X)
If X and Y are independent, then H(X|Y ) = H(X) and H(Y |X) = H(Y ), which are
not equal in general.
(a) What is the maximum entropy the random variable X can possess?
Solution: log 6 (where log is taken at base 2).
(b) For what probability distribution X will have the entropy of part (a)?
Solution: Uniform distribution, i.e, X takes all values with equal probability.
(c) Let us assume that X has a probability distribution for which a corresponding entropy
can be found. If the probability mass function values can only be power of 2, what is the
probability mass function of X?
Solution: One possible solution is ( 12 , 14 , 18 , 16
1 1 1
, 32 , 32 ). Another possibility is ( 14 , 14 , 18 , 18 , 18 , 18 ).
(d) Write a code to describe X which will achieve the entropy for the probability mass
function given in part (c)?
Solution: For the first possible solution, (0, 10, 110, 1110, 11110, 11111) may be assigned to
the values of X with increasing probability values in that order.