Documente Academic
Documente Profesional
Documente Cultură
(Ans. a)
2. The capacity of a binary symmetric channel, given H(P) is binary entropy function
is
a) 1 - H(P)
b) H(P) – 1
c) 1 - H(P)2
d) H(P)2 – 1
(Ans. a)
3. According to Shannon Hartley theorem,
a) The channel capacity becomes infinite with infinite bandwidth
b) The channel capacity does not become infinite with infinite bandwidth
c) Has a tradeoff between bandwidth and Signal to noise ratio
d) Both b and c are correct
(Ans. d)
5. The technique that may be used to increase average information per bit is
a) Shannon-Fano algorithm
b) ASK
c) FSK
d) Digital modulation techniques
(Ans. a)
(Ans. a)
(Ans. c)
(Ans. a)
9. Entropy is defined as
a) Average information per message
b) Information in a signal
c) Amplitude of signal
d) All of the above
(Ans. a)
(Ans. a)
(Ans. a)
(Ans. c)
(Ans. b)
14. When the base of the logarithm is 2, then the unit of measure of information is
a) Bits
b) Bytes
c) Nats
d) None of the mentioned
(Ans. a)
(Ans. b)
(Ans. c)
17. An analog baseband signal, bandlimited to 100 Hz, is sampled at the Nyquist rate. The
samples are quantized into four message symbols that occur independently with
probabilities 𝑝1 = 𝑝4 = 0.125 and 𝑝2 = 𝑝3 . The information rate (bits/sec) of the message
source is (approximately)
a) 181
b) 362
c) 724
d) None of the above
(Ans. b)
18. A communication channel with AWGN operating at a signal to noise ratio SNR >>1 and
bandwidth B has capacity 𝐶1 . If the SNR is doubled keeping B constant, the resulting
capacity 𝐶2 is given by
a) 𝐶2 ≈ 2𝐶1
b) 𝐶2 ≈ 𝐶1 + 𝐵
c) 𝐶2 ≈ 𝐶1 + 2𝐵
d) 𝐶2 ≈ 𝐶1 + 0.3𝐵
(Ans. b)
19. A memoryless source emits n symbols each with a probability p. The entropy of the
source as a function of n
a) Increases as log 𝑛
b) Decreases as log (1/𝑛)
c) Increases as n
d) Increases as n log 𝑛
(Ans. a)
20. For M equally likely messages, M>>1, if the rate of information R > C, the
probability of error is
a) Arbitrarily small
b) Close to unity
c) Not predictable
d) Unknown
(Ans. b)
21. For a binary symmetric channel, the random bits are given as
a) Logic 1 given by probability P and logic 0 by (1-P)
b) Logic 1 given by probability 1-P and logic 0 by P
c) Logic 1 given by probability P2 and logic 0 by 1-P
d) Logic 1 given by probability P and logic 0 by (1-P)2
(Ans. a)
(Ans. b)
(Ans. a)
24. The number of bits required for unique coding, when L is a power of 2 is
a) 𝑅 = 𝑙𝑜𝑔2 𝐿
b) 𝑅 = 𝑝 ∗ 𝑙𝑜𝑔2 𝐿
c) 𝑅 = 𝑙𝑜𝑔2 𝐿 + 1
d) 𝑅 = 2 − 𝑙𝑜𝑔2 𝐿
(Ans. a)
25. The number of bits required for unique coding, when L is NOT a power of 2 is
a) 𝑅 = 𝑙𝑜𝑔2 𝐿
b) 𝑅 = 𝑝 ∗ 𝑙𝑜𝑔2 𝐿
c) 𝑅 = 𝑙𝑜𝑔2 𝐿 + 1
d) 𝑅 = 2 − 𝑙𝑜𝑔2 𝐿
(Ans. c)
26. In a Binary Symmetric Channel, the mutual information about the occurrence of the
event X=0 given that Y=0 is observed as,
a) 𝑙𝑜𝑔2 (𝑝)
b) 𝑙𝑜𝑔2 (1 − 𝑝)
c) 𝑙𝑜𝑔2 2(1 − 𝑝)
d) 𝑙𝑜𝑔2 (2𝑝)
(Ans. c)
27. In a Binary Symmetric Channel, the mutual information about the occurrence of the
event X=1 given that Y=0 is observed as,
a) 𝑙𝑜𝑔2 (𝑝)
b) 𝑙𝑜𝑔2 (1 − 𝑝)
c) 𝑙𝑜𝑔2 2(1 − 𝑝)
d) 𝑙𝑜𝑔2 (2𝑝)
(Ans. d)
28. For a Binary Symmetric Channel, the capacity, C is given by
a) 𝐶 = 1 + 𝑝𝑙𝑜𝑔2 𝑝 + (1 − 𝑝)𝑙𝑜𝑔2 (1 − 𝑝)
b) 𝐶 = 1 + 𝑝𝑙𝑜𝑔2 𝑝 + (1 + 𝑝)𝑙𝑜𝑔2 (1 + 𝑝)
c) 𝐶 = 1 + (1 + 𝑝)𝑙𝑜𝑔2 (1 + 𝑝) + (1 − 𝑝)𝑙𝑜𝑔2 (1 − 𝑝)
d) 𝐶 = 1 + (1 + 𝑝)𝑙𝑜𝑔2 (1 + 𝑝) − (1 − 𝑝)𝑙𝑜𝑔2 (1 − 𝑝)
(Ans. a)
(Ans. a)
(Ans. c)