Sunteți pe pagina 1din 6

C. V.

Raman College of Engineering, Bhubaneswar


Department of Electronics and Telecommunication Engineering

Information Theory, Coding and Cryptography

1. The channel capacity is


a) The maximum information transmitted by one symbol over the channel
b) Information contained in a signal
c) The amplitude of the modulated signal
d) All of the above

(Ans. a)

2. The capacity of a binary symmetric channel, given H(P) is binary entropy function
is
a) 1 - H(P)
b) H(P) – 1
c) 1 - H(P)2
d) H(P)2 – 1

(Ans. a)
3. According to Shannon Hartley theorem,
a) The channel capacity becomes infinite with infinite bandwidth
b) The channel capacity does not become infinite with infinite bandwidth
c) Has a tradeoff between bandwidth and Signal to noise ratio
d) Both b and c are correct

(Ans. d)

4. For M equally likely messages, M>>1, if the rate of information R ≤ C, the


probability of error is
a) Arbitrarily small
b) Close to unity
c) Not predictable
d) Unknown
(Ans. a)

5. The technique that may be used to increase average information per bit is
a) Shannon-Fano algorithm
b) ASK
c) FSK
d) Digital modulation techniques
(Ans. a)

6. Code rate r, for k information bits and n as total bits, is defined as


a) r = k/n
b) k = n/r
c) r=k*n
d) n=r*k

(Ans. a)

7. The mutual information


a) Is symmetric
b) Always non negative
c) Both a and b are correct
d) None of the above

(Ans. c)

8. The relation between entropy and mutual information is


a) I(X;Y) = H(X) - H(X/Y)
b) I(X;Y) = H(X/Y) - H(Y/X)
c) I(X;Y) = H(X) - H(Y)\
d) I(X;Y) = H(Y) - H(X)

(Ans. a)

9. Entropy is defined as
a) Average information per message
b) Information in a signal
c) Amplitude of signal
d) All of the above

(Ans. a)

10. The information I contained in a message with probability of occurrence is given


by (k is constant)
a) I = k log21/P
b) I = k log2P
c) I = k log21/2P
d) I = k log21/P2

(Ans. a)

11. The expected information contained in a message is called


a) Entropy
b) Efficiency
c) Coded signal
d) None of the above

(Ans. a)

12. When probability of error during transmission is 0.5, it indicates that


a) Channel is very noisy
b) No information is received
c) Both of the mentioned
d) None of the mentioned

(Ans. c)

13. The method of converting a word to stream of bits is called as


a) Binary coding
b) Source coding
c) Bit coding
d) Cipher coding

(Ans. b)

14. When the base of the logarithm is 2, then the unit of measure of information is
a) Bits
b) Bytes
c) Nats
d) None of the mentioned

(Ans. a)

15. When X and Y are statistically independent, then I (x,y) is


a) 1
b) 0
c) Ln 2
d) Cannot be determined

(Ans. b)

16. The self information of random variable is


a) 0
b) 1
c) Infinite
d) Cannot be determined

(Ans. c)

17. An analog baseband signal, bandlimited to 100 Hz, is sampled at the Nyquist rate. The
samples are quantized into four message symbols that occur independently with
probabilities 𝑝1 = 𝑝4 = 0.125 and 𝑝2 = 𝑝3 . The information rate (bits/sec) of the message
source is (approximately)
a) 181
b) 362
c) 724
d) None of the above

(Ans. b)

18. A communication channel with AWGN operating at a signal to noise ratio SNR >>1 and
bandwidth B has capacity 𝐶1 . If the SNR is doubled keeping B constant, the resulting
capacity 𝐶2 is given by
a) 𝐶2 ≈ 2𝐶1
b) 𝐶2 ≈ 𝐶1 + 𝐵
c) 𝐶2 ≈ 𝐶1 + 2𝐵
d) 𝐶2 ≈ 𝐶1 + 0.3𝐵

(Ans. b)

19. A memoryless source emits n symbols each with a probability p. The entropy of the
source as a function of n
a) Increases as log 𝑛
b) Decreases as log (1/𝑛)
c) Increases as n
d) Increases as n log 𝑛

(Ans. a)

20. For M equally likely messages, M>>1, if the rate of information R > C, the
probability of error is
a) Arbitrarily small
b) Close to unity
c) Not predictable
d) Unknown

(Ans. b)

21. For a binary symmetric channel, the random bits are given as
a) Logic 1 given by probability P and logic 0 by (1-P)
b) Logic 1 given by probability 1-P and logic 0 by P
c) Logic 1 given by probability P2 and logic 0 by 1-P
d) Logic 1 given by probability P and logic 0 by (1-P)2

(Ans. a)

22. The Hartley-Shannon theorem sets a limit on the:


a) Highest frequency that maybe sent over a given channel.
b) Maximum capacity of a channel with a given noise level.
c) Maximum number of coding levels in a channel with a given noise level
d) Maximum number of quantizing levels in a channel for a given bandwidth.

(Ans. b)

23. The inequality which satisfies Source Coding Theorem is


a) H(X) ≤ 𝑅̅ < 𝐻(𝑋) + 1
b) H(X) ≥ 𝑅̅ > 𝐻(𝑋) + 1
c) H(X) ≤ 𝑅̅ < 𝐻(𝑋) − 1
d) H(X) ≥ 𝑅̅ > 𝐻(𝑋) − 1

(Ans. a)

24. The number of bits required for unique coding, when L is a power of 2 is
a) 𝑅 = 𝑙𝑜𝑔2 𝐿
b) 𝑅 = 𝑝 ∗ 𝑙𝑜𝑔2 𝐿
c) 𝑅 = 𝑙𝑜𝑔2 𝐿 + 1
d) 𝑅 = 2 − 𝑙𝑜𝑔2 𝐿

(Ans. a)

25. The number of bits required for unique coding, when L is NOT a power of 2 is
a) 𝑅 = 𝑙𝑜𝑔2 𝐿
b) 𝑅 = 𝑝 ∗ 𝑙𝑜𝑔2 𝐿
c) 𝑅 = 𝑙𝑜𝑔2 𝐿 + 1
d) 𝑅 = 2 − 𝑙𝑜𝑔2 𝐿

(Ans. c)

26. In a Binary Symmetric Channel, the mutual information about the occurrence of the
event X=0 given that Y=0 is observed as,
a) 𝑙𝑜𝑔2 (𝑝)
b) 𝑙𝑜𝑔2 (1 − 𝑝)
c) 𝑙𝑜𝑔2 2(1 − 𝑝)
d) 𝑙𝑜𝑔2 (2𝑝)

(Ans. c)

27. In a Binary Symmetric Channel, the mutual information about the occurrence of the
event X=1 given that Y=0 is observed as,
a) 𝑙𝑜𝑔2 (𝑝)
b) 𝑙𝑜𝑔2 (1 − 𝑝)
c) 𝑙𝑜𝑔2 2(1 − 𝑝)
d) 𝑙𝑜𝑔2 (2𝑝)
(Ans. d)
28. For a Binary Symmetric Channel, the capacity, C is given by
a) 𝐶 = 1 + 𝑝𝑙𝑜𝑔2 𝑝 + (1 − 𝑝)𝑙𝑜𝑔2 (1 − 𝑝)
b) 𝐶 = 1 + 𝑝𝑙𝑜𝑔2 𝑝 + (1 + 𝑝)𝑙𝑜𝑔2 (1 + 𝑝)
c) 𝐶 = 1 + (1 + 𝑝)𝑙𝑜𝑔2 (1 + 𝑝) + (1 − 𝑝)𝑙𝑜𝑔2 (1 − 𝑝)
d) 𝐶 = 1 + (1 + 𝑝)𝑙𝑜𝑔2 (1 + 𝑝) − (1 − 𝑝)𝑙𝑜𝑔2 (1 − 𝑝)

(Ans. a)

29. If the probability is high, then the information will be


a) Low
b) High
c) Cannot determine
d) Information does not depend on probability.

(Ans. a)

30. Which of the following relation is true,


Where, H is Entropy, 𝑋𝑘 are the transmitted samples, 𝑌𝑘 are received samples,
𝑁𝑘 denotes the noise samples
a) H(𝑌𝑘 |𝑋𝑘 )=H(𝑋𝑘 )
b) H(𝑌𝑘 |𝑋𝑘 )=H(𝑌𝑘 )
c) H(𝑌𝑘 |𝑋𝑘 )=H(𝑁𝑘 )
d) None of the above

(Ans. c)

S-ar putea să vă placă și