Sunteți pe pagina 1din 4

ASSIGNMENT 1

Question 1:

Solution:
Let the entropy of the source be H(x)
H(x) =

= 2.4086950 bits/sample
For a uniformly distributed source,
The probability of each element of the source is equal.
p1 = p2 = p3 = p4 = p5 = p6 =
Finding the entropy of the uniformly distributed source,
H(x) =

= 2.58496 bits/sample
H(x ) of the non-uniformly
distributed source

H(x ) of the uniformly


distributed source

2.4086950 bits/sample

2.58496 bits/sample

Comment: The entropy of the source increases when the source is


uniformly distributed with the same alphabet.
Question 2:

Solution:
Let the entropy of the source be H(x)
H(x) =

= 2.4086950 bits/sample
Given,
Bandwidth

= 6000 Hz

Nyquist rate

= 2 Bandwidth
= 2 6000
= 1200 Hz

Sampling rate = Nyquist rate + Guard band


= 1200 + 2000
= 14000 samples/second or Hz

Calculating the bit rate in bits/second,


Bit rate = Sampling rate Entropy of source
=
= 14000 2.4086950
= 33,721.73 bits/second (correct to 2 decimal place)
Question 3

Solution:
1. Let the entropy of the source be H(x),
H(x) =

= 2.528213 bits/sample
2. Quantizing the source according to the quantization rule ,

Alphabet

Alphabet after
quantization

Probability
before
quantization

0.3

0.25

0.05

0.15

-1

0.1

-3

-4

0.1

-5

-4

0.05

Finding entropy of source after quantization,

H(x) =

Probability after
quantization
0.3 + 0.25 =
0.55

0.05 + 0.15 +
0.1= 0.3

0.1 + 0.05 =
0.15

S-ar putea să vă placă și