0 evaluări0% au considerat acest document util (0 voturi)
7 vizualizări1 pagină
Explain u-law and a-law companding technique. Compare Delta modulation with Pulse code modulation technique. Give the comparisons between BFSK and PSK schemes.
Explain u-law and a-law companding technique. Compare Delta modulation with Pulse code modulation technique. Give the comparisons between BFSK and PSK schemes.
Drepturi de autor:
Attribution Non-Commercial (BY-NC)
Formate disponibile
Descărcați ca PDF, TXT sau citiți online pe Scribd
Explain u-law and a-law companding technique. Compare Delta modulation with Pulse code modulation technique. Give the comparisons between BFSK and PSK schemes.
Drepturi de autor:
Attribution Non-Commercial (BY-NC)
Formate disponibile
Descărcați ca PDF, TXT sau citiți online pe Scribd
III B.Tech I Semester(R05) Supplementary Examinations, November 2010
DIGITAL COMMUNICATION (Electronics & Communication Engineering) Time: 3 hours Max Marks: 80 Answer any FIVE Questions All Questions carry equal marks ?????
1. (a) Explain µ - law and A - law companding technique?
(b) Approximate µ-law curve with µ = 100 by two-linear segments, one for 0<x≤ 0.1 and the other for 0.1 < x ≤ 1. Calculate S/Nq for S = -40, -10. Where x(t) is input signal. [8+8] 2. (a) Explain with a neat block diagram the operation of a continuously variable slope delta modulator (CVSD). (b) Compare Delta modulation with Pulse code modulation technique. [8+8] 3. (a) Explain with neat diagrams coherent BFSK transmitter and receiver. Also Explain single space diagram for coherent BFSK systems. (b) Give the comparisons between FSK and PSK schemes. [10+6] 4. (a) Draw the block diagram of duo - binary PAM system and explain. (b) Mention the drawbacks in duo - binary correlative level coding with neat waveforms. [10+6] 5. One experiment has four mutual exclusive outcomes Aj , j =1, 2, 3, 4 and second experiment has three mutually exclusive outcomes Bj , j=1, 2, 3. The joint probabilities P (A, B) are P (A1 , B1 ) = 0.10 P (A1 , B2 ) = 0.08 P (A1 , B3 ) = 0.13 P (A2 , B1 ) = 0.05 P (A2 , B2 ) = 0.03 P (A2 , B3 ) = 0.09 P (A3 , B1 ) = 0.11 P (A3 , B2 ) = 0.04 P (A3 , B3 ) = 0.06 With the given joint probabilities P (Ai , Bj ), if the outcomes Ai , i=1, 2, 3, 4 of experiment A. Determine the Average Mutual Information I (B; A). [16] 6. (a) Define the channel capacity in terms of average signal-power and noise power. (b) Mention the two important implications of Shannon-Hartley theorem? [16] 7. What is parity Check matrix H? Explain the procedure to verify a code word C is generated by the matrix G or not ? [16] 8. (a) What is the importance of Viterbi Algorithm? Explain. (b) Explain Viterbi Algorithm with considering appropriate encoder. [16]