Sunteți pe pagina 1din 4

Assignment 02

Discrete Channels
A discrete communication channel has a discrete input and a discrete output
the symbols applied to the channel input for transmission are drawn from a
nite alphabet,
described by an input ensemble (X,p) while
the symbols appearing at the channel output are also drawn from a nite alphabet, which
is described by an output ensemble (Y,q) I the channel transition probability matrix F.

In many situations the input and output alphabets X and Y are identical but in
the general case these are dierent. Instead of using X and Y , it is common
practice to use the symbols H and D and thus de ne the two alphabets and the
associated probabilities as
,p 1



,p 2

,p M

), zPr}|
(H2{) ,...,Prz(H
)] T
fH1,H2,...,HM g p =[Pr
z (,}|
, q2
, qK
z }| { z }| {
z }| {
g q =[Pr(D1), Pr(D2) ,...,Pr(DK )]


where pm abbreviates the probability Pr(Hm) that the symbol Hm may appear at
the input while qk abbreviates the probability Pr(Dk ) that the symbol Dk may
appear at the output of the channel.
The probabilistic relationship between input symbols H and output symbols D is
described by the so-called channel transition probability matrix F, which is de ned

Converting a Continuous to a Discrete Channel

A continuous channel is converted into (becomes) a discrete channel when a
digital modulator is used to feed the channel and a digital demodulator
provides the channel output.
A digital modulator is described by M dierent channel symbols .
These channel symbols are ENERGY SIGNALS of duration Tcs.
Digital Modulator:

If M = 2 )Binary Digital Modulator )Binary Comm. System If M > 2 )M-ary

Digital Modulator )M-ary Comm. System

Equivocation & Mutual Information of a Discrete Channel

Consider a discrete
source (H,p) followed by a discrete channel, as
shown below
The average amount of information gained (or uncertainty removed) about the
H source (channel input) by observing the outcome of the D source (channel
output), is given by the conditional entropy HHjD which is de ned as follows:

HHjD , HHjD (J)


=1 k=1 Jkm.log2




A similar expression can be also given for the average information gained about
the channel output D by observing the channel input H, i.e.

HDjH , HDjH (J)




=1 k=1

The conditional entropy HHjD is also known as equivocation and it is the entropy
of the noise or, otherwise, the uncertainty in the input of the channel from the
receiver s point of view.
Submitted by;

1)vikas n theng
2)shubham a dandge
3)m fayaj
4) a k abdul
5) sumit j chauhan