Documente Academic
Documente Profesional
Documente Cultură
Discrete Channels
A discrete communication channel has a discrete input and a discrete output
where
the symbols applied to the channel input for transmission are drawn from a
nite alphabet,
described by an input ensemble (X,p) while
the symbols appearing at the channel output are also drawn from a nite alphabet, which
is described by an output ensemble (Y,q) I the channel transition probability matrix F.
In many situations the input and output alphabets X and Y are identical but in
the general case these are dierent. Instead of using X and Y , it is common
practice to use the symbols H and D and thus de ne the two alphabets and the
associated probabilities as
,p 1
input:
output:
,p 2
,p M
H1{
), zPr}|
(H2{) ,...,Prz(H
)] T
M{
fH1,H2,...,HM g p =[Pr
z (,}|
}|
, q2
, qK
q1
z }| { z }| {
z }| {
g q =[Pr(D1), Pr(D2) ,...,Pr(DK )]
D
fD1,D2,...,DMT
where pm abbreviates the probability Pr(Hm) that the symbol Hm may appear at
the input while qk abbreviates the probability Pr(Dk ) that the symbol Dk may
appear at the output of the channel.
The probabilistic relationship between input symbols H and output symbols D is
described by the so-called channel transition probability matrix F, which is de ned
as
Jkm
=1 k=1 Jkm.log2
qk
bits
M
symbol
A similar expression can be also given for the average information gained about
the channel output D by observing the channel input H, i.e.
M
Jkm
=
(16)
=1 k=1
The conditional entropy HHjD is also known as equivocation and it is the entropy
of the noise or, otherwise, the uncertainty in the input of the channel from the
receiver s point of view.
Submitted by;
1)vikas n theng
2)shubham a dandge
3)m fayaj
4) a k abdul
5) sumit j chauhan