Documente Academic
Documente Profesional
Documente Cultură
3{1
De nition of a discrete channel { Transition matrix { Transition probability diagram Probability of error { Conditional probability of error { Average probability of error Decision criteria { Minimum average probability of decision error criterion Maximum a posteriori probability (MAP) decision rule/criterion { Maximum likelihood (ML) decision rule/criterion
3{2
Performance analysis { Why performance analysis? { Closed-form evaluation { Semi-closed-form evaluation { Tight upper and/or lower bounds { Approximations { Monte-Carlo methods
3{3
Discrete Channel
Def: A communication channel is a triplet (X p(y jx) Y ) where
X p(yjx) Y
3{4
p(1jL) p(2jL) ::: p(N jL) where p(yjx) is the (x y)th of P. Note that Pentry sum of each row is 1, i.e., N y=1 p(y jx) = 1. Def: If P is an L L squre matrix and p(yjx) = p(xjy), then the channel is called an L-ary symmetric channel.
-Note that by row or column permutations, an asymmetric channel can become a symmetric channel.
2 3 p(1j1) p(2j1) ::: p(N j1) 6 p (1j2) p(2j2) ::: p(N j2) 7 6 7 P=4 . . . . . . 5
X = f1 2 ::: Lg Y = f1 2 ::: N g
3{5
2. By specifying the transition probability diagram with inputs and outputs properly labeled:
-Note that the number of input nodes equals the number of possible inputs to the channel and that the number of output nodes equals the number of possible outputs from the channel.
3{6
- Note that the input and the output alphabets are both f0 1g. - The quantity 0 1 is called the crossover probability.
3{7
Probability of Error
Now we consider transmitting a message symbol mi 2 fm1 ::: mM g through a discrete channel with jX j = L and jYj = N . We need to design a transmitter that converts the message symbol to an admissible input to the channel and a receiver that infers from the channel output which message symbol is actually transmitted.
Def: Pe i , Pr(Rx decides(says/infers) mj (j 6= i) is the transmitted messagejTx transmits mi) is called the conditional (symbol) error probability given ith
symbol. We also write it as Pe i Pe m=mi or Pr(m ^ 6= mi jm = mi ).
j j
Pe i
= Pr( ^ = 6 =
3{8
Def: When a message symbol mi i = 1 2 ::: M is chosen with probability Pr(m = mi) i = 1 2 ::: M , the average probability of (symbol) error is
de ned as
Pe ,
= =
Pr(
M X
decision error)
Pr(
which is the statistical average of the conditional error probabilities. In (1), we used a partition equation. the following probability transition diagram:
i=1 E fPr(decision m
(1)
The source generates a message symbol 0 with probability p and 1 with 1 ; p. The transmitter maps the symbol i to the input i, and the receiver maps the output i to the symbol i. What is the average probability of decision error?
3{9
Decision Criteria
In this course, we consider two decision rules: the decision rule. In what follows, we will show that the decision rule with the minimum average probability of symbol error is the same as the
minimum average probability of symbol error decision rule and the maximum likelihood (ML) maximum a posteriori probability (MAP)
decision rule and that the MAP decision rule is the same as the ML decision rule if the a priori probabilities of the message symbols are the same. Derivation: (MAP decision rule) First, recall the de nition of a partition equation:
P (A) =
where where
=
X
i
P (AjEi)P (Ei)
P (AjB ) =
=
iEi.
X
i
iEi.
3 { 10
Pe
M X i=1 M X i=1
Pe i Pr(m = mi)
1
; Pr(m ^ = mi jm = mi )] Pr(m = mi )
Pr(
= 1
; ;
M X i=1 M X i=1
= 1
Pr(
Pr( ^ = = 1
m mijm = mi Y
M X i=1
Pr(
yj ) Pr(Y
yj jm = mi)
i
m = mi)
=
N hX j =1
= 1
Pr( ^ =
m mijY
i=1
Pr(
yj ) Pr(Y
yj jm = mi)
=
N hX M X j =1
m = mi) Pr(m ^ = mi jY Y
=
Pr(
yj jm = mi)
yj )
3 { 11
Hence, the optimization problem minimize Pe is equivalent to the following N separate optimization problems
Pr(m ^ =mi jY =yj )
decision rule
maximize
i=1 2 ::: M
M X i=1
Pr( ^ =
m mijY
yj )
i
Pr(
subject to
M X i=1
Pr( ^ =
yj ) = 1
for j = 1 2 ::: N . -Note that there exists a unique solution of the optimization problem maximize p p ::: p
1 2
M X
0
subject to
i=1
piqi pi
1
M X i=1
8i
pi = 1
3 { 12
is
if i = argmaxj qj 0 otherwise if there is only one index i with the largest value of qi. If there is more than one such indices?
pi =
Hence, the minimum average probability of symbol error decision rule becomes ^ i = argmax Pr(Y = yj jm = mi) Pr(m = mi) i = argmax Pr(m = mi jY = yj ) which is the maximum a posteriori (MAP) probability decision rule. -Note that we do not need a randomized decision rule to obtain the the minimum probability of symbol error.
i
3 { 13
as
argmax Pr(Y
i
yj jm = mi)
for j = 1 2 ::: N . -Note that the ML decision rule is de ned even if the message symbols are not randomly chosen. -Note that the ML decision rule is equivalent to the MAP decision rule if the message symbols are chosen with equal probability.
3 { 14
Performance Analysis
Objectives of performance analysis { To reveal key parameters that a ect system performance { To compare di erent systems { To design a better system MAP decision rule and the evaluation of Pe { a closed-form expression of Pe { very tight upper and lower bounds on Pe { a semi-closed form of Pe { an approximate expression of Pe { numerical result by simulation