Sunteți pe pagina 1din 14

Transmission of a Single Symbol over a Discrete Channel

3{1

Handout 3 Transmission of a Single Symbol over a Discrete Channel


ECE 645 Professor Joon Ho Cho

De nition of a discrete channel { Transition matrix { Transition probability diagram Probability of error { Conditional probability of error { Average probability of error Decision criteria { Minimum average probability of decision error criterion Maximum a posteriori probability (MAP) decision rule/criterion { Maximum likelihood (ML) decision rule/criterion

Transmission of a Single Symbol over a Discrete Channel

3{2

Performance analysis { Why performance analysis? { Closed-form evaluation { Semi-closed-form evaluation { Tight upper and/or lower bounds { Approximations { Monte-Carlo methods

Transmission of a Single Symbol over a Discrete Channel

3{3

Discrete Channel
Def: A communication channel is a triplet (X p(y jx) Y ) where
X p(yjx) Y

input alphabet transition probability output alphabet

Def: A discrete channel is a communication channel


with the following properties: X a countable set p(yjx) probability of Y = y 2 Y given X = x 2 X Y a countable set - X and Y can be any type of countable sets, not necessarily sets of real numbers, real vectors, complex numbers, or complex vectors. - Any discrete channel with jX j = L and jYj = N is equivalent to the discrete channel with X = f1 2 ::: Lg and Y = f1 2 ::: N g.

Transmission of a Single Symbol over a Discrete Channel

3{4

Representation of a Discrete Channel


When jX j < 1 and jYj < 1, 1. By specifying the input alphabet X , the output alphabet Y , and the transition probability matrix:

p(1jL) p(2jL) ::: p(N jL) where p(yjx) is the (x y)th of P. Note that Pentry sum of each row is 1, i.e., N y=1 p(y jx) = 1. Def: If P is an L L squre matrix and p(yjx) = p(xjy), then the channel is called an L-ary symmetric channel.
-Note that by row or column permutations, an asymmetric channel can become a symmetric channel.

2 3 p(1j1) p(2j1) ::: p(N j1) 6 p (1j2) p(2j2) ::: p(N j2) 7 6 7 P=4 . . . . . . 5

X = f1 2 ::: Lg Y = f1 2 ::: N g

Transmission of a Single Symbol over a Discrete Channel

3{5

2. By specifying the transition probability diagram with inputs and outputs properly labeled:

-Note that the number of input nodes equals the number of possible inputs to the channel and that the number of output nodes equals the number of possible outputs from the channel.

Transmission of a Single Symbol over a Discrete Channel

3{6

transition probability diagram:

Def: A binary symmetric channel has the following

- Note that the input and the output alphabets are both f0 1g. - The quantity 0 1 is called the crossover probability.

Transmission of a Single Symbol over a Discrete Channel

3{7

Probability of Error
Now we consider transmitting a message symbol mi 2 fm1 ::: mM g through a discrete channel with jX j = L and jYj = N . We need to design a transmitter that converts the message symbol to an admissible input to the channel and a receiver that infers from the channel output which message symbol is actually transmitted.

Def: Pe i , Pr(Rx decides(says/infers) mj (j 6= i) is the transmitted messagejTx transmits mi) is called the conditional (symbol) error probability given ith
symbol. We also write it as Pe i Pe m=mi or Pr(m ^ 6= mi jm = mi ).
j j

- We have M conditional probabilities Pe 1 ::: Pe M . - Note that

Pe i

= Pr( ^ = 6 =

m mijm = mi) 1 ; Pr(m ^ = mi jm = mi )

Transmission of a Single Symbol over a Discrete Channel

3{8

Def: When a message symbol mi i = 1 2 ::: M is chosen with probability Pr(m = mi) i = 1 2 ::: M , the average probability of (symbol) error is
de ned as

Pe ,
= =

Pr(

M X

decision error)
Pr(

which is the statistical average of the conditional error probabilities. In (1), we used a partition equation. the following probability transition diagram:

i=1 E fPr(decision m

errorjm = mi) Pr(m = mi) errorjm)g = E fP g m em


j

(1)

Example: Consider a binary asymmetric channel with

The source generates a message symbol 0 with probability p and 1 with 1 ; p. The transmitter maps the symbol i to the input i, and the receiver maps the output i to the symbol i. What is the average probability of decision error?

Transmission of a Single Symbol over a Discrete Channel

3{9

Decision Criteria
In this course, we consider two decision rules: the decision rule. In what follows, we will show that the decision rule with the minimum average probability of symbol error is the same as the

minimum average probability of symbol error decision rule and the maximum likelihood (ML) maximum a posteriori probability (MAP)

decision rule and that the MAP decision rule is the same as the ML decision rule if the a priori probabilities of the message symbols are the same. Derivation: (MAP decision rule) First, recall the de nition of a partition equation:

P (A) =
where where
=

X
i

P (AjEi)P (Ei)

P (AjB ) =
=

iEi.

For a conditional probability, we have

X
i

P (AjB Ei)P (EijB )

iEi.

Transmission of a Single Symbol over a Discrete Channel

3 { 10

Suppose that we have an M -ary symbol. Then,

Pe

M X i=1 M X i=1

Pe i Pr(m = mi)
1

; Pr(m ^ = mi jm = mi )] Pr(m = mi )
Pr(

= 1

; ;

M X i=1 M X i=1

m = mi) Pr(m ^ = mi jm = mi ) m = mi)


N hX j =1
=

= 1

Pr(

Pr( ^ = = 1

m mijm = mi Y
M X i=1
Pr(

yj ) Pr(Y

yj jm = mi)
i

m = mi)
=

N hX j =1
= 1

Pr( ^ =

m mijY
i=1
Pr(

yj ) Pr(Y

yj jm = mi)
=

N hX M X j =1

m = mi) Pr(m ^ = mi jY Y
=

Pr(

yj jm = mi)

yj )

Transmission of a Single Symbol over a Discrete Channel

3 { 11

Hence, the optimization problem minimize Pe is equivalent to the following N separate optimization problems
Pr(m ^ =mi jY =yj )

decision rule

maximize
i=1 2 ::: M

M X i=1

Pr( ^ =

m mijY

yj )
i

Pr(

subject to

M X i=1

m = mi) Pr(Y = yj jm = mi) Pr(m ^ = mi jY = yj ) 1 8i m mijY


=

Pr( ^ =

yj ) = 1

for j = 1 2 ::: N . -Note that there exists a unique solution of the optimization problem maximize p p ::: p
1 2

M X
0

subject to

i=1

piqi pi
1

M X i=1

8i

pi = 1

Transmission of a Single Symbol over a Discrete Channel

3 { 12

is

if i = argmaxj qj 0 otherwise if there is only one index i with the largest value of qi. If there is more than one such indices?

pi =

Hence, the minimum average probability of symbol error decision rule becomes ^ i = argmax Pr(Y = yj jm = mi) Pr(m = mi) i = argmax Pr(m = mi jY = yj ) which is the maximum a posteriori (MAP) probability decision rule. -Note that we do not need a randomized decision rule to obtain the the minimum probability of symbol error.
i

Transmission of a Single Symbol over a Discrete Channel

3 { 13

as

Def: The maximum-likelihood decision rule is de ned


^ =

argmax Pr(Y
i

yj jm = mi)

for j = 1 2 ::: N . -Note that the ML decision rule is de ned even if the message symbols are not randomly chosen. -Note that the ML decision rule is equivalent to the MAP decision rule if the message symbols are chosen with equal probability.

Transmission of a Single Symbol over a Discrete Channel

3 { 14

Performance Analysis
Objectives of performance analysis { To reveal key parameters that a ect system performance { To compare di erent systems { To design a better system MAP decision rule and the evaluation of Pe { a closed-form expression of Pe { very tight upper and lower bounds on Pe { a semi-closed form of Pe { an approximate expression of Pe { numerical result by simulation

S-ar putea să vă placă și