Sunteți pe pagina 1din 7

ENEE621: HW Assignment #1

Spring 2013

1. (Exercise 4 from Poor) Suppose that the probability density function (PDF) of observation Y under hypothesis H 0 is given by

p 0 (y ) = e y , y 0 ,

y< 0 ,

0,

and the PDF under hypothesis H 1 is

2 2 2 , p 1 (y ) = (q ⇡ e y 0,
2
2
2
,
p 1 (y ) = (q
⇡ e y
0,

y 0 ,

y< 0 .

(a)

Find a Bayes rule and the minimum Bayes risk for testing H 0 versus H 1 with uniform costs and equal priors.

(b)

Find a minimax rule and the minimax risk for uniform costs.

(c)

Find a Neyman-Pearson rule and the corresponding detection probability for false-alarm proba- bility 2 (0, 1).

Ans: (a) Bayes rule:

p (y )

p 0 (y ) = r e y

2

1

2

rule: p ( y ) p 0 ( y ) = r ⇡ e y 2

2 + y 1 ) H 1

y ) = r ⇡ e y 2 1 2 2 + y 1 ) H
y ) = r ⇡ e y 2 1 2 2 + y 1 ) H

Therefore, we will decide H 1 when A = 1 p 1 log y 1 + p 1 log = B . The Bayes risk can be calculated as

2

2

1

R ( ) = 2 (P 0 ( 1 ) + P 1 ( 0 ))

where

and

B

P 0 ( 1 ) = Z A e y dy = e A e B

P 1 ( 0 ) = Z A

0

r 2 e y 2 2 ⇡
r
2
e y 2
2

1

dy + Z

B

r 2 2 2 dy = 1 2 Q(A)+2 Q(B ). ⇡ e y
r
2
2
2 dy = 1 2 Q(A)+2 Q(B ).
⇡ e y

(b) Minimax rule: Bayes rule for general 0 is

p (y ) 2 2 1 ⇡ 0 2 + y ) H 1 .
p (y )
2
2
1
⇡ 0
2 + y
) H 1 .
p 0 (y ) = r ⇡ e y
1 ⇡ 0
1
If 1 2 log p ⇡ · ⇡ 0 ⇡ 0 0 , i.e., ⇡ 0 
, then, we will decide that H 1 is true when
2
1
1+ p
2 e

A 0 = 1 s 1 2 log r ·

2

0 0 y 1 + s 1 2 log r ·

1

0

2

1 0

1

= B 0 .

Therefore, referring to the results from the above problem, we can calculate that

V (0 ) = 0 · (e A 0 e B 0 ) + (1 0 ) · (1 2Q(A 0 )+2 Q(B 0 )).

If 0 >

1

1+ p

2 e

, we will always decide that H 0 is true. Then, V (0 )=1 0 .

V (0 ) will have the maximum value where 0 satisfies that e A 0 e B 0 = 1 2 Q(A 0 )+2Q(B 0 . Let us set L is the maximum point of V (0 ). Then, the minimax risk is equal to e A 0 e B 0 when 0 = L .

(c)NP-rule: We will decide that H 1 is true when

p (y ) 2 2 1 2 p 0 (y ) = r ⇡ e
p (y )
2
2
1
2
p 0 (y ) = r ⇡ e y

+ y .

We know that if > q 2 e , P F = 0.

If 1 2 log p · 0, i.e., q 2 e , then, we will decide that H 1 is true when

2 e , then, we will decide that H 1 is true when 2 ⇡ A

2

A ⇤ = 1 s 1 2 log r ⇡ ⌘  2
A ⇤ = 1 s 1 2 log r ⇡ ⌘ 
2

Therefore, false alarm probability is

y  1 + s 1 2 log r ⇡ ⌘ 2
y  1 + s 1 2 log r ⇡ ⌘
2

= B .

P F = = e A e B ,

which gives us that = q · e

2

2 1 1 log e + p e 2 2 +4

1 „ log e ↵ + p e 2 ↵ 2 + 4 2 « 2

2

« 2 ! . And, the detection probability is

P D = 1 2 Q(A )+2 Q(B )

is P D = 1 2 Q ( A ⇤ )+2 Q ( B ⇤ )

.

where A = 1 log e + p e 2 2 +4 and B = 1 + log e + p e 2 2 +4

2

2

2. (Exercise 9 from Poor) Suppose that we have a real observation Y and binary hypothesis described by the following pair of PDFs:

and

p 0 (y ) = 1 |y |, 0 ,

if |y | 1 ,

if |y | > 1

p 1 (y ) = ( 2 | y | 0 ,

4

, if |y | 2 , if |y | > 2 ,

(a) Assume that the costs are given by

C 01 = 2C 10 > 0 and C 00 = C 11 = 0.

Find a minimax test of H 0 versus H 1 and the corresponding minimax risk.

2

(b) Find a Neyman-Pearson test of H 0 versus H 1 with false-alarm probability and the correspond- ing detection probability.

Ans: (a) Bayes rule for a prior 0 : First, when |y | > 1, we decide H 1 is true from the given conditional distributions. When |y | 1, if

p

p 0 (y ) = 4(1 |y |) 2(1 0 )

1 (y )

2 | y |

0

) (30 1)|y | 2(20 1),

we decide H 1 is true. This can be separated as following three cases.

At first case,

0 2 ( 2 , 1] : |y | 4 0 2 = a 1 ) H 1

1

30 1

P 0 ( 1 )=2 Z

a

1

1

(1 y )dy =

1 30 1 2

0

P 1 ( 0 )=2 Z a 1

0

2 y dy = (40 1)(20 1)

4

(30 1) 2

V (0 ) = 0 C 10 P 0 ( 1 ) + (1 0 )C 01 P 1 ( 0 )

= C 10 0 ·

1 0 1 2 + 2(1 0 ) · (40 1)(20 1)

30

(30 1) 2

!

Secondly,

0 2 [0, 3 ) : |y | 4 0 2 ) H 1

1

30 1

Here, 4 0 2 1 for this range of 0 , which means that for all |y | 1 we will decide H 1 .

3 0 1

P 0 ( 1 )=1

P 1 ( 0 )=0

V (0 ) = 0 C 10 P 0 ( 1 ) + (1

0 )C 01 P 1 ( 0 ) =

0 C 10

At third,

0 2 (

1

3 ,

2 1 ) : Always H 1

P 0 ( 1 )=1

P 1 ( 0 )=0

V (0 ) =

0 C 10 P 0 ( 1 ) + (1 0 )C 01 P 1 ( 0 ) = 0 C 10

Finally, if you plot the graph of V (0 ), you can check that the maximum of V (0 ) is achieved when

(which satisfies that P 0 ( 1 )=2P 1 ( 0 ) at the first case). Given the 0 , the minimax risk

0 = 5+ p 10

15

is C 10 p 10 1

3

2 .

3

(b) Let us define 0 is the threshold where we decide H 0 (H 1 ) when |y | (|y | ). Then, the

false alarm probability is P F = (1 ) 2 = , which gives us that = 1 p . And the corresponding

.

3. (Exercise 12 from Poor) Consider a simple binary hypothesis testing problem. For a decision rule , denote the false-alarm and miss probability by P F ( ) and P M ( ), respectively. Consider the perfor- mance measure

detection probability is P D = (2 ) 2

4

= (1+ p ) 2

4

( ) = (P F ( )) 2 + ( P M ( )) 2

and let o denote a decision rule minimizing ( ) over all randomized decision rules .

(a)

Show that o must be a likelihood-ratio test.

(b)

For 0 2 [0, 1], define the function V by

V (0 ) = min [0 P F ( ) + (1 0 )P M ( )] .

Suppose that V (0 ) achieves its maximum on [0, 1] at the point 0 = 0.5 . Show that o is a Bayes rule for prior 0 = 0.5 .

Ans: (a) The performance measure can be rewritten as

( ) = (P F ( )) 2 + (1 P D ( )) 2 .

Let us suppose that the decision rule 0 is not a likelihood-ratio test. And, let’s define that P F ( 0 ) = . Then, from the Neyman-Pearson Lemma, we know that we can always find the likelihood ratio test, 0 , which satisfies that P D ( 0 ) P D ( 0 ) when P F ( 0 ) = . Then, obviously, we have that ( 0 ) ( 0 ), which is a contradiction that 0 gives us the minimum ( ). Therefore, the 0 is a likelihood-ratio test.

(b)(with uniform cost and differentiable V (0 )) Let’s suppose that o is a Bayes rule for prior 0 = 0 .5 . Then, we will show that 0 = arg min ( ). Since o is a Bayes rule for prior 0 = 0.5,

0 =

1

1

arg min 2 P F ( ) + 2 P M ( )

= arg min (P F ( ) + P M ( )) 2

From the minimax test theory, we know that P F ( 0 ) = P D ( 0 ) when 0 = 0.5 . Therefore,

0 = arg min (P F ( ) + P M ( )) 2 + ( P F ( ) P M ( )) 2

= arg min ( ).

4. (Exercise 2.1 from Levy) In the binary communication system shown in Figure 1, the message values

X = 0 and X = 1 occur with a priori probabilities 1/4 and 3/4, respectively. The random variable V

takes the values -1, 0, and 1 with probabilities 1/8, 3/4, and 1/8, respectively. The received message is

Y = X + V .

(a) Given the received signal Y , the receiver must decide if the transmitted message was 0 or 1.

ˆ

The estimated message X takes the values 0 or 1. Find the receiver that achieves the maximum

probability of a correct decision.

4

Figure 1: Binary communication system model. (b) Find h X = X i for the

Figure 1: Binary communication system model.

(b)

Find h X = X i for the receiver of part (a).

(b) Find h X = X i for the receiver of part (a). ˆ Ans: On

ˆ

Ans: On the next pages.

5. (Exercise 2.2 from Levy) The system shown in Figure 2 is an idealized model for binary communi- cation through a fading channel. The message X that we want to transmit takes one of two values, 0 or 1, with a priori probabilities 1/3 and 2/3, respectively. The channel fading A is a N (1, 3) Gaussian random variable. The channel noise V is a N (0, 1) Gaussian random variable.

noise V is a N (0 , 1) Gaussian random variable. Figure 2: Idealized model of

Figure 2: Idealized model of a fading communication channel.

(a)

Find the minimum probability of error decision rule. Simplify your answer as much as possible

(b)

Sketch the decision regions on y -axis plot.

Ans: On the next pages.

5