Sunteți pe pagina 1din 7

Assignment <3>

∫∞ ∫∞
1. Given that −∞ p1 (y)dy = −∞ p0 (y)dy = 1, the constants c0 and c1 can be found as c0 = 32 , c1 = 19 .
Hence, the prior ratio is η = 1 and the likelihood function L(y) is
{ 2(3−|y|)
p1 (y) 27y 2
|y| ≤ 1
L(y) = = ,
p0 (y) ∞ 1 ≤ |y| ≤ 3

which by using L(y) ≷ η, gives the Bays rule ĤB (y), and the minimum risk rB , as
{
1 y ∈ Λ1 = {y| |y| ≤ c ∨ 1 < |y| ≤ 3} 1 (√ )
ĤB (y) = ,c = 163 − 1 ≃ 0.43582,
0 y ∈ Λ0 = {y| − 1 ≤ y < −c ∨ c < y ≤ 1} 27


1 11423 − 326 163
rB = (Pr(y ∈ Λ1 |H = 0) + Pr(y ∈ Λ0 |H = 1)) = ≃ 0.184446.
2 39366

3. In the first exercise we derived the likelihood L(y) as


{ 2(3−|y|)
p1 (y) 27y 2
|y| ≤ 1
L(y) = = ,
p0 (y) ∞ 1 ≤ |y| ≤ 3

which under the assumption of uniform costs and by using L(y) ≷ η = 1−π π0
0
, gives the Bays rule
ĤB (y, π0 ), as
{
1 y ∈ Λ1 (π0 ) = {y| |y| ≤ c(π0 ) ∨ 1 < |y| ≤ 3}
ĤB (y, π0 ) = ,
0 y ∈ Λ0 = {y| − 1 ≤ y < −c(π0 ) ∨ c(π0 ) < y ≤ 1}

in which, c(π0 ) is
{
1 0 ≤ π0 ≤ 31
4
c(π0 ) = √ .
27π0 (π0 − 1 + −161π02 + 160π0 + 1) 31 < π0 ≤ 1
1 4

Therefore, the minimum risk rB (π0 ) is



π0 0 ≤ π0 < 4

rB (π0 ) = (π0 −1) 2 √−(π0 −1)(161π0 +1)−1 +π0 −10451π0 +322√−(π0 −1)(161π0 +1)−482
( ( ) ( )) 31
 2
4
≤ π0 ≤ 1
19683π0 31

1
Reza Oftadeh

For minmax rule we have to find the least favorable prior πl for which rB (πl ) is maximum. The above
function is continuous and a simple plot shows that it acquire its maximum between 31 4
≤ π0 ≤ 1.

0.20

0.15

0.10

0.05

0.2 0.4 0.6 0.8 1.0

Figure 1: The concave plot of the Bayes minimum risk rB (π0 ).

Hence, the least favorable prior πl and the minmax risk are

πl = 0.341424, rM = rB (πl ) = 0.202424,

the minmax rule is


{
1 y ∈ Λ1 (π0 ) = {y| |y| ≤ c ∨ 1 < |y| ≤ 3}
ĤM (y) = , c = 0.587156
0 y ∈ Λ0 = {y| − 1 ≤ y < −c ∨ c < y ≤ 1}

4. (a) The minimum Bayes risk V (π0 ) is a concave function, where the least favorable prior is
πl that maximizes it and the minmax risk is its value V (πl ). V (π0 ) = 2 − π02 has its maximum
at
πl = 0 and the minmax risk isV (πl ) = 2.
(b) Under the prior πl = 0 the Bayes rule is simply 1 for any y ∈ R and hence, Λ1 = R and Λ0 = ∅.
Therefore, the probability of error under H0 is

Pr(y ∈ Λ1 |H0 ) = Pr(y ∈ R|H0 ) = 1.

5. We have Hi : Y ∼ N ((i − 3)µ, σ 2 ). The Bayes risk under uniform costs becomes as a function
of priors π = {π1 , · · · , π5 }


5 ∑
5 ∑
5 ∑
5 ∑
5
r(δ, π) = cij πj Pr(δ(y) = i|Hj ) = πj Pr(δ(y) = i|Hj ) = πi (1 − Pr(δ(y, πi ) = i|Hi )) .
i=1 j=1 i=1 j=1,j̸=i i=1

2
Reza Oftadeh

The Bayes decision rule δ(y, π) minimizes the above risk and can be written as

δ(y, π) = arg max {ln f (y|Hi ) + ln πi }


1≤i≤5
1 1 1
= arg max {− (y − (i − 3)µ)2 − ln 2π − ln σ 2 + ln πi }
1≤i≤5 2σ 2 2 2
1
= arg max {ln πi − 2 (y − (i − 3)µ)2 }.
1≤i≤5 2σ

The above maximization is over the 2nd order polynomials si (y) = ln πi − 2σ1 2 (y − (i − 3)µ)2 . The
crossing point of si and sj is where the regions change from i to j but there are many subtleties
to account for since some priors can lead to some regions become empty. Generally, si (y) = sj (y)
results in
σ2 πj 1
yij = ln + (i + j − 6)µ
µ(i − j) πi 2
which is simplified to


1 y ∈ (−∞, min y1j ]

 j>1



2 y ∈ (max y2j , min y2j ]


 j<2 j>2
δ(y, π) = 3 y ∈ (max y3j , min y3j ] .

 j<3 j>3



4 y ∈ (max y4j , min y4j ]

 j<4 j>4


5 y ∈ (max y5j , ∞)
j<5

By setting d = σµ , let

′ yij − (i − 3)µ 1 πi j−i


yij (d, π) = = ln + d,
σ d(j − i) πj 2

we have
 ′

Φ(min y1j (d, π)) i=1

 (
j>1 )

′ ′
Pr(δ(y, π) = i|Hi ) = max Φ(min yij (d, π)) − Φ(max yij (d, π)), 0 i ∈ {2, 3, 4} .

 j>i j<i

 ′
1 − Φ(max y5j (d, π)) i=5
j<5


Hence, the average Bayes risk for the Bayes rule is r(d, π) = 5i=1 πi (1 − Pr(δ(y, π) = i|Hi )), which
is function of d and the priors π. Now for the minmax rule we have to find the least favorable
prior πl that maximizes r(d, π) and as qn equalizer results in R = (1 − Pr(δ(y, π) = i|Hi )) be the
same for all i. Also, the minmax rule will be (δ(y, πl ) with minmax error being r(d, πl ), which is a
function of only d.
As requested, I wrote a Matlab code to calculate πl for d = 3, which is shown below:

3
Reza Oftadeh

1 function v = prob error ( pi )


2 d=5;
3 yp = z e r o s ( 5 , 5 ) ;
4 f o r i =1:5
5 f o r j =1:5
6 i f j==i
7 yp ( i , j ) = 0 ;
8 else
9 yp ( i , j ) = 1 / ( d ∗ ( j −i ) ) ∗ l o g ( p i ( i ) / p i ( j ) ) +( j −i ) ∗d / 2 ;
10 end
11 end
12 end
13 pr ( 1 ) = p h i ( min ( yp ( 1 , 2 : 5 ) ) ) ;
14 pr ( 2 ) = p h i ( min ( yp ( 2 , 3 : 5 ) ) ) − p h i (max( yp ( 2 , 1 ) ) ) ;
15 pr ( 3 ) = p h i ( min ( yp ( 3 , 4 : 5 ) ) ) − p h i (max( yp ( 3 , 1 : 2 ) ) ) ;
16 pr ( 4 ) = p h i ( min ( yp ( 4 , 5 ) ) ) − p h i (max( yp ( 4 , 1 : 3 ) ) ) ;
17 pr ( 5 ) = 1−p h i (max( yp ( 5 , 1 : 4 ) ) ) ;
18 f o r i =1:5
19 i f pr ( i ) < 0
20 pr ( i ) = 0 ;
21 end
22 end
23 v = −sum ( p i . ∗ ( 1 − pr ’ ) ) ;
24 end
25 f u n c t i o n pr = p h i ( x )
26 pr = ( 1 / 2 ) ∗ e r f c (−x/ s q r t ( 2 ) ) ;
27 end
28

29 p i l = fmincon ( @ p r o b e r r o r , [ 1 / 5 ; 1 / 1 5 ; 1 / 1 5 ; 1 / 1 5 ; 1 / 5 ] , . . .
30 [] ,[] ,[1 ,1 ,1 ,1 ,1] ,1 ,[0;0;0;0;0] ,[1;1;1;1;1]) ;
The optimization provides the following result for the least favorable priors πl and respective
errors:

πl = [0.0916, 0.2473, 0.3222, 0.2473, 0.0916] =⇒ R = (1 − Pr(δ(y, π) = i|Hi )) = 0.0107.

For equal priors πi = 1


5 ∀i and hence, the Bayes rule becomes


1 i ∈ (−∞, − 32 µ]



 i ∈ (− 32 µ, − 12 µ]
2
δ(y, πl ) = arg min {(y − (i − 3)µ) } = 3
2
i ∈ (− 12 µ, 12 µ] .
1≤i≤5 


4 i ∈ ( 12 µ, 32 µ]



5 i ∈ ( 32 µ, ∞]

4
Reza Oftadeh

6. (a) The Bayes risk r(δ(y)) is


1
r(δ(y)) = π0 (c10 PF + c00 (1 − PF )) + π1 (c11 PD + c01 (1 − PD )) = 2(π0 − 1)PF4 + π0 PF + 2.
Now the Bayes rule, for a given π0 minimizes the above risk by setting PF (δ(y)) through δ(y).
The PF = PF,m that minimizes the risk r(δ(y)) is achieved by setting
( )
∂r(δ(y)) 1 − π0 4/3
= 0 =⇒ PF,m = ∧ 0 ≤ PF,m ≤ 1.
∂PF 2π0
Now for π0 = 13 the above expression is PF,m = 1 and PF,m > 1 for π0 < 13 . Hence, for pi0 ≤ 13 ,
the minimum Bayes risk is achieved by setting
PF,m = Pr(δ(y) = 1|H0 ) = 1 =⇒ δ(y) = 1.
( )4/3
1−π0 1
(b) From the previous part we know that PF,m = 2π0 minimizes the Bayes risk. For π0 = 2
4
13
we have PF,m = 2 and the minimum Bayes risk becomes
1 3
rB = 2(π0 − 1)PF,m
4
+ π0 PF,m + 2 =⇒ rB = 2 − √ ≃ 1.40472.
432
Optional Problems:
1. The minimum Bayes risk rB is
rB = π0 (c10 PF + c00 (1 − PF )) + π1 (c11 PD + c01 (1 − PD )) = π0 (PF + 1) + (1 − π0 )(4 − 2PD )
= π0 (PF + 2PD − 3) + (4 − 2PD ),
comparing with the given V (π0 ) = 2 − π02 , we have PD = 1 and PF = 1 − π0 . Hence, the ROC plot
is just a horizontal line PD (PF ) = 1 ∀ 0 ≤ PF ≤ 1.

2.0

1.5

1.0

0.5

0.2 0.4 0.6 0.8 1.0

Figure 2: ROC of the optimal detectors.

5
Assignment 3
Problems:

1. Find the Bayes rule and the minimum Bayes risk for a binary hypothesis testing problem
with equal priors and uniform costs, with
(
c0 y 2 |y| ≤ 1
p0 (y) =
0 otherwise
(
c1 (3 − |y|) |y| ≤ 3
p1 (y) =
0 otherwise

where c0 and c1 are constants.

2. A communication system uses binary signaling by sending one of the following two signals:
s1 (t) to send one, and s0 (t) to send zero, where s1 (t) = −As(t), s0 = As(t), A > 0, and
(
1 0<t<1
s(t) = .
0 otherwise

The transmitted signal is corrupted by zero mean white Gaussian noise n(t) with autocorre-
lation function Rn (τ ) = σ 2 δ(t), so that the received signal is given by
(
−As(t) + n(t) 1 sent
y(t) = .
As(t) + n(t) 0 sent

The receiver computes the decision statistic


Z ∞
Z= y(t)s(t)dt.
−∞

The receiver would like to decide whether 0 or 1 was sent, or (if it is not confident about a
0/1 decision) erase the signal. We would like to obtain a Bayesian decision rule for doing this
as follows.
Let Θ = {0, 1} and A = {0, 1, e}. The observation is Z. Assume the cost structure

1 a 6= θ, a = 0, 1


C(a, θ) = 0 a=θ .


c a = e

where θ = 0, 1 and where c ∈ (0, 1) is the cost of an erasure.

(a) Find the Bayes rule for equal priors. Simplify the form of the rule as much as possible
and specify its dependence on the problem parameters c, A, and σ 2 .
(b) For d = A/σ = 5, find c and a correspoinding Bayesian decision rule δA so that the
probability of erasure p1 is twice the probability of error p2 . Compare with the values
of p1 and p2 for a Bayesian decision rule δB corresponding to c = 1/2.

1
3. Find the minimax rule and minimax risk for the binary hypothesis testing problem with
(
c0 y 2 |y| ≤ 1
p0 (y) =
0 otherwise
(
c1 (3 − |y|) |y| ≤ 3
p1 (y) =
0 otherwise

where c0 and c1 are constants.

4. The minimum Bayes risk for a binary hypothesis testing problem with costs C00 = 1, C11 =
2, C10 = 2, C01 = 4 is given by
V (π0 ) = 2 − π02
where π0 is the prior probability of hypothesis H0 .

(a) Find the minimax risk and the least favorable prior.
(b) What is the conditional probability of error given H0 for the minimax rule in (a)?

5. A communication system uses multi-amplitude signaling to send one of 5 signals. The decision
statistic when i(i = 1, . . . , 5) is sent is given by

Y = (i − 3)µ + N, i = 1, . . . , 5

where N is Gaussian noise with mean zero and variance σ 2 . For uniform costs, specify in
its simplest form a Bayes rule which is an equalizer, and hence is minimax. Show that the
minimax error probability depends only on d = µ/σ, and give a numerical value for the
minimax error probability (use a computer program if needed) for d = 3. You may use a
computer program if you wish, but you may save time by deriving an approximation based on
large d instead. Compare with the worst-case error probability using a Bayes rule for uniform
costs and equal priors.

6. Consider a binary hypothesis testing problem in which the ROC for the Neyman-Pearson rule
is given by PD = (PF )1/4 . For the same hypotheses, consider a Bayesian problem with cost
structure C10 = 3, C01 = 2, C00 = 2, C11 = 0.

(a) Show that the decision rule δ(y) = 1 is optimal for π0 ≤ 1/3.
(b) Find the minimum Bayes risk for π0 = 1/2.

Optional Problems:

1. The minimum Bayes risk for a binary hypothesis testing problem with costs C00 = 1, C11 =
2, C10 = 2, C01 = 4 is given by
V (π0 ) = 2 − π02
where π0 is the prior probability of hypothesis H0 . Find the receiver operating characteristic
of optimal detectors.

S-ar putea să vă placă și