Documente Academic
Documente Profesional
Documente Cultură
∫∞ ∫∞
1. Given that −∞ p1 (y)dy = −∞ p0 (y)dy = 1, the constants c0 and c1 can be found as c0 = 32 , c1 = 19 .
Hence, the prior ratio is η = 1 and the likelihood function L(y) is
{ 2(3−|y|)
p1 (y) 27y 2
|y| ≤ 1
L(y) = = ,
p0 (y) ∞ 1 ≤ |y| ≤ 3
which by using L(y) ≷ η, gives the Bays rule ĤB (y), and the minimum risk rB , as
{
1 y ∈ Λ1 = {y| |y| ≤ c ∨ 1 < |y| ≤ 3} 1 (√ )
ĤB (y) = ,c = 163 − 1 ≃ 0.43582,
0 y ∈ Λ0 = {y| − 1 ≤ y < −c ∨ c < y ≤ 1} 27
√
1 11423 − 326 163
rB = (Pr(y ∈ Λ1 |H = 0) + Pr(y ∈ Λ0 |H = 1)) = ≃ 0.184446.
2 39366
which under the assumption of uniform costs and by using L(y) ≷ η = 1−π π0
0
, gives the Bays rule
ĤB (y, π0 ), as
{
1 y ∈ Λ1 (π0 ) = {y| |y| ≤ c(π0 ) ∨ 1 < |y| ≤ 3}
ĤB (y, π0 ) = ,
0 y ∈ Λ0 = {y| − 1 ≤ y < −c(π0 ) ∨ c(π0 ) < y ≤ 1}
in which, c(π0 ) is
{
1 0 ≤ π0 ≤ 31
4
c(π0 ) = √ .
27π0 (π0 − 1 + −161π02 + 160π0 + 1) 31 < π0 ≤ 1
1 4
rB (π0 ) = (π0 −1) 2 √−(π0 −1)(161π0 +1)−1 +π0 −10451π0 +322√−(π0 −1)(161π0 +1)−482
( ( ) ( )) 31
2
4
≤ π0 ≤ 1
19683π0 31
1
Reza Oftadeh
For minmax rule we have to find the least favorable prior πl for which rB (πl ) is maximum. The above
function is continuous and a simple plot shows that it acquire its maximum between 31 4
≤ π0 ≤ 1.
0.20
0.15
0.10
0.05
Hence, the least favorable prior πl and the minmax risk are
4. (a) The minimum Bayes risk V (π0 ) is a concave function, where the least favorable prior is
πl that maximizes it and the minmax risk is its value V (πl ). V (π0 ) = 2 − π02 has its maximum
at
πl = 0 and the minmax risk isV (πl ) = 2.
(b) Under the prior πl = 0 the Bayes rule is simply 1 for any y ∈ R and hence, Λ1 = R and Λ0 = ∅.
Therefore, the probability of error under H0 is
5. We have Hi : Y ∼ N ((i − 3)µ, σ 2 ). The Bayes risk under uniform costs becomes as a function
of priors π = {π1 , · · · , π5 }
∑
5 ∑
5 ∑
5 ∑
5 ∑
5
r(δ, π) = cij πj Pr(δ(y) = i|Hj ) = πj Pr(δ(y) = i|Hj ) = πi (1 − Pr(δ(y, πi ) = i|Hi )) .
i=1 j=1 i=1 j=1,j̸=i i=1
2
Reza Oftadeh
The Bayes decision rule δ(y, π) minimizes the above risk and can be written as
The above maximization is over the 2nd order polynomials si (y) = ln πi − 2σ1 2 (y − (i − 3)µ)2 . The
crossing point of si and sj is where the regions change from i to j but there are many subtleties
to account for since some priors can lead to some regions become empty. Generally, si (y) = sj (y)
results in
σ2 πj 1
yij = ln + (i + j − 6)µ
µ(i − j) πi 2
which is simplified to
1 y ∈ (−∞, min y1j ]
j>1
2 y ∈ (max y2j , min y2j ]
j<2 j>2
δ(y, π) = 3 y ∈ (max y3j , min y3j ] .
j<3 j>3
4 y ∈ (max y4j , min y4j ]
j<4 j>4
5 y ∈ (max y5j , ∞)
j<5
By setting d = σµ , let
we have
′
Φ(min y1j (d, π)) i=1
(
j>1 )
′ ′
Pr(δ(y, π) = i|Hi ) = max Φ(min yij (d, π)) − Φ(max yij (d, π)), 0 i ∈ {2, 3, 4} .
j>i j<i
′
1 − Φ(max y5j (d, π)) i=5
j<5
∑
Hence, the average Bayes risk for the Bayes rule is r(d, π) = 5i=1 πi (1 − Pr(δ(y, π) = i|Hi )), which
is function of d and the priors π. Now for the minmax rule we have to find the least favorable
prior πl that maximizes r(d, π) and as qn equalizer results in R = (1 − Pr(δ(y, π) = i|Hi )) be the
same for all i. Also, the minmax rule will be (δ(y, πl ) with minmax error being r(d, πl ), which is a
function of only d.
As requested, I wrote a Matlab code to calculate πl for d = 3, which is shown below:
3
Reza Oftadeh
29 p i l = fmincon ( @ p r o b e r r o r , [ 1 / 5 ; 1 / 1 5 ; 1 / 1 5 ; 1 / 1 5 ; 1 / 5 ] , . . .
30 [] ,[] ,[1 ,1 ,1 ,1 ,1] ,1 ,[0;0;0;0;0] ,[1;1;1;1;1]) ;
The optimization provides the following result for the least favorable priors πl and respective
errors:
4
Reza Oftadeh
2.0
1.5
1.0
0.5
5
Assignment 3
Problems:
1. Find the Bayes rule and the minimum Bayes risk for a binary hypothesis testing problem
with equal priors and uniform costs, with
(
c0 y 2 |y| ≤ 1
p0 (y) =
0 otherwise
(
c1 (3 − |y|) |y| ≤ 3
p1 (y) =
0 otherwise
2. A communication system uses binary signaling by sending one of the following two signals:
s1 (t) to send one, and s0 (t) to send zero, where s1 (t) = −As(t), s0 = As(t), A > 0, and
(
1 0<t<1
s(t) = .
0 otherwise
The transmitted signal is corrupted by zero mean white Gaussian noise n(t) with autocorre-
lation function Rn (τ ) = σ 2 δ(t), so that the received signal is given by
(
−As(t) + n(t) 1 sent
y(t) = .
As(t) + n(t) 0 sent
The receiver would like to decide whether 0 or 1 was sent, or (if it is not confident about a
0/1 decision) erase the signal. We would like to obtain a Bayesian decision rule for doing this
as follows.
Let Θ = {0, 1} and A = {0, 1, e}. The observation is Z. Assume the cost structure
1 a 6= θ, a = 0, 1
C(a, θ) = 0 a=θ .
c a = e
(a) Find the Bayes rule for equal priors. Simplify the form of the rule as much as possible
and specify its dependence on the problem parameters c, A, and σ 2 .
(b) For d = A/σ = 5, find c and a correspoinding Bayesian decision rule δA so that the
probability of erasure p1 is twice the probability of error p2 . Compare with the values
of p1 and p2 for a Bayesian decision rule δB corresponding to c = 1/2.
1
3. Find the minimax rule and minimax risk for the binary hypothesis testing problem with
(
c0 y 2 |y| ≤ 1
p0 (y) =
0 otherwise
(
c1 (3 − |y|) |y| ≤ 3
p1 (y) =
0 otherwise
4. The minimum Bayes risk for a binary hypothesis testing problem with costs C00 = 1, C11 =
2, C10 = 2, C01 = 4 is given by
V (π0 ) = 2 − π02
where π0 is the prior probability of hypothesis H0 .
(a) Find the minimax risk and the least favorable prior.
(b) What is the conditional probability of error given H0 for the minimax rule in (a)?
5. A communication system uses multi-amplitude signaling to send one of 5 signals. The decision
statistic when i(i = 1, . . . , 5) is sent is given by
Y = (i − 3)µ + N, i = 1, . . . , 5
where N is Gaussian noise with mean zero and variance σ 2 . For uniform costs, specify in
its simplest form a Bayes rule which is an equalizer, and hence is minimax. Show that the
minimax error probability depends only on d = µ/σ, and give a numerical value for the
minimax error probability (use a computer program if needed) for d = 3. You may use a
computer program if you wish, but you may save time by deriving an approximation based on
large d instead. Compare with the worst-case error probability using a Bayes rule for uniform
costs and equal priors.
6. Consider a binary hypothesis testing problem in which the ROC for the Neyman-Pearson rule
is given by PD = (PF )1/4 . For the same hypotheses, consider a Bayesian problem with cost
structure C10 = 3, C01 = 2, C00 = 2, C11 = 0.
(a) Show that the decision rule δ(y) = 1 is optimal for π0 ≤ 1/3.
(b) Find the minimum Bayes risk for π0 = 1/2.
Optional Problems:
1. The minimum Bayes risk for a binary hypothesis testing problem with costs C00 = 1, C11 =
2, C10 = 2, C01 = 4 is given by
V (π0 ) = 2 − π02
where π0 is the prior probability of hypothesis H0 . Find the receiver operating characteristic
of optimal detectors.