Sunteți pe pagina 1din 2

1

Detection and Estimation

Max. Marks 5

EE5340

I. A SSIGNMENT 2
Problem 1. Let R be the observation for the binary message case, s.t., p0 = 1/3, and
1
H0 : f (R|H0) = [u(R) u(R 2)]
2
R
H1 : f (R|H1) = [u(R) u(R 2)].
2
where u() is the standard step function. Determine the following:
The minimum Pe test.
The N-P test with P F = 0.1.
The min-max test with minimum Pe cost assignment.
Problem 2. Consider the binary observation of the following form, R|H0 = s0 n and R|H1 = s1 n, where
n is a zero mean, variance 2 , Gaussian random variable. Find the N-P test with P F = 0.2.
Problem 3. Consider the following decision problem
( 1
|R 1| |R| < 1
2
f (R|H0 ) =
0
otherwise
( 1
|R 2| |R| < 2
8
f (R|H1 ) =
0
otherwise
If C00 = 1, C10 = 3, C01 = 4, C11 = 2, and p0 = 3/4. Find the Bayes decision rule. Using the same cost
matrix find min-max decision rule. Find the optimum Bayes cost in both the cases.
Problem 4. Let p0 = 1/2 and
1
H0 : f (R|H0) = eR u(R) + e2R u(R)
2
1
H1 : f (R|H1) = e2R u(R) + eR u(R)
2

Find
Find
Find
Find

the
the
the
the

Pe decision rule.
N-P decision rule with 0 = 0.2 (P F = 0.2).
min-max decision rule using Pe cost.
associated probability of error in all the cases. Assume p0 = 1/2.

Problem 5. Consider the binary decision problem with


(
1 0R1
H0 : f (R|H0 ) =
0 otherwise
( R
e
0R
H1 : f (R|H1 ) =
0 otherwise

Find the ROC for this problem.


Problem 6. A simple binary decision problem has a two dimensional signal space and a two dimensional
observation space. The signal vectors are:
 
 
1
0
S0 =
, S1 =
0
1
The observation vector is just the sum of the signal vector and noise vector ([n1 n2 ]T ). The n1 and
n2 are zero mean, unit variance i.i.d. Gaussian random variables. Find the decision rule that minimizes
probability of error for equi-probable message and the resulting Pe .
Problem 7. Consider the multiple observation binary decision problem with the signal vectors as

1
1
S0 = ... , S1 = ...
1
1

The observations Ri s are sum of the S and an l dimensional Gaussian vector with zero mean and covariance matrix equal to an l dimensional identity matrix. For equi-probable case, what is the smallest
value of l such that Pe 0.01.
Problem 8. Prove that any symmetric matrix can always be diagonalized.
Problem 9. Prove that ROC for a binary detection problem is always convex.
Problem 10. Let n denote a zero mean Gaussian random vector with the co-variance matrix equal to


2 2
V =
.
2 5
find a suitable transformation of n, denoted by n such that, n has a co-variance matrix equal to identity
and mean equal to [1 1]T .

S-ar putea să vă placă și