Sunteți pe pagina 1din 15

Detection Theory

Example
1- Radar
2- Communications
3- Speech
4- Sonar
5- Control
6- . . .
Denition
Assume a set of data {x[0], x[1], . . . , x[N 1]} is available. To arrive at a decision,
rst we form a function of the data or T(x[0], x[1], . . . , x[N 1]) and then make a de-
cision based on its value. Determining the function T and its mapping to a decision
is the central problem addressed in Detection Theory.
1
Introduction
Example: BPSK phase detection
2
Introduction
Detection Problem
The simplest detection problem is to determine if a signal is present or not. Note
that such a signal is always embedded in noise. This type of detection problem is
called binary hypothesis testing problem. Assuming the received data at time n to
be x[n], the signal s[n] and the noise w[n], the binary hypothesis testing problem is
dened as follows
H
0
: x[n] =w[n]
H
1
: x[n] =s[n] +w[n]
Note that if the number of hypotheses is more than two, then the problem becomes
a multiple hypothesis testing problem. One example is detection of different digits
in speech processing.
3
Detection Problem
Example
Detection of a DC level of amplitude A= 1 embedded in white Gaussian noise w[n]
with variance
2
with only one sample.
H
0
: x[0] =w[0]
H
1
: x[n] = 1+w[0]
One possible detection rule:
H
0
: x[0] <
1
2
H
1
: x[n] >
1
2
for the case where x[0] =
1
2
, we might arbitrarily choose one of the possibilities.
However the probability of such a case is zero!
4
Detection Problem
Example continued
The probability density function of x[0] under each hypothesis is as follows
p(x[0]; H
0
) =
1

2
2
exp
_

1
2
2
x
2
[0]
_
p(x[0]; H
1
) =
1

2
2
exp
_

1
2
2
(x[0] 1)
2
_
Deciding between H
0
and H
1
, we are essentially asking weather x[0] has been
generated according to the pdf p(x[0]; H
0
) or the pdf p(x[0]; H
1
).
5
Important pdfs
Gaussian
Gaussian pdf:
p(x) =
1

2
2
exp
_

1
2
2
(x)
2
_
<x <+
where is the mean and
2
is the variance of x.
Standard normal pdf: = 0 and
2
= 1
The cumulative distribution function (cdf) of a standard normal pdf:
(x) =
_
x

2
exp
_

1
2
t
2
_
dt
A more convenient description is the right-tail probability which is dened as Q(x) =
1 (x). This function which is called Q-function is used frequently in different
detection problems where the signal and noise are normally distributed.
6
Chi-Squared (Central)
A chi-squared pdf arises as the pdf of x, where x =
v

i=1
x
2
i
, if x
i
is a standard normally
distributed random variable. The chi-squared pdf with v degrees of freedom is
dened as
p(x) =
_

_
1
2
v
2

(
v
2
)
x
v
2
1
exp
_

1
2
x
_
, x >0
0, x <0
and is denoted by
2
v
. v is assumed to be integer and v 1. The function (u) is
the Gamma function and is dened as
(u) =
_

0
t
u1
exp(t)dt
7
Chi-Squared (Noncentral)
If x =
v

i=1
x
2
i
, where x
i
s are i.i.d. Gaussian random variables with mean
i
and
variance
2
=1, then x has a noncentral chi-squared pdf with v degrees of freedom
and noncentrality parameter =

v
i=1

2
i
. The pdf then becomes
p(x) =
_
_
_
1
2
_
x

_
v2
4
exp
_

1
2
(x+)
_
I
v
2
1
_

x
_
, x >0
0, x <0
8
Neyman-Pearson Theorem
Detection performance measures
Detection performance of a system is measured mainly by two factors:
1. Probability of false alarm: P
FA
=p(H
1
; H
0
)
2. Probability of detection: P
D
=p(H
1
; H
1
)
Note that sometimes instead of probability of detection, probability of miss detec-
tion, P
M
= 1P
D
is used.
9
Neyman-Pearson Theorem
Problem statement
Assume a data set x = [x[0], x[1], ..., x[N 1]]
T
is available. The detection problem
is dened as follow
H
0
= T(x) <
H
1
= T(x) >
where T is the decision function and is the detection threshold. Our goal is to
design T so as to maximize P
D
subject to P
FA
<.
10
Neyman-Pearson Theorem
Neyman-Pearson Theorem
To maximize P
D
for a given P
FA
= decide H
1
if
L(x) =
p(x; H
1
)
p(x; H
0
)
>
where the threshold is found from
P
FA
=
_
{x:L(x)>}
p(x; H
0
)dx =
The function L(x) is called the likelihood ratio and the entire test is called the likeli-
hood ratio test (LRT).
11
Neyman-Pearson Theorem
Example: DC level in WGN
Consider the following signal detection problem
H
0
: x[n] =w[n] n = 0, 1, . . . , N 1
H
1
: x[n] =s[n] +w[n] n = 0, 1, . . . , N 1
where the signal is s[n] =A for A>0 and w[n] is AWGN with variance
2
. Now the
NP detector decides H
1
if
1
(2
2
)
N
2
exp
_

1
2
2

N1
n=0
(x[n] A)
2
_
1
(2
2
)
N
2
exp
_

1
2
2

N1
n=0
x
2
[n]
_
>
Taking the logarithm of both sides and simplication results in
A

2
N1

n=0
x[n] >ln+
NA
2
2
2
Since A>0, we have nally
1
N
N1

n=0
x[n] >

2
NA
ln+
A
2
=

12
Neyman-Pearson Theorem
Example continued
The NP detector compares the sample mean x =
1
N

N1
n=0
x[n] to a threshold

. To
determine the detection performance, we rst note that the test statistic T(x) = x is
Gaussian under each hypothesis and its distribution is as follows
T(x)
_
_
_
N(0,

2
N
) under H
0
N(A,

2
N
) under H
1
We have then
P
FA
=Pr(T(x) >

; H
0
) =Q
_

2
/N
_
and
P
D
=Pr(T(x) >

; H
1
) =Q
_

A
_

2
/N
_
P
D
and P
FA
are related to each other according to the following equation
P
D
=Q
_
Q
1
(P
FA
)
_
NA
2

2
_
13
Receiver Operating Characteristics
The alternative way of summarizing the detection performance of a NP detector
is to plot P
D
versus P
FA
. This plot is called Receiver Operating Characteristics
(ROC). For the former DC level detection example, the ROC is shown here. Note
that here
NA
2

2
= 1.
14
Minimum Probability of Error
Assume the prior probabilities of H
0
and H
1
are known and represented by P(H
0
)
and P(H
1
), respectively. The probability of error, P
e
, is then dened as
P
e
=P(H
1
)P(H
0
|H
1
) +P(H
0
)P(H
1
|H
0
) =P(H
1
)P
M
+P(H
0
)P
FA
Our goal is to design a detector that minimizes P
e
. It is shown that the following
detector is optimal in this case
p(x|H
1
)
p(x|H
0
)
>
P(H
0
)
P(H
1
)
=
In case P(H
0
) =P(H
1
), the detector is called the maximum likelihood detector.
15

S-ar putea să vă placă și