Sunteți pe pagina 1din 20

1

12.6 Sequential LMMSE Estimation


Same kind if setting as for Sequential LS
Fixed number of parameters (but here they are modeled as random)
Increasing number of data samples
] [ ] [ ] [ n n n w H x + =
Data Model:
(n+1)1
x[n] = [x[0] x[n]]
T
p1
unknown PDF
known mean & cov
(n+1)1
w[n] = [w[0] w[n]]
T
unknown PDF
known mean & cov
C
w
must be diagonal
with elements
2
n
& w are uncorrelated
(
(


=
] [
] 1 [
] [
n
n
n
T
h
H
H
(n+1)p
known
Goal: Given an estimate
] 1 [

n
based on x[n 1], when new
data sample x[n] arrives, update the estimate to
] [

n
2
Development of Sequential LMMSE Estimate
Our Approach Here: Use vector space ideas to derive solution
for DC Level in White Noise then write down general
solution.
] [ ] [ n w A n x + =
For convenience Assume both A and w[n] have zero mean
Given x[0] we can find the LMMSE estimate
] 0 [ ] 0 [
} ]) [ {(
])} [ ( {
] 0 [
]} 0 [ {
]} 0 [ {

2 2
2
2 2
0
x x
n w A E
n w A A E
x
x E
Ax E
A
A
A
(
(

+
=
(
(

+
+
=
(
(

=


Now we seek to sequentially update this estimate
with the info from x[1]
3
From Vector Space View:
A
x[0]
x[1]
0

A
1

A
First project x[1] onto x[0] to get
] 0 | 1 [

x
Estimate new data
given old data
Prediction!
Notation: the estimate
at 1 based on 0
Use Orthogonality Principle
] 0 | 1 [

] 1 [ ] 1 [
~
x x x =

is to x[0]
This is the new, non-redundant info provided by data x[1]
It is called the innovation
x[0]
x[1]
0

A
] 1 [
~
x
4
Find Estimation Update by Projecting A onto Innovation
] 1 [
~
]} 1 [
~
{
]} 1 [
~
{
] 1 [
~
] 1 [
~
] 1 [
~
,
] 1 [
~
] 1 [
~
] 1 [
~
] 1 [
~
,

2 2
1
x
x E
x A E
x
x
x
A
x
x
x
x
A A
(
(

= = =
Gain: k
1
Recall Property: Two Estimates from data just add:
] 0 [ ] 1 [
~
x x
] 1 [
~


1 0
1 0 1
x k A
A A A
+ =
+ =
| | ] 0 | 1 [

] 1 [

1 0 1
x x k A A + =
x[0]
x[1]
0

A
] 1 [
~
x
1

A
1

A
Predicted
New Data
New
Data
Old
Estimate
Gain
I nnovat ion is Old Dat a
5
The Innovations Sequence
The Innovations Sequence is
Key to the derivation & implementation of Seq. LMMSE
A sequence of orthogonal (i.e., uncorrelated) RVs
Broadly significant in Signal Processing and Controls
] 0 [ x
{ } ], 2 [
~
], 1 [
~
], 0 [
~
x x x
] 0 | 1 [

] 1 [ x x
] 1 | 2 [

] 2 [ x x
Means: Based on
ALL data up to n = 1
(inclusive)
6
General Sequential LMMSE Estimation
Initialization No Data Yet! Use Prior Information
} {

1
E =

Est imat e

C M =
1
MMSE Mat r ix
Update Loop For n = 0, 1, 2,
n n
T
n n
n n
n
h M h
h M
k
1
2
1

+
=

Gain Vect or Calculat ion


| |
1 1

] [


+ =
n
T
n n n n
n x h k
Est imat e Updat e
| |
1
=
n
T
n n n
M h k I M
MMSE Mat r ix Updat e
7
Sequential LMMSE Block Diagram
] [ ] [ ] [ n n n w H x + =
(
(


=
] [
] 1 [
] [
n
n
n
T
h
H
H
Data Model
k
n

2
1
, ,
n n n
h M

Compute
Gain
] [
~
n x
z
-1
1

+
+
x[n]

T
n
h
1

] 1 | [

=
n
T
n
n n x h
+
Innovation
Observation
( )
1 1

] [


+
n
T
n n n
n x h k
n

Delay
Updated
Estimate
Previous
Estimate
n
h
Predicted
Observation
Exact Same St r uct ur e as f or Sequent ial Linear LS!!
8
Comments on Sequential LMMSE Estimation
1. Same structure as for sequential linear LS. BUT they solve
the estimation problem under very different assumptions.
2. No matrix inversion required So computationally Efficient
3. Gain vector k
n
weighs confidence in new data (
2
n
) against all
previous data (M
n-1
)
when previous data is better, gain is small dont use new data much
when new data is better, gain is large new data is heavily used
4. If you know noise statistics
2
n
and observation rows h
n
T
over
the desired range of n:
Can run MMSE Matrix Recursion without data measurements!!!
This provides a Predictive Performance Analysis
9
12.7 Examples Wiener Filtering
During WWII, Norbert Wiener developed the mathematical ideas
that led to the Wiener filter when he was working on ways to
improve anti-aircraft guns.
He posed the problem in C-T form and sought the best linear
filter that would reduce the effect of noise in the observed A/C
trajectory.
He modeled the aircraft motion as a wide-sense stationary
random process and used the MMSE as the criterion for
optimality. The solutions were not simple and there were many
different ways of interpreting and casting the results.
The results were difficult for engineers of the time to understand.
Others (Kolmogorov, Hopf, Levinson, etc.) developed these ideas
for the D-T case and various special cases.
10
Weiner Filter: Model and Problem Statement
Signal Model: x[n] = s[n] + w[n]
Observed: Noisy Signal
Model as WSS, Zero-Mean
C
xx
= R
xx
covariance matrix
correlation
matrix
} {
T
E xx R
xx
=
} }) { })( { {(
T
E E E x x x x C
xx
=
Desired Signal
Model as WSS, Zero-Mean
C
ss
= R
ss
Noise
Model as WSS, Zero-Mean
C
ww
= R
ww
Same if zero-mean
Problem Statement: Process x[n] using a linear filter to provide
a de-noised version of the signal that has minimum MSE
relative to the desired signal
LMMSE Problem!
11
Filtering, Smoothing, Prediction
Terminology for three different ways to cast the Wiener filter problem
Filtering Smoothing Prediction
Given: x[0], x[1], , x[n] Given: x[0], x[1], , x[N-1] Given: x[0], x[1], , x[N-1]
Find: ] 1 [

, ], 1 [

], 0 [

N s s s 0 , ] [

> + l l N x Find: ] [

n s Find:
x C C
xx x
1


=
x[n]
] 0 [

s
2
1
n
] 1 [

s
] 2 [

s
] 3 [

s
3
x[n]
2
1
x[n]
] 5 [

x
2
1
4 5
Note!!
n n
3 3
] 0 [

s ] 1 [

s ] 2 [

s ] 3 [

s
All t hr ee solved using Gener al LMMSE Est .
12
Filt er ing Smoot hing
x C C
xx x
1


=
Pr edict ion
(vector) s = (scalar) ] [n s =
{ }
{ }
| |
) (vector!
~
] 0 [ ] [
] [
] [
T
ss
ss ss
T
T
r n r
n s E
n s E
r
s
x C
x
=
=
=
=
"

{ }
{ }
ww ss
T T
T
xx
E
E
R R
ww ss
w s w s C
+ =
+ =
+ + = ) )( (
| || || | 1 ) 1 ( ) 1 ( ) 1 ( ) 1 ( 1
1
) (
~
] [

+ + + +

+ =
n n n n
ww ss
T
ss
n s x R R r
{ }
{ }
{ }
) (Matrix!
) (
ss
T T
T
T
E
E
E
R
sw ss
w s s
sx C
x
=
+ =
+ =
=
(scalar) ] 1 [ l N x + =
x not s!
{ }
| |
) (vector!
~
] [ ] 1 [
] 1 [
T
xx
xx xx
T
x
l r l N r
l N x E
r
x C
=
+ =
+ =
"

{ }
{ }
ww ss
T T
T
xx
E
E
R R
ww ss
w s w s C
+ =
+ =
+ + = ) )( (
{ }
xx
T
xx
E
R
xx C
=
=
| || || | 1
1
) (

+ =
N N N N N
ww ss ss
x R R R s
| || || | 1 1
1 ~
] 1 [

= +
N N N N
xx
T
xx
l N x x R r
13
Comments on Filtering: FIR Wiener
x a x R R r
a
T
ww ss
T
ss
T
n s = + =

# # # $ # # # % &
1
) (
~
] [

| |
| |
T
n n
T
n n n
a a a
n h h h
0 1
) ( ) ( ) (
...
] [ ... ] 1 [ ] 0 [

=
= h

=
=
n
k
n
k n x k h n s
0
) (
] [ ] [ ] [

Wiener Filter as
Time-Varying FIR
Filter
Causal!
Length Grows!
Wiener - Hopf Filt er ing Equat ions
| |
T
ss ss ss ss
ss ww ss
n r r r
xx
] [ ... ] 1 [ ] 0 [
) (
=
= +
r
r h R R
R
# #$ # #% &
(
(
(
(

=
(
(
(
(
(

(
(
(
(

] [
] 1 [
] 0 [
] [
] 1 [
] 0 [
] 0 [ ] 1 [ ] [
] 1 [ ] 0 [ ] 1 [
] [ ] 1 [ ] 0 [
) (
) (
) (
Toeplitz & Symmetric
n r
r
r
n h
h
h
r n r n r
n r r r
n r r r
ss
ss
ss
n
n
n
xx xx xx
xx xx xx
xx xx xx
'
'
# # # # # # $ # # # # # # % &
"
' ( ' '
"
"
In Principle: Solve WHF Eqs for filter h at each n
In Practice: Use Levinson Recursion to Recursively Solve
14
Comments on Filtering: IIR Wiener
Can Show: as n Wiener filter becomes Time-Invariant
Thus: h
(n)
[k] h[k]
Then the Wiener-Hopf Equations become:
, 1 , 0 ] [ ] [ ] [
0
= =

=
l l r k l r k h
ss
k
xx
and these are solved using so-called Spectral Factorization
Andthe Wiener Filter becomes IIR Time-Invariant:

=
=
0
] [ ] [ ] [

k
k n x k h n s
15
Revisit the FIR Wiener: Fixed Length L

=
=
1
0
] [ ] [ ] [

L
k
k n x k h n s
] 6 [

s
The way the Wiener filter was formulated above, the length of filter
grew so that the current estimate was based on all the past data
Reformulate so that current estimate is based on only L most recent
data: x[3] x[4] x[5] x[6] x[7] x[8] x[9]
] 7 [

s
] 8 [

s
Wiener - Hopf Filt er ing Equat ions f or WSS Pr ocess w/ Fixed FI R
| |
T
ss ss ss ss
ss ww ss
n r r r
xx
] [ ... ] 1 [ ] 0 [
) (
=
= +
r
r h R R
R
# #$ # #% &
(
(
(
(
(
(

=
(
(
(
(
(
(

(
(
(
(
(
(

] [
] 1 [
] 0 [
] [
] 1 [
] 0 [
] 0 [ ] 1 [ ] [
] 1 [ ] 0 [ ] 1 [
] [ ] 1 [ ] 0 [
Toeplitz & Symmetric
n r
r
r
n h
h
h
r n r n r
n r r r
n r r r
ss
ss
ss
xx xx xx
xx xx xx
xx xx xx
' '
# # # # # # # $ # # # # # # # % &
"
' ( ' '
"
"
Solve W- H Filt er ing Eqs ONCE f or f ilt er h
16
Comments on Smoothing: FIR Smoother
Wx x R R R s
W
= + =

# # # $ # # # % &
1
) (

ww ss ss
Each row of Wlike a FIR Filter
Time-Varying
Non-Causal!
Block-Based
To interpret this Consider N=1 Case:
] 0 [
1
] 0 [
] 0 [ ] 0 [
] 0 [
] 0 [

SNR Low , 0
SNR High , 1
x
SNR
SNR
x
r r
r
s
ww ss
ss
#$ #% &

+
=
(

+
=
17
Comments on Smoothing: IIR Smoother
Estimate s[n] based on {, x[1], x[0], x[1],}

=
=
k
k n x k h n s ] [ ] [ ] [

Time-Invariant &
Non-Causal IIR Filter
The Wiener-Hopf Equations become:
< < =

=
l l r k l r k h
ss
k
xx
] [ ] [ ] [ ] [ ] [ ] [ n r n r n h
ss xx
=
) ( ) (
) (
) (
) (
) (
f P f P
f P
f P
f P
f H
ww ss
ss
xx
ss
+
=
=
Differs From Filter Case
Sum over all k
Differs From Filter Case
Solve for all l
H( f ) 1 when P
ss
( f ) >> P
ww
( f )
H( f ) 0 when P
ss
( f ) << P
ww
( f )
18
Relationship of Prediction to AR Est. & Yule-Walker
Wiener-Hopf Prediction Equations
| |
T
xx xx xx xx
xx xx
N l r l r l r ] 1 [ ... ] 1 [ ] [ + + =
=
r
r h R
(
(
(
(

+
+
=
(
(
(
(

(
(
(
(

] 1 [
] 1 [
] [
] [
] 1 [
] 0 [
] 0 [ ] 2 [ ] 1 [
] 2 [ ] 0 [ ] 1 [
] 1 [ ] 1 [ ] 0 [
Toeplitz & Symmetric
N l r
l r
l r
n h
h
h
r N r N r
N r r r
N r r r
xx
xx
xx
xx xx xx
xx xx xx
xx xx xx
' '
# # # # # # # # $ # # # # # # # # % &
"
' ( ' '
"
"
For l=1 we get EXACTLY the Yule-Walker Eqs used in
Ex. 7.18 to solve for the ML estimates of the AR parameters!!
!FIR Prediction Coefficients are estimated AR parms
Recall: we first estimated the ACF lags r
xx
[k] using the data
Then used the estimates to find estimates of the AR parameters
xx xx
r h R

=
19
Relationship of Prediction to Inverse/Whitening Filter
) ( 1
1
z a
u[k] x[k]
) (z a

AR Model
] [

k x
+

Inverse Filter: 1a(z)


FIR Pred.
Signal Observed
u[k]
White Noise
White Noise
1-Step Prediction
Imagination
&
Modeling
Physical Reality
20
Results for 1-Step Prediction: For AR(3)
0 20 40 60 80 100
-6
-4
-2
0
2
4
Sample Index, k
S
i
g
n
a
l

V
a
l
u
e
Signal
Prediction
Error
At each k we predict x[k] using past 3 samples
Application to Data Compression
Smaller Dynamic Range of Error gives More Efficient Binary Coding
(e.g., DPCM Differential Pulse Code Modulation)

S-ar putea să vă placă și