Sunteți pe pagina 1din 27

Detection and Estimation Theory

ECE 531

Saeed Hajizadeh

Department of Electrical and Computer Engineering


University of Illinois at Chicago

29 March 2016

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 1 / 24
Lecture 16

Kalman Filters

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 2 / 24
Review

Recall last lecture we discussed Bayesian Estimation


Bayesian Risk function to be minimized was introduced
Different cost functions led to various estimators
We also considered the MMSE estimators (posterior mean if
quadratic cost function)
Bayesian linear model resembles classical linear model with prior

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 3 / 24
Introduction

Today we will discuss Kalman Filters


So far, weve had a fixed parameter to be estimated
What if the parameter evolves over time?
x[n] = f ((n), u[n]) Dynamical Signal Model
Wiener filters and Kalman filters used
When non-stationary or vector noise, Kalman filter is used
Applications:
Guidance, navigation, and control of vehicles
Time series analysis used in Econometrics, Finace, and Signal
Processing

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 4 / 24
Outline

1 Scalar Version
Scalar Gauss-Markov Signal Model
Scalar Kalman Filter

2 Vector Version
Vector Gauss-Markov Signal Model
Kalman Filter

3 Application

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 5 / 24
Outline

1 Scalar Version
Scalar Gauss-Markov Signal Model
Scalar Kalman Filter

2 Vector Version
Vector Gauss-Markov Signal Model
Kalman Filter

3 Application

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 6 / 24
Dynamical Signal Model

Example of DC level with WGN

x[n] = A[n] + u[n] n 0

Model with first-order Gauss-Markov process

s[n] = as[n 1] + u[n] n 0 (1)

s[1] N (s , s2 )
WGN
u[n] N (0, u2 )
where everything is pairwise independent.
Model looks like an AR(1) process

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 7 / 24
Dynamical Signal Model

Non-iterative equivalent
n
X
n+1
s[n] = a s[1] + ak u[n k] (2)
k=0

s[n] is a Gaussian random process (proved by induction)


Mean, covariance, and variance:

E(s[n]) = an+1 s (3)


Cov (s[m], s[n]) = E [(s[m] E(s[m]))(s[n] E(s[n]))]
n
X
= am+n+2 s2 + u2 amn a2k m n (4)
k=0
n
X
Var (s[n]) = a2n+2 s2 + u2 a2k (5)
k=0

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 8 / 24
Dynamical Signal Model

E(s[n]) = an+1 s
n
X
Cov (s[m], s[n]) = am+n+2 s2 + u2 amn a2k mn
k=0
n
X
Var (s[n]) = a2n+2 s2 + u2 a2k
k=0

Mean and variance depend on n


Covariance depends on m and n as well as m n
Not a WSS random process
As n the process gets WSS

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 9 / 24
Dynamical Signal Model

Recursive formulas for mean, variance, and covariance:

E(s[n]) = aE(s[n 1]) (6)


2
Var (s[n]) = a Var (s[n 1]) + u2 (7)
mn
Cov (s[m], s[n]) = a Var (s[n]) (8)

Mean, variance, and covariance propagation equations


Variance is asymptotically bounded (Why?)

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 10 / 24
Outline

1 Scalar Version
Scalar Gauss-Markov Signal Model
Scalar Kalman Filter

2 Vector Version
Vector Gauss-Markov Signal Model
Kalman Filter

3 Application

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 11 / 24
Scalar Kalman Filter

s[n] = as[n 1] + u[n]


x[n] = s[n] + w [n] (9)
WGN
u[n] N (0, u2 )
w [n] N (0, n2 )

Want to estimate s[n] based on {x[0], x[1], ..., x[n]}: s[n|n]

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 12 / 24
Scalar Kalman Filter

s[n] = as[n 1] + u[n]


x[n] = s[n] + w [n] (9)
WGN
u[n] N (0, u2 )
w [n] N (0, n2 )

Want to estimate s[n] based on {x[0], x[1], ..., x[n]}: s[n|n]


Recursive estimation: s[n] given s[n 1]

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 12 / 24
Scalar Kalman Filter

s[n] = as[n 1] + u[n]


x[n] = s[n] + w [n] (9)
WGN
u[n] N (0, u2 )
w [n] N (0, n2 )

Want to estimate s[n] based on {x[0], x[1], ..., x[n]}: s[n|n]


Recursive estimation: s[n] given s[n 1]
Process called Kalman filtering

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 12 / 24
Scalar Kalman Filter

s[n] = as[n 1] + u[n]


x[n] = s[n] + w [n] (9)
WGN
u[n] N (0, u2 )
w [n] N (0, n2 )

Want to estimate s[n] based on {x[0], x[1], ..., x[n]}: s[n|n]


Recursive estimation: s[n] given s[n 1]
Process called Kalman filtering

s[n|n] = arg min E [s[n] s[n|n]]2

s[n|n] = E(s[n]|x[0], x[1], . . . , x[n])

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 12 / 24
Scalar Kalman Filter

Some MMSE estimator of given two uncorrelated data vectors

= E(|x1 , x2 ) = E(|x1 ) + E(|x2 )


= E(1 + 2 |x) = E(1 |x) + E(2 |x)

Using these properties we can prove,

s[n|n 1] = a
s [n 1|n 1] (10)
2
M[n|n 1] = a M[n 1|n 1] + u2 (11)
M[n|n 1]
K [n] = 2 (12)
n + M[n|n 1]
s[n|n] = s[n|n 1] + K [n](x[n] s[n|n 1]) (13)
M[n|n] = (1 K [n])M[n|n 1] (14)

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 13 / 24
Scalar Kalman Filter

s[n|n 1] = a
s [n 1|n 1]
M[n|n 1] = a2 M[n 1|n 1] + u2
M[n|n 1]
K [n] = 2
n + M[n|n 1]
s[n|n] = s[n|n 1] + K [n](x[n] s[n|n 1])
M[n|n] = (1 K [n])M[n|n 1]

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 14 / 24
Scalar Kalman Filter

Example:

1
s[n] = s[n 1] + u[n]
2
1
s[1] N (0, 1) , u2 = 2 , n2 = ( )n
2

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 15 / 24
Kalman Filter Properties

Extension of sequential MMSE estimation for parameter evolving in


time
No matrix inversion is needed Computationally more efficient
Prediction stage increases the error correction stage decreases it
Best one-step prediction vs. best two-step prediction

s[n|n 1] vs. s[n + 1|n 1] (15)

Two-step prediction is optimal if n .


Kalman is a whitening filter.

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 16 / 24
Outline

1 Scalar Version
Scalar Gauss-Markov Signal Model
Scalar Kalman Filter

2 Vector Version
Vector Gauss-Markov Signal Model
Kalman Filter

3 Application

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 17 / 24
Vector Gauss-Markov Signal Model

Vector state-scalar observation:

s[n] = As[n 1] + Bu[n] (16)


x[n] = hT [n]s[n] + w [n] (17)

WGN
[A]pp , [B]pr , [u]r 1 N (0, Q), s[1] N (s , Cs )
[h[n]]p1 , w [n] N (0, n2 )

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 18 / 24
Vector Gauss-Markov Siganl Model

Vector state-vector observation:


State model is the same

x[n] = H[n]s[n] + w[n] (18)

[H[n]]Mp , [x[n]]M1 , [w[n]]M1 N (0, C[n])

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 19 / 24
Outline

1 Scalar Version
Scalar Gauss-Markov Signal Model
Scalar Kalman Filter

2 Vector Version
Vector Gauss-Markov Signal Model
Kalman Filter

3 Application

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 20 / 24
Kalman Filter

Vector state-scalar observation Filter


Prediction
Minimum Prediction MSE Matrix (p p)
Kalman Gain Vector (p 1)
Correction
Minimum MSE Matrix(p p)

s[n|n 1] = As[n 1|n 1] (19)


M[n|n 1] = AM[n 1|n 1]AT + BQBT (20)
M[n|n 1]h[n]
K[n] = 2 (21)
n + hT [n]M[n|n 1]h[n]
s[n|n] = s[n|n 1] + K[n](x[n] hT [n]s[n|n 1]) (22)
T
M[n|n] = (I K[n]h [n])M[n|n 1] (23)

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 21 / 24
Kalman Filter

Vector state-vector observation Filter


Prediction
Minimum Prediction MSE Matrix (p p)
Kalman Gain Vector (p M)
Correction
Minimum MSE Matrix(p p)

s[n|n 1] = As[n 1|n 1] (24)


T T
M[n|n 1] = AM[n 1|n 1]A + BQB (25)
K[n] = M[n|n 1]HT [n](C[n] + H[n]M[n|n 1]HT [n])1 (26)
s[n|n] = s[n|n 1] + K[n](x[n] H[n]s[n|n 1]) (27)
M[n|n] = (I K[n]H[n])M[n|n 1] (28)

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 22 / 24
Application

Kalman filter has found a lot of applications in Finance


Is used to model tick by tick returns in Market Microstructure
Efficient price, transaction price, informational price diffusion,
i.i.d
pt,i = mt,i + ut,i , mt,i = mt,i1 + (t,i) i i N (0, 1)

ARMA model for Microstructure noise ut,i assumed


Look for return rt,i = pt,i pt,i1 using Kalman filter techniques

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 23 / 24
Summary

Gauss-Markov model for evolving parameter estimation


Kalman filter estimated the parameter recursively
Mean, variance, and propagation equations
Scalar and vector state and observation were considered
One application were taken a look at

Saeed Hajizadeh (UIC) Detection and Estimation Theory ECE 531 29 March 2016 24 / 24

S-ar putea să vă placă și