Sunteți pe pagina 1din 12

IV.

Stochastic Filtering

Inference Problem
Signal
Y

Estimation
Y*

Observation
X
1xM

1xN

Intervening
Process

1xN

Estimator

The problem is to infer Y from observation X in such a way


as to minimize mean square error (mse): E||Y-Y*||2
Example Applications:
De-noising Problem
Prediction

Tallal Elshabrawy

Inference Problem
Signal
Y

Estimation
Y*

Observation
X
1xM

1xN

Intervening
Process

1xN

Estimator

Assume:

X, Y are jointly distributed multi-dimensional random variables


Y 1 x N with covariance matrix RY
X 1 x M with covariance matrix RX
Cross-covariance matrix RXY=E[(Y-mY)T(X-mX)] where mY, mX are
the mean vectors for random variables Y, X respectively

Tallal Elshabrawy

Inference Problem Example


Signal
Y
1xN

Observation
X

1xN

Estimation
Y*
h(X)

1xN

Estimator

1xN

Signal Y affected by Noise W


We would like to estimate what was Y given we see an
observation X
Design Problem:
What is h(X) that will minimize mse E||Y-Y*||2

Tallal Elshabrawy

Inference Problem Example


Signal
Y
1xN

Observation
X

1xN

Estimation
Y*
h(X)

1xN

Estimator

1xN

Suppose fXY(x,y) = fX(x) fY(x) (i.e., X, Y are independent)


Y*=h(X) = E[Y]

Why?
Because the observation X has no information about Y.
Therefore the only thing we are left with is to estimate the
mean E[Y]
Tallal Elshabrawy

Inference Problem Example


Signal
Y
1xN

Observation
X

1xN

Estimation
Y*
h(X)

1xN

Estimator

1xN

Suppose X, Y are totally dependent


Y* = h(X) = X
Why?
Because there is a deterministic (non random) relation
between X and Y
Tallal Elshabrawy

Inference Problem Example


Signal
Y
1xN

Observation
X

1xN

Estimation
Y*
h(X)

1xN

Estimator

1xN

Therefore we could conclude that the correlation between


X and Y affects the design of the optimal estimator
The estimator h(X) could be linear or non-linear. We shall
only consider linear estimators such that
Y*T=HXT
Tallal Elshabrawy

Inference Problem Example


Signal
Y
1xN

Observation
X

1xN

W
1xN

Estimation
Y*

1xN

HT
NxN

1xN

Z
1xN

In the model above:


W represents the noise
Z= Y-Y* represents the error between the estimation and
reality
YT=ZT+Y*T YT=HXT+ZT
Tallal Elshabrawy

Inference Problem Example:


Linear Optimal Estimator
Signal
Y
1xN

Observation
X

1xN

W
1xN

Estimation
Y*

1xN

HT
NxN

1xN

Z
1xN

Claim:
There exists a matrix H such that the error Z=Y-Y* is
uncorrelated with X (i.e., E[ZTX]=0 E[ZjXi] for all i,j)
E[(YT-HXT) X] = 0
E[YTX] - E[HXTX] = 0
E[YTX] - HE[XTX] = 0
RXY=HRX H = RXY RX-1
Tallal Elshabrawy

Inference Problem Example:


Linear Optimal Estimator
Signal
Y
1xN

Observation
X

1xN

W
1xN

Estimation
Y*

1xN

HT
NxN

1xN

Z
1xN

Claim:
H = RXY RX-1 is an optimal linear estimator

Tallal Elshabrawy

10

Proof
Let H, G be two estimators and suppose H has the error Z uncorrelated
with X, i.e., H=RXYRX-1
mseG = E||YT-GXT||2 = E||(YT-HXT)+ (HXT- GXT)||2
mseG = E||YT-HXT||2 + E|| HXT- GXT||2 + 2E[(YT-HXT)T(HXT- GXT)]
Since BY DESIGN ZT = YT HXT is independent of XT and all functions
of XT then
E[(YT-HXT)T(HXT- GXT)] =

E[(YT-HXT)T] E[ (HXT- GXT)]

E[(YT-HXT)T(HXT- GXT)] = E[Z] E[ (HXT- GXT)] = 0


Because the mean error E[Z] should always be zero

Tallal Elshabrawy

11

Proof
Therefore
mseG = E||YT-HXT||2 + E|| HXT- GXT||2 + 2E[(YT-HXT)T(HXT- GXT)]
mseG = mseH + E|| HXT- GXT||2
E|| HXT- GXT||2 must be positive
Therefore
mseG >mseH unless H=G
Therefore
H is the optimal linear estimator

Tallal Elshabrawy

12

S-ar putea să vă placă și