Documente Academic
Documente Profesional
Documente Cultură
Stochastic Filtering
Inference Problem
Signal
Y
Estimation
Y*
Observation
X
1xM
1xN
Intervening
Process
1xN
Estimator
Tallal Elshabrawy
Inference Problem
Signal
Y
Estimation
Y*
Observation
X
1xM
1xN
Intervening
Process
1xN
Estimator
Assume:
Tallal Elshabrawy
Observation
X
1xN
Estimation
Y*
h(X)
1xN
Estimator
1xN
Tallal Elshabrawy
Observation
X
1xN
Estimation
Y*
h(X)
1xN
Estimator
1xN
Why?
Because the observation X has no information about Y.
Therefore the only thing we are left with is to estimate the
mean E[Y]
Tallal Elshabrawy
Observation
X
1xN
Estimation
Y*
h(X)
1xN
Estimator
1xN
Observation
X
1xN
Estimation
Y*
h(X)
1xN
Estimator
1xN
Observation
X
1xN
W
1xN
Estimation
Y*
1xN
HT
NxN
1xN
Z
1xN
Observation
X
1xN
W
1xN
Estimation
Y*
1xN
HT
NxN
1xN
Z
1xN
Claim:
There exists a matrix H such that the error Z=Y-Y* is
uncorrelated with X (i.e., E[ZTX]=0 E[ZjXi] for all i,j)
E[(YT-HXT) X] = 0
E[YTX] - E[HXTX] = 0
E[YTX] - HE[XTX] = 0
RXY=HRX H = RXY RX-1
Tallal Elshabrawy
Observation
X
1xN
W
1xN
Estimation
Y*
1xN
HT
NxN
1xN
Z
1xN
Claim:
H = RXY RX-1 is an optimal linear estimator
Tallal Elshabrawy
10
Proof
Let H, G be two estimators and suppose H has the error Z uncorrelated
with X, i.e., H=RXYRX-1
mseG = E||YT-GXT||2 = E||(YT-HXT)+ (HXT- GXT)||2
mseG = E||YT-HXT||2 + E|| HXT- GXT||2 + 2E[(YT-HXT)T(HXT- GXT)]
Since BY DESIGN ZT = YT HXT is independent of XT and all functions
of XT then
E[(YT-HXT)T(HXT- GXT)] =
Tallal Elshabrawy
11
Proof
Therefore
mseG = E||YT-HXT||2 + E|| HXT- GXT||2 + 2E[(YT-HXT)T(HXT- GXT)]
mseG = mseH + E|| HXT- GXT||2
E|| HXT- GXT||2 must be positive
Therefore
mseG >mseH unless H=G
Therefore
H is the optimal linear estimator
Tallal Elshabrawy
12