Documente Academic
Documente Profesional
Documente Cultură
Maximilian Kasy
1 / 13
Residual regression
Roadmap, Part I
2 / 13
Residual regression
3 / 13
Residual regression
Residual regression
I Consider regression with K regressors:
E ∗ (Y | 1, X1 , . . . , XK ) = β0 + β1 X1 + · · · + βK XK
E ∗ (XK | 1, X1 , . . . , XK −1 ) = γ0 + γ1 X1 + · · · + γK −1 XK −1
X̃K = XK − E ∗ (XK | 1, X1 , . . . , XK −1 )
I Claim:
βK = E (Y X̃K )/E (X̃K2 ).
4 / 13
Residual regression
5 / 13
Residual regression
Solution:
I Rewrite XK in terms of linear predictor and residual:
E ∗ (Y | 1, X1 , . . . , XK ) = β0 + β1 X1 + · · · + βK −1 XK −1
+ βK (γ0 + γ1 X1 + · · · + γK −1 XK −1 + X̃K )
= β˜0 + β˜1 X1 + · · · + β˜K −1 XK −1 + βK X̃K ,
with
β˜j = βj + βK γj (j = 0, 1, . . . , K − 1).
I Use orthogonality of X̃K and the other X s:
Sample Counterpart
I Data
y1 x1j
.. ..
y = . , xj = . (j = 0, 1, . . . , K ).
yn xnj
I Linear predictor and residual regressor:
7 / 13
Residual regression
Omitted variables
I Long regression:
E ∗ (Y | 1, X1 , . . . , XK ) = β0 + β1 X1 + · · · + βK XK
I Short regression:
E ∗ (Y | 1, X1 , . . . , XK −1 ) = α0 + α1 X1 + · · · + αK −1 XK −1
I Auxiliary regression:
E ∗ (XK | 1, X1 , . . . , XK −1 ) = γ0 + γ1 X1 + · · · + γK −1 XK −1
I Claim:
αj = βj + βK γj ( j = 0, 1, . . . , K − 1)
8 / 13
Residual regression
9 / 13
Residual regression
Sample Counterpart
I Long regression:
I Short regression:
I Auxiliary regression:
I Claim:
aj = bj + bK cj (j = 0, 1, . . . , K − 1)
10 / 13
Residual regression
Matrix version
I Linear predictor:
X0 β0
X1 β1 K
X = . , β = . , E ∗ (Y | X ) = ∑ Xj βj = X 0 β .
.. .. j =0
XK βK
I Orthogonality conditions:
E [Xj (Y − X 0 β )] = 0 (j = 0, 1, . . . , K ),
0
E [X (Y − X β )] = 0,
E (XY ) − E (XX 0 )β = 0
I Solution:
β = [E (XX 0 )]−1 E (XY ).
11 / 13
Residual regression
Sample Counterpart
I Data:
b0
y1 x1j b1
. .
y = .. , xj = .. (j = 0, 1, . . . , K ), b = . ,
..
yn xnj
bK
x10 x11 . . . x1K
. .. ..
x = x0 x1 ... xK = .. . .
xn0 xn1 . . . xnK
I Linear predictor:
ŷ = x0 b0 + x1 b1 + . . . + xK bK = xb.
12 / 13
Residual regression
I Orthogonality conditions:
n
∑ xij (yi − ŷi ) = xj0 (y − xb) = 0 (j = 0, 1, . . . , K ),
i =1
x00
x0
1
.. (y − xb) = x 0 (y − xb) = 0,
.
xK0
x 0 y − x 0 xb = 0,
I Solution:
b = (x 0 x )−1 x 0 y .
13 / 13