Sunteți pe pagina 1din 3

Introductory Econometrics - Cheat Sheet for the Final Exam

You are allowed to use the following cheat sheet for the nal exam. 1. Properties of the log function: 100 log(x) 100 2. Multiple linear regression assumptions: MLR.1: Y = 0 + 1 X1 + . . . + k Xk + U (model) MLR.2: The observed data {(Yi , X1i , . . . , Xki ), i = 1, . . . , n} is a random sample from the population MLR.3: In the sample, none of the explanatory variables has constant values and there is no perfect collinearity among the explanatory variables. MLR.4: (Zero conditional mean) E(U |X1 , . . . , Xk ) = 0 for any possible value of X1 , . . . , Xk . MLR.5: (Homoskedasticity) V ar(U |X1 , . . . , Xk ) = V ar(U ) = 2 for any possible value of X1 , . . . , X k MLR.6: (Normality) U | X1 , . . . , Xk N (0, 2 ) for any possible value of X1 , . . . , Xk 3. The OLS estimator and its algebraic properties The OLS estimator solves
n 0 ,...,k

x = %x x

min

(Yi 0 1 X1i 2 X2i . . . k Xki )2


i=1

Solution: In the general case no explicit was solution given. If k = 1, i.e. there is only one explanatory variable, then 1 =
n i=1 (X1i X1 )(Yi n 2 i=1 (X1i X1 )

Y)

and 0 = Y 1 X1 .

Predicted value for individual i in the sample: Yi = 0 + 1 X1i + . . . k Xki . Residual for individual i in the sample: Ui = Yi Yi .
n i=1 n i=1

Ui = 0 Ui Xji , j = 1, . . . , k

estimated regression line goes through (X1 , . . . , Xk , Y ) 4. SST, SSE, SSR, R2

SST = n (Yi Y )2 i=1 n SSE = i=1 (Yi Y )2 n 2 SSR = i=1 Ui SST = SSE + SSR R2 = SSE/SST = 1 SSR/SST 5. Variance of the OLS estimator, estimated standard error, etc. For j = 1, . . . , k: V ar(j ) = 2 , 2 SSTXj (1 Rj )

2 where SSTXj = n (Xji Xj )2 and Rj is the R-squared from the regression of Xj on all i=1 the other X variables. The estimated standard error of the j , j = 1, . . . , k, is given by

se(j ) = where 2 =
SSR . nk1

2 , 2 SSTXj (1 Rj )

6. Simple omitted variables formula: True model: Y = 0 + 1 X1 + 2 X2 + U Estimated model: Y = 0 + 1 X1 + V E(1 ) = 1 + 2 cov(X1 , X2 ) var(X1 )

7. Hypothesis testing 1. Basic notions Type 1 error=rejecting a true null hypothesis Type 2 error=accepting a false null hypothesis Signicance level=probability of a Type 1 error 2. t-statistic for testing H0 : j = a j a t(n k 1) if H0 is true se(j )

3. The F-statistic for testing hypotheses involving more than one regression coecient (ur = unrestricted, r = restricted, q = # of restrictions in H0 , k = # of slope coecients in the unrestricted regression) (SSRr SSRur )/q F (q, n k 1) if H0 is true SSRur /(n k 1) or, under certain conditions,
2 2 (Rur Rr )/q F (q, n k 1) if H0 is true 2 (1 Rur )/(n k 1)

8. Heteroskedasticity Testing for heteroskedasticity: Regress squared residuals on all explanatory variables; test for joint signicance of the explanatory variables. Correcting for heteroskedasticty: Use heteroskedasticity-robust standard errors; correct Ftest formula. OR: Use GLS 9. Regression with time series data TS. 1: Yt = 0 + 1 X1t + 2 X2t + . . . + k Xkt + Ut TS. 2: Same as MLR 3. In the assumptions below let the matrix X contain the values of all explanatory variables in all time periods (past, present and future). TS. TS. TS. TS. 3: 4: 5: 6: (Zero conditional mean) E[Ut | X] = 0, t = 1, 2, . . . , n. (No heteroskedasticity) V ar(Ut | X) = V ar(Ut ) = 2 , t = 1, 2, . . . , n. (No autocorrelation) Corr(Ut , Us ) = 0 for t = s. (Normality) Ut | X N (0, 2 ).

S-ar putea să vă placă și