Sunteți pe pagina 1din 6

Exercices 1 and 2

. clear all . . set seed 123456798 . . set obs 50 obs was 0, now 50 . . gen x1 = invnorm(uniform()) . . gen e = invnorm(uniform()) . . gen x2 = x1+e . . gen u = invnorm(uniform()) . . gen x3=.5*x1+.7*x2 . . gen y = 1+(2/3)*x1+(1/3)*x2+u . reg y x1 x2,r Linear regression Number of obs = F( 2, 47) = Prob > F = R-squared = Root MSE = Robust Std. Err. .2340611 .1176094 .1402353 50 37.77 0.0000 0.6080 .99448

y x1 x2 _cons . vce

Coef. 1.005005 .2286828 1.277729

t 4.29 1.94 9.11

P>|t| 0.000 0.058 0.000

[95% Conf. Interval] .5341351 -.0079168 .9956125 1.475876 .4652824 1.559846

Covariance matrix of coefficients of regress model e(V) x1 x2 x1 .05478461 -.02028526 x2 .01383198 _cons

In both cases were going to use the t-student distribution, because were testing single hyphotesis. This is the assumption - For H0:b1 = 0 at the 5% significance level look at the table above and see that the t

value is 4,29 > 1,96 theres strong statistical evidence to reject the null hypothesis.
- For H0: 2 1 / 3 at the 5% significance level.
. display (x2 - 1/3)/0.1176094 2.347888

And thaing absolute value we have that 2, 34 > 1,96 Moderate evidence to reject the null hypothesis. At 5% significance level.

Exercice 3

. test (1) (2)

xl=x2=2/3 x l - x2 = O x l = ,6666667
F(

2, 47) = Prob > F =

8.78 0.0006

Because q = 2 -> F-statistic at 5% significance level is = 3 -> 8,78 > 3 -> reject the nuil hypothesis. Tve used the F-student distribution and the assumptions are the following: Conditional distribution o f U t o X = 0 X , X2i, Xki, Yi are iid (independently and identically distributed) X I , x2, Xk and U have 4 moments with respect to the origin There's no perfect multicollinearity

Exercise 4

. test x l (1) xl
F(

- 2*x2 = 0 - 2*x2 = 0 1, 47) = Prob > F = 1.57 0.2167

Because q = 2 -> 1,57 < 3 (at 5% significance level). Henee, we accept the nuil hypothesis . t e s t x l + x2 = 1 (1) x l + x2 = 1
F(

1, 47) = Prob > F =

1.95 0.1695

Again, because q = 2 -> 1,95 < 3. Henee, we accept the nuil hypothesis at 5% significance level. - We can compute the results manually o r

^ ~t**\~

, -t Q >

f-

Exercise 5
. reg y x1 x2 x3, r note: x3 omitted because of collinearity Linear regression > = 50 > > > > > > > > > > x1 1.475876 x2 .4652824 x3 _cons 1.559846 1.005005 .2286828 (omitted) 1.277729 .2340611 .1176094 .1402353 4.29 1.94 9.11 0.000 0.058 0.000 .5341351 -.0079168 .9956125 y Interval] Coef. Robust Std. Err. t P>|t| [95% Conf. = = = = 37.77 Prob > F 0.0000 R-squared 0.6080 Root MSE .99448 Number of obs F( 2, 47)

- As we can see beta 3 is omitted because theres multicollinearity (x3 is highly correlated to x1 and x2) Exercise 6 1st case) We must test H0:b1 = 0 against H0:b1 = not 0
. use "C:\Users\Roger\Documents\POMPEU - ADE\2n\Intro to Econometrics\STAR1.dta", clear . reg sck boy, r Linear regression Number of obs F( 1, 6323) Prob > F R-squared Root MSE Robust Std. Err. .0115346 .0082665 = = = = = 6325 0.00 0.9687 0.0000 .4585

sck boy _cons

Coef. .0004528 .3001626

t 0.04 36.31

P>|t| 0.969 0.000

[95% Conf. Interval] -.0221589 .2839574 .0230645 .3163678

And as we can see the t value = 0,04 and that means that at a 5% significance level we have strong statistical evidence to accept the null hypothesis no correlation between class size and boys.

2nd case) We must test H0:b1 = 0 against H0:b1 = not 0


. reg sck freelunk, r Linear regression Number of obs F( 1, 6299) Prob > F R-squared Root MSE Robust Std. Err. .0115492 .0081014 = = = = = 6301 1.96 0.1619 0.0003 .45838

sck freelunk _cons

Coef. -.0161551 .3080948

t -1.40 38.03

P>|t| 0.162 0.000

[95% Conf. Interval] -.0387954 .2922133 .0064853 .3239763

Therefore at 5% significance level we dont reject the null hypothesis despite the fact that there seems to be more correlation than the previous case.
Exercise 7 - The procedure might lead to bad inference because if the allocation is not random then the variables can be correlated and if weve assumed that theyre not well make incorrect inferences. Joint Contrast boy = freelunk = 0 vs. boy or freelunk = not 0

. reg sck boy freelunk, r Linear regression Number of obs F( 2, 6298) Prob > F R-squared Root MSE Robust Std. Err. .0115558 .0115515 .0100992 = = = = = 6301 0.98 0.3760 0.0003 .45841

sck boy freelunk _cons

Coef. .0000771 -.0161542 .3080547

t 0.01 -1.40 30.50

P>|t| 0.995 0.162 0.000

[95% Conf. Interval] -.0225763 -.0387991 .2882568 .0227305 .0064907 .3278526

. test boy = freelunk = 0 ( 1) ( 2) boy - freelunk = 0 boy = 0 F( . vce Covariance matrix of coefficients of regress model e(V) boy freelunk _cons boy .00013354 2.157e-06 -.00006967 freelunk .00013344 -.00006677 _cons 2, 6298) = Prob > F = 0.98 0.3760

.00010199

Therefore we cannot reject the nuil hyphotesis at 5% level under q = 2 (two restrictions) because 0,98 < 3 -> strong evidence of NOT CORRELATION between both variables and class size. - By hand

CJ

S-ar putea să vă placă și