Documente Academic
Documente Profesional
Documente Cultură
Hypothesis Testing
HO: Homskedacity
H1: Hetrosckedacity
α=0.05 or 0.01
Region of Rejection
1) General Regression
Run Auxiliary Regression : log of Sum of Residual Square (Lresid2) on log of explanatory variables (LFord,
LGM,LMS)
Result
The t-statistic is in significant as they are less than value of 2, we may accept the Null Hypothesisi.o
Homoskedacity
PARK LM TEST
Hypothesis Testing
HO: Homskedacity
H1: Hetrosckedacity
α=0.05 or 0.01
Region of Rejection
Auxiliary Regression
Run Auxiliary Regression : log of Sum of Residual Square (Lresid2) on log of explanatory variable (LGM)
Result
The t-statistic is in significant as they are less than value of 2, we may accept the Null Hypothesis of
Homoskedacity
PARK LM TEST - Second Dataset (MACRO)
Hypothesis Testing
HO: Homskedacity
H1: Hetrosckedacity
α=0.05 or 0.01
Region of Rejection
1) General Regression
Interpretation
The explanatory variables are statistically significant as the T-Stats is greater than value of 2. The
explanatory power of the mode is 57%. The goodness of model is fit as F-Statistics value is greater than
3.5 and Prob( F-statistic) is zero.
2. Auxiliary Regression
Run Auxiliary Regression:Run Regression of log of Sum of Residual Square (Lresid2) on log of
explanatory variables (LCPI, LConsumer Credit and LIndustrial Production)
Interpretation
The auxiliary regression result shows that all explanatory variables are statistically significant and have
hetroskadacity except CPI which statistical is insignificant.
Glesjre Test (Data set-1)
Hypothesis Testing
HO: Homskedacity
H1: Hetrosckedacity
α=0.05 or 0.01
LM =N.R2
LM=Lagrange Multiplier
N= No. of Observations
R2= Coefficient of determination
Values of R2 are found by different auxiliary regression according to the Test applied
Or
F=R2/1-R2 N-K/K-1
Region of Rejection
LM(Lagrange Multiplier)> Chi Square K (X2K)
Or
F >Fα, K-1,N-K
1. General Regression
Auxiliary Regressions
In following Auxiliary Regressing we regressed the absolute value of Residual on absolute values of
explanatory variables in Glesjre Test. Here, we run three auxiliary regressions with same Absolute
value of error with different forms (Absolute ,Square root and Reciprocal) of explanatory variables to
check the Hetroskadecity.
Auxiliary Regression (1)-With Absolute Values of Dependent and Independent Variables)
(Abs)µ=α0+αMS+αGM+αFord
CHSQ(0.05,3)=CHISQ.INV.RT(0.05,3)=7.81
Result Interpretation
As LM is greater than CHI Square is not met in all three cases, we have strong evidence not to
reject Ho and conclude that errors are not hetroskedastic.
In other words, as LM < 𝐶𝐻𝐼 𝑆𝑄𝑈𝐴𝑅𝐸 Value, we cannot reject Ho and conclude that errors are
not hetroskedastic.
Glesjre Data set 2
General Regression
Result Interpretation
As LM is greater than CHI Square is met in all three cases, we have strong evidence to reject Ho
and conclude that errors are hetroskedastic.
BREUCH PAGAN TEST
Step 1
Run a regression, find error then find square of error. So we have run the following
regression SANDP on Microsoft, GM,Ford and get error and square of error.
Step-2
Results
a) LM TEST
LM = N.R2
LM=64*0.100 =6.4
CHSQ(0.05,3)=CHISQ.INV.RT(0.05,3)=7.81
As LM < 𝑪𝑯𝑰 𝑺𝑸𝑼𝑨𝑹𝑬 Value, we cannot reject Ho and conclude that errors are not
hetroskedastic
b) F-stats
F-Value is also less than 3.5 i.e 2.23, so we accept Ho and conclude that errors are not
Hetroskedastic
BREUCH PAGAN TEST- Data Set-2
Step 1
Run a regression, find error then find square of error. So we have run the following regression
Microsoft on CPI, Industrial Production, Consumer and Credit Microsoft and get error and
square of error.
Step-2
Results
a) LM TEST
LM = N.R2
LM=254*0.091 =23.11
CHSQ(0.05,3)=CHISQ.INV.RT(0.05,3)=7.81
Result Interpretation
As LM is greater than CHI Square , we have strong evidence to reject Ho and conclude
that errors are hetroskedastic.
b) F-stats
F-Value is also greater than 3.5 i.e. 8.42, so we reject Ho and conclude that errors are
Hetroskedastic
White Test-(Dataset-1)
Step 1
Run a regression, find error then find square of error. So we have run the following
regression SANDP on Microsoft, GM,Ford and get error and square of error.
Run Auxiliary Regression which is different from previous test. Run Square of error on
MS,Ford,GM,MS2,Ford2,GM2,(MS*Ford),(MS*GM),(GM*MS2)
Results
a) LM TEST
LM = N.R2
LM=64*0.23 =15
CHSQ(0.05,9)=CHISQ.INV.RT(0.05,9)=16.9
As LM < 𝑪𝑯𝑰 𝑺𝑸𝑼𝑨𝑹𝑬 Value, we cannot reject Ho and conclude that errors are not
hetroskedastic
b) F-stats
F-Value is also less than 3.5 i.e 1.85, so we accept Ho and conclude that errors are not
Hetroskedastic
White Test-(Data set-2)
Step-1
Run a regression, find error then find square of error. So we have run the following regression
Microsoft on CPI, Industrial Production, Consumer and Credit Microsoft and get error and
square of error.
Run Auxiliary Regression which is different from previous test. Run Square of error on
CPI,CC,IP,CPI2,CC2,IP2,(CPI*CC,)(CPI*IP),(CC*IP) and obtain R2
Results
c) LM TEST
LM = N.R2
LM=254*0.14 =35.56
CHSQ(0.05,9)=CHISQ.INV.RT(0.05,9)=16.9
As LM is greater than CHI Square , we have strong evidence to reject Ho and conclude
that errors are hetroskedastic.
d) F-Stats
As F-Stats is also greater than 3.5, we have strong evidence to reject Ho and conclude that
errors are hetroskedastic.
Dependent Variable: MS_CPI
Method: Least Squares
Date: 05/28/19 Time: 01:13
Sample: 1986M03 2007M04
Included observations: 254
genr ms_cpi=microsoft/cpi
genr cc_cpi=consumer_credit/cpi
genr ip_cpi=industrial_production/cpi
genr resid_cpi=resid/cpi
Dependent Variable: MS_CPI
Method: Least Squares
Date: 05/28/19 Time: 00:51
Sample: 1986M03 2007M04
Included observations: 254
genr ip_cpi=industrial_production/cpi
genr reci_cpi=1/cpi
genr resid_cpi=resid/cpi
genr resid2=resid*resid
genr recip_cpi2=1/cpi*1/cpi
genr cc_cpi2=cc_cpi*cc_cpi
genr ip_cpi2=ip_cpi*ip_cpi
genr reci_cpi_cc_cpi=reci_cpi*cc_cpi
genr reci_cpi_ip_cp=reci_cpi*ip_cpi
genr cc_cpi_ip_cpi=cc_cpi*ip_cpi