Sunteți pe pagina 1din 2

SIMPLE LINEAR REGRESSION 8. Confidence Interval of 0 and 11.

Adjusted R2
1 SSE /( n  k  1)
RA2  1 
1. Prediction Equation  0  t ( / 2,n2)  S e (  0 ) SST /( n  1)
ˆ0  equation
yˆ i Type ˆ1 𝑥1  1  t ( / 2,n  2 )  S e (  1) n 1
here. RA2  1Type
 (1  equation
R2 )  here.
9. Confidence interval for n  (k  1)
2. Sample Slope mean value of Y given x R 2A  The adjusted coefficien t
A (1-𝛼)100% confidence of determinat ion
x  x  yi  y 
ˆ1  xy   i
SS
interval for E(𝑌|𝑋):
SS xx  xi  x 
2 R 2  Unadjuste d coefficien t

∧ 1 (𝑋𝑖 − 𝑋)2 of determinat ion
𝑌𝑖 ± 𝑡𝛼/2,𝑛−2 𝑆𝑒 √ +
∑ 𝑥𝑖 𝑦𝑖− 𝑛𝑥̅ 𝑦̅ 𝑐𝑜𝑣(𝑥,𝑦) 𝑛 𝑆𝑆𝑋  number
n 13. of observatio ns
= = 𝑣𝑎𝑟(𝑥) Variance Inflation Factor
∑𝑥𝑖2 −𝑛𝑥̅ 2 ∧
Here 𝑌is the E(𝑌|𝑋). k  no. of explanator y variable s
2 2
SSxx= ∑ x - (∑ x) /n
SSx = (n-1) 𝑆𝑥2
SSxy= ∑ xy- ∑ x*∑ y/n 12. Variance Inflation Factor
10. Prediction interval for a
3. Sample Y Intercept 1
randomly chosen value of Y VIF(𝑋𝑗 ) =
1 − 𝑅𝑗2
ˆ0  y  ˆ1 x given x
𝑅𝑗2 is the coefficient of

4. Coeff. Of Determination A (1-𝛼)100% prediction determination for the


interval for Y is: regression of X𝑗 as
SSR SSE − dependent variable and all other
R2   1 ∧ 1 (X i − X)2
SST SST Yi ± t α/2,n−2 Se √1 + + 𝑋𝑖 s as independent variables
n SSx
5. Standard Error of Estimate where Xi is the observed value
of independent variable. IF VIF>4, Multi colinearity suspected.
2
  
  Y

Y 
i 𝑌𝑖 is the estimate of Y, n is 13. Tolerance Factor:
Se   𝟏
the sample size and S𝑒 is the 1-𝑹𝟐𝒋 =𝑽𝑰𝑭
n  k 1
standard error
6. Standard Error of 0 and 1 14. Beta Weights
10. Coefficient of Correlation
(for simple regression) (Standardized Beta)
𝟏 𝑥̅ 𝟐 S
𝑺(𝜷𝟎 ) = 𝑺𝒆 √𝒏 + (𝒏−𝟏)𝑺𝟐 Beta   i  x
𝒙
SS XY Sy
r  R2 

Se  x 2
SS XX SSYY S x  Std dev of X
nSS xx S y  Std dev of Y
Forward Regression
Se 1 Se
S (1 )  
SS xx n 1 S x 𝐹𝑖𝑛 > 3.84

𝑃𝑖𝑛 < 0.05
7. Test Statistic for 𝛽1
𝑬𝒔𝒕𝒊𝒎𝒂𝒕𝒆−𝑷𝒂𝒓𝒂𝒎𝒆𝒕𝒆𝒓 Backward Regression
𝒕(𝒏−𝟐) = 𝑬𝒔𝒕.𝒔𝒕𝒅.𝒆𝒓𝒓𝒐𝒓 𝒐𝒇 𝑬𝒔𝒕𝒊𝒎𝒂𝒕𝒆
𝐹𝑜𝑢𝑡 < 2.71
𝑃𝑜𝑢𝑡 > 0.10
𝜷̂𝟏 − 𝜷𝟏
=
𝑺𝒆 (𝜷̂𝟏 )
ANOVA TABLE
Source of Sum of Degrees Mean F Statistic
Variation Squares of Square
freedom
Regression SSR k SSR/k 𝐹(𝑘,𝑛−𝑘−1)= MSR/MSE
Error SSE n-(k+1) SSE/n-(k+1)
Total SST n-1

18. Partial Correlation 20. Omitted Variable Bias


15. Partial F Test
( SSER  SSEF ) / r Correlation between y and 𝑥1 ,
Fr ,n ( k 1)  Actual relationship
MSE F when the influence of 𝑥2 is
(𝑅 2 −𝑅 2 )/𝑟
removed from both y and 𝑥1 . Y= 𝛽0 + 𝛽1 𝑋1 +𝛽2 𝑋2
= (1−𝑅2𝐹)/(𝑛−𝑘−1)
𝑅
𝐹 𝑟𝑦1−(𝑟𝑦2 )(𝑟12 )
pry1,2 =
Fitted Model
𝑆𝑆𝐸𝑅 - Sum of square error of √1 − 𝑟𝑦22 √1 − 𝑟12
2
reduced model; Y= 𝛼0 + 𝛼1 𝑋1

𝑆𝑆𝐸𝐹 −Sum of squares error of


Then
full model;
19. Semi-partial correlation
r – no. of variables dropped 𝐶𝑜𝑣 ( 𝑥1 , 𝑥2 )
∝1 = 𝛽1 + 𝛽2 ×
from full model/or added to the (Part Correlation) 𝑉𝑎𝑟(𝑥1 )
reduced model
correlation between y and x1,
when influence of x2 is removed
out of x1 (but not out of y).
16. F-test (Overall Significance
of the model) 𝑟𝑦1−(𝑟
𝑦2 )(𝐫𝟏𝟐 )
𝑠𝑟𝑦1,2 =
2
√1−𝑟12
𝑀𝑆𝑅
𝐹𝑘,𝑛−(𝑘+1) = 𝑀𝑆𝐸
Square of part correlation of an
𝑆𝑆𝑅/𝑘
= 𝑀𝑆𝐸/(𝑛−(𝑘+1)) explanatory variable = unique
contribution of the explanatory
𝑅2 /𝑘
= (1−𝑅2 )/(𝑛−(𝑘+1) variable to R2

Multiple Linear Regression: When this variable is added

16. Prediction Interval 𝑠𝑟 2 = 𝐶ℎ𝑎𝑛𝑔𝑒 𝑖𝑛 𝑅 2


2 2
A (1-α) 100% PI (Prediction Interval) for = 𝑅𝑛𝑒𝑤 − 𝑅𝑜𝑙𝑑
value of a randomly chosen Y, given
values of Xi: 𝐶ℎ𝑎𝑛𝑔𝑒 𝑖𝑛 𝑅2
pr 2 = 2
1− 𝑅𝑛𝑒𝑤
𝑦̂ ± 𝑡( ∝ ,(𝑛−(𝑘+1)) √𝑠 2 (𝑦̂) + 𝑀𝑆𝐸
2 𝑆𝑟 2
pr 2 = 2
1− 𝑅𝑛𝑒𝑤
A (1-α) 100% CI (Confidence Interval)
for the conditional mean of Y given
values of Xi:

𝑦̂ ± 𝑡( ∝ ,(𝑛−(𝑘+1)) S[𝐸̂ (𝑌)]


2

S-ar putea să vă placă și