Sunteți pe pagina 1din 172

Chapter 9

Regression with Time Series Data:


Stationary Variables

Walter R. Paczkowski
Rutgers University
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 1
Stationary Variables
Chapter Contents

 9.1 Introduction
 9.2 Finite Distributed Lags
 9.3 Serial Correlation
 9.4 Other Tests for Serially Correlated Errors
 9.5 Estimation with Serially Correlated Errors
 9.6 Autoregressive Distributed Lag Models
 9.7 Forecasting
 9.8 Multiplier Analysis

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 2
Stationary Variables
9.1
Introduction

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 3
Stationary Variables
9.1
Introduction

When modeling relationships between variables,


the nature of the data that have been collected has
an important bearing on the appropriate choice of
an econometric model
– Two features of time-series data to consider:
1. Time-series observations on a given
economic unit, observed over a number of
time periods, are likely to be correlated
2. Time-series data have a natural ordering
according to time

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 4
Stationary Variables
9.1
Introduction

There is also the possible existence of dynamic


relationships between variables
– A dynamic relationship is one in which the
change in a variable now has an impact on that
same variable, or other variables, in one or
more future time periods
– These effects do not occur instantaneously but
are spread, or distributed, over future time
periods

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 5
Stationary Variables
9.1
Introduction FIGURE 9.1 The distributed lag effect

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 6
Stationary Variables
9.1
Introduction

9.1.1
Dynamic Nature of
Relationships
Ways to model the dynamic relationship:
1. Specify that a dependent variable y is a
function of current and past values of an
explanatory variable x
Eq. 9.1 yt  f ( xt , xt 1 , xt 2 ,...)

• Because of the existence of these lagged


effects, Eq. 9.1 is called a distributed lag
model

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 7
Stationary Variables
9.1
Introduction

9.1.1
Dynamic Nature of
Relationships
Ways to model the dynamic relationship (Continued):
2. Capturing the dynamic characteristics of time-series
by specifying a model with a lagged dependent
variable as one of the explanatory variables

Eq. 9.2 yt  f ( yt 1 , xt )
• Or have:

Eq. 9.3 yt  f ( yt 1 , xt , xt 1 , xt  2 )
– Such models are called autoregressive
distributed lag (ARDL) models, with
‘‘autoregressive’’ meaning a regression of yt on
its own lag or lags
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 8
Stationary Variables
9.1
Introduction

9.1.1
Dynamic Nature of
Relationships
Ways to model the dynamic relationship (Continued):
3. Model the continuing impact of change over
several periods via the error term

Eq. 9.4 yt  f ( xt )  et et  f (et 1 )

• In this case et is correlated with et - 1


• We say the errors are serially correlated or
autocorrelated

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 9
Stationary Variables
9.1
Introduction

9.1.2
Least Squares
Assumptions
The primary assumption is Assumption MR4:

cov  yi , y j   cov  ei , e j   0 for i  j


• For time series, this is written as:

cov  yt , ys   cov  et , es   0 for t  s

– The dynamic models in Eqs. 9.2, 9.3 and 9.4


imply correlation between yt and yt - 1 or et and et
- 1 or both, so they clearly violate assumption
MR4
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 10
Stationary Variables
9.1
Introduction

9.1.2a
Stationarity

A stationary variable is one that is not explosive,


nor trending, and nor wandering aimlessly without
returning to its mean

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 11
Stationary Variables
9.1
Introduction FIGURE 9.2 (a) Time series of a stationary variable

9.1.2a
Stationarity

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 12
Stationary Variables
9.1 FIGURE 9.2 (b) time series of a nonstationary variable that is ‘‘slow-turning’’
Introduction
or ‘‘wandering’’

9.1.2a
Stationarity

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 13
Stationary Variables
9.1
Introduction FIGURE 9.2 (c) time series of a nonstationary variable that ‘‘trends”

9.1.2a
Stationarity

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 14
Stationary Variables
9.1 FIGURE 9.3 (a) Alternative paths through the chapter starting with finite
Introduction
distributed lags

9.1.3
Alternative Paths
Through the
Chapter

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 15
Stationary Variables
9.1 FIGURE 9.3 (b) Alternative paths through the chapter starting with
Introduction
serial correlation

9.1.3
Alternative Paths
Through the
Chapter

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 16
Stationary Variables
9.2
Finite Distributed Lags

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 17
Stationary Variables
9.2
Finite Distributed
Lags

Consider a linear model in which, after q time


periods, changes in x no longer have an impact on
y
Eq. 9.5 yt    0 xt  1 xt 1   2 xt  2     q xt  q  et

– Note the notation change: βs is used to denote


the coefficient of xt-s and α is introduced to
denote the intercept

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 18
Stationary Variables
9.2
Finite Distributed
Lags

Model 9.5 has two uses:


– Forecasting

Eq. 9.6 yT 1    0 xT 1  1 xT   2 xT 1     q xT  q 1  eT 1

– Policy analysis
• What is the effect of a change in x on y?
E ( yt ) E ( yt  s )
Eq. 9.7   s
xt  s xt

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 19
Stationary Variables
9.2
Finite Distributed
Lags

Assume xt is increased by one unit and then


maintained at its new level in subsequent periods
– The immediate impact will be β0
– the total effect in period t + 1 will be β0 + β1, in
period t + 2 it will be β0 + β1 + β2, and so on
• These quantities are called interim
multipliers
– The total multiplier is the final effect on y of
the sustained increase
q
after q or more periods
have elapsed  β s
s 0

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 20
Stationary Variables
9.2
Finite Distributed
Lags

The effect of a one-unit change in xt is distributed


over the current and next q periods, from which we
get the term ‘‘distributed lag model’’
– It is called a finite distributed lag model of
order q
• It is assumed that after a finite number of
periods q, changes in x no longer have an
impact on y
– The coefficient βs is called a distributed-lag
weight or an s-period delay multiplier
– The coefficient β0 (s = 0) is called the impact
multiplier
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 21
Stationary Variables
9.2
Finite Distributed
Lags
ASSUMPTIONS OF THE DISTRIBUTED LAG MODEL

9.2.1
Assumptions

TSMR1. yt    β 0 xt  β1 xt 1  β 2 xt  2    β q xt  q  et , t  q  1, , T
TSMR2. y and x are stationary random variables, and et is independent of
current, past and future values of x.
TSMR3. E(et) = 0
TSMR4. var(et) = σ2
TSMR5. cov(et, es) = 0 t ≠ s
TSMR6. et ~ N(0, σ2)

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 22
Stationary Variables
9.2
Finite Distributed
Lags

9.2.2
An Example:
Okun’s Law
Consider Okun’s Law
– In this model the change in the unemployment rate
from one period to the next depends on the rate of
growth of output in the economy:

Eq. 9.8 U t  U t 1    Gt  GN 

– We can rewrite this as:


Eq. 9.9 DU t    β 0Gt  et

where DU = ΔU = Ut - Ut-1, β0 = -γ, and α=


γGN
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 23
Stationary Variables
9.2
Finite Distributed
Lags

9.2.2
An Example:
Okun’s Law

We can expand this to include lags:

Eq. 9.10 DU t    β 0Gt  β1Gt 1  β 2Gt 2    β q Gt q  et

We can calculate the growth in output, G, as:

GDPt  GDPt 1
Eq. 9.11 Gt  100
GDPt 1

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 24
Stationary Variables
9.2
Finite Distributed FIGURE 9.4 (a) Time series for the change in the U.S. unemployment rate:
Lags 1985Q3 to 2009Q3

9.2.2
An Example:
Okun’s Law

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 25
Stationary Variables
9.2
Finite Distributed
Lags
FIGURE 9.4 (b) Time series for U.S. GDP growth: 1985Q2 to 2009Q3

9.2.2
An Example:
Okun’s Law

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 26
Stationary Variables
9.2
Finite Distributed
Lags
Table 9.1 Spreadsheet of Observations for Distributed Lag Model

9.2.2
An Example:
Okun’s Law

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 27
Stationary Variables
9.2
Finite Distributed
Lags
Table 9.2 Estimates for Okun’s Law Finite Distributed Lag Model

9.2.2
An Example:
Okun’s Law

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 28
Stationary Variables
9.3
Serial Correlation

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 29
Stationary Variables
9.3
Serial Correlation

When is assumption TSMR5, cov(et, es) = 0 for


t ≠ s likely to be violated, and how do we assess
its validity?
– When a variable exhibits correlation over time,
we say it is autocorrelated or serially
correlated
• These terms are used interchangeably

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 30
Stationary Variables
9.3
Serial Correlation FIGURE 9.5 Scatter diagram for Gt and Gt-1

9.3.1
Serial Correlation
in Output Growth

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 31
Stationary Variables
9.3
Serial Correlation

9.3.1a
Computing
Autocorrelation

Recall that the population correlation between two


variables x and y is given by:
cov  x, y 
ρ xy 
var  x  var  y 

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 32
Stationary Variables
9.3
Serial Correlation

9.3.1a
Computing For the Okun’s Law problem, we have:
Autocorrelation

cov  Gt , Gt 1  cov  Gt , Gt 1 
ρ1  
var  Gt 
Eq. 9.12
var  Gt  var  Gt 1 

The notation ρ1 is used to denote the population


correlation between observations that are one period apart
in time
– This is known also as the population autocorrelation
of order one.
– The second equality in Eq. 9.12 holds because
var(Gt) = var(Gt-1) , a property of time series that are
stationary
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 33
Stationary Variables
9.3
Serial Correlation

9.3.1a
Computing
Autocorrelation

The first-order sample autocorrelation for G is


obtained from Eq. 9.12 using the estimates:
1 T
 t t 1 
 G ,G
cov  
T  1 t 2
 Gt  G   Gt 1  G 
T
1
var  Gt     Gt  G 
 2

T  1 t 1

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 34
Stationary Variables
9.3
Serial Correlation

9.3.1a
Computing
Autocorrelation

Making the substitutions, we get:

 G  G  Gt t 1  G
Eq. 9.13 r1  t 2
T

 G  G
2
t
t 1

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 35
Stationary Variables
9.3
Serial Correlation

9.3.1a
Computing
Autocorrelation

More generally, the k-th order sample


autocorrelation for a series y that gives the
correlation between observations that are k periods
apart is: T
Eq. 9.14
  yt  y   yt k  y 
rk  t  k 1 T
  yt  y 
2

t 1

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 36
Stationary Variables
9.3
Serial Correlation

9.3.1a
Computing
Autocorrelation

Because (T - k) observations are used to compute


the numerator and T observations are used to
compute the denominator, an alternative that leads
to larger estimates in finite samples is:
T
1

T  k t  k 1
 yt  y   yt k  y 
Eq. 9.15 rk 
1 T
  yt  y 
2

T t 1

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 37
Stationary Variables
9.3
Serial Correlation

9.3.1a
Computing
Autocorrelation

Applying this to our problem, we get for the first


four autocorrelations:

Eq. 9.16 r1  0.494 r2  0.411 r3  0.154 r4  0.200

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 38
Stationary Variables
9.3
Serial Correlation

9.3.1a
Computing
Autocorrelation

How do we test whether an autocorrelation is


significantly different from zero?
– The null hypothesis is H0: ρk = 0
– A suitable test statistic is:
rk  0
Eq. 9.17 Z  T rk  N  0,1
1T

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 39
Stationary Variables
9.3
Serial Correlation

9.3.1a
Computing
Autocorrelation
For our problem, we have:

Z1  98  0.494  4.89, Z 2  98  0.414  4.10


Z 3  98  0.154  1.52, Z 4  98  0.200  1.98
– We reject the hypotheses H0: ρ1 = 0 and H0: ρ 2
=0
– We have insufficient evidence to reject H0: ρ 3
=0
– ρ4 is on the borderline of being significant.
– We conclude that G, the quarterly growth rate in
U.S. GDP, exhibits significant serial correlation at
lags one and two
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 40
Stationary Variables
9.3
Serial Correlation

9.3.1b
The Correlagram

The correlogram, also called the sample


autocorrelation function, is the sequence of
autocorrelations r1, r2, r3, …
– It shows the correlation between observations
that are one period apart, two periods apart,
three periods apart, and so on

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 41
Stationary Variables
9.3
Serial Correlation FIGURE 9.6 Correlogram for G

9.3.1b
The Correlagram

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 42
Stationary Variables
9.3
Serial Correlation

9.3.2
Serially Correlated
Errors

The correlogram can also be used to check


whether the multiple regression assumption
cov(et, es) = 0 for t ≠ s is violated

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 43
Stationary Variables
9.3
Serial Correlation

9.3.2a
A Phillips Curve

Consider a model for a Phillips Curve:

Eq. 9.18 INFt  INFt E  γ  U t  U t 1 

– If we initially assume that inflationary


expectations are constant over time (β1 = INFEt)
set β2= -γ, and add an error term:

Eq. 9.19 INFt  β1  β 2 DU t  et

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 44
Stationary Variables
9.3
Serial Correlation FIGURE 9.7 (a) Time series for Australian price inflation

9.3.2a
A Phillips Curve

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 45
Stationary Variables
9.3 FIGURE 9.7 (b) Time series for the quarterly change in the Australian
Serial Correlation
unemployment rate

9.3.2a
A Phillips Curve

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 46
Stationary Variables
9.3
Serial Correlation

9.3.2a
A Phillips Curve

To determine if the errors are serially correlated,


we compute the least squares residuals:
Eq. 9.20 eˆt  INFt  b1  b2 DU t 

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 47
Stationary Variables
9.3 FIGURE 9.8 Correlogram for residuals from least-squares estimated
Serial Correlation
Phillips curve

9.3.2a
A Phillips Curve

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 48
Stationary Variables
9.3
Serial Correlation

9.3.2a
A Phillips Curve The k-th order autocorrelation for the residuals can
be written as: T

 eˆ eˆ t t k
Eq. 9.21 rk  t  k 1
T

t
ˆ
e 2

t 1
– The least squares equation is:


INF  0.7776  0.5279 DU
Eq. 9.22
 se   0.0658  0.2294 

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 49
Stationary Variables
9.3
Serial Correlation

9.3.2a
A Phillips Curve

The values at the first five lags are:

r1  0.549 r2  0.456 r3  0.433 r4  0.420 r5  0.339

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 50
Stationary Variables
9.4
Other Tests for Serially Correlated
Errors

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 51
Stationary Variables
9.4
Other Tests for
Serially Correlated
Errors

9.4.1
A Lagrange
Multiplier Test

An advantage of this test is that it readily


generalizes to a joint test of correlations at more
than one lag

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 52
Stationary Variables
9.4
Other Tests for
Serially Correlated
Errors

9.4.1
A Lagrange
Multiplier Test

If et and et-1 are correlated, then one way to model


the relationship between them is to write:
Eq. 9.23 et  ρet 1  vt
– We can substitute this into a simple regression
equation:
Eq. 9.24 yt  β1  β 2 xt  ρet 1  vt

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 53
Stationary Variables
9.4
Other Tests for
Serially Correlated
Errors

9.4.1
A Lagrange
Multiplier Test

We have one complication: ê0 is unknown


– Two ways to handle this are:
1. Delete the first observation and use a total
of T observations
2. Set eˆ0  0 and use all T observations

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 54
Stationary Variables
9.4
Other Tests for
Serially Correlated
Errors

9.4.1
A Lagrange
Multiplier Test

For the Phillips Curve:

 i t  6.219 F  38.67 p-value  0.000


 ii  t  6.202 F  38.47 p-value  0.000
– The results are almost identical
– The null hypothesis H0: ρ = 0 is rejected at all
conventional significance levels
– We conclude that the errors are serially
correlated

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 55
Stationary Variables
9.4
Other Tests for
Serially Correlated
Errors

9.4.1
A Lagrange
Multiplier Test
To derive the relevant auxiliary regression for the
autocorrelation LM test, we write the test equation
as:
Eq. 9.25 yt  β1  β 2 xt  ρeˆt 1  vt

– But since we know that yt  b1  b2 xt  eˆt , we


get:

b1  b2 xt  eˆt  β1  β 2 xt  ρeˆt 1  vt

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 56
Stationary Variables
9.4
Other Tests for
Serially Correlated
Errors

9.4.1
A Lagrange
Multiplier Test

Rearranging, we get:

eˆt   β1  b1    β 2  b2  xt  ρeˆt 1  vt
Eq. 9.26
 γ1  γ 2 xt  ρeˆt 1  v
– If H0: ρ = 0 is true, then LM = T x R2 has an
approximate χ2(1) distribution
• T and R2 are the sample size and goodness-
of-fit statistic, respectively, from least
squares estimation of Eq. 9.26

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 57
Stationary Variables
9.4
Other Tests for
Serially Correlated
Errors

9.4.1
A Lagrange
Multiplier Test
Considering the two alternative ways to handle ê0 :

 
iii LM   T  1  R 2
 89  0.3102  27.61
 iv  LM  T  R 2  90  0.3066  27.59
– These values are much larger than 3.84, which
is the 5% critical value from a χ2(1)-distribution
• We reject the null hypothesis of no
autocorrelation
– Alternatively, we can reject H0 by examining
the p-value for LM = 27.61, which is 0.000

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 58
Stationary Variables
9.4
Other Tests for
Serially Correlated
Errors

9.4.1a
Testing Correlation
at Longer Lags

For a four-period lag, we obtain:

 iii  LM   T  4   R 2  86  0.3882  33.4


 iv  LM  T  R 2  90  0.4075  36.7
– Because the 5% critical value from a χ2(4)-
distribution is 9.49, these LM values lead us to
conclude that the errors are serially correlated

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 59
Stationary Variables
9.4
Other Tests for
Serially Correlated
Errors

9.4.2
The Durbin-
Watson Test

This is used less frequently today because its


critical values are not available in all software
packages, and one has to examine upper and lower
critical bounds instead
– Also, unlike the LM and correlogram tests, its
distribution no longer holds when the equation
contains a lagged dependent variable

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 60
Stationary Variables
9.5
Estimation with Serially Correlated
Errors

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 61
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

Three estimation procedures are considered:


1. Least squares estimation
2. An estimation procedure that is relevant when
the errors are assumed to follow what is
known as a first-order autoregressive model
et  ρet 1  vt
3. A general estimation strategy for estimating
models with serially correlated errors

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 62
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

We will encounter models with a lagged


dependent variable, such as:
yt  δ  θ1 yt 1  δ0 xt  δ1 xt 1  vt

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 63
Stationary Variables
9.5
Estimation with
Serially Correlated ASSUMPTION FOR MODELS WITH A LAGGED DEPENDENT VARIABLE
Errors

TSMR2A In the multiple regression model yt  β1  β 2 xt 2    β K xK  vt


Where some of the xtk may be lagged values of y, vt is uncorrelated with all
xtk and their past values.

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 64
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.1
Least Squares
Estimation
Suppose we proceed with least squares estimation
without recognizing the existence of serially
correlated errors. What are the consequences?
1. The least squares estimator is still a linear
unbiased estimator, but it is no longer best
2. The formulas for the standard errors usually
computed for the least squares estimator are
no longer correct
• Confidence intervals and hypothesis tests
that use these standard errors may be
misleading
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 65
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.1
Least Squares
Estimation

It is possible to compute correct standard errors


for the least squares estimator:
– HAC (heteroskedasticity and autocorrelation
consistent) standard errors, or Newey-West
standard errors
• These are analogous to the heteroskedasticity
consistent standard errors

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 66
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.1
Least Squares
Estimation
Consider the model yt = β1 + β2xt + et
– The variance of b2 is:

var  b2    wt2 var  et     wt ws cov  et , es 


t ts

Eq. 9.27    wt ws cov  et , es  


  wt2 var  et  1  ts 
t

  t
wt var  et 
2 


where
wt   xt  x    x  x
2
t t

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 67
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.1
Least Squares
Estimation
When the errors are not correlated, cov(et, es) = 0,
and the term in square brackets is equal to one.
– The resulting expression

var  b2    t wt2 var  et 

is the one used to find heteroskedasticity-


consistent (HC) standard errors
– When the errors are correlated, the term in
square brackets is estimated to obtain HAC
standard errors
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 68
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.1
Least Squares
Estimation

If we call the quantity in square brackets g and its


estimate ĝ , then the relationship between the two
estimated variances is:

Eq. 9.28 
varHAC  b2   
varHC  b2   g
ˆ

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 69
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.1
Least Squares
Estimation

Let’s reconsider the Phillips Curve model:


INF  0.7776  0.5279 DU
Eq. 9.29  0.0658   0.2294   incorrect se 
 0.1030   0.3127   HAC se 

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 70
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.1
Least Squares
Estimation

The t and p-values for testing H0: β2 = 0 are:


t  0.5279 0.2294  2.301 p  0.0238  from LS standard errors 
t  0.5279 0.3127  1.688 p  0.0950  from HAC standard errors 

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 71
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.2
Estimating an
AR(1) Error Model

Return to the Lagrange multiplier test for serially


correlated errors where we used the equation:
Eq. 9.30 et  ρet 1  vt

– Assume the vt are uncorrelated random errors


with zero mean and constant variances:

Eq. 9.31 E  vt   0 var  vt    v2 cov  vt , vs   0 for t  s

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 72
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.2
Estimating an
AR(1) Error Model
Eq. 9.30 describes a first-order autoregressive
model or a first-order autoregressive process for
et
– The term AR(1) model is used as an
abbreviation for first-order autoregressive
model
– It is called an autoregressive model because it
can be viewed as a regression model
– It is called first-order because the right-hand-
side variable is et lagged one period

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 73
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.2a
Properties of an
AR(1) Error
We assume that:

Eq. 9.32 1  ρ  1
The mean and variance of et are:
 2
Eq. 9.33 E  et   0 var  et     2
e
v
1  ρ2
The covariance term is:

ρ k 2
Eq. 9.34 cov  et , et  k   , k 0 v
1 ρ 2

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 74
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.2a
Properties of an The correlation implied by the covariance is:
ρ k  corr  et , et  k 
AR(1) Error

cov  et , et  k 

var  et  var  et  k 
cov  et , et  k 
Eq. 9.35 
var  et 


ρ k v2 1  ρ 2 
 v2  1  ρ 2 
 ρk
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 75
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.2a
Properties of an Setting k = 1:
AR(1) Error

Eq. 9.36 ρ1  corr  et , et 1   ρ

– ρ represents the correlation between two errors that are


one period apart
• It is the first-order autocorrelation for e,
sometimes simply called the autocorrelation
coefficient
• It is the population autocorrelation at lag one for a
time series that can be described by an AR(1) model
• r1 is an estimate for ρ when we assume a series is
AR(1)
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 76
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.2a
Properties of an
AR(1) Error
Each et depends on all past values of the errors vt:

Eq. 9.37 et  vt  ρvt 1  ρ vt  2  ρ vt 3 


2 3

– For the Phillips Curve, we find for the first five


lags:

r1  0.549 r2  0.456 r3  0.433 r4  0.420 r5  0.339

– For an AR(1) model, we have:


ρˆ 1  ρˆ  r1  0.549

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 77
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.2a
Properties of an
AR(1) Error

For longer lags, we have:

ρˆ 2  ρˆ   0.549   0.301
2 2

ρˆ 3  ρˆ 3   0.549   0.165
3

ρˆ 4  ρˆ   0.549   0.091
4 4

ρˆ 5  ρˆ 5   0.549   0.050
5

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 78
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.2b
Nonlinear Least
Squares Estimation

Our model with an AR(1) error is:

Eq. 9.38 yt  β1  β 2 xt  et with et  ρet 1  vt

with -1 < ρ < 1


– For the vt, we have:

Eq. 9.39 E  vt   0 var  vt    v2 cov  vt , vt 1   0 for t  s

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 79
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.2b
Nonlinear Least
Squares Estimation
With the appropriate substitutions, we get:

Eq. 9.40 yt  β1  β 2 xt  ρet 1  vt

– For the previous period, the error is:

Eq. 9.41 et 1  yt 1  β1  β 2 xt 1

– Multiplying by ρ:

Eq. 9.42 ρet 1  et yt 1  ρβ1  ρβ 2 xt 1

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 80
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.2b
Nonlinear Least
Squares Estimation

Substituting, we get:
Eq. 9.43 yt  β1  1  ρ   β 2 xt  ρyt 1  ρβ 2 xt 1  vt

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 81
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.2b
Nonlinear Least
Squares Estimation

The coefficient of xt-1 equals -ρβ2


– Although Eq. 9.43 is a linear function of the
variables xt , yt-1 and xt-1, it is not a linear
function of the parameters (β1, β2, ρ)
– The usual linear least squares formulas cannot
be obtained by using calculus to find the values
of (β1, β2, ρ) that minimize Sv
• These are nonlinear least squares estimates

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 82
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.2b
Nonlinear Least
Squares Estimation
Our Phillips Curve model assuming AR(1) errors
is:
Eq. 9.44 INFt  β1  1  ρ   β 2 DU t  ρINFt 1  ρβ 2 DU t 1  vt

– Applying nonlinear least squares and presenting


the estimates in terms of the original
untransformed model, we have:


INF  0.7609  0.6944 DU et  0.557et 1  vt
Eq. 9.45
 se   0.1245  0.2479   0.090 
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 83
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.2c
Generalized Least
Squares Estimation

Nonlinear least squares estimation of Eq. 9.43 is


equivalent to using an iterative generalized least
squares estimator called the Cochrane-Orcutt
procedure

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 84
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.3
Estimating a More
General Model
We have the model:

Eq. 9.46 yt  β1  1  ρ   β 2 xt  ρyt 1  ρβ 2 xt 1  vt

– Suppose now that we consider the model:

Eq. 9.47 yt  δ  θ1 yt 1  δ0 xt  δ1 xt 1  vt

• This new notation will be convenient when


we discuss a general class of autoregressive
distributed lag (ARDL) models
– Eq. 9.47 is a member of this class
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 85
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.3
Estimating a More
General Model

Note that Eq. 9.47 is the same as Eq. 9.47 since:

Eq. 9.48 δ  β1  1  ρ  δ0  β 2 δ1  ρβ 2 θ1  ρ

– Eq. 9.46 is a restricted version of Eq. 9.47 with


the restriction δ1 = -θ1δ0 imposed

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 86
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.3
Estimating a More
General Model

Applying the least squares estimator to Eq. 9.47


using the data for the Phillips curve example
yields:
 t  0.3336  0.5593INF  0.6882DU  0.3200DU
INF t 1 t t 1
Eq. 9.49
 se   0.0899  0.0908  0.2575  0.2499

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 87
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.3
Estimating a More
General Model

The equivalent AR(1) estimates are:


δˆ  βˆ 1  1  ρˆ   0.7609   1  0.5574   0.3368
θˆ1  ρˆ  0.5574
δˆ  βˆ  0.6944
0 2

ˆ ˆ 2  0.5574   0.6944   0.3871


δˆ1  ρβ
– These are similar to our other estimates

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 88
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.3
Estimating a More
General Model

The original economic model for the Phillips


Curve was:

Eq. 9.50 INFt  INFt E  γ  Ut  Ut 1 

– Re-estimation of the model after omitting DUt-1


yields:
 t  0.3548  0.5282INF  0.4909DU
INF t 1 t
Eq. 9.51
 se  0.0876  0.0851  0.1921
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 89
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.3
Estimating a More
General Model

In this model inflationary expectations are given


by:

INFt E  0.3548  0.5282INFt 1

– A 1% rise in the unemployment rate leads to an


approximate 0.5% fall in the inflation rate

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 90
Stationary Variables
9.5
Estimation with
Serially Correlated
Errors

9.5.4
Summary of
Section 9.5 and
We have described three ways of overcoming the
Looking Ahead
effect of serially correlated errors:
1. Estimate the model using least squares with
HAC standard errors
2. Use nonlinear least squares to estimate the
model with a lagged x, a lagged y, and the
restriction implied by an AR(1) error
specification
3. Use least squares to estimate the model with a
lagged x and a lagged y, but without the
restriction implied by an AR(1) error
specification
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 91
Stationary Variables
9.6
Autoregressive Distributed Lag
Models

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 92
Stationary Variables
9.6
Autoregressive
Distributed Lag
Models

An autoregressive distributed lag (ARDL) model


is one that contains both lagged xt’s and lagged yt’s

Eq. 9.52 yt    0 xt  1 xt 1     q xt q  1 yt 1     p yt  p  vt

– Two examples:

ADRL  1,1 : INF



t  0.3336  0.5593 INFt 1  0.6882 DU t  0.3200 DU t 1

ADRL  1, 0  : INF

t  0.3548  0.5282 INFt 1  0.4909 DU t

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 93
Stationary Variables
9.6
Autoregressive
Distributed Lag
Models

An ARDL model can be transformed into one with


only lagged x’s which go back into the infinite
past:
yt    0 xt  β1 xt 1  β 2 xt  2  β 3 xt 3    et

Eq. 9.53 
    β s xt  s  et
s 0

– This model is called an infinite distributed


lag model

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 94
Stationary Variables
9.6
Autoregressive
Distributed Lag
Models

Four possible criteria for choosing p and q:


1. Has serial correlation in the errors been
eliminated?
2. Are the signs and magnitudes of the estimates
consistent with our expectations from
economic theory?
3. Are the estimates significantly different from
zero, particularly those at the longest lags?
4. What values for p and q minimize information
criteria such as the AIC and SC?

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 95
Stationary Variables
9.6
Autoregressive
Distributed Lag
Models

The Akaike information criterion (AIC) is:


 SSE  2 K
Eq. 9.54 AIC  ln  
 T  T
where K = p + q + 2
The Schwarz criterion (SC), also known as the
Bayes information criterion (BIC), is:
 SSE  K ln  T 
Eq. 9.55 SC  ln  
 T  T

– Because Kln(T)/T > 2K/T for T ≥ 8, the SC


penalizes additional lags more heavily than
does the AIC
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 96
Stationary Variables
9.6
Autoregressive
Distributed Lag
Models

9.6.1
The Phillips Curve

Consider the previously estimated ARDL(1,0)


model:

 t  0.3548  0.5282INF  0.4909DU , obs  90


INF
Eq. 9.56 t 1 t

 se  0.0876  0.0851  0.1921

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 97
Stationary Variables
9.6
Autoregressive
Distributed Lag FIGURE 9.9 Correlogram for residuals from Phillips curve ARDL(1,0) model
Models

9.6.1
The Phillips Curve

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 98
Stationary Variables
9.6
Autoregressive
Distributed Lag Table 9.3 p-values for LM Test for Autocorrelation
Models

9.6.1
The Phillips Curve

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 99
Stationary Variables
9.6
Autoregressive
Distributed Lag
Models

9.6.1
The Phillips Curve

For an ARDL(4,0) version of the model:

 t  0.1001  0.2354INF  0.1213INF  0.1677INF


INF t 1 t 2 t 3

Eq. 9.57
 se  0.0983  0.1016  0.1038  0.1050
 0.2819INFt -4  0.7902DUt
 0.1014  0.1885 obs  87

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 100
Stationary Variables
9.6
Autoregressive
Distributed Lag
Models

9.6.1
The Phillips Curve

Inflationary expectations are given by:

INFt E  0.1001  0.2354INFt 1  0.1213INFt 2  0.1677INFt 3  0.2819INFt -4

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 101
Stationary Variables
9.6
Autoregressive
Distributed Lag Table 9.4 AIC and SC Values for Phillips Curve ARDL Models
Models

9.6.1
The Phillips Curve

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 102
Stationary Variables
9.6
Autoregressive
Distributed Lag
Models

9.6.2
Okun’s Law

Recall the model for Okun’s Law:

 t  0.5836  0.2020G  0.1653G  0.0700G , obs  96


DU t t 1 t 2
Eq. 9.58
 se  0.0472  0.0324  0.0335  0.0331

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 103
Stationary Variables
9.6
Autoregressive
Distributed Lag FIGURE 9.10 Correlogram for residuals from Okun’s law ARDL(0,2) model
Models

9.6.2
Okun’s Law

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 104
Stationary Variables
9.6
Autoregressive
Distributed Lag Table 9.5 AIC and SC Values for Okun’s Law ARDL Models
Models

9.6.2
Okun’s Law

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 105
Stationary Variables
9.6
Autoregressive
Distributed Lag
Models

9.6.2
Okun’s Law

Now consider this version:

 t  0.3780  0.3501DU  0.1841G  0.0992G , obs  96


DU t 1 t t 1
Eq. 9.59
 se  0.0578  0.0846  0.0307  0.0368

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 106
Stationary Variables
9.6
Autoregressive
Distributed Lag
Models

9.6.3
Autoregressive
Models

An autoregressive model of order p, denoted


AR(p), is given by:

Eq. 9.60 yt  δ  θ1 yt 1  θ2 yt 2    θ p yt  p  vt

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 107
Stationary Variables
9.6
Autoregressive
Distributed Lag
Models

9.6.3
Autoregressive
Models

Consider a model for growth in real GDP:

 t  0.4657  0.3770G  0.2462G


G t 1 t 2
Eq. 9.61
 se  0.1433  0.1000  0.1029 obs = 96

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 108
Stationary Variables
9.6
Autoregressive
Distributed Lag FIGURE 9.11 Correlogram for residuals from AR(2) model for GDP
Models
growth
9.6.3
Autoregressive
Models

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 109
Stationary Variables
9.6
Autoregressive
Distributed Lag Table 9.6 AIC and SC Values for AR Model of Growth in U.S. GDP
Models

9.6.3
Autoregressive
Models

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 110
Stationary Variables
9.7
Forecasting

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 111
Stationary Variables
9.7
Forecasting

We consider forecasting using three different


models:
1. AR model
2. ARDL model
3. Exponential smoothing model

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 112
Stationary Variables
9.7
Forecasting

9.7.1
Forecasting with
an AR Model

Consider an AR(2) model for real GDP growth:

Eq. 9.62 Gt  δ  θ1Gt 1  θ2Gt 2  vt

The model to forecast GT+1 is:

GT 1  δ  θ1GT  θ2GT 1  vT 1

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 113
Stationary Variables
9.7
Forecasting

9.7.1
Forecasting with
an AR Model

The growth values for the two most recent


quarters are:
GT = G2009Q3 = 0.8
GT-1 = G2009Q2 = -0.2
The forecast for G2009Q4 is:

GˆT 1  δˆ  θˆ1GT  θˆ 2GT 1


Eq. 9.63  0.46573  0.37700  0.8  0.24624   0.2
 0.7181

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 114
Stationary Variables
9.7
Forecasting

9.7.1
Forecasting with
an AR Model
For two quarters ahead, the forecast for G2010Q1 is:

GˆT 2  δˆ  θˆ1GT 1  θˆ 2GT


Eq. 9.64  0.46573  0.37700  0.71808  0.24624  0.8
 0.9334

For three periods out, it is:

GˆT 3  δˆ  θˆ1GT 2  θˆ 2GT 1


Eq. 9.65  0.46573  0.37700  0.93343  0.24624  0.71808
 0.9945

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 115
Stationary Variables
9.7
Forecasting

9.7.1
Forecasting with
an AR Model

Summarizing our forecasts:


– Real GDP growth rates for 2009Q4, 2010Q1,
and 2010Q2 are approximately 0.72%, 0.93%,
and 0.99%, respectively

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 116
Stationary Variables
9.7
Forecasting

9.7.1
Forecasting with
an AR Model

A 95% interval forecast for j periods into the


future is given by:

GˆT  j  t 0.975,df  σ̂ j
where σ̂ j is the standard error of the forecast
error and df is the number of degrees of freedom
in the estimation of the AR model

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 117
Stationary Variables
9.7
Forecasting

9.7.1
Forecasting with
an AR Model

The first forecast error, occurring at time T+1, is:

    
u1  GT 1  GˆT 1  δ  δˆ  θ1  θˆ1 GT  θ2  θˆ 2 GT 1  vT 1 
Ignoring the error from estimating the coefficients,
we get:

Eq. 9.66 u1  vT 1

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 118
Stationary Variables
9.7
Forecasting

9.7.1
Forecasting with
an AR Model

The forecast error for two periods ahead is:

Eq. 9.67  
u2  θ1 GT 1  GˆT 1  vT  2  θ1u1  vT  2  θ1vT 1  vT  2

The forecast error for three periods ahead is:

Eq. 9.68  
u3  θ1u2  θ 2u1  vT 3  θ12  θ 2 vT 1  θ1vT  2  vT 3

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 119
Stationary Variables
9.7
Forecasting

9.7.1
Forecasting with
an AR Model

Because the vt’s are uncorrelated with constant


 2
variance v , we can show that:
σ12  var  u1   σ v2
σ 22  var  u2   σ v2 1  θ12  
σ  var  u3   σ
2
3
2
v  θ  θ2
2
1 
2

 θ12  1

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 120
Stationary Variables
9.7
Forecasting Table 9.7 Forecasts and Forecast Intervals for GDP Growth

9.7.1
Forecasting with
an AR Model

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 121
Stationary Variables
9.7
Forecasting

9.7.2
Forecasting with
an ARDL Model

Consider forecasting future unemployment using


the Okun’s Law ARDL(1,1):

Eq. 9.69 DU t  δ  θ1DU t 1  δ0Gt  δ1Gt 1  vt


The value of DU in the first post-sample quarter
is:

Eq. 9.70 DU T 1  δ  θ1 DU T  δ0GT 1  δ1GT  vT 1

– But we need a value for GT+1

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 122
Stationary Variables
9.7
Forecasting

9.7.2
Forecasting with
an ARDL Model

Now consider the change in unemployment


– Rewrite Eq. 9.70 as:

U T 1  UT  δ  θ1  UT  UT 1   δ0GT 1  δ1GT  vT 1

– Rearranging:

U T 1  δ   θ1  1 U T  θ1U T 1  δ0GT 1  δ1GT  vT 1


Eq. 9.71
 δ  θ1*U T  θ*2U T 1  δ 0GT 1  δ1GT  vT 1

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 123
Stationary Variables
9.7
Forecasting

9.7.2
Forecasting with
an ARDL Model

For the purpose of computing point and interval


forecasts, the ARDL(1,1) model for a change in
unemployment can be written as an ARDL(2,1)
model for the level of unemployment
– This result holds not only for ARDL models
where a dependent variable is measured in
terms of a change or difference, but also for
pure AR models involving such variables

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 124
Stationary Variables
9.7
Forecasting

9.7.3
Exponential
Smoothing

Another popular model used for predicting the


future value of a variable on the basis of its history
is the exponential smoothing method
– Like forecasting with an AR model, forecasting
using exponential smoothing does not utilize
information from any other variable

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 125
Stationary Variables
9.7
Forecasting

9.7.3
Exponential
Smoothing

One possible forecasting method is to use the


average of past information, such as:

yT  yT 1  yT  2
yˆT 1 
3
– This forecasting rule is an example of a simple
(equally-weighted) moving average model with
k=3

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 126
Stationary Variables
9.7
Forecasting

9.7.3
Exponential
Smoothing

Now consider a form in which the weights decline


exponentially as the observations get older:

yˆT 1  αyT  α  1  α  yT 1  α  1  α  yT 2 
1 2
Eq. 9.72

– We assume that 0 < α < 1


– Also, it can be shown that:
s
 α  1 α  1

s 0

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 127
Stationary Variables
9.7
Forecasting

9.7.3
Exponential
Smoothing

For forecasting, recognize that:

 1 α yˆT  α 1 α yT 1  α 1 α yT 2  α  1 α yT 3 
2 3
Eq. 9.73

– We can simplify to:

Eq. 9.74 yˆT 1  αyT   1 α yˆT

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 128
Stationary Variables
9.7
Forecasting

9.7.3
Exponential
Smoothing
The value of α can reflect one’s judgment about
the relative weight of current information
– It can be estimated from historical information
by obtaining within-sample forecasts:

Eq. 9.75 yˆt  αyt 1   1 α yˆt 1 t  2,3,,T

• Choosing α that minimizes the sum of


squares of the one-step forecast errors:
Eq. 9.76 vt  yt  yˆt  yt   αyt 1 + 1 α yˆt 1 

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 129
Stationary Variables
9.7 FIGURE 9.12 (a) Exponentially smoothed forecasts for GDP growth with α
Forecasting
= 0.38

9.7.3
Exponential
Smoothing

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 130
Stationary Variables
9.7 FIGURE 9.12 (b) Exponentially smoothed forecasts for GDP growth with α
Forecasting
= 0.8

9.7.3
Exponential
Smoothing

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 131
Stationary Variables
9.7
Forecasting

9.7.3
Exponential
Smoothing

The forecasts for 2009Q4 from each value of α


are:
α  0.38: GˆT 1  αGT   1 α GˆT  0.380.8   1 0.38   0.403921
= 0.0536
α  0.8: GˆT 1  αGT   1 α GˆT  0.8 0.8   1 0.8   0.393578
= 0.5613

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 132
Stationary Variables
9.8
Multiplier Analysis

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 133
Stationary Variables
9.8
Multiplier Analysis

Multiplier analysis refers to the effect, and the


timing of the effect, of a change in one variable on
the outcome of another variable

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 134
Stationary Variables
9.8
Multiplier Analysis

Let’s find multipliers for an ARDL model of the


form:

Eq. 9.77 yt    1 yt 1     p yt  p  0 xt  1 xt 1     q xt  q  vt

– We can transform this into an infinite


distributed lag model:

Eq. 9.78 yt  α  β 0 x t + β1 xt 1  β 2 xt  2  β 3 xt 3    et

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 135
Stationary Variables
9.8
Multiplier Analysis

The multipliers are defined as:


yt
βs   s period delay multiplier
xt  s

β
j 0
j  s period interim multiplier

β
j 0
j  total multiplier

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 136
Stationary Variables
9.8
Multiplier Analysis

The lag operator is defined as:


Lyt  yt 1
– Lagging twice, we have:
L  Lyt   Lyt 1  yt  2
– Or:
L2 yt  yt  2

– More generally, we have:


Ls yt  yt  s

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 137
Stationary Variables
9.8
Multiplier Analysis

Now rewrite our model as:

yt    1 Lyt  2 L2 yt     p Lp yt  0 xt  1Lxt   2 L2 xt
Eq. 9.79
   q Lq xt  vt

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 138
Stationary Variables
9.8
Multiplier Analysis

Rearranging terms:

Eq. 9.80  1 2
1   L   L2
    p Lp
 t
y     0 1 2
   L   L2
    q Lq
 xt  vt

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 139
Stationary Variables
9.8
Multiplier Analysis

Let’s apply this to our Okun’s Law model


– The model:

Eq. 9.81 DU t  δ  θ1DU t 1  δ0Gt  δ1Gt 1  vt

can be rewritten as:

Eq. 9.82  1  θ1L  DU t  δ   δ0  δ1L  Gt  vt

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 140
Stationary Variables
9.8
Multiplier Analysis

Define the inverse of (1 – θ1L) as (1 – θ1L)-1 such


that:
 1  θ1L   1  θ1L   1
1

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 141
Stationary Variables
9.8
Multiplier Analysis

Multiply both sides of Eq. 9.82 by (1 – θ1L)-1:

DUt   1  θ1L  δ   1  θ1L   δ0  δ1L  Gt   1  θ1L 


1 1 1
Eq. 9.83 vt

– Equating this with the infinite distributed lag


representation:

DU t  α  β0Gt  β1Gt 1  β 2Gt 2  β3Gt 3    et


 
Eq. 9.84
 α  β0  β1L  β 2 L2  β3 L3   Gt  et

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 142
Stationary Variables
9.8
Multiplier Analysis

For Eqs. 9.83 and 9.84 to be identical, it must be


true that:

α=  1  θ1L  δ
1
Eq. 9.85

β0  β1L  β2 L  β3 L     1  θ1L   δ0  δ1L 


2 3 1
Eq. 9.86

et   1  θ1L  vt
1
Eq. 9.87

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 143
Stationary Variables
9.8
Multiplier Analysis

Multiply both sides of Eq. 9.85 by (1 – θ1L) to


obtain (1 – θ1L)α = δ.
– Note that the lag of a constant that does not
change so Lα = α
– Now we have:
δ
 1  θ1  α  δ and α 
1  θ1

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 144
Stationary Variables
9.8
Multiplier Analysis

Multiply both sides of Eq. 9.86 by (1 – θ1L):


δ0  δ1L   1 θ1L β0  β1L  β2 L2  β3L3  
 β0  β1L  β2 L2  β3L3 
Eq. 9.88
 β0θ1L  β1θ1L2  β2θ1L3 
 β0   β1  β0θ1  L   β2  β1θ1  L2   β3  β2θ1  L3  

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 145
Stationary Variables
9.8
Multiplier Analysis

Rewrite Eq. 9.86 as:

Eq. 9.89 δ0  δ1L  0L2  0L3  β0   β1  β0θ1  L   β2  β1θ1  L2   β3  β2θ1  L3 

– Equating coefficients of like powers in L yields:

δ0 = β0
δ1  β1  β0θ1
0  β2  β1θ1
0  β3  β2θ1
and so on
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 146
Stationary Variables
9.8
Multiplier Analysis

We can now find the β’s using the recursive


equations:
β0 = δ0
Eq. 9.90 β1  δ1  β0θ1
β j  β j 1θ1 for j  2

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 147
Stationary Variables
9.8
Multiplier Analysis

You can start from the equivalent of Eq. 9.88


which, in its general form, is:


δ0  δ1L  δ2 L2  δq Lq  1 θ1L  θ2 L2  θ p Lp 
Eq. 9.91

 β0  β1L  β2 L2  β3L3  
– Given the values p and q for your ARDL
model, you need to multiply out the above
expression, and then equate coefficients of like
powers in the lag operator

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 148
Stationary Variables
9.8
Multiplier Analysis

For the Okun’s Law model:


 t  0.3780  0.3501DU  0.1841G  0.0992G
DU t 1 t t 1

– The impact and delay multipliers for the first


four quarters are:
βˆ 0 = δˆ 0  0.1841
βˆ  δˆ  βˆ θˆ  0.099155  0.184084  0.350116  0.1636
1 1 0 1

βˆ 2  βˆ1θˆ1  0.163606  0.350166  0.0573


βˆ  βˆ θˆ  0.057281 0.350166  0.0201
3 2 1

βˆ 4  βˆ 3θˆ1  0.020055 0.350166  0.0070


Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 149
Stationary Variables
9.8
Multiplier Analysis FIGURE 9.13 Delay multipliers from Okun’s law ARDL(1,1) model

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 150
Stationary Variables
9.8
Multiplier Analysis

We can estimate the total multiplier given by:


β
j 0
j

and the normal growth rate that is needed to


maintain a constant rate of unemployment:

GN  α β
j 0
j

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 151
Stationary Variables
9.8
Multiplier Analysis

We can show that:


 ˆ  δˆ θˆ
δ 0.163606

j 0
ˆ
β j  δ0  1
ˆ
1 θ
0 1
 0.184084 
1 0.350116
 0.4358
1
– An estimate for α is given by:
δ̂ 0.37801
α̂    0.5817
1 θˆ1 0.649884
– Therefore, normal growth rate is:
0.5817
ĜN   1.3% per quarter
0.4358

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 152
Stationary Variables
Key Words

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 153
Stationary Variables
Keywords

AIC criterion dynamic models LM test


AR(1) error exponential multiplier analysis
AR(p) model smoothing nonlinear least squares
ARDL(p,q) model finite distributed lag out-of-sample
forecast error forecasts
autocorrelation
forecast intervals sample
Autoregressive
forecasting autocorrelations
distributed lags
HAC standard errors serial correlation
autoregressive error standard error of
autoregressive impact multiplier
forecast error
model infinite distributed lag
SC criterion
BIC criterion interim multiplier
total multiplier
correlogram lag length
T x R2 form of LM
delay multiplier lag operator test
lagged dependent within-sample
distributed lag
variable forecasts
weight
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 154
Stationary Variables
Appendices

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 155
Stationary Variables
9A
The Durbin-
Watson Test

For the Durbin-Watson test, the hypotheses are:


H0 :   0 H1 :   0
The test statistic is:
T
  eˆt  eˆt 1 
2

Eq. 9A.1 d t 2
T
t
ˆ
e 2

t 1

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 156
Stationary Variables
9A
The Durbin-
Watson Test

We can expand the test statistic as:


T T T
 t  t 1  2 eˆt eˆt 1
ˆ
e 2
 ˆ
e 2

d t 2 t 2
T
t 2

t
ˆ
e 2

t 1

T T T
Eq. 9A.2
 eˆ 2
t  eˆ 2
t 1  eˆt eˆt 1
 t 2
T
 t 2
T
2 t 2
T
t
ˆ
e 2
t
ˆ
e 2
t
ˆ
e 2

t 1 t 1 t 1

 1  1  2r1
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 157
Stationary Variables
9A
The Durbin-
Watson Test

We can now write:

Eq. 9A.3 d  2  1  r1 
– If the estimated value of ρ is r1 = 0, then the
Durbin-Watson statistic d ≈ 2
• This is taken as an indication that the model
errors are not autocorrelated
– If the estimate of ρ happened to be r1 = 1 then d ≈
0
• A low value for the Durbin-Watson statistic
implies that the model errors are correlated, and
ρ>0
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 158
Stationary Variables
9A
The Durbin-
Watson Test FIGURE 9A.1 Testing for positive autocorrelation

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 159
Stationary Variables
9A
The Durbin- FIGURE 9A.2 Upper and lower critical value bounds for the Durbin-
Watson Test
Watson test

9A.1
The Durbin-
Watson Bounds
Test

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 160
Stationary Variables
9A
The Durbin-
Watson Test

9A.1
The Durbin-
Watson Bounds
Test

Decision rules, known collectively as the Durbin-


Watson bounds test:
– If d < dLc: reject H0: ρ = 0 and accept H1:
ρ>0
– If d > dUc do not reject H0: ρ = 0
– If dLc < d < dUc, the test is inconclusive

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 161
Stationary Variables
9B
Properties of the
AR(1) Error

Note that:

et  ρet 1  vt
 ρ  ρet  2  vt 1   vt
Eq. 9B.1

 ρ 2 et  2  ρvt 1  vt
Further substitution shows that:

et  ρ  ρet 3  vt  2   ρvt 1  vt
2

Eq. 9B.2
 ρ3et 3  ρ 2 vt  2  ρvt 1  vt

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 162
Stationary Variables
9B
Properties of the
AR(1) Error

Repeating the substitution k times and


rearranging: k 1
Eq. 9B.3 et  ρ et  k  vt  ρvt 1  ρ vt  2    ρ vt  k 1
k 2

If we let k → ∞, then we have:


Eq. 9B.4 et  vt  ρvt 1  ρ 2 vt  2  ρ3vt 3  

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 163
Stationary Variables
9B
Properties of the
AR(1) Error

We can now find the properties of et:


E  et   E  vt   ρE  vt 1   ρ 2 E  vt  2   ρ3 E  vt 3   
 0  ρ  0  ρ 2  0  ρ3  0  
0
var  et   var  vt   ρ2var  vt 1   ρ4 var  vt 2   ρ6 var  vt 3  
v2  ρ2v2  ρ4v2  ρ6v2 
  1 ρ  ρ  ρ 
2
v
2 4 6

v2

1 ρ2
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 164
Stationary Variables
9B
Properties of the
AR(1) Error

The covariance for one period apart is:


cov  et , et 1   E  et et t 
 E  vt  ρvt 1  ρ vt 2  ρ vt 3  
 2 3

 v  ρv  ρ v  ρ v   
t 1 t 2
2
t 3
3
t 4

 ρE  v   ρ E  v   ρ E  v   
2
t 1
3 2
t 2
5 2
t 3

 ρ  1  ρ  ρ  
2
v
2 4

ρv2

1  ρ2
Chapter 9: Regression with Time Series Data:
Principles of Econometrics, 4th Edition Page 165
Stationary Variables
9B
Properties of the
AR(1) Error

Similarly, the covariance for k periods apart is:

ρk v2
cov  et , et k   k 0
1  ρ2

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 166
Stationary Variables
9C
Generalized Least
Squares Estimation

We are considering the simple regression model


with AR(1) errors:

yt  1  2 xt  et et  et 1  vt

To specify the transformed model we begin with:

Eq. 9C.1 yt  1  2 xt  yt 1  1   2 xt 1  vt


– Rearranging terms:

Eq. 9C.2 yt  yt 1  1  1     2  xt  xt 1   vt

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 167
Stationary Variables
9C
Generalized Least
Squares Estimation

Defining the following transformed variables:

yt  yt  yt 1 xt2  xt  xt 1 xt1  1  

Substituting the transformed variables, we get:

Eq. 9C.3 yt  xt11  xt22  vt

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 168
Stationary Variables
9C
Generalized Least
Squares Estimation

There are two problems:


1. Because lagged values of yt and xt had to be
formed, only (T - 1) new observations were
created by the transformation
2. The value of the autoregressive parameter ρ is
not known

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 169
Stationary Variables
9C
Generalized Least
Squares Estimation

For the second problem, we can write Eq. 9C.1 as:

Eq. 9C.4 yt  1  2 xt  ( yt 1  1  2 xt 1 )  vt

For the first problem, note that:


y1  1  x12  e1
and that

1  2 y1  1  2 1  1   2 x1 2  1   2 e1

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 170
Stationary Variables
9C
Generalized Least
Squares Estimation

Or:

Eq. 9C.5 y1  x11 1  x12 2  e1

where

y1  1  2 y1 x11  1  2
Eq. 9C.6
x12  1  2 x1 e1  1  2 e1

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 171
Stationary Variables
9C
Generalized Least
Squares Estimation

To confirm that the variance of e*1 is the same as


that of the errors (v2, v3,…, vT), note that:
 2
var(e1 )  (1  2 ) var(e1 )  (1  2 ) v 2  v2
1 

Chapter 9: Regression with Time Series Data:


Principles of Econometrics, 4th Edition Page 172
Stationary Variables

S-ar putea să vă placă și