Documente Academic
Documente Profesional
Documente Cultură
In this section we will consider another correlation function, which together with
the ACF will help to identify the models. The function is called Partial Autocor-
relation Function (PACF). Before introducing a formal definition of PACF we
motivate the idea for AR(1). Let
Xt = Xt1 + Zt
be a causal AR(1) process. Then
(2) = cov(Xt , Xt2 )
= cov(Xt1 + Zt , Xt2 )
= cov(2 Xt2 + Zt1 + Zt , Xt2 )
= E[(2 Xt2 + Zt1 + Zt )Xt2 ]
= 2 (0).
The autocorrelation is not zero because Xt depends on Xt2 through Xt1 . Due
to the iterative kind of AR models there is a chain of dependence. We can break
this dependence removing the influence of Xt1 from both Xt and Xt2 to obtain
Xt Xt1 and Xt2 Xt1
for which the covariance is zero, i.e.,
cov(Xt Xt1 , Xt2 Xt1 ) = cov(Zt , Xt2 Xt1 ) = 0.
Similarly, we obtain zero covariance for Xt and Xt3 after breaking the chain
of dependence, i.e. removing the dependence of the two variables on Xt1 and
Xt2 , i.e. for Xt f (Xt1 , Xt2 ) and Xt3 f (Xt1 , Xt2 ) for some func-
tion f . Continuing this we would obtain zero covariances for variables Xt
f (Xt1 , Xt2 , . . . , Xt +1 ) and Xt f (Xt1 , Xt2 , . . . , Xt +1 ). Then the
only nonzero covariance is for Xt and Xt1 (nothing in between to break the chain
of dependence). These covariances with an appropriate function f divided by the
variance of the process are the partial autocorrelations. Hence, for a causal AR(1)
process we would have the PACF at lag 1 equal to (1) and at lags > 1 equal to 0.
This, together with the tailing off shape of the ACF identifies the process.
116 CHAPTER 6. ARMA MODELS
11 = corr(X1 , X0 ) = (1)
(6.17)
= corr(X f( 1) , X0 f( 1) ), 2,
where
f( 1) = f (X 1, . . . , X1 )
minimizes the mean square linear prediction error
E(X f( 1) )2 .
Remark 6.4. The subscript at the f function denotes the number of variables the
function depends on.
Remark 6.5. By stationarity, is the correlation between variables Xt and Xt
with the linear effect
f (Xt1 , . . . , Xt +1 ) = 1 Xt1 + . . . + 1 Xt +1
Consider a process
Xt = Xt1 + Zt , Zt W N(0, 2 ),
11 = (1) = .
f(1) = X1 .
We choose to minimize
Hence
(1)
= = (1) =
(0)
and
f(1) = X1 .
Then
22 = corr(X2 X1 , X0 X1 ) = corr(Z2 , X0 X1 ) = 0
Let
Xt 1 Xt1 . . . p Xtp = Zt , Zt W M(0, 2 )
be a causal AR(p) process, i.e., we assume that the roots of (z) are outside the
unit circle. When > p the linear combination minimizing the mean square linear
prediction error is
X
p
f(p) = j X j .
j=1
We will discuss this result later. Now we will use it to obtain the PACF for > p,
namely
= corr(X f(p) , X0 f(p) )
= corr(Z , X0 f(p) ) = 0
as by causality X j , do not depend on the future noise value Z .
10 70 130 190
AR1.phi.0.9 AR1.phi.minus0.9
-2
-4
AR1.phi.0.5 AR1.phi.minus0.5
-2
-4
10 70 130 190
Figure 6.3: AR(1) for various values of the parameters = 0.9, 0.9, 0.5, 0.5.
6.2. ACF AND PACF OF ARMA(P,Q) 119
0.8
0.8
0.6
0.6
Partial ACF
0.4
ACF
0.4
0.2
0.2
0.0
0.0
0 20 40 60 80 100 0 5 10 15 20
Lag Lag
0.0
0.5
-0.2
Partial ACF
ACF
0.0
-0.4 -0.6
-0.5
-0.8
0 20 40 60 80 100 0 5 10 15 20
Lag Lag
0.4
0.6
Partial ACF
0.4
0.2
ACF
0.2
0.0
0.0
-0.2
0 20 40 60 80 100 0 5 10 15 20
Lag Lag
0.1
0.8
0.0
0.6
0.4
Partial ACF
-0.1
ACF
0.2
-0.2
0.0
-0.3
-0.2
-0.4
-0.4
0 20 40 60 80 100 0 5 10 15 20
Lag Lag
Series : AR2$x
0.6
0.4
Partial ACF
0.2 0.0
-0.2
0 5 10 15 20
Lag
MA(1)
3
-1
-3
-5
10 30 50 70 90 t
Series : MA1$theta09 Series : MA1$theta09
1.0
0.4
0.8
0.2
0.6
Partial ACF
ACF
0.4
0.0
0.2
0.0
-0.2
-0.2
0 5 10 15 0 5 10 15
Lag Lag
This is an AR() representation (p = ) and the PACF will never cut off as for
the AR(p) with finite p.
The PACF of MA models behaves like ACF for AR models and PACF for AR
models behaves like ACF for MA models.
It can be shown that PACF of MA(1) is
() (1 2 )
= , 1.
1 2( +1)