Documente Academic
Documente Profesional
Documente Cultură
Recall that stationary processes The Autoregressive Integrated Moving Average (ARIMA) vary about a fixed level, and models, or Box-Jenkins methodology, are a class of linear nonstationary processes have no natural constant mean level. models that is capable of representing stationary as well as nonstationary time series. The ACF and PACF associated to the TS
ARIMA models rely heavily on autocorrelation patterns in data both ACF and PACF are used to select an initial model.
are matched with the theoretical autocorrelation pattern associated with a particular ARIMA model.
2.
3.
4.
Autoregressive models are appropriate for stationary time series, and the coefficient 0 is related to the constant level of the series.
Theoretical behavior of the ACF and PACF for AR(1) and AR(2) models:
AR(1)
ACF 0
AR(2)
ACF 0
constant mean of the process i regression coefficients to be estimated t k error in time period t - k
The term Moving Average is historical and should not be confused with the moving average smoothing procedures.
MA models are appropriate for stationary time series. The weights i do not necessarily sum to 1 and may be positive or negative.
Theoretical behavior of the ACF and PACF for MA(1) and MA(2) models:
ARMA(p,q) models can describe a wide variety of behaviors for stationary time series. Theoretical behavior of the ACF and PACF for autoregressivemoving average processes:
PACF
Cut off after the order p of the process Die out
In this context
Die out
In some cases, it may be necessary to difference the differences before stationary data are obtained.
Note that: ARIMA(p,0,q) = ARMA(p,q) By counting the number of significant sample autocorrelations and partial autocorrelations, the orders of the AR and MA parts can be determined.
Advice: start with a model containing few rather than many parameters. The need for additional parameters will be evident from an examination of the residual ACF and PACF.
Many of the same residual plots that are useful in regression analysis can be developed for the residuals from an ARIMA model (histogram, normal probability plot, time sequence plot, etc.)
After an adequate model has been found, forecasts can be made. Prediction intervals based on the forecasts can also be constructed. As more data become available, it is a good idea to monitor the forecast errors, since the model must need to be reevaluated if:
The magnitudes of the most recent errors tend to be consistently larger than previous errors, or The recent forecast errors tend to be consistently positive or negative
In general, the longer the forecast lead time, the larger the prediction interval (due to greater uncertainty)
Seasonal AR and MA terms that account for the correlation at the seasonal lags
In addition, for nonstationary seasonal series, an additional seasonal difference is often required
File: PORTFOLIO_INVESTMENT.MTW
Stat > Time Series >
Time Series Plot of Index
A consulting corporation wants to try the Box-Jenkins technique for forecasting the Transportation Index of the Dow Jones.
The first several autocorrelations are persistently large and trailed off to zero rather slowly a trend exists and this time series is nonstationary (it does not vary about a fixed level)
Idea: to difference the data to see if we could eliminate the trend and create a stationary series.
Autocorrelation
8 9 Lag
10
11
12
13
14
15
16
Diff1
3 2 1 0 -1 -2 -3 -4 1 6 12 18 24 30 36 Index 42 48 54 60
Comparing the autocorrelations with their error limits, the only significant autocorrelation is at lag 1. Similarly, only the lag 1 partial autocorrelation is significant. The PACF appears to cut off after lag 1, indicating AR(1) behavior. The ACF appears to cut off after lag 1, indicating MA(1) behavior we will try: ARIMA(1,1,0) and ARIMA(0,1,1)
A constant term in each model will be included to allow for the fact that the series of differences appears to vary about a level greater than zero.
8 9 Lag
10
11
12
13
14
15
16
The LBQ statistics are not significant as indicated by the large pvalues for either model.
Autocorrelation
Autocorrelation
Finally, there is no significant residual autocorrelation for the ARIMA(1,1,0) model. The results for the ARIMA(0,1,1) are similar.
0,6 0,4 0,2 0,0 -0,2 -0,4 -0,6 -0,8 -1,0 1 2 3 4 5 6 7 8 9 Lag 10 11 12 13 14 15 16
Therefore, either model is adequate and provide nearly the same one-step-ahead forecasts.
File: READINGS.MTW
Stat > Time Series >
Time Series Plot of Readings
110 100 90 80
Readings
A consulting corporation wants to try the Box-Jenkins technique for forecasting a process.
The time series of readings appears to vary about a fixed level of around 80, and the autocorrelations die out rapidly toward zero the time series seems to be stationary.
70 60 50 40 30 20 1 7 14 21 28 35 42 Index 49 56 63 70
The first sample ACF coefficient is significantly different form zero. The autocorrelation at lag 2 is close to significant and opposite in sign from the lag 1 autocorrelation. The remaining autocorrelations are small. This suggests either an AR(1) model or an MA(2) model. The first PACF coefficient is significantly different from zero, but none of the other partial autocorrelations approaches significance, This suggests an AR(1) or ARIMA(1,0,0)
0,6
Autocorrelation
10 Lag
12
14
16
18
MA(2) = ARIMA(0,0,2)
Both models appear to fit the data well. The estimated coefficients are significantly different from zero and the mean square (MS) errors are similar.
Autocorrelation
Finally, there is no significant residual autocorrelation for the ARIMA(1,0,0) model. The results for the ARIMA(0,0,2) are similar.
0,8 0,6 0,4 0,2 0,0 -0,2 -0,4 -0,6 -0,8 -1,0 2 4 6 8 10 Lag 12 14 16 18
Therefore, either model is adequate and provide nearly the same threestep-ahead forecasts. Since the AR(1) model has two parameters (including the constant term) and the MA(2) model has three parameters, applying the principle of parsimony we would use the simpler AR(1) model to forecast future readings.