Sunteți pe pagina 1din 8

2.15. Suppose that {Xt , t " 0, 1, . . .

} is a stationary process satisfying the equations


Xt " 1 Xt1 + + p Xtp + Zt ,

*
+
where {Zt } WN 0, 2 and Zt is uncorrelated with Xs for each s < t . Show
that the best linear predictor Pn Xn+1 of Xn+1 in terms of 1, X1 , . . . , Xn , assuming
n > p , is
Pn Xn+1 " 1 Xn + + p Xn+1p .

What is the mean squared error of Pn Xn+1 ?

2.16. Use the program ITSM to plot the sample ACF and PACF up to lag 40 of the
sunspot series Dt , t " 1, 100, contained in the ITSM file SUNSPOTS.TSM.
(Open the project SUNSPOTS.TSM and click on the second yellow button at
the top of the screen to see the graphs. Repeated clicking on this button will
toggle between graphs of the sample ACF, sample PACF, and both. To see the
numerical values, right-click on the graph and select Info.) Fit an AR(2) model
to the mean-corrected data by selecting Model>Estimation>Preliminary
and click Yes to subtract the sample mean from the data. In the dialog box that
follows, enter 2 for the AR order and make sure that the MA order is zero and
that the Yule-Walker algorithm is selected without AICC minimization. Click
OK and you will obtain a model of the form
Xt " 1 Xt1 + 2 Xt2 + Zt ,

*
+
where {Zt } WN 0, 2 ,

for the mean-corrected series Xt " Dt 46.93. Record the values of the estimated parameters 1 , 2 , and 2 . Compare the model and sample ACF and
PACF by selecting the third yellow button at the top of the screen. Print the
graphs by right-clicking and selecting Print.

2.18. Let {Xt } be the stationary process defined by the equations


Xt " Zt Zt1 ,

t " 0, 1, . . . ,
+
*
where | | < 1 and {Zt } WN 0, 2 . Show that the best linear predictor
P n Xn+1 of Xn+1 based on {Xj , < j n} is
P n Xn+1 "

j Xn+1j .

j "1

What is the mean squared error of the predictor P n Xn+1 ?

3.1. Determine which of the following ARMA processes are causal and which of
them are invertible. (In each case {Zt } denotes white noise.)
a. Xt + 0.2Xt1 0.48Xt2 " Zt .
b. Xt + 1.9Xt1 + 0.88Xt2 " Zt + 0.2Zt1 + 0.7Zt2 .
c. Xt + 0.6Xt1 " Zt + 1.2Zt1 .
d. Xt + 1.8Xt1 + 0.81Xt2 " Zt .
e. Xt + 1.6Xt1 " Zt 0.4Zt1 + 0.04Zt2 .

3.3. For those processes in Problem 3.1 that are causal, compute (
the first six coefficients 0 , 1 , . . . , 5 in the causal representation Xt "
j "0 j Ztj of
{Xt }.

where 0 < || < 1, have the same autocovariance functions.


3.7. Suppose that {Xt } is the noninvertible MA(1) process
*
+
2
Xt " Zt + Zt1 , {Zt } WN 0, ,
where || > 1. Define a new process {Wt } as
Wt "

)
()j Xtj
j "0

*
+
and show that {Wt } WN 0, W2 . Express W2 in terms of and 2 and show
that {Xt } has the invertible representation (in terms of {Wt })

1
Xt " Wt + Wt1 .

3.8. Let {Xt } denote the unique stationary solution of the autoregressive equations
Xt " Xt1 + Zt , t " 0, 1, . . . ,
*
+
where {Zt } WN 0, 2 and || > 1. Then Xt is given by the expression
(2.2.11). Define the new sequence

1
Xt1 ,

*
+
show that {Wt } WN 0, W2 , and express W2 in terms of 2 and . These
calculations show that {Xt } is the (unique stationary) solution of the causal AR
equations
Wt " Xt

Xt "

1
Xt1 + Wt ,

t " 0, 1, . . . .

3.9. a. Calculate the autocovariance function () of the stationary time series


*
+
Yt " + Zt + 1 Zt1 + 12 Zt12 , {Zt } WN 0, 2 .

b. Use the program ITSM to compute the sample mean and sample autocovariances (h), 0 h 20, of {12 Xt }, where {Xt , t " 1, . . . , 72} is the
accidental deaths series DEATHS.TSM of Example 1.1.3.
c. By equating (1), (11), and (12) from part (b) to (1), (11), and (12),
respectively, from part (a), find a model of the form defined in (a) to represent
{12 Xt }.

S-ar putea să vă placă și