Sunteți pe pagina 1din 17

Analisis

Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales
Definitions

Analisis
Bayesiano de Series Temporales
BASIC DEFINITIONS AND INFERENCE

Raquel Prado
Universidad de California, Santa Cruz

Julio, 2013

Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

Definitions

Definitions

Applications and Objectives

Applications and Objectives

Univariate time series analysis

Multivariate time series analysis

100

200

Modeling and inference: Describing the latent structure of a single series

-400

-300

-200

-100

What if we have multiple time series or a time series vector,


yt = (y1,t , . . . , yk,t )0 , at each time t?
For instance, the electroencephalogram (EEG) time series just
displayed is one of several EEGs recorded at different locations
over the scalp of a patient. We are interested in discovering
common features accross the multiple EEG signals.

1000

2000

3000

time

We want to describe the latent components of yt in terms of


Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

Definitions

Definitions

Applications and Objectives

Applications and Objectives

Univariate and multivariate time series analysis

Online monitoring and control

Forecasting. Example: Annual per capita gross domestic product (GDP).


Canada

France

Monitoring a time series to detect possible changes in real time.

1950

1960

1970

1980

10

0.04

20

0.08

30

Austria

1950

1960

1980

1950

1960

Greece

1970

1980

Example: TAR(1)

Italy

yt =

20

1.0

40

2.0

60

10 14 18

80

Germany

1970

1950

1960

1970

1980

1950

1960

1980

1950

1960

UK

1970

1980

(1)

where t

USA

(1)

(1) yt1 + t , + ytd > 0


(2)
(2) yt1 + t , + ytd 0
(2)

N(0, v2 ).

N(0, v1 ) and t

(M1 )
(M2 ),

1.0

20

1.4

30

1.8

Sweden

1970

1950

1960

1970

1980

1950

1960

1970

1980

1950

1960

1970

1980

Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

Definitions

Definitions

Applications and Objectives

Applications and Objectives

y1:T and 1:T

Goals of time series analysis in the example

Online monitoring and control

500

1000

1500

time
(a)

If transitions between modes occur in response to a


controllers action 1:T is known and so, we can:
I

1.8

1.0

1.4

I
0

500

1000

1500

time
(b)

Here (1) = 0.9, (2) = 0.3, d = 1, = 1.5, and v1 = v2 = 1.


Also, t = 1 if yt M1 and t = 2 if yt M2 .

infer the parameters of the stochastic models that control


the settings, i.e., infer (i) and vi , and
learn about transition rule, i.e., infer and d;

If transitions do not occur in response to a controllers


action we need to make inferences on 1:T as well.


Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

Definitions

Definitions

Applications and Objectives

Stationarity

Other goals

Clustering

Time series models as components of models with


additional structure: e.g., regression models,
spatio-temporal models, etc.

Stationarity
Definition
{yt , t T } is completely or strongly stationary if, for any
sequence of times t1 , . . . , tn and any lag h, the distribution of
(yt1 , . . . , ytn )0 is identical to the distribution of (yt1 +h , . . . , ytn +h )0 .
Definition
{yt , t T } is weakly or second order stationary if for any
sequence t1 , . . . , tn , and any h, the first and second joint
moments of (yt1 , . . . , ytn )0 and those of (yt1 +h , . . . , ytn +h )0 exist
and are identical.

Tracking and online learning.

Complete stationarity implies second order stationarity but the


converse is not necessarily true.

Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

Definitions

Definitions

Stationarity

Stationarity

Stationarity

Stationarity

-300
0

500

1000

1500

2000

200
100
0
-300

-200

-100

100

200

time

-100

Gaussian time series processes: strong and weak


stationarity are equivalent.

-200

If {yt } is weakly stationary E(yt ) = , V (yt ) = v , and


Cov (yt , ys ) = (s t).

-300

-200

-100

100

200

Second order stationarity

50

100
time

150

200

50

100
time

150

200


Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

Definitions

Definitions

The ACF

The ACF

The autocorrelation function (ACF)


Definition
The autocovariance of {yt } is defined as
(s, t) = Cov (yt , ys ) = E{(yt t )(ys s )}.

The sample autocorrelation function


Definition
The sample autocovariance is given by
T h
1 X
(h) =
(yt y )(yt+h y ),
T

If {yt } is stationary we can write (h) = Cov (yt , yth ).


Definition
The autocorrelation function (ACF) is given by
(s, t)
(s, t) = p
.
(s, s)(t, t)

t=1

where y =

PT

t=1 yt /T

is the sample mean.

Definition
The sample autocorrelation is given by (h) =

(h)

(0) .

For stationary processes we can write (h) = (h)/(0).

Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

Definitions

Definitions

The ACF

The ACF

ACF of AR(1)

0.5

AR(1). Let yt = yt1 + t , t N(0, v ). Then,


(0) =

v
,
(1 2 )

= 0.9
= 0.9
= 0.3

0.5

White Noise. Let yt N(0, v ) for all t, with Cov (yt , ys ) = 0 if


t 6= s. Then, (0) = v , (h) = 0 for all h 6= 0, and so, (0) = 1
and (h) = 0 for all h 6= 0.

0.0

1.0

The ACF: Examples

1.0

(h) = |h| (0).

10

20

30
h

40

50


Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

Definitions

ML and Bayesian Inference

The ACF

Sample ACF of AR(1)


= 0.9

= 0.3

1.0

likelihood

0.8

p(|y1:T ) =

ACF

0.4

ACF

p(y1:T ) =

0.2

0.0

0.4

0.0
10

15

20

Lag
(a)

10
Lag
(b)

15

20

p(y1:T |)p()d.

p(|y1:T ) p(yT |y1:(T 1) , ) p(|y1:(T 1) )


p(yT |) p(|y1:(T 1) ) .
| {z } |
{z
}

1.0

0.0

Alternatively, we can write:

0.5

0.2

ACF

with

prior

z }| { z}|{
p(y1:T |) p()
,
p(y1:T )
| {z }
predictive

0.6

0.6

0.5

0.8

1.0

1.0

= 0.9

Bayes theorem: Independent Observations

10

15

20

likelihood

Lag
(c)

Analisis
Bayesiano de Series Temporales

predictive

Analisis
Bayesiano de Series Temporales

ML and Bayesian Inference

ML and Bayesian Inference

Bayes theorem: Dependence on (t 1)

Bayes theorem: Dependence on (t 1)

Example
AR(1): yt = yt1 + t , t N(0, v ), and so = (, v ).
I

I
T
Y
p(|y1:T ) p() p(y1 |)
p(yt |yt1 , ) .
|{z}
t=2
prior |
{z
}

Conditional likelihood: p(yt |yt1 , ) = N(yt |yt1 , v );


p(y1 |) = N(0, v /(1 2 ));

Then,

p


(1 2 )
Q ()
exp
p(|y1:T ) p()
,
2v
(2v )T /2

likelihood

with

Q () =

y12 (1

T
X
)+
(yt yt1 )2
2

|t=2

{z

Q()


Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

ML and Bayesian Inference

ML and Bayesian Inference

Bayes theorem: Dependence on (t 1)

Estimation
I

Example
AR(1) (cont.): We can also use the conditional likelihood
p(y2:T |, y1 ) as an approximation to the full likelihood and
obtain the posterior


Q()
(T 1)/2
p(|y1:T ) p()v
exp
2v

I
I

= MLE that
Maximum likelihood estimation (MLE): Find
maximizes p(y1:T |).
= MAP
Maximum a posteriori estimation (MAP): Find
that maximizes p(|y1:T ).
Least squares estimation (LSE): Write the model as
y = F0 + ,  N(0, v I).
with dim(y) = n and dim() = p so that
p(y|F0 , , v ) = (2v )n/2 exp(Q(y, )/2v ) ,
that minimizes Q(y, ).
and find

Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

ML and Bayesian Inference

ML and Bayesian Inference

Bayesian Estimation

Bayesian Estimation

Consider again the model y = F0 + , with  N(0, v I). The


posterior density is given by
p(, v |y) p(, v ) p(y|, v )
p(, v ) (2v )

n/2

Reference prior: p(, v ) 1/v


I

exp(Q(y, )/2v )

where
0 (FF0 )( )
+ R,
Q(, y) = (y F0 )0 (y F0 ) = ( )
= (FF0 )1 Fy and R = (y F0 )
0 (y F0 ).

with

I The MLE of is ;
I

The MLE of v is R/n, however, s2 = R/(n p) is used


instead.

I
I

p(|y, F) is Student-t with n p degrees of freedom, mode


and density

n
on/2
0 FF0 ( )/(n

p)s2
p(|y, F) |FF0 |1/2 1 + ( )
s2 (FF0 )1 ).
When n p(|y, F) N(|,


(np)s2
p(v |y) = IG (np)
.
2 ,
2


Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

ML and Bayesian Inference

ML and Bayesian Inference

Bayesian Estimation

Bayesian Estimation (Conjugate prior)

Conjugate Prior:

p(, v ) = p(|v )p(v ) = N(|m0 , v C0 ) IG(v |n0 /2, d0 /2)

e
I
I

p(, v |F, y) v {(p+n+n0 )/2+1}

m0 )0 C0 ( m0 )+(yF0 )0 (yF0 )+d0


1

(v |F, y) IG(n /2, d /2) with n = n + n0 and


d = e0 Q1 e + d0 , with

2v

(y|F, v ) N(F0 m0 , v (F0 C0 F + In ));


(|F, v ) N(m, v C), with

e = (y F0 m0 ), and Q = (F0 C0 F + I).


I

(|y1:n , F) Tn [m, d C/n ].

m = m0 + C0 F[F0 C0 F + In ]1 (y F0 m0 )
C = C0 C0 F[F0 C0 F + In ]1 F0 C0 ,

Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

ML and Bayesian Inference

ML and Bayesian Inference

MLE of unconditional likelihood is obtained by maximizing


p(y1:n |) or by minimizing
0.5[log(1 2 ) Q ()].
We need methods such as Newton-Raphson or scoring to
find ML .

80
100

The conditional MLE is found by maximizing


exp{ Q()/2}
Q()). Therefore,
P (or by minimizing
P
2 .
= ML = nt=2 yt yt1 / nt=2 yt1

120

140

Example
ML, MAP, and LS estimators for the AR(1) model.
yt = yt1 + t , with t N(0, 1). In this case = .

60

AR(1) conditional and unconditional likelihoods; simulated


data with = 0.9; MLEs = 0.9069 and = 0.8979.

Estimation

0.6

0.7

0.8

0.9

1.0


Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

ML and Bayesian Inference

ML and Bayesian Inference

Bayesian Estimation (Conjugate Analysis)

AR(1) conditional and unconditional posteriors with priors


N(0, c), c = 1 and c = 0.01

Reference analysis in the AR(1) model.

40
60
80
120
140

140

120

100

80

100

60

40

0.6

0.7

0.8

0.9

1.0

0.6

0.7

0.8

0.9

1.0

I
I

Analisis
Bayesiano de Series Temporales

For the conditional likelihood ML =

Pn

t=2 yt1 yt /

Under the reference prior MAP = ML .


Also,
P
n
X
( nt=2 yt yt1 )2
2
R=
yt
Pn1 2 ,
t=1 yt
t=2

Pn1
t=1

yt2 .

and so s2 = R/(n 2) estimates v .

Marginal posterior for : Student-t with n 2 degrees of


freedom, centered at ML with scale s2 (FF0 )1 .

Marginal posterior for v : Inv 2 (v |n 2, s2 ) or,


equivalently, (n 2)s2 /2 IG(v |(n 2)/2, (n 2)s2 /2).

Analisis
Bayesiano de Series Temporales

ML and Bayesian Inference

ML and Bayesian Inference

AR(1) reference analysis; 500 simulated observations with


= 0.9 and v = 100.
1400

AR(1) with full likelihood: The prior p(, v ) 1/v does not
lead to a closed form posterior distribution when the full
likelihood is used. We obtain


Q ()
(n/2+1)
2 1/2
p(, v |y1:n ) v
(1 ) exp
.
2v

1200

1000

Frequency

600

800

1000

800

400

How can we summarize posterior inference in this case?

200

Frequency

600
400
200
0
0.84

0.88

0.92

(a)

0.96

Bayesian Estimation: Non-Conjugate Analysis

90 100

120
v
(b)

140

Via simulation-based methods such as Markov chain


Monte Carlo...


Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

ML and Bayesian Inference

ML and Bayesian Inference

MCMC: The Metropolis Hastings Algorithm

MCMC: AR(1) case

Creates a sequence of random draws, (1) , (2) , . . . , whose


distributions converge to the target distribution, p(|y1:n ).
1. Draw (0) with p( (0) |y1:n ) > 0 from p0 ().
2. For m = 1, 2, . . . , (until convergence):

MCMC for AR(1) with full likelihood.

(a) Sample J( | (m1) )


(b) Compute the importance ratio
r=

p( |y1:n )/J( | (m1) )

p(

(m1)

|y1:n )/J(

(m1)

| )

Sample v (m) from (v |, y1:n ) IG(n/2, Q ()/2) (Gibbs


step, every draw will be accepted)

Sample N (m1) , c .

(c) Set

(m)

(m1)

with probability = min(r , 1)


otherwise.

Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

ML and Bayesian Inference

Time Domain Models


Autoregressions

MCMC: AR(1) example

AR(p) Models
2.0
v

0.0

0.2
0.0
0

200

400

600

800

1000

200

600

800

1000

80
60
20

(c)

0.94

T
Y

t=p+1

p(yt |y(tp):(t1) ) =

T
Y

t=p+1

N(yt |f0t , v ) = N(y|F0 , v In ),

with = (1 , . . . , p )0 , ft = (yt1 , . . . , ytp )0 , F = [fT , . . . , fp+1 ].

0
0.90

j ytj + t ,

where t is a sequence of uncorrelated error terms typically


assumed Gaussian, i.e., t N(0, v ).
Under Gaussianity, if y = (yT , yT 1 , . . . , yp+1 )0 , we have
p(y|y1:p ) =

40

Frequency

60
40
20
0

Frequency

400

iteration
(b)

80

iteration
(a)

0.86

p
X
j=1

0.5

0.4

yt =

1.0

0.6

1.5

0.8

1.0

An autoregression of order p, or AR(p), has the form

1.00

1.05
v
(d)

1.10

1.15


Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

Time Domain Models

Time Domain Models

Autoregressions

Autoregressions

AR Models: Causality and Stationarity

AR Models: Causality and Stationarity

Definition
An AR(p) process yt is causal if it can be written as
yt = (B)t =

j tj ,

j=0

with
P B the backshift operator Bt = t1 , 0 = 1 and
j=0 |j | < .

Definition
The AR characteristic polynomial is defined as:
(u) = 1

p
X

yt is causal only when (u) has all its roots outside the unit
circle (or the reciprocal roots inside the unit circle). In other
words, yt is causal if (u) = 0 only when |u| > 1.
Causality Stationarity.

j u j .

j=1

Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

Time Domain Models

Time Domain Models

Autoregressions

Autoregressions

AR Models: State-space representation


yt AR(p) can be written as
yt

= F0 xt

xt

= Gxt1 + t ,

with xt = (yt , yt1 , . . . , ytp+1 )0 , t = (t , 0, . . . , 0)0 ,


F = (1, 0, . . . , 0)0 and

1 2 3 p1 p
1 0 0
0
0

0 1 0

0
0
G=
.
..
..
..
.
. 0
.
0 0 1
0

AR Models: State-space representation


I

The eigenvalues of the matrix G, 1 , . . . , p are the


reciprocal roots of the AR characteristic polynomial the
AR characteristic polynomial can written as:
p
Y
(1 j u).
j=1

The expected behavior of the process in the future is given


by
p
X
0 h
ft (h) = E(yt+h |y1:t ) = F G xt =
ct,j jh ,
j=1

with ct,j = dj et,j , and dj , et,j elements of d = E0 F, and


et = E1 xt , where E is an eigenmatrix of G.


Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

Time Domain Models

Time Domain Models

Autoregressions

Autoregressions

AR Models: Forecast function

If yt is such that |j | < 1 for all j, ft (h) 0 as h increases.

If j is real, its contribution to the forecast function is ct,j jh .

If j and j+1 are complex conjugates, ct,j and ct,j+1 are


also complex conjugates that can be written as
at exp( ibt ). The contribution of this pair of complex
reciprocal roots to ft (h) is 2at r h cos(wh + bt ), with r and w
the modulus and frequency of j and j+1 .

AR Models: ACF
The autocorrelation structure of an AR(p) process is given in
terms of the solution of the homogeneous difference equation
(h) 1 (h 1) . . . p (h p) = 0, h p.
If 1 , . . . , r are the reciprocalPcharacteristic roots with
multiplicities m1 , . . . , mp and pi=1 mi = p, the general solution
of the equation is
h
(h) = 1h p1 (h) + . . . + m
pm (h), h p,

where pi (h) is a polynomial of degree mi 1.

Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

Time Domain Models

Time Domain Models

Autoregressions

Autoregressions

AR Models: Forecast function, AR(2)


Let yt = 1 yt1 + 2 yt2 + t and 1 , 2 be the reciprocal roots
of (u) = 1 1 u 2 u 2 .
I

1 , 2 real and distinct.

AR Models: ACF of an AR(2)


Autocorrelation Function
I

ft (h) = ct,1 1h + ct,2 2h .


I

1 = 2 = real.

(h) = a1h + b2h , h 2


I

ft (h) = p(h)h ,
with p(h) polynomial of degree one, i.e., p(h) = d + eh.
I

1 , 2 complex conjugates.
h

ft (h) = 2at r cos(wh + bt ).

1 , 2 real and distinct.

1 = 2 = real.
(h) = (a + bh)h , h 2

1 , 2 complex conjugates.
(h) = ar h cos(h + b), h 2.


Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

Time Domain Models

Time Domain Models

Autoregressions

Autoregressions

AR Models: PACF

AR Models: Computing the PACF


I

Let (h, h) be the partial autocorrelation coefficient at lag h,


given by

(y1 , y0 ) = (1)
h=1
(h, h) =
h1
h1
(yh yh , y0 y0 ) h > 1,

with yhh1 the minimum mean square linear predictor of yh given


yh1 , . . . , y1 , and y0h1 the minimum mean square linear
predictor of y0 given y1 , . . . , yh1 .

n n = n , with n an n n matrix with elements


{(h j)}nj=1 , n = ((1), . . . , (n))0 , and
n = ((n, 1), . . . , (n, n))0 .
Durbin-Levinson recursion. For n = 0 (0, 0) = 0 and for
n1
P
(n) n1
h=1 (n 1, h)(n h)
(n, n) =
,
Pn1
1 h=1 (n 1, h)(h)
with

(n, h) = (n 1, h) (n, n)(n 1, n h),

Result: If yt AR(p), (h, h) = 0 for all h > p.

for n 2 and h = 1 : (n 1).

Sample PACF can also be computed using these algorithms.

Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

Time Domain Models

Time Domain Models

Autoregressions

Autoregressions

AR Models: Yule-Walker Estimation

AR Models: MLE and Bayesian estimation


I

p(y|, v , y1:p ) =

p
=
1

p , v = (0)
0p
p.
p
It can be shown that

that maximizes
MLE. Find

t=p+1

=
) N(0, v 1
T (
p ),

and that v is close to v when T is large.

T
Y

T
Y

t=p+1
I

p(yt |, v , y(tp):(t1) )
N(yt |f0t , v ) = N(y|F0 , v In ).

Bayesian. Combine p(y|, v , y1:p ) with prior p(, v ).


I
I
I

Reference prior p(, v ) 1/v .


Conjugate prior p(|v ) = N(|m0 , v C0 ) and
p(v ) = IG(n0 /2, d0 /2).
Non-conjugate.


Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

Time Domain Models

Time Domain Models

Autoregressions

Autoregressions

AR Models: EEG data analysis

Posterior mean from AR(8) reference


analysis (n = 392):

200

Future sample

Forecast function

= (0.27, 0.07, 0.13, 0.15,

and s = 61.52. These estimates lead


to the following estimates of the
reciprocal characteristic roots:

-100
-200
-300
0

100

200

300

(0.97, 12.73);

(0.81, 5.10);

(0.72, 2.99);

(0.66, 2.23).

voltage (mcv)

0.11, 0.15, 0.23, 0.14)0

200

200

100

100
voltage (mcv)

100
voltage (mcv)

AR Models: EEG data analysis

0
-100

0
-100

-200

-200

-300

-300

400

time

100

200

300

400

500

600

100

time

Analisis
Bayesiano de Series Temporales

Time Domain Models

Autoregressions

Autoregressions

AR Models: Model Order Assessment


Choose a value p and for all p p compute
Akaikes Information Criterion (AIC):

time

-10

Bayesian Information Criterion (BIC):

-20
log-likelihood

log(n)p + n log(sp2 ).

-30

-40

p(y(p +1):T |y1:p , p) =

a a
b m
b m m
m
a ab b m a a a
m
a a
a
a a
b
m
a a
a a a a a
m
b
m
b
b
m
m
m
b
m
a
b
m
m m
b
b
m m m m
b
m
b
b
b a
b
m
b
b
b b
b
a

Marginal:

Here n = T p .

400

Order Assessment in EEG Example: Take p = 25 and


n = 400 p .

2p + n log(sp2 ).
I

300

Analisis
Bayesiano de Series Temporales

Time Domain Models

200

-50

p(y(p +1):T |p , v , y1:p )p(p , v )dp dv .

b
m

-60
a

a
5

10

15
AR order p

20

25

500

600


Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

Time Domain Models

Time Domain Models

Autoregressions

Autoregressions

AR Models: Initial Observations


Full likelihood:

AR Models: Initial Observations

It can be shown (e.g., see Box, Jenkins, and Reinsel, 2008) that

p(y1:T |, v ) = p(y(p+1):T |, v , y1:p )p(y1:p |, v )


= p(y|, v , xp )p(xp |, v ).

What about p(xp |, v )?


I N(xp |0, A) with A known.
I N(xp |0, v A()) with A() depending on through the
autocorrelation function and
p(y1:T |, v ) v T /2 |A()|1/2 exp(Q(y1:T , )/2v ),

Q(y1:T , ) = a 2b0 + 0 C,
with a, b, and C obtained from


a b0
D=
,
b C
and D a (p + 1) (p + 1) with Dij =
I

where

Q(y1:T , ) =

T
X

t=p+1

(yt f0t )2 + x0p A()1 xp .

Analisis
Bayesiano de Series Temporales

Time Domain Models


Autoregressions

AR Models: Structured non-conjugate priors


I

Denote the complex roots as (rj , j ), for j = 1 : C and the


real roots as rj , for j = (C + 1) : (R + C).

Prior on the real reciprocal roots.


rj

If yt AR(p), yt is causal and stationary if all the AR reciprocal


roots have moduli less than one.
Huerta and West (1999) proposed priors on the reciprocal
characteristic roots as follows.
Let C be the maximum number of pairs of complex roots
and R the maximum number of real roots with p = 2C + R.

yi+r yj+r .

Analisis
Bayesiano de Series Temporales

Autoregressions

r =0

If |A()|1/2 is ignored when computing p(y1:T |, v ), the


likelihood function is that of a standard linear model form
and so, if p(, v ) 1/v we have a normal/inverse-gamma
= C1 b.
posterior with
Jeffreys prior is approximately p(, v ) |A()|1/2 v 1/2 .

Time Domain Models

AR Models: Structured non-conjugate priors

PT +1ji

r ,1 I(1) (rj ) + c,0 I0 (rj ) + r ,1 I1 (rj ) +


+(1 r ,0 r ,1 r ,1 )gr (rj ),

with gr () a continuous distribution on (1, 1), e.g.,


gr () = U(| 1, 1).
Prior on the complex reciprocal roots.
rj
j

Then...
I
I

c,0 I0 (rj ) + c,1 I1 (rj ) + (1 c,1 c,0 )gc (rj ),

h(j ),

with gc (rj ) and h(j ) continuous distributions on 0 < rj < 1


and 2 < j < u , with u T /2.
Dir (r ,1 , r ,0 , r ,1 |1, 1, 1) and Dir (c,0 , c,1 |1, 1).
IG(v |a, b).


Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

Time Domain Models

Time Domain Models

ARMA Models

ARMA Models

ARMA Models

ARMA Models

yt follows an autoregressive moving average model,


ARMA(p, q), if
yt

p
X

i yti +

i=1

q
X

j tj + t ,

j=1

We can also write


(1 1 B . . . p B p ) yt
|
{z
}
(B)

= (1 + 1 B + . . . + q B q ) t
|
{z
}
(B)

We typically assume t N(0, v ). If q = 0 yt AR(p) and if


p = 0 yt MA(q).

Analisis
Bayesiano de Series Temporales

Definition
A MA(q) process is identifiable or invertible if the roots of the
MA characteristic polynomial (u) lie outside the unit circle. In
this case is possible to write the process as an infinite order AR.
Example
Let yt MA(1) with MA coefficient . The process is stationary
for all and

h=0
1

h=1
(h) =
2
(1+ )
0
otherwise.
Note that a MA process with coefficient 1/ has the same ACF
the identifiability condition is 1/ > 1.

Analisis
Bayesiano de Series Temporales

Time Domain Models

Time Domain Models

ARMA Models

ARMA Models

An ARMA(p, q) process is causal if the roots of (u) lie outside


the unit circle. In this case:
1

yt =

(B)(B)t = (B)t =

j tj ,

j=0

with (B)(B) = (B). The j s can be found by solving the


homogeneous difference equations
j

p
X
h=1

h jh = 0, j max(p, q + 1),

with initial conditions


j
and 0 = 1.

j
X
h=1

ARMA Models

h jh = j , 0 j < max(p, q + 1),

The general solution is given by


j = 1j p1 (j) + . . . + rj pr (j),
where 1 , . . . , r are the reciprocal roots of (u), with
multiplicities m1 , . . . , mr , respectively, and each pi (j) is a
polynomial of degree mi 1.


Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

Time Domain Models

Time Domain Models

ARMA Models

ARMA Models

ARMA Models: ACF of ARMA(p, q)

ARMA Models: ACF of MA(q)

The autocovariance function can be written in terms of the


general homogeneous equations
If yt MA(q), its ACF is

Pqh
j=0 j j+h
Pq
(h) =
1+ j=1 j2

(h) 1 (h 1) . . . p (h p) = 0, h max(p, q + 1),

h=0
h=1:q
h > q,

with initial conditions


(h)

p
X
j=1

j (h j) = v

q
X
j=h

j jh ,

0 h < max(p, q + 1).

PACF can also be computed. The PACF coefficients of MA and


ARMA processes will never drop to zero.

Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

Time Domain Models

Time Domain Models

ARMA Models

ARMA Models

ARMA Models: Inverting Q


AR Components Assume that
yt AR(p), i.e., (B)yt = pi=1 (1 i B)yt = t . Then
p
r
Y
Y
(1 i B)yt =
(1 i B)1 t = (B)t ,
i=1

i=r +1

P
j
with (u) = 1 +
j=1 j u , such that
Q
p
1 = (u) i=r +1 (1 i u).

yt

r
X

j ytj + t +

j=1

yt
where

(u)

Qr

r
X

Algorithm
1. Initialize the algorithm by setting i = 0 for all i = 1 : q.
2. For i = (r + 1) : p, update 1 = 1 + i , and then,
I

j tj ,

for j = 2 : q, update j = j + i j1
.

EEG data, AR(8): Bayesian reference analysis: we obtained


estimates of reciprocal characteristic roots given by
(0.97, 12.73), (0.81, 5.10), (0.72, 2.99), and (0.66, 2.23)

j=1

j ytj + t +

j=1

i=1 (1

ARMA Models: Inverting AR Components

i u) = 0.

q
X
j=1

j tj ,

yt 1 yt1 + 2 yt2 + t +

q
X

j tj ,

j=1

where 1 = 2r1 cos(2/1 ) and 2 = r12 . Taking q = 8 we


have...


Analisis
Bayesiano de Series Temporales

Analisis
Bayesiano de Series Temporales

Time Domain Models

Time Domain Models

ARMA Models

ARMA Models

ARMA Models: Inference

1.5

coefficient

1.0

0.0

-0.5

MA coefficient index

Note: the optimal ARMA(p, q) model for these data, among all
the models with p, q 8, is an ARMA(2, 2). The MLEs for the
MA coefficients are 1 = 1.37 and 2 = 0.51.

Analisis
Bayesiano de Series Temporales
Time Domain Models
ARMA Models

Other Related Models


I

ARIMA yt ARIMA(p, d, q)
(1 1 B . . . p B p )(1 B)d yt = (1 + 1 B + . . . + q B q )t .

SARMA
(1 1 B s . . . p B ps )yt = (1 + 1 B s + . . . + q B qs )t .

MLE and least squares estimation. See, e.g., Shumway


and Stoffer, 2006.

Inference via state-space representation. E.g., Kohn and


Ansley (1985), Harvey (1981, 1991).

Bayesian estimation: Monahan (1983); Marriott & Smith


(1992); Chib and Greenberg (1994); Box, Jenkins, and
Reinsel (2008); Zellner (1996); Marriott, Ravishanker,
Gelfand, and Pai (1996); Barnett, Kohn and Sheather
(1997) among others...

0.5

Multiplicative Seasonal ARMA


p (B)P (B s )(1 B)d yt = q (B)Q (B s )t .

S-ar putea să vă placă și