Sunteți pe pagina 1din 37

14.

Stochastic Processes
Introduction
Let  denote the random outcome of an experiment. To every such
outcome suppose a waveform X (t,  )
X (t , ) is assigned.

The collection of such X (t,  )
waveforms form a
n

X (t,  ) 
stochastic process. The
k

set of { k } and the time X (t,  )2



index t can be continuous
X (t,  )
or discrete (countably 1

0
t
t t
infinite or finite) as well. 1 2

Fig. 14.1
For fixed  i  S (the set of
all experimental outcomes), X (t , i ) is a specific time function.
For fixed t,
X 1  X (t1 ,  )

is a random variable. The ensemble of all such realizations 1


X (t , ) over time represents the stochastic PILLAI/Cha
process X(t). (see Fig 14.1). For example
X (t )  a cos( 0 t   ),
where  is a uniformly distributed random variable in (0, 2 ),
represents a stochastic process. Stochastic processes are everywhere:
Brownian motion, stock market fluctuations, various queuing systems
all represent stochastic phenomena.
If X(t) is a stochastic process, then for fixed t, X(t) represents
a random variable. Its distribution function is given by
FX ( x, t )  P{ X (t )  x} (14-1)
Notice that FX ( x, t ) depends on t, since for a different t, we obtain
a different random variable. Further
 dFX ( x, t )
f X ( x, t )  (14-2)
dx
represents the first-order probability density function of the
process X(t). 2
PILLAI/Cha
For t = t1 and t = t2, X(t) represents two different random variables
X1 = X(t1) and X2 = X(t2) respectively. Their joint distribution is
given by

FX ( x1 , x2 , t1 , t2 )  P{ X (t1 )  x1 , X (t2 )  x2 } (14-3)

and
  2
FX ( x1 , x2 , t1 , t2 ) (14-4)
f X ( x1 , x2 , t1 , t2 ) 
x1 x2

represents the second-order density function of the process X(t).


Similarly f X ( x1 , x2 ,  xn , t1 , t2 , tn ) represents the nth order density
function of the process X(t). Complete specification of the stochastic
process X(t) requires the knowledge of f X ( x1 , x2 ,  xn , t1 , t2 , tn )
for all ti , i  1, 2,  , n and for all n. (an almost impossible task
in reality). 3
PILLAI/Cha
Mean of a Stochastic Process:

 (t )  E{ X (t )}    x f X ( x, t )dx

(14-5)
represents the mean value of a process X(t). In general, the mean of
a process can depend on the time index t.

Autocorrelation function of a process X(t) is defined as



RXX (t1 , t2 )  E{ X (t1 ) X * (t2 )}    x1 x2* f X ( x1 , x2 , t1 , t2 )dx1dx2 (14-6)
and it represents the interrelationship between the random variables
X1 = X(t1) and X2 = X(t2) generated from the process X(t).

Properties:

1. RXX (t1 , t2 )  RXX* (t2 , t1 )  [ E{ X (t2 ) X * (t1 )}]* (14-7)


2. R XX (t , t )  E{| X (t ) |2 }  0. (Average instantaneous power) 4
PILLAI/Cha
3. RXX (t1 , t 2 ) represents a nonnegative definite function, i.e., for any
n
set of constants {ai }i 1
n n
  i j RXX (ti , t j )  0.
a a *
(14-8)
i 1 j 1 n
Eq. (14-8) follows by noticing that E{| Y | }  0 for Y   ai X (ti ).
2

The function i 1

C XX (t1 , t 2 )  RXX (t1 , t2 )   X (t1 )  *X (t2 ) (14-9)

represents the autocovariance function of the process X(t).


Example 14.1
Let
T
z   T X (t )dt.

Then
T T
E [| z | ]   T  T E{ X (t1 ) X * (t2 )}dt1dt2
2

T T
  T  T R XX (t1 , t2 )dt1dt2 (14-10) 5
PILLAI/Cha
Example 14.2
X (t )  a cos( 0 t   ),  ~ U (0,2 ). (14-11)
This gives
 X (t )  E{ X (t )}  aE{cos( 0 t   )}
 a cos  0 t E{cos }  a sin  0 t E{sin  }  0, (14-12)
2
since E{cos }  21 0 cos d  0  E{sin }.
Similarly
R XX (t1 , t2 )  a 2 E{cos( 0 t1   ) cos( 0 t2   )}
a2
 E{cos  0 (t1  t2 )  cos( 0 (t1  t2 )  2 )}
2
a2
 cos  0 (t1  t 2 ). (14-13)
2
6
PILLAI/Cha
Stationary Stochastic Processes
Stationary processes exhibit statistical properties that are
invariant to shift in the time index. Thus, for example, second-order
stationarity implies that the statistical properties of the pairs
{X(t1) , X(t2) } and {X(t1+c) , X(t2+c)} are the same for any c.
Similarly first-order stationarity implies that the statistical properties
of X(ti) and X(ti+c) are the same for any c.
In strict terms, the statistical properties are governed by the
joint probability density function. Hence a process is nth-order
Strict-Sense Stationary (S.S.S) if
f X ( x1 , x2 , xn , t1 , t2 , tn )  f X ( x1 , x2 , xn , t1  c, t2  c , tn  c )
(14-14)
for any c, where the left side represents the joint density function of
the random variables X 1  X (t1 ), X 2  X (t2 ), , X n  X (tn ) and
the right side corresponds to the joint density function of the random
variables X 1  X (t1  c ), X 2  X (t2  c ), , X n  X (tn  c).
A process X(t) is said to be strict-sense stationary if (14-14) is 7
true for all ti , i  1, 2, , n, n  1, 2,  and any c. PILLAI/Cha
For a first-order strict sense stationary process,
from (14-14) we have
f X ( x, t )  f X ( x, t  c ) (14-15)

for any c. In particular c = – t gives


f X ( x, t )  f X ( x ) (14-16)
i.e., the first-order density of X(t) is independent of t. In that case

E[ X (t )]    x f ( x )dx   , a constant. (14-17)
Similarly, for a second-order strict-sense stationary process
we have from (14-14)
f X ( x1 , x2 , t1 , t 2 )  f X ( x1 , x2 , t1  c, t 2  c)

for any c. For c = – t2 we get


f X ( x1 , x2 , t1 , t 2 )  f X ( x1 , x2 , t1  t 2 ) (14-18)
8
PILLAI/Cha
i.e., the second order density function of a strict sense stationary
process depends only on the difference of the time indices t1  t2   .
In that case the autocorrelation function is given by

RXX (t1 , t2 )  E{ X (t1 ) X * (t2 )}
   x1 x2* f X ( x1 , x2 ,  t1  t2 )dx1dx2
 RXX (t1  t2 )  RXX ( )  RXX
*
(  ), (14-19)
i.e., the autocorrelation function of a second order strict-sense
stationary process depends only on the difference of the time
indices   t1  t 2 .
Notice that (14-17) and (14-19) are consequences of the stochastic
process being first and second-order strict sense stationary.
On the other hand, the basic conditions for the first and second order
stationarity – Eqs. (14-16) and (14-18) – are usually difficult to verify.
In that case, we often resort to a looser definition of stationarity,
known as Wide-Sense Stationarity (W.S.S), by making use of 9
PILLAI/Cha
14-17) and (14-19) as the necessary conditions. Thus, a process X(t)
s said to be Wide-Sense Stationary if
i) E{ X (t )}   (14-20)
and
ii) E{ X (t1 ) X * (t2 )}  R XX (t1  t2 ), (14-21)

.e., for wide-sense stationary processes, the mean is a constant and


he autocorrelation function depends only on the difference between
he time indices. Notice that (14-20)-(14-21) does not say anything
about the nature of the probability density functions, and instead deal
with the average behavior of the process. Since (14-20)-(14-21)
ollow from (14-16) and (14-18), strict-sense stationarity always
mplies wide-sense stationarity. However, the converse is not true in
general, the only exception being the Gaussian process.
This follows, since if X(t) is a Gaussian process, then by definition
X 1  X (t1 ), X 2  X (t2 ), , X n  X (are
tn ) jointly Gaussian random
variables for any t1 , t2 whose
, tn joint characteristic function 10
s given by PILLAI/Cha
n n
j   ( tk )k   C (14-22) XX
( ti , tk )i  k / 2
 X (1 ,  2 , ,  n )  e k 1 l ,k

where C XX (ti , t k ) is as defined on (14-9). If X(t) is wide-sense


stationary, then using (14-20)-(14-21) in (14-22) we get
n n n
j   k  12  C XX
( ti  tk )i  k
(14-23)
 X (1 ,  2 , ,  n )  e k 1 11 k 1

and hence if the set of time indices are shifted by a constant c to


generate a new set of jointly Gaussian random variables X 1  X (t1  c ),
X 2  X (t2  c ),, X n  X (tn  c ) then their joint characteristic
n
function is identical to (14-23). Thus the set of random variables i i 1{ X }
and { X i}i 1 have the same joint probability distribution for all n and
n

all c, establishing the strict sense stationarity of Gaussian processes


from its wide-sense stationarity.
To summarize if X(t) is a Gaussian process, then
wide-sense stationarity (w.s.s)  strict-sense stationarity (s.s.s).
Notice that since the joint p.d.f of Gaussian random variables depends
11
only on their second order statistics, which is also the basis PILLAI/Cha
for wide sense stationarity, we obtain strict sense stationarity as well.
From (14-12)-(14-13), (refer to Example 14.2), the process
X (t )  a cos( 0 t   ), in (14-11) is wide-sense stationary, but
not strict-sense stationary. t 2

Similarly if X(t) is a zero mean wide   t1  t 2

sense stationary process in Example 14.1, T T 


t1

then  z in (14-10) reduces to


2  T
2T  
T T
 z2  E{| z |2 }   T  T R XX (t1  t2 )dt1dt2 .
As t1, t2 varies from –T to +T,   t1  t 2 varies
Fig. 14.2
from –2T to + 2T. Moreover RXX ( ) is a constant
over the shaded region in Fig 14.2, whose area is given by (  0)
1 1
(2T   )  (2T    d ) 2  (2T   )d
2

2 2
and hence the above integral reduces to
2T 2T | |
    2 t R XX ( )( 2T  |  |)d 
2
z
1
2T  2t XX
R ( )(1  2 T ) d . 12
(14-24) PILLAI/Cha
Systems with Stochastic Inputs
A deterministic system1 transforms each input waveform X (t , i ) into
an output waveform Y (t , i )  T [ X (t , i )] by operating only on the
time variable t. Thus a set of realizations at the input corresponding
to a process X(t) generates a new set of realizations {Y (t , )} at the
output associated with a new process Y(t).

Y (t,  i )
X (t,  i )


X (t )
T [] Y

(t )

t t

Fig. 14.3

Our goal is to study the output process statistics in terms of the input
process statistics and the system function.
1
A stochastic system on the other hand operates on both the variables t and  .
13
PILLAI/Cha
Deterministic Systems

Memoryless Systems Systems with Memory


Y (t )  g [ X (t )]

Time-varying Time-Invariant Linear systems


systems systems Y (t )  L[ X (t )]
Fig. 14.3

Linear-Time Invariant
(LTI) systems

X (t ) h (t ) Y (t )     h(t   ) X ( )d

LTI system     h( ) X (t   )d . 14
PILLAI/Cha
Memoryless Systems:
The output Y(t) in this case depends only on the present value of the
input X(t). i.e.,
Y (t )  g{ X (t )} (14-25)

Strict-sense Memoryless Strict-sense


stationary input system stationary output.

(see (9-76), Text for a proof.)

Wide-sense Memoryless Need not be


stationary input system stationary in
any sense.

X(t) stationary Y(t) stationary,but


Memoryless
Gaussian with not Gaussian with
system R XY ( )  R XX ( ).
RXX ( )
Fig. 14.4 (see (14-26)). 15
PILLAI/Cha
Theorem: If X(t) is a zero mean stationary Gaussian process, and
Y(t) = g[X(t)], where g () represents a memoryless device,
then
R XY ( )  R XX ( ),   E{g ( X )}. (14-26)

Proof: Verify it or
See Page 397 of the Book


16
PILLAI/Cha
Linear Systems: L[] represents a linear system if
L{a1 X (t1 )  a2 X (t2 )}  a1 L{ X (t1 )}  a2 L{ X (t2 )}. (14-28)
Let
Y (t )  L{ X (t )} (14-29)
represent the output of a linear system.
Time-Invariant System: L[] represents a time-invariant system if
Y (t )  L{ X (t )}  L{ X (t  t0 )}  Y (t  t0 ) (14-30)
i.e., shift in the input results in the same shift in the output also.
If L[] satisfies both (14-28) and (14-30), then it corresponds to
a linear time-invariant (LTI) system.
LTI systems can be uniquely represented in terms of their output to
h (t )
a delta function Impulse
response of
 (t ) LTI h(t ) the system
t

Impulse Fig. 14.5 Impulse 17


response PILLAI/Cha
then Y (t )

X (t )
t
X (t ) Y (t )
t LTI 
Y (t )     h(t   ) X ( )d
arbitrary Fig. 14.6 
input
    h( ) X (t   )d (14-31)
Eq. (14-31) follows by expressing X(t) as

X (t )     X ( ) (t   )d (14-32)
and applying (14-28) and (14-30) to Y (t )  L{ X (t )}. Thus

Y (t )  L{ X (t )}  L{   X ( ) (t   )d }

    L{ X ( ) (t   )d } By Linearity

    X ( ) L{ (t   )}d By Time-invariance
 
    X ( )h(t   )d     h( ) X (t   )d . (14-33) 18
PILLAI/Cha
Output Statistics: Using (14-33), the mean of the output process
is given by

Y (t )  E{Y (t )}     E{ X ( )h (t   )d }

     X ( )h(t   )d   X (t )  h(t ). (14-34)

Similarly the cross-correlation function between the input and output


processes is given by
R XY (t1 , t2 )  E{ X (t1 )Y * (t2 )}

 E{ X (t1 )    X * (t2   )h * ( )d }

    E{ X (t1 ) X * (t2   )}h * ( )d

    R XX (t1 , t2   )h * ( )d
 R XX (t1 , t2 )  h * (t2 ). (14-35)
Finally the output autocorrelation function is given by 19
PILLAI/Cha
RYY (t1 , t2 )  E{Y (t1 )Y * (t2 )}

 E{   X (t1   )h(  )d Y * (t2 )}

    E{ X (t1   )Y * (t2 )}h (  )d

    R XY (t1   , t2 )h(  )d
 R XY (t1 , t2 )  h(t1 ), (14-36)
or
RYY (t1 , t2 )  R XX (t1 , t2 )  h * (t2 )  h(t1 ). (14-37)

 X (t ) h(t) Y (t )
(a)

RXX (t1 , t 2 ) 
 h*(t )  
R XY ( t1 ,t 2 )
2 h(t )
1

 RYY (t1 , t 2 )
(b)
20
Fig. 14.7 PILLAI/Cha
In particular if X(t) is wide-sense stationary, then we have  X (t )   X
so that from (14-34)

Y (t )   X    h( )d   X c, a constant. (14-38)

Also RXX (t1 , t 2 )  RXX (t1  t 2 ) so that (14-35) reduces to



R XY (t1 , t2 )     R XX (t1  t2   )h * ( )d

 R XX ( )  h * (  )  R XY ( ),   t1  t2 . (14-39)
Thus X(t) and Y(t) are jointly w.s.s. Further, from (14-36), the output
autocorrelation simplifies to

RYY (t1 , t 2 )    RXY (t1    t 2 )h(  )d ,   t1  t 2
 RXY ( )  h( )  RYY ( ). (14-40)
From (14-37), we obtain
RYY ( )  RXX ( )  h* ( )  h( ). (14-41) 21
PILLAI/Cha
From (14-38)-(14-40), the output process is also wide-sense stationary.
This gives rise to the following representation

X (t ) Y (t )
wide-sense LTI system wide-sense
stationary process h(t) stationary process.
(a)
X (t )
strict-sense LTI system Y (t )
strict-sense
stationary process h(t) stationary process
(b) (see Text for proof )
X (t ) Y (t )
Gaussian Linear system Gaussian process
process (also (also stationary)
stationary) (c)
22
Fig. 14.8 PILLAI/Cha
White Noise Process:
W(t) is said to be a white noise process if
RWW (t1 , t 2 )  q(t1 ) (t1  t 2 ), (14-42)
i.e., E[W(t1) W*(t2)] = 0 unless t1 = t2.
W(t) is said to be wide-sense stationary (w.s.s) white noise
if E[W(t)] = constant, and
RWW (t1 , t 2 )  q (t1  t 2 )  q ( ). (14-43)

If W(t) is also a Gaussian process (white Gaussian process), then all of


its samples are independent random variables (why?).

LTI Colored noise


White noise
h(t) N (t )  h (t ) W (t )
W(t)
Fig. 14.9
For w.s.s. white noise input W(t), we have 23
PILLAI/Cha

E[ N (t )]  W   h( )d , a constant (14-44)

and

Rnn ( )  q ( )  h* ( )  h( )


 qh* ( )  h( )  q ( ) (14-45)

where

 ( )  h( )  h ( )    h( )h* (   )d .
*
(14-46)

Thus the output of a white noise process through an LTI system


represents a (colored) noise process.
Note: White noise need not be Gaussian.
“White” and “Gaussian” are two different concepts!
24
PILLAI/Cha
Discrete Time Stochastic Processes:
A discrete time stochastic process Xn = X(nT) is a sequence of
random variables. The mean, autocorrelation and auto-covariance
functions of a discrete-time process are gives by
 n  E{ X ( nT )} (14-57)
R ( n1 , n2 )  E{ X ( n1T ) X * ( n2T )} (14-58)
and
C ( n1 , n2 )  R ( n1 , n2 )   n1  n*2 (14-59)
respectively. As before strict sense stationarity and wide-sense
stationarity definitions apply here also.
For example, X(nT) is wide sense stationary if
E{ X ( nT )}   , a constant (14-60)
And {i.e., R(n1, n2) = R(n1 – n2) = R*(n2 – n1)}

E [ X {( k  n )T } X * {( k )T }]  R ( n )  rn  r*n (14-61) 25
PILLAI/Cha
If X(nT) represents a wide-sense stationary input to a discrete-time
system {h(nT)}, and Y(nT) the system output, then as before the cross
correlation function satisfies
R XY ( n )  R XX ( n )  h * ( n ) (14-65)
and the output autocorrelation function is given by
RYY ( n )  R XY ( n )  h( n ) (14-66)
or
RYY ( n )  R XX ( n )  h * ( n )  h( n ). (14-67)
Thus wide-sense stationarity from input to output is preserved
for discrete-time systems also.

26
PILLAI/Cha
Auto Regressive Moving Average (ARMA) Processes

Consider an input – output representation


p q
X ( n )    ak X ( n  k )   bkW ( n  k ), (14-68)
k 1 k 0

where X(n) may be considered as the output of a system {h(n)}


driven by the input W(n).
Z – transform of W(n) h(n) X(n)
(14-68) gives Fig.14.12
p q
X ( z )  ak z k
 W ( z )  bk z  k , a0  1 (14-69)
k 0 k 0

or
1 2 q

X ( z ) b0  b1 z  b2 z    bq z  B( z )
H ( z )   h(k ) z  k   1 2 p

k 0 W ( z ) 1  a1 z  a2 z    a p z A( z )
27
(14-70) PILLAI/Cha
represents the transfer function of the associated system response {h(n)}
in Fig 14.12 so that 
X ( n )   h( n  k )W ( k ). (14-71)
k 0

Notice that the transfer function H(z) in (14-70) is rational with p poles
and q zeros that determine the model order of the underlying system.
From (14-68), the output undergoes regression over p of its previous
values and at the same time a moving average based on W ( n ), W ( n  1),
, W ( n  q) of the input over (q + 1) values is added to it, thus
generating an Auto Regressive Moving Average (ARMA (p, q))
process X(n). Generally the input {W(n)} represents a sequence of
uncorrelated random variables of zero mean and constant variance  W2
so that
RWW (n)   W2 (n). (14-72)
If in addition, {W(n)} is normally distributed then the output {X(n)}
also represents a strict-sense stationary normal process.
If q = 0, then (14-68) represents an AR(p) process (all-pole
28
process), and if p = 0, then (14-68) represents an MA(q) PILLAI/Cha
process (all-zero process). Next, we shall discuss AR(1) and AR(2)
processes through explicit calculations.
AR(1) process: An AR(1) process has the form (see (14-68))
X ( n )  aX ( n  1)  W ( n ) (14-73)
and from (14-70) the corresponding system transfer

1
H ( z) 
1  az 1
  a n n
z (14-74)
n 0

provided | a | < 1. Thus


h(n )  a n , | a | 1 (14-75)

represents the impulse response of an AR(1) stable system. Using


(14-67) together with (14-72) and (14-75), we get the output
autocorrelation sequence of an AR(1) process to be
 |n |
a
R XX ( n )   W2 ( n )  {a  n }  {a n }   W2  a |n| k a k   W2
k 0 1  a 2
29
(14-76) PILLAI/Cha
where we have made use of the discrete version of (14-46). The
normalized (in terms of R (0)) output autocorrelation sequence is
XX

given by
R (n)
 X ( n )  XX  a |n| , | n |  0. (14-77)
R XX (0)
It is instructive to compare an AR(1) model discussed above by
superimposing a random component to it, which may be an error
term associated with observing a first order AR process X(n). Thus
Y ( n)  X ( n)  V ( n) (14-78)
where X(n) ~ AR(1) as in (14-73), and V(n) is an uncorrelated random
sequence with zero mean and variance  V that is also uncorrelated
2

with {W(n)}. From (14-73), (14-78) we obtain the output


autocorrelation of the observed process Y(n) to be
RYY (n)  RXX (n)  RVV (n)  RXX (n)   V2 (n)
a |n|
 W 2
  2
 ( n) (14-79)
1 a 2 V
30
PILLAI/Cha
so that its normalized version is given by

 RYY ( n ) 1 n0
Y ( n )    |n|
RYY (0) c a n  1,  2, (14-80)
where
 W2
c 2  1.
 W   V (1  a )
2 2 (14-81)
Eqs. (14-77) and (14-80) demonstrate the effect of superimposing
an error sequence on an AR(1) model. For non-zero lags, the
autocorrelation of the observed sequence {Y(n)}is reduced by a constant
factor compared to the original process {X(n)}.
From (14-78), the superimposed
 (0)   (0)  1
error sequence V(n) only affects X Y

the corresponding term in Y(n)  (k )   (k )


X Y

(term by term). However,


a particular term in the “input sequence” 0
n
k
W(n) affects X(n) and Y(n) as well as
all subsequent observations. 31
Fig. 14.13 PILLAI/Cha
AR(2) Process: An AR(2) process has the form
X (n)  a1 X (n  1)  a2 X (n  2)  W (n) (14-82)
and from (14-70) the corresponding transfer function is given by

1 b1 b2
H ( z )   h( n) z n
   (14-83)
n 0
1
1  a1 z  a2 z 2
1  1 z 1
1  2 z 1
so that
h(0)  1, h(1)  a1 , h(n)  a1h(n  1)  a2 h(n  2), n  2 (14-84)
and in term of the poles 1 and 2 of the transfer function,
from (14-83) we have
h(n)  b11n  b2 n2 , n0 (14-85)

that represents the impulse response of the system.


From (14-84)-(14-85), we also have b1  b2  1, b11  b2 2  a1.
From (14-83),
1  2  a1 , 12  a2 , 32
(14-86) PILLAI/Cha
and H(z) stable implies | 1 | 1, | 2 | 1.
Further, using (14-82) the output autocorrelations satisfy the recursion
RXX (n)  E{ X (n  m) X * (m)}
 E{[a1 X (n  m  1)  a2 X (n  m  2)] X * (m)}
0
 E{W (n  m) X * (m)}
 a1 RXX (n  1)  a2 RXX (n  2) (14-87)
and hence their normalized version is given by
 RXX ( n )
 X (n)   a1  X ( n  1)  a2  X ( n  2). (14-88)
RXX (0)
By direct calculation using (14-67), the output autocorrelations are
given by
RXX (n)  RWW (n)  h* ( n)  h(n)   W2 h* ( n)  h(n)

  W  h* ( n  k )  h( k )
2

k 0
(14-89)
 | b1 | ( ) b b ( ) b b ( ) | b2 | ( ) 
2 * n * * n * * n 2 * n
 W 
2
 1 1 2
 1
1 2 2
 2

 1 | 1 | 1  12 1   1 | 2 | 
2 * * 2 33
1 2
PILLAI/Cha
where we have made use of (14-85). From (14-89), the normalized
output autocorrelations may be expressed as
RXX (n) *n *n
 X ( n)   c11  c2 2 (14-90)
RXX (0)
where c1 and c2 are appropriate constants.
Damped Exponentials: When the second order system in
(14-83)-(14-85) is real and corresponds to a damped exponential
response, the poles are complex conjugate which gives a1  4a2  0
2

in (14-83). Thus
1  r e  j , 2  1* , r  1. (14-91)
j
In that case 1c  c *
2  c e in (14-90) so that the normalized
correlations there reduce to
*n
 X ( n )  2 Re{c  }  2cr n cos( n   ).
1 1 (14-92)
But from (14-86)
1  2  2r cos  a1 , r 2  a2  1, (14-93) 34
PILLAI/Cha
and hence 2r sin   ( a12  4a2 )  0 which gives
 (a12  4a2 )
tan   . (14-94)
a1
Also from (14-88)
 X (1)  a1  X (0)  a2  X (1)  a1  a2  X (1)
so that
a1
 X (1)   2cr cos(   ) (14-95)
1  a2

where the later form is obtained from (14-92) with n = 1. But  X (0)  1
in (14-92) gives
2c cos  1, or c  1 / 2 cos . (14-96)
Substituting (14-96) into (14-92) and (14-95) we obtain the normalized
output autocorrelations to be 35
PILLAI/Cha
cos(n   )
 X (n )  ( a2 ) n/2
,  a2  1 (14-97)
cos
where  satisfies
cos(   ) a1 1
 . (14-98)
cos 1  a2  a2
Thus the normalized autocorrelations of a damped second order
system with real coefficients subject to random uncorrelated
impulses satisfy (14-97).

More on ARMA processes

From (14-70) an ARMA (p, q) system has only p + q + 1 independent


coefficients, ( ak , k  1  p, bi , i  0  q), and hence its impulse
response sequence {hk} also must exhibit a similar dependence among
them. In fact according to P. Dienes (The Taylor series, 1931), 36
PILLAI/Cha
an old result due to Kronecker1 (1881) states that the necessary and
sufficient condition for H ( z )   k  0 hk z  k to represent a rational

system (ARMA) is that


det H n  0, nN (for all sufficiently large n ), (14-99)
where
 h0 h1 h2  hn 
h h h  h 
Hn   1 .
 2 3 n 1

  (14-100)
h h h h 
 n n 1 n  2  2n 

i.e., In the case of rational systems for all sufficiently large n, the
Hankel matrices Hn in (14-100) all have the same rank.

37
1
Among other things “God created the integers and the rest is the work of man.” (Leopold Kronecker)
PILLAI/Cha

S-ar putea să vă placă și