Sunteți pe pagina 1din 8

EE 5375/7375 Random Processes September 30, 2003

Homework #4 solutions

Problem 1. textbook problem 6.4


A discrete-time random process is defined by Xn = sn for n ≥ 0 where s is selected at random from the
interval (0,1). (a) Sketch a sample path of the process. (b) Find the PDF of Xn . (c) Find the joint PDF of
Xn and Xn+1 . (d) Find the mean and autocovariance functions of Xn .

(a) The sample path will decrease geometrically by X0 = 1, X1 = s, X2 = x2 , and so on, as shown in Fig. 1.
(b) The PDF of Xn at a specific n is

P (Xn ≤ x) = P (sn ≤ x) = P (s ≤ x1/n ) = x1/n

The last step follows because s is uniformly distributed in (0,1).


(c) By definition, the joint PDF is

P (Xn ≤ x, Xn+1 ≤ y) = P (sn ≤ x, sn+1 ≤ y)


1 1
= P (s ≤ x n , x ≤ y n+1 )
1 1
= P (s ≤ min(x n , y n+1 ))
1 1
= min(x n , y n+1 )
1 1
We use the minimum of x n and y n+1 because the joint event that s is less than both of them means that s
must be less than the smaller of them. Again, the last step follows because s is uniformly distributed.
(d) The mean of Xn depends on knowledge of the pdf of s which is fs (s) = 1 over the interval (0,1):
Z 1 Z 1
n n 1
E(Xn ) = E(s ) = s fs (s)ds = sn ds =
0 0 n+1

The autocovariance function for lag k is defined as

CX (n, n + k) = E(Xn Xn+k ) − E(Xn )E(Xn+k )


1 1
= E(sn sn+k ) −
n+1n+k+1
2n+k 1
= E(s )−
(n + 1)(n + k + 1)
Z 1
1
= s2n+k ds −
0 (n + 1)(n + k + 1)
1 1
= −
2n + k + 1 (n + 1)(n + k + 1)

Problem 2. textbook problem 6.11


Find an expression for E(|Xt2 − Xt1 |2 ) in terms of autocorrelation function.

The absolute value does not matter because it is squared.

E(|Xt2 − Xt1 |2 ) = E((Xt2 − Xt1 )2 )


= E(Xt22 − 2Xt1 Xt2 + Xt21 )
= E(Xt22 ) − 2E(Xt1 Xt2 ) + E(Xt21 )
= RX (t2 , t2 ) − 2RX (t1 , t2 ) + RX (t1 , t1 )

1
where RX (t, s) is the autocorrelation function.

Problem 3. textbook problem 6.20


Let Yn be the process that results when individual 1’s in a Bernoulli process are erased with probability α.
Find the PMF of Sn0 , the counting process for Yn . Does Yn have independent and stationary increments?

Let In be a Bernoulli process where a 1 occurs with probability p and a 0 occurs with probability 1 − p.
Whenever a 1 occurs, it is not erased with probability (1 − α) in the derived process Yn . The probability of a
1 in Yn is therefore (1 − α)p. The counting process Sn0 gives the number of ones in the first n Bernoulli trials
where each trial has probability (1 − α)p of success. Therefore Sn0 is binomial with parameters (n, (1 − α)p)
and has PMF
 
0 n
P (Sn = x) = ((1 − α)p)x (1 − (1 − α)p)n−x
x

Yn has independent and stationary increments because it is a Bernoulii process (sequence of independent
Bernoulli random variables).

Problem 4. textbook problem 6.24


Consider the following autoregressive processes:

Wn = 2Wn−1 + Xn , W0 = 0
1
Zn = Zn−1 + Xn , Z0 = 0
2

(a) Flip a coin 10 times to obtain a realization of the Bernoulli process. Find the resulting realizations for Wn
and Zn . What trends do the processes exhibit? Is the sample mean meaningful for either of these proceses?
(b) Express Wn and Zn in terms of Xn , Xn−1 , . . . , X1 and then find E(Wn ) and E(Zn ). Do these results
agree with the trends observed in part (a)? (c) Does Wn or Zn have independent increments? Stationary
increments?

(a) A realization of Xn , Wn , and Zn are shown in Fig. 2. Wn is approximately exponentially increasing by


a factor of 2 in each step. Therefore, the sample mean does not have meaning for Wn . Zn is exponentially
decreasing by a factor of 1/2 except when Xn is 1. The sample mean for Zn is roughly twice the sample
mean of Xn .
(b) Consider Wn first.
Wn = 2Wn−1 + Xn
= 2(2Wn−2 + Xn−1 ) + Xn
= 4(2Wn−3 + Xn−2 ) + 2Xn−1 + Xn
= 8(2Wn−4 + Xn−3 ) + 4Xn−2 + 2Xn−1 + Xn
= 2n−1 X1 + · · · + 4Xn−2 + 2Xn−1 + Xn

E(Wn ) = E(2n−1 X1 ) + · · · + E(4Xn−2 ) + E(2Xn−1 ) + E(Xn )


= (2n−1 + · · · + 4 + 2 + 1)E(X)
1 − 2n
= E(X) = (2n − 1)E(X)
1−2

2
Next, Zn is figured out similarly.

1
Zn = Zn−1 + Xn
2 
1 1
= Zn−2 + Xn−1 + Xn
2 2
 
1 1 1
= Zn−3 + Xn−2 + Xn−1 + Xn
4 2 2
 
1 1 1 1
= Zn−4 + Xn−3 + Xn−2 + Xn−1 + Xn
8 2 4 2
 n−1
1 1 1
= X1 + · · · + Xn−2 + Xn−1 + Xn
2 4 2
 n−1 !
1 1 1
E(Zn ) = + · · · + + + 1 E(X)
2 4 2
1 n

1− 2
= E(X)
1 − 21

Problem 5. textbook problem 6.25


Let Mn be the discrete-time process defined as the sequence of sample means of an iid sequence:

X1 + X2 + · · · + Xn
Mn =
n

(a) Find the mean, variance, and covariance of Mn . (b) Does Mn have independent increments? Stationary
increments?

(a) The mean is


1 1
E(Mn ) = E(X1 + X2 + · · · + Xn ) = nE(X) = E(X)
n n
In statistical terms, this property means that Mn is an unbiased estimator of E(X). The variance is
 
X1 + X2 + · · · + Xn 1 1 var(X)
var(Mn ) = var = 2
· var(X1 + X2 + · · · + Xn ) = 2 n · var(X) =
n n n n

The covariance is
cov(n, k) = E[(Mn − E(X))(Mk − E(X))]
 
1 1
=E (Sn − nE(X)) (Sk − kE(X))
n k
1
= E [(Sn − E(Sn ))(Sk − E(Sk ))]
nk
1
= CS (n, k)
nk
1
= min(n, k)var(X)
nk
where Sn = X1 + X2 + · · · + Xn is the sum process and CS (n, k) is the autocovariance of Sn .
1 1
(b) An increment of Mn is Mn+1 − Mn = n+1 Xn − n+1 Mn . The increment depends on the value of Mn .
Therefore, increments are not independent or stationary.

Problem 6. textbook problem 6.28

3
Let Xn consist of an iid sequence of Poisson random variables with mean α. (a) Find the PMF of the sum
process Sn . (Hint: use the probability generating function.) (b) Find the joint PMF of Sn and Sn+k .

(a) For a sum of random variables, the PGF of the sum is the product of individual PGFs. In this case,

GSn (z) = E(z Sn ) = E(z X1 +···+Xn ) = (eα(z−1) )n = enα(z−1)

This PGF may be recognized as corresponding to a Poisson distribution with mean nα, i.e.,

(nα)k −nα
P (Sn = k) = e
k!
(b) We know that Sn is Poisson with mean nα. Also, the increment (Sn+k − Sn ) consists of the sum of k iid
Poisson random variables, which has the same distribution as Sk .

P (Sn = i, Sn+k = j) = P (Sn = i)P (Sn+k = j|Sn = i)


= P (Sn = i)P (Sn+k − Sn = j − i)
= P (Sn = i)P (Sk = j − i)
(nα)i −nα (kα)j−i −kα
= e e
i! (j − i)!

Problem 7. textbook problem 6.29


Let Xn be an iid sequence of zero-mean, unit-variance Gaussian random variables. (a) Find the PDF of Mn
defined in Problem 5. (b) Find the joint PDF of Mn and Mn+k . (Hint: use the independent increments
property of Sn .)

(a) Let us find the PDF using the characteristic function because the characteristic function of a sum of
independent random variables is the product of individual characteristic functions.

φMn (ω) = E(ejω(X1 +···+Xn )/n )


= (E(ejωX/n ))n
  ω n
= φX
n
 2 n
−( ω )
= e n /2
2
= e−ω /2n

This characteristic function implies that Mn is Gaussian with zero mean and variance 1/n, ie, the pdf is
r
n −nx2 /2
fMn (x) = e

(b) We note that Mn and Mn+k are related to Sn and Sn+k by

1 1
Mn = Sn , Mn+k = Sn+k
n n+k

Thus the joint PDF of Mn and Mn+k can be expressed as

FMn ,Mn+k (x, y) = P (Mn ≤ x, Mn+k ≤ y)


= P (Sn ≤ nx, Sn+k ≤ (n + k)y)
= FSn ,Sn+k (nx, (n + k)y)

4
The joint pdf is found by differentiating with respect to x and y:

fMn ,Mn+k (x, y) = n(n + k)fSn ,Sn+k (nx, (n + k)y)

We note that Sn has independent increments and the increment (Sn+k − Sn ) is the sum of k iid N (0, 1)
random variables, so it is N (0, k). Hence the joint pdf can be rewritten as

fMn ,Mn+k (x, y) = n(n + k)fSn ,Sn+k (nx, (n + k)y)


= n(n + k)fSn (nx)fSn+k |Sn ((n + k)y|nx)
= n(n + k)fSn (nx)fSn+k −Sn ((n + k)y − nx)
1 2 1 2
= n(n + k) √ e−(nx) /2n √ e−((n+k)y−nx) /2k
2πn 2πk

Problem 8. textbook problem 6.54


Consider the following moving average process:

1
Yn = (Xn + Xn−1 ) , X0 = 0
2

(a) Is Yn a stationary random process if Xn is an iid process? (b) Is Yn a stationary random process if Xn
is a stationary process?

(a) Let us look at the joint PDF of three arbitrary points of Yn at n1 < n2 < n3 :
P (Yn1 = y1 , Yn2 = y2 , Yn3 = y3 )
 
1 1 1
=P (Xn1 + Xn1 −1 ) = y1 , (Xn2 + Xn2 −1 ) = y2 , (Xn3 + Xn3 −1 ) = y3
2 2 2
 
1 1 1
=P (X2 + X1 ) = y1 , (Xn2 −n1 +2 + Xn2 −n1 +1 ) = y2 , (Xn3 −n2 +2 + Xn3 −n1 +1 ) = y3
2 2 2

The last step is true if Xn is stationary. Next, let us look at the joint PDF at the three points shifted by an
arbitrary T :
P (Yn1 +T = y1 , Yn2 +T = y2 , Yn3 +T = y3 )
 
1 1 1
=P (Xn1 +T + Xn1 +T −1 ) = y1 , (Xn2 +T + Xn2 +T −1 ) = y2 , (Xn3 +T + Xn3 +T −1 ) = y3
2 2 2
 
1 1 1
=P (X2 + X1 ) = y1 , (Xn2 −n1 +2 + Xn2 −n1 +1 ) = y2 , (Xn3 −n2 +2 + Xn3 −n1 +1 ) = y3
2 2 2

Again, the last step is true if Xn is stationary, which includes the case if Xn is an iid process. Since the time
shift T does not effect the joint PDF, Yn is stationary.
(b) The steps in part (a) are all true if Xn is stationary, so Yn is stationary if Xn is stationary.

Problem 9. textbook problem 6.62 (optional for EE 5375)


A ternary information source produces an iid, equiprobable sequence of symbols from the alphabet {a, b, c}.
Suppose that these three symbols are encoded into the respective binary codewords 00, 01, 10. Let Bn be
the sequence of binary symbols that result from encoding the ternary symbols. (a) Find the joint PMF of
Bn and Bn+1 . Is Bn stationary? Cyclostationary? (b) Find the mean and autocorrelation functions of Bn .
Is Bn wide-sense stationary? Wide-sense cyclostationary? (c) If Bn is cyclostationary, find the joint PMF,
mean, and autocorrelation functions of the randomly phase-shifted version of Bn as defined by equation
(6.65):
Xs (t) = X(t + θ) , θ uniform in [0, T ]

5
(a) Suppose B0 is the first binary symbol. If n is even,

1
P (Bn = 0, Bn+1 = 0) =
3
1
P (Bn = 0, Bn+1 = 1) =
3
1
P (Bn = 1, Bn+1 = 0) =
3
P (Bn = 1, Bn+1 = 1) = 0

If n is odd, consider that 2 of the 3 codewords end with a 0, and 2 of the 3 codewords start with a 0. Hence
an odd symbol is 0 with probability 2/3, and the following even symbol is 0 with probability 2/3.

22 4
P (Bn = 0, Bn+1 = 0) = =
33 9
21 2
P (Bn = 0, Bn+1 = 1) = =
33 9
12 2
P (Bn = 1, Bn+1 = 0) = =
33 9
11 1
P (Bn = 1, Bn+1 = 1) = =
33 9

This shows that Bn is not stationary but is cyclostationary with period 2.


(b) Overall, a 0 appears in the codewords with probability 2/3, and a 1 appears with probability 1/3. The
mean is
2 1 1
E(Bn ) = 0 · + 1 · =
3 3 3
If n is even, the autocorrelation function is

R(n, n + j) = E(Bn Bn+j )


= P (Bn = 1, Bn+j = 1)

0 if j = 1
= 13 if j = 0
 1 1 = 1 otherwise
33 9

If n is odd, the autocorrelation function is

R(n, n + j) = E(Bn Bn+j )


= P (Bn = 1, Bn+j = 1)
1
 9 if j = 1
= 13 if j = 0
1
9 otherwise

To be WSS, Bn must have constant mean and the autocovariance function must be a function only of the
lag (i.e., R(n, n + j) is a function only of j). Apparently, the autocovariance function is not a function only
of the lag, so Bn is not WSS. The process is wide-sense cyclostationary if the mean is constant and the
autocovariance function is constant at integer multiples of some period. In this case, the mean is constant
and the autocovariance function has a period of 2, so Bn is wide-sense cyclostationary.
(c) Bn is cyclostationary with period 2, so the random phase shift is uniform between 0 and 2, intended to
change Bn to a stationary process Bns that is

2 1
P (Bns = 0) = , P (Bns = 1) =
3 3
6
Now the joint PMF for the randomly shifted process is

1 1
P (Bns = 0, Bn+1
s
= 0) = P (Bns = 0, Bn+1
s
= 0|n even) + P (Bns = 0, Bn+1
s
= 0|n odd)
2 2
1 1 1 4 7
= · + · =
2 3 2 9 18
Similarly,
1 1 1 2 5
P (Bns = 0, Bn+1
s
= 1) = · + · =
2 3 2 9 18
5
P (Bns = 1, Bn+1
s
= 0) =
18
1 1 1
P (Bns = 1, Bn+1
s
= 1) = · =
2 9 18
The mean is
2 1 1
E(Bns ) = 0 · +1· =
3 3 3
The autocorrelation function is

 P (Bns = 1) = 31 if j = 0
1
R(n, n + j) = P (Bns = 1, Bn+1
s
= 1) = 18 if j = 1
1 1 1
·
3 3 = 9 if j > 1

Problem 10. Matlab (optional for EE 5375)


Start Matlab and use the uniform random number generator rand to generate 1000 random samples by
typing:
> u = rand(1000,1);
Create a sample Bernoulli sequence by typing:
> w = 0.5 >= u;
This sets elements of w equal to 1 if u > 0.5 or 0 if u < 0.5. This sequence can be visualized by plotting:
> stem(w)
Scale the Bernoulli sequence to a different range by typing:
> s = 0.1;
> w = s*(2*w - 1.0);
Generate a new sequence x by typing:
> x = cumsum(w);
and look at the process by plotting:
> plot(1:1000,x)
What kind of random process does this look like?

It should look like Brownian motion.

7
Fig. 1 for Problem 1

Fig. 2 for Problem 4

S-ar putea să vă placă și