Sunteți pe pagina 1din 30

AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

1 Chapter 1 Section 2 Exercises


1. Let A and B be arbitrary, not necessarily disjoint, events. Use the law of total probability to verify the formula

P(A) = P(A ∩ B) + P(A ∩ Bc )

where Bc is the complementary event to B. (That is, Bc occurs if and only if B does not occur.)
Proof :
To prove that P(A) = P(A ∩ B) + P(A ∩ Bc ) we need the following:

(a) Claim 1: A = (A ∩ B) ∪ (A ∩ Bc ).
(b) Claim 2: (A ∩ B) ∩ (A ∩ Bc ) = ∅.
(c) Definition 1.1 Let (Ω, A ) be the σ-algebra of events, then the function P(·) : Ω → [0, 1] is a probability
function or probability measure if and only if the following holds:
i. P (Ac ) = 1 − P(A), ∀ A ∈ A
ii. P(A1 ∪ A2 ) = P(A1 ) + P(A2 ) if A1 ∩ A2 = ∅

Claim 1: A = (A ∩ B) ∪ (A ∩ Bc ).
Proof :
Note: Two sets A and B are equal if and only if A ⊆ B and B ⊆ A. Hence, to show that
A = (A ∩ B) ∪ (A ∩ Bc ), we need to show the subclaims: (A ∩ B) ∪ (A ∩ Bc ) ⊆ A and A ⊆ (A ∩ B) ∪ (A ∩ Bc ).
Subclaim 1: (A ∩ B) ∪ (A ∩ Bc ) ⊆ A.
Let x ∈ (A ∩ B) ∪ (A ∩ Bc ).

⇒ x∈ A ∩ B or x ∈ A ∩ Bc
⇒ x∈ A and x ∈ B or x ∈ A and x ∈ Bc
⇒ x∈ A or x ∈ A
⇒ x∈ A.

Therefore, (A ∩ B) ∪ (A ∩ Bc ) ⊆ A.
Note: For any sets C and D, the following hold,
1. C ∩ C c = ∅
2. D = D ∪ ∅
Thus, A = A ∪ ∅ = A ∪ (B ∩ Bc ).
Subclaim 2: A ⊆ (A ∩ B) ∪ (A ∩ Bc ).
Let x ∈ A.

⇒ x ∈ A ∪ (B ∩ Bc ), since A = A ∪ (B ∩ Bc )
⇒ x ∈ A or x ∈ B and x ∈ Bc
⇒ x ∈ A and x ∈ B or x ∈ A or x ∈ Bc
⇒ x ∈ A ∩ B or x ∈ A ∩ Bc
⇒ x ∈ (A ∩ B) ∪ (A ∩ Bc )

1
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

Therefore, A ⊆ (A ∩ B) ∪ (A ∩ Bc ).
Thus, (A ∩ B) ∪ (A ∩ Bc ) = A.
Claim 2: (A ∩ B) ∩ (A ∩ Bc ) = ∅.
Proof :
Since ∅ ⊆ (A ∩ B) ∩ (A ∩ Bc ) it remains to show that (A ∩ B) ∩ (A ∩ Bc ) ⊆ ∅. Now, let x ∈ (A ∩ B) ∩ (A ∩ Bc ).

⇒ x ∈ A ∩ B and x ∈ A ∩ Bc
⇒ x ∈ A and x ∈ B and x ∈ A and x ∈ Bc
⇒ x ∈ B and x ∈ Bc , a contradiction

Therefore, (A ∩ B) ∩ (A ∩ Bc ) = ∅.
Now,

P(A) = P [(A ∩ B) ∪ (A ∩ Bc )] , by Claim 1


P(A) = P(A ∩ B) + P(A ∩ Bc ), by Claim 2 and Definition 1.1(ii).

Therefore, P(A) = P(A ∩ B) + P(A ∩ Bc ).


2. Let A and B be arbitrary, not necessarily disjoint, events. Establish the general addition law

P(A ∪ B) = P(A) + P(B) − P(A ∩ B).

Proof :
To prove that P(A ∪ B) = P(A) + P(B) − P(A ∩ B) we need the following:
(a) Claim 1: A ∪ B = A ∪ (B ∩ Ac ).
(b) Claim 2: A ∩ (B ∩ Ac ) = ∅.
(c) Claim 3: P(B ∩ Ac ) = P(B) − P(A ∩ B).
(d) Definition 1.2 Let (Ω, A ) be the σ-algebra of events, then the function P(·) : Ω → [0, 1] is a probability
function or probability measure if and only if the following holds:
i. P (Ac ) = 1 − P(A), ∀ A ∈ A
ii. P(A1 ∪ A2 ) = P(A1 ) + P(A2 ) if A1 ∩ A2 = ∅
Claim 1: A ∪ B = A ∪ (B ∩ Ac ).
Proof :
To show that A∪ B = A∪(B∩Ac ), we need to prove the subclaims: A∪ B ⊆ A∪(B∩Ac ) and A∪(B∩Ac ) ⊆ A∪ B.
Subclaim 1: A ∪ (B ∩ Ac ) ⊆ A ∪ B.
Let x ∈ A ∪ (B ∩ Ac ).

⇒ x∈ A or x ∈ B ∩ Ac
⇒ x∈ A or x ∈ B and x ∈ Ac
⇒ x∈ A or x ∈ B
⇒ x∈ A∪B

2
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

Therefore, A ∪ (B ∩ Ac ) ⊆ A ∪ B.
Note: For any sets C and D, the following hold,
1. C = U ∩ C, U = universal set,
2. U = D ∪ Dc
Thus, A ∪ B = U ∩ (A ∪ B) = (A ∪ Ac ) ∩ (A ∪ B) = (A ∪ Ac ) ∩ (B ∪ A).
Subclaim 2: A ∪ B ⊆ A ∪ (B ∩ Ac ).
Let x ∈ A ∪ B.

⇒ x ∈ (A ∪ Ac ) ∩ (B ∪ A), since A ∪ B = (A ∪ Ac ) ∩ (B ∪ A)
⇒ x ∈ (A ∪ Ac ) ∩ x ∈ (B ∪ A)
⇒ (x ∈ A or x ∈ Ac ) and (x ∈ B or x ∈ A)
⇒ (x ∈ A and x ∈ B or x ∈ A) or (x ∈ Ac and x ∈ B or x ∈ A)
⇒ x ∈ A or (x ∈ Ac and x ∈ B)
⇒ x ∈ A or x ∈ (B ∩ Ac )
⇒ x ∈ A ∪ (B ∩ Ac )

Therefore, A ∪ B ⊆ A ∪ (B ∩ Ac ).
Thus, A ∪ B = A ∪ (B ∩ Ac ).
Claim 2: A ∩ (B ∩ Ac ) = ∅.
Proof :
Since ∅ ⊆ A ∩ (B ∩ Ac ) it remains to show that A ∩ (B ∩ Ac ) ⊆ ∅. Now, let x ∈ A ∩ (B ∩ Ac ).

⇒ x ∈ A and (x ∈ B ∩ Ac )
⇒ x ∈ A and x ∈ B and x ∈ Ac
⇒ x ∈ A and x ∈ Ac , a contradiction

Therefore, A ∩ (B ∩ Ac ) = ∅.
Claim 3: P(B ∩ Ac ) = P(B) − P(A ∩ B).
Proof :
By interchanging the roles of A and B in problem no. 1, we have

P(B) = P(B ∩ A) + P(B ∩ Ac ) and thus


P(B ∩ Ac ) = P(B) − P(B ∩ A).

Now,

P(A ∪ B) = P [A ∪ (B ∩ Ac )] , by Claim 1
= P(A) + P(B ∩ Ac ), by Claim 2 and Definition 1.2(ii)
P(A ∪ B) = P(A) + P(B) − P(A ∩ B), by Claim 3.

Therefore, P(A ∪ B) = P(A) + P(B) − P(A ∩ B).

3
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

3. (a) Plot the distribution function 



 0 for x ≤ 0,
F(x) =  for 0 < x < 1,

 3
 x
for x ≥ 1.

 1

(b) Determine the corresponding density function f (x) in the three regions (i) x ≤ 0, (ii) 0 < x < 1, and (iii)
1 ≤ x.
(c) What is the mean of the distribution?
!
1 3
(d) If X is a random variable following the distribution specified in (a), evaluate P ≤X≤ .
4 4
Solution:

(a)


 0 for x ≤ 0,
(b) Since f (x) = D x F(x), thus f (x) =  for 0 < x < 1,

 2
 3x
for x ≥ 1.

 0

(c) The mean of the distribution is


Z ∞ Z 0 Z 1 Z −∞
E[X] = x f (x)dx = 0 dx + x f (x)dx + 0 dx
−∞ −∞ 0 1
Z 1 Z 1 1 !
3 3
= x · 3x2 dx = 3x3 dx = x4 = −0
0 0 4 0 4
3
E[X] = .
4

3
Therefore, the mean of the distribution is E[X] = .
4

4
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

(d)
! ! ! !3 !3
1 3 3 1 3 1 27 1
P ≤X≤ =F −F = − = −
4 4 4 4 4 4 64 64
!
1 3 26
P ≤X≤ = .
4 4 64
!
1 3 26
Therefore, P ≤X≤ = .
4 4 64

4. Let Z be a discrete random variable having possible values 0, 1, 2 and 3 and probability mass function
1 1
p(0) = , p(2) = ,
4 8
1 1
p(1) = p(3) = ,
2 8
(a) Plot the corresponding distribution function.
(b) Determine the mean E[Z].
(c) Evaluate the variance Var[Z].
Solution:

(a)
(b) The mean is
4
X
E[Z] = xi · f (xi ) = x1 · f (x1 ) + x2 · f (x2 ) + x3 · f (x3 ) + x4 · f (x4 )
i=1
1 1 1 1 1 2 3 4+2+3
=0· +1· +2· +3· = + + =
4 2 8 8 2 8 8 8
9
E[Z] = .
8
9
Therefore, E[Z] = .
8

5
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

(c) Note that


4
X
E[Z ] =
2
xi2 · f (xi ) = x12 · f (x1 ) + x22 · f (x2 ) + x32 · f (x3 ) + x42 · f (x4 )
i=1
1 1 1 1 1 4 9 4+4+9
=0· + 1 · + 22 · + 32 · = + + =
4 2 8 8 2 8 8 8
17
E[Z 2 ] = .
8
Thus, the variance is
!2
17 9 17 81 136 − 81
Var[Z] = E[Z ] − E[Z] =2
− 2
= − =
8 8 8 64 64
55
Var[Z] = .
64

55
Therefore, Var[Z] = .
64
5. Let A, B, and C be arbitrary events. Establish the addition law

P(A ∪ B ∪ C) = P(A) + P(B) + P(C) − P(A ∩ B) − P(A ∩ C) − P(B ∩ C) + P(A ∩ B ∩ C).

Proof :
Let D = B ∪ C then we have

P(A ∪ B ∪ C) = P(A ∪ (B ∪ C)) = P(A ∪ D)


P(A ∪ B ∪ C) = P(A) + P(D) − P(A ∩ D), by problem no. 2 (1)

and A ∩ D = A ∩ (B ∪ C) = (A ∩ B) ∪ (A ∩ C), by distributive law so

P(A ∩ D) = P[(A ∩ B) ∪ (A ∩ C)]


= P(A ∩ B) + P(A ∩ C) − P[(A ∩ B) ∩ (A ∩ C)], by problem no. 2
P(A ∩ D) = P(A ∩ B) + P(A ∩ C) − P(A ∩ B ∩ C), so (2)
P(D) = P(B ∪ C) = P(B) + P(C) − P(B ∩ C), by problem no. 2 (3)

Substituting (3) and (2) to (1) we have

P(A ∪ B ∪ C) = P(A) + P(B) + P(C) − P(B ∪ C) − [P(A ∩ B) + P(A ∩ C) − P(A ∩ B ∩ C)]


= P(A) + P(B) + P(C) − P(B ∪ C) − P(A ∩ B) − P(A ∩ C) + P(A ∩ B ∩ C)
P(A ∪ B ∪ C) = P(A) + P(B) + P(C) − P(A ∩ B) − P(A ∩ C) − P(B ∪ C) + P(A ∩ B ∩ C).

Therefore, P(A ∪ B ∪ C) = P(A) + P(B) + P(C) − P(A ∩ B) − P(A ∩ C) − P(B ∪ C) + P(A ∩ B ∩ C).

6
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

2 Chapter 1 Section 2 Problems


1. Thirteen cards numbered 1, . . . , 13 are shuffled and dealt one at a time. Say a match occurs on deal k if the
kth card revealed is card number k. Let N be the total number of matches that occur in the thirteen cards.
Determine E[N]. Hint: Write N = 1A1 + · · · + 1A13 where Ak is the event that a match occurs on deal k.
Solution:
1 , if the kth card matches the card number k
(
Let N = the number of matches. Define 1Ak =
0 , otherwise.
1
Then N = 1A1 + 1A2 + 1A3 + · · · + 1A13 and E 1Ak = , k = 1, 2, . . . , 13. Now,
 
13
E [N] = E 1A1 + 1A2 + 1A3 + · · · + 1A13 = E 1A1 + E 1A2 + E 1A3 + · · · + E 1A13
         
!
1 1 1 1 1 13
= + + + ··· + = 13 = = 1.
13 13 {z
| 13 13
} 13 13
13 terms

Therefore, E[N] = 1.

2. Let N cards carry the distinct numbers x1 , . . . , xn . If two cards are drawn at random without replacement, show
1
that the correlation coefficient ρ between the numbers appearing on the two cards is − .
(N − 1)
Solution:

Let C1 = random variable that represents the number xi on the first card;
C2 = random variable that represents the number x j on the second card.

The correlation coefficient is given by


E[C1C2 ] − E[C1 ]E[C2 ] E[C1C2 ] − E[C1 ]E[C2 ]
ρ= √ = q .
var(C1 )var(C1 )

E[C12 ] − E[C1 ]2 E[C22 ] − E[C2 ]2

Then,
Pn Pn
xi xj
E[C1 ] = i=1
E[C2 ] = i=1
N PnN
x2j
Pn 2
i=1 xi i=1
E[C12 ] = E[C22 ] =
PnN Pn Pn 2
N
i=1 j=1 xi x j − i=1 xi
E[C1C2 ] =
N(N − 1)

7
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

Let s = xi x j = xi and t = xi2 = x2j . Therefore


Pn Pn Pn Pn Pn Pn
i=1 j=1 i=1 xi i=1 i=1 j=1

s−t N(s−t)−(N−1)s
E[C1C2 ] − E[C1 ]E[C2 ] N(N−1)
− Ns2 N 2 (N−1)
ρ = q   = q  = Nt−s
t s t s
E[C12 ] − E[C1 ]2 E[C22 ] − E[C2 ]2 N
− N2 N
− N2
N2

N s − Nt − N s + s −(Nt − s)
ρ= =
(N − 1)(Nt − s) (N − 1)
1
ρ=− .
(N − 1)

1
Therefore, ρ=− .
(N − 1)

3. A population having N distinct element is sampled with replacement. Because of repetitions, a random sample
of size r may contain fewer than r distinct elements. Let S r be the sample size necessary to get r distinct
elements. Show that !
1 1 1
E[S r ] = N + + ··· + .
N N−1 N−r+1
Solution:
N−r
The probability that r distinct objects are selected from a population N with replacement is p = . Each
N
1 1 N
selection x has a geometric distribution distribution and has thus E[x] = = N−r = . Thus, the sum of
p N
N−r
the expected number of selection x is
r−1 r−1
X N X 1
E[S r ] = =N
i=0
N−i i=0
N−i
!
1 1 1
=N + + ··· +
N N−1 N − (r − 1)
!
1 1 1
E[S r ] = N + + ··· + .
N N−1 N−r+1
!
1 1 1
Therefore, E[S r ] = N + + ··· + .
N N−1 N−r+1

8
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

3 Chapter 1 Section 3 Exercises


1. Consider tossing a fair coin five times and counting the total number of heads that appear. What is the proba-
bility that this total is three?
Solution: !
5 5!
There are 2 = 32 possible outcomes. The number of placement of three heads is
5
= = 10. Therefore,
3 3!2!
10
the probability that an outcome has three heads is p = .
32
2. A fraction p = 0.05 of the items coming off a production process are defective. If a random sample of 10 items
is taken from the output of the process, what is the probability that the sample contains exactly one defective
item? What is the probability that the sample contains one or fewer defective items?
Solution:
The problem involves a binomial distribution.

Definition 3.1 Binomial Distribution. If a binomial trial can result in a success with probability p and a failure
with probability q = 1 − p, then the probability distribution of the binomial random variable X, the number of
successes in n independent trials is
!
n x n−x
b(x; n, p) = p q , for x = 0, 1, 2, . . . , n.
x

Given p = 0.05 = probability of coming defective, then


q = 1 − p = 1 − 0.05 = 0.95 = probability of coming nondefective.

! !
10 10
Now, P ({0 defective}) = 1
(0.05) (0.95)10−1
= (0.05)(0.95)9
1 1
= 0.3151;
! !
10 10
P ({1 defective}) = 0
(0.05) (0.95)10−0
= (0.95)10
0 0
= 0.5988;
P ({0 or 1 defective}) = P ({0 defective}) + P ({1 defective}) = 0.3151 + 0.5988
= 0.9139

Therefore, P ({0 defective}) = 0.3151 , P ({1 defective}) = 0.5988 , and P ({0 or 1 defective}) = 0.9139.
3. A fraction p = 0.05 of the items coming off a production process are defective. The output of the process is
sampled, one by one, in a random manner. What is the probabality that the first defective item found is the
tenth item sampled?
Solution:
The problem involves a negative binomial distribution.

9
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

Definition 3.2 Negative Binomial Distribution. If repeated independent trials can result in a success with
probability p and a failure with probability q = 1 − p, then the probability distribution of the random variable
X, the number of the trial on which the kth success occurs is given by
!
x − 1 k x−k
b (x; k, p) =

p q , for x = k, k + 1, k + 2, . . .
k−1

Given p = 0.05 = probability of coming defective, then


q = 1 − p = 1 − 0.05 = 0.95 = probability of coming nondefective.

!
10 − 1
Now, P(N = 10) = b (10, 1, 0.05) =

(0.05)1 (0.95)10−1
1−1
!
9
= (0.05)(0.95)9
0
P(N = 10) = 0.0315

Therefore, P(N = 10) = 0.0315.

4. A Poisson distributed random variable X has a mean of λ = 2. What is the probability that X equals 2? What
is the probability that X is less than or equal to 2?
Solution:
The problem involves Poisson distribution.

Definition 3.3 Poisson Distribution. The probability distribution of the Poisson random variable X, represent-
ing the number of outcomes occuring in a given time interval or specified region, is
e−λ λ x
p(x; λ) = , for x = 0, 1, 2, . . .
x!
where λ is the average number of outcomes occuring in the given time interval or specified region.

e−2 22
Now, P(X = 2) = p(2; 2) = = 2e−2
2!
= 0.2727;
e−2 21 e−2 20
p(1; 2) = = 2e ;
−2
p(0; 2) = = e−2
1! 0!
P(X ≤ 2) = p(0; 2) + p(1; 2) + p(2; 2)
= e−2 + 2e−2 + 2e−2 = 5e−2
P(X ≤ 2) = 0.6767.

Therefore, P(X = 2) = 0.2727 and P(X ≤ 2) = 0.6767.

10
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

5. The number of bacteria in a prescribed area of a slide containing a sample of well water has a Poisson distri-
bution with parameter 5. What is the probability that the slide shows 8 or more bacteria?
Solution:

P(X ≥ 8) = 1 − P(X < 8)


X7
=1− p(x; 5)
x=0
= 1 − 0.8666, by Poisson table of values
P(X ≥ 8) = 0.1334

Therefore, P(X ≥ 8) = 0.1334.

4 Chapter 1 Section 3 Problems


1. Suppose that X has a discrete uniform distribution on the integers 0, 1, . . . 9, and Y is independent and has the
probability distribution P(Y = k) = ak for k = 0, 1, . . . . What is the distribution of Z = X + Y (mod 10), their
sum modulo 10?
Solution:
Note that a = b mod 10 ⇔ 10 j + a = b
Let z ∈ {0, 1, . . . , 9}.
9
X
P(Z = z) = P(X + Y = z) = P(X = i)P(Y = z − i mod 10)
i=0
9 ∞ 9 X ∞
X 1 X X 1
P(Y = 10 j + z − i) = P(Y = 10 j + z − i)
i=0
10 j=0 i=0 j=0
10
9 X ∞
X 1
P(Z = z) = a10 j+z−i
i=0 j=0
10

9 X ∞
X 1
Therefore, P(Z = z) = a10 j+z−i .
i=0 j=0
10

2. The mode of a probability mass function p(k) is any value k∗ for which p(k∗ ) ≥ p(k) for all k. Determine the
mode(s) for

(a) The Poisson distribution with parameter λ > 0.


(b) The binomial distribution with parameters n and p.

Solution:

11
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

(a) To find the mode of the Poisson distribution we will find k∗ that will satisfy
p(k∗ )
p(k∗ ) ≥ p(k) or ≥1
p(k)
Equivalently we will find the values of λ and k that will satisfy the following:
e−λ λk+1
P(X = k + 1) (k+1)! λ
= = ≥1 (4)
P(X = k) e−λ λk k
k!

Consider the cases:


i. If 0 < λ < 1, then
λ2 e−λ
P(X = 0) = e−λ , P(X = 1) = λe−λ , P(X = 2) = ,...
2
implies P(X = 0) > P(X = 1) > P(X = 2) > · · · , since λ < 1;
P(X = 0) P(X = 0)
implies ≥ 1, ≥ 1, . . . implies k = 0 satisfies (4). Thus, the mode is 0.
P(X = 1) P(X = 2)
P(X = bλc + 1) λ
ii. If λ > 1 is not an integer, then = ≥1
P(X = bλc) bλc
implies k = bλc satisfies (4). Thus, the mode is bλc .
iii. If λ = m, m − 1, an integer, then
P(X = m) m P(X = m) m
= ≥ 1 and = =1
P(X = m − 1) m − 1 P(X = m) m
implies k = m, m − 1 satisfies (4). Thus, the mode is k = m, m − 1.
, if 0 < λ < 1


 0
Therefore, mode =  , if λ > 1 is not an integer

bλc

 λ, λ − 1 , if λ is an integer.

(b) To find the mode of the Binomial distribution we will find k∗ that will satisfy

∗ p(k∗ )
p(k ) ≥ p(k) or ≥1
p(k)
Equivalently we will find the values of p and n that will satisfy the following:

P(X = k + 1)
n!
(k+1)!(n−(k+1))!
pk+1 (1 − p)n−(k+1)
=
P(X = k) n!
k!(n−k)!
pk (1 − p)n−k
P(X = k + 1) p(n − k)
= ≥1 (5)
P(X = k) (1 − p)(k + 1)

12
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

From (5)
p(n − k)
≥1
(1 − p)(k + 1)
p(n − k) ≥ (1 − p)(k + 1)
np − pk ≥ k + 1 − pk − p
np + p − 1 ≥ k
(n + 1)p − 1 ≥ k
(n + 1)p ≥ k + 1 ≥ k

Consider the cases:


i. If (n + 1)p − 1 is an integer then (n + 1)p − 1 and (n + 1)p is a mode.
ii. If (n + 1)p − 1 is not an integer then b(n + 1)pc is a mode.
(n + 1)p − 1, (n + 1)p , if (n + 1)p − 1 is an integer
(
Therefore, mode =
b(n + 1)pc , otherwise.

3. Let X be a Poisson random variable with parameter λ. Determine the probability that X is odd.
Solution:

P(X is odd) = P(X = 1) + P(X = 3) + P(X = 5) + · · ·


λ1 e−λ λ3 e−λ λ5 e−λ
= + + + · · · , since X is Poisson
1!" 3! 5! #
λ3 λ5
= e−λ λ + + + ···
3! 5!

X λ2i+1
= e−λ
i=0
(2i + 1)!
P(X is odd) = e−λ sinh(λ).

Therefore, P(X is odd) = e−λ sinh(λ).

13
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

5 Chapter 1 Section 4 Exercises


1. The lifetime, in years, of a certain class of light bulbs has an exponential distribution with parameter λ = 2.
What is the probability that a bulb selected at random from this class will last more than 1.5 years? What is
the probability that a bulb selected at random will last exactly 1.5 years?
Solution:
λe , x>0
( −λx
For exponential distribution, f (x) = . Now,
0 , x<0
Z 1.5 1.5
P(X > 1.5) = 1 − P(X ≤ 1.5) = 1 − 2e−2x dx = 1 + e−2x
0 0
   
=1+ e −2(1.5)
−e =1+ e
0 −2(1.5)
−1 =e −3

P(X > 1.5) = 0.0498, the probability that a bulb selected at random will last more than 1.5 years.
Z 1.5
P(X = 1.5) = 2e−2x dx = 0.
1.5
Therefore, P(X > 1.5) = 0.0498, the probability that a bulb selected at random will last more than 1.5 years.
and
P(X = 1.5) = 0, the probability that a bulb selected at random will last exactly 1.5 years.

1 1
2. The median of a random variable X is any value a for which P(X ≤ a) ≥ and P(X ≥ a) ≥ . Determine the
2 2
median of an exponentially distributed random variable with parameter λ. Compare the median to the mean.
Solution: Z a
1
P(X ≤ a) = λe−λx dx ≥
Z a 0 2
1 a 1   1   1 1
λe−λx dx ≥ ⇒ −e−λx ≥ ⇒ − e−λa − e0 ≥ ⇒ − e−λa − 1 ≥ ⇒ 1 − e−λa ≥
0 2 0 2 2 2 2
1 1
⇒ e ≤−λa
⇒ −λa ≤ −log 2 ⇒ λa ≥ log 2 ⇒ a ≥ log 2.
2 Z a λZ a
1 1
Since P(X ≥ a) = 1 − P(X ≤ a) = 1 − λe−λx dx ≥ implies λe−λx dx ≤ . Then
Z a 0 2 0 2
1 1
λe−λx dx ≤ ⇒ a ≤ log 2.
0 2 λ
1
Thus, the median of the exponentially distributed random variable X is a = log 2.
λ
1
The exponential distribution has a mean = .
λ
log 2 = median < mean = λ1 for positive λ
( 1
Comparison: λ1 .
λ
log 2 = median > mean = λ1 for negative λ

3. The lengths, in inches, of cotton fibers used in a certain mill are exponentially distributed random variables
with parameter λ. It is decided to convert all measurements in this mill to the metric system. Describe the
probability distribution of the length, in centimeters, of cotton fibers in this mill.

14
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

Solution:
λ
Since 1 inch = 2.54 centimeters by conversion factor, then λ → describes the new probability distri-
2.54
bution of the length, in centimeters, of cotton fibers in this mill.
4. Twelve independent random variables, each uniformly distributed over the interval (0 , 1] are added, and 6 is
subtracted from the total. Determine the mean and the variance of the resulting random variable.
Solution:
Let x1 , x2 , . . . , x12 be independent uniformly distributed random variables over the interval (0 , 1] . The resulting
X12
random variable is (x1 + x2 + · · · + x12 ) − 6 = xi − 6. Solving for the mean
i=1
 12   12   12  12
X X X  X
xi − 6 = E  xi + (−6) = E  xi  + E [(−6)] = E [xi ] + E [(−6)]
 
E       
i=1 i=1 i=1 i=1
!
1 1 d
= 12 + (−6), since E [xi ] = , ∀ xi ∼ U (0 , 1]
2 2
=6−6
 12 
X
xi − 6 = 0.

E  (6)
i=1

To solve for the variance we need the following claims:


 12 2
X 
(a) Claim 1:  xi  = 12 terms of xi2 + 66 terms of 2xi x j , i , j.
i=1
h i 1 d
(b) Claim 2: E xi2 = , ∀ xi ∼ U (0 , 1].
3
h i 1
(c) Claim 3: E 2xi x j , i , j = .
2
Proof :
(a) Claim 1: Observe that
 12 2
X 
 xi  = (x1 + x2 + · · · + x12 )2 = (x1 + x2 + · · · + x12 )(x1 + x2 + · · · + x12 )
i=1
= x12 + x22 + · · · + x12
2
+ 2x1 x2 + 2x1 x3 + +2x1 x4 + · · · + 2x1 x12
| {z } | {z }
12 terms 11 terms

+ 2x2 x3 + 2x2 x4 + 2x2 x5 + · · · + 2x2 x12 + 2x3 x4 + 2x3 x5 + 2x3 x6 + · · · + 2x3 x12
| {z } | {z }
10 terms 9 terms
(11)(11 + 1)
+ · · · + 2x10 x11 + 2x10 x12 + 2x11 x12 , so we have = 11(6) = 66 terms of 2xi x j .
| {z } | {z } 2
2 terms 1 term

15
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

 12 2
X 
Therefore,  xi  = 12 terms of xi2 + 66 terms of 2xi x j , i , j.
i=1
(b) Claim 2: Note that 1
h i Z 1 xi3 1 h i 1 d
E xi =
2
xi f (xi )dxi = = . Therefore, E xi2 = , ∀ xi ∼ U (0 , 1].
2
0 3 0
3 3
(c) Claim 3: Observe that
h i h i h i
E 2xi x j = 2E xi x j = 2E [xi ] E x j , since xi , x j are independent
! !
1 1 1 d
=2 , since E [xi ] = , ∀ xi ∼ U (0 , 1]
2 2 2
h i 1
E 2xi x j = .
2
h i 1
Therefore, E 2xi x j , i , j = .
2
Now,

X 12
2 
h i
xi   = E 12 terms of xi2 + 66 terms of 2xi x j , i , j , by Claim 1
 
E 
i=1
h i h i h i h i
= E 12 terms of xi2 + E 66 terms of 2xi x j , i , j = 12E xi2 + 66E 2xi x j , i , j
! !
1 1
= 12 + 66 , by Claim 2 and Claim 3
3 2

X12
2 
xi   = 4 + 33 = 37.
 
E  (7)
i=1

Finally
 12  
X 12
2   12 2 
 12
2 
X X X
xi − 6 = E  xi − 6 = E 
 
  
 
 
   
var  xi − 6  − E  xi − 6  − 0, by (6)
i=1 i=1 i=1 i=1

12
2 12
 
12
2   12 
X X   X X 
= E  xi + 36 = E  xi  + E [36]

 
 
 
  

 
 xi  − 12

 
 
 
 xi   − 12E 
 
i=1 i=1 i=1 i=1

12
 2   12
 !
X   X 1
= E  xi  + 36 = 37 − 12(12) + 36 = 37 − 72 + 36, by (7)

xi   − 12E 
i=1 i=1
2
 12 
X
xi − 6 = 1.

var 
i=1

Therefore, the mean = 0 and the variance = 1 of the resulting random variable.

16
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

5. Let X and Y have the joint normal distribution described in equation (4.16). What value of α minimizes the
variance of Z = αX + (1 − α)Y? Simplify your result when X and Y are independent.
Solution:
The variance of Z = αX + (1 − α)Y is

var(Z) = α2 σ2x + 2α(1 − α)ρσ x σy + (1 − α)2 σ2y

The value that minimizes var(Z) is to set it to zero and solve for α.

α2 σ2x + 2α(1 − α)ρσ x σy + (1 − α)2 σ2y = 0


α2 σ2x + 2αρσ x σy − 2α2 ρσ x σy + σ2y − 2ασ2y + α2 σ2y = 0
h i h i
σ2x − 2ρσ x σy + σ2y α2 + 2ρσ x σy − 2σ2y α + σ2y = 0 (8)

Equation (8) is quadratic in α. Solving α by quadratic formula and take the minimum value
q
−[2ρσ x σy − 2σ2y ] − [2(ρσ x σy − σ2y )]2 − 4[σ2x − 2ρσ x σy + σ2y ][σ2y ]
α=
2[σ2x − 2ρσ x σy + σ2y ]
q
2[σ2y − ρσ x σy ] − 2 ρ2 σ2x σ2y − 2ρσ x σ3y + σ4y − σ2x σ2y + 2ρσ x σ3y − σ4y
=
2[σ2x − 2ρσ x σy + σ2y ]
q
σ2y − ρσ x σy − (ρ2 − 1)σ2x σ2y
α=
σ2x − 2ρσ x σy + σ2y

q
σ2y − ρσ x σy (ρ2 − 1)σ2x σ2y
α= − , for ρ , ±1 and
σ2x − 2ρσ x σy + σ2y σ2x − 2ρσ x σy + σ2y
σ2y − ρσ x σy
α∗ = , for ρ = ±1.
σ2x − 2ρσ x σy + σ2y

6 Chapter 1 Section 4 Problems


1. Evaluate the moment E[eλz ], where λ is an arbitrary real number and Z is a random variable following a
standard normal distribution, by integrating
Z +∞
λz 1 z2
E[e ] = eλz √ e− 2 dz.
−∞ 2π

Solution:

17
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

By completing the square, we have

z2 1h i 1h i
λz − = − z2 − 2λz = − z2 − 2λz + λ2 − λ2
2 2 2
1h 2 i 1
= − z − 2λz + λ2 + λ2
2 2
z2 1 1
λz − = − (z − λ)2 + λ2 (9)
2 2 2

Z +∞ Z +∞
λz 1 − z22
λz 1 z2
E[e ] = e √ e dz = √ eλz− 2 dz
−∞ 2π −∞ 2π
Z +∞
1 − 12 (z−λ)2 + 12 λ2
= √ e dz, by (9)
−∞ 2π
Z +∞ Z +∞
1 − 12 (z−λ)2 12 λ2 λ
1 2 1 1 2
= √ e e dz = e 2 √ e− 2 (z−λ) dz
−∞ 2π −∞ 2π
Z +∞
1
= e 2 λ · 1, since
1 2 1 2
√ e− 2 (z−λ) dz = 1
−∞ 2π
E[eλz ] = e 2 λ .
1 2

Therefore, E[eλz ] = e 2 λ .
1 2

1
2. Let W be an exponentially distributed random variable with parameter θ and mean µ = .
θ
(a) Determine P(W > µ).
(b) What is the mode of the distribution?

Solution:

(a)
! !
1 1
P(W > µ) = P W > =1−P W ≤
θ θ
Z 1θ 1
θ
=1− θe−θx dx = 1 + e−θx
0 0

= 1 + (e −1
− e ) = 1 + (e
0 −1
− 1) = e−1
1
P(W > µ) =
e

1
Therefore, P(W > µ) = .
e

18
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

(b) To find the mode of the exponential distribution we will find k∗ that will satisfy
p(k∗ )
p(k∗ ) ≥ p(k) or ≥1
p(k)
Equivalently we will find k∗ such that
P(X = k∗ )
≥ 1, for all k (10)
P(X = k)
P(X = 0) = θe−θ(0) = θ, P(X = 1) = θe−θ , P(X = 2) = θe−2θ , P(X = 3) = θe−3θ , . . .
implies P(X = 0) > P(X = 1) > P(X = 2) > P(X = 3) > · · · ;
P(X = 0) P(X = 0) P(X = 0)
implies ≥ 1, ≥ 1, ≥ 1, . . .
P(X = 1) P(X = 2) P(X = 3)
implies k∗ = X = 0 satisfies (10).
Thus, the mode of the exponential distribution is 0.
" #
1 1
3. Let X and Y be independent random variables uniformly distributed over the interval θ − , θ + for some
2 2
fixed θ. Show that W = X − Y has a distribution that is independent of θ with density function
1 + w for − 1 ≤ w < 0,



fW (w) = 

1 − w for 0 ≤ w ≤ 1,

for |w| > 1.


 0

Solution: h i h i
1 , if x ∈ θ − 12 , θ + 21 1 , if y ∈ θ − 21 , θ + 12
( (
Note that fX (x) = and fY (y) = .
0 , otherwise 0 , otherwise
h i
So for x, y ∈ θ − 21 , θ + 12 then
" ! ! ! !# !
1 1 1 1 1 1 1 1
w = x − y ∈ min(x − y), max(x − y) = θ − − θ+ , θ+ − θ− = − − , + = (−1, 1)
 
2 2 2 2 2 2 2 2
Consider the following figures

19
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

FW (w) can be interpreted as the area of the polygons. Then we have the following cases:

(a) Case 1: If −1 ≤ w < 0. Then


Z θ+ 21 +w Z θ+ 12 Z θ+ 12 +w ! " #2 θ+ 12 +w
1 1 1
FW (w) = dy = θ + + w − x dx = − x − θ − − w

dx
θ− 12 θ− 21
x−w 2 2 2 θ− 1
| {z } | {z } 2

width length
" #2 " #2 
1 1 1 1 1 
=−  θ + + θ θ θ
 
w − − − w − − − − − w
 
2 2 2 2 2

 


1
= (1 + w)2 , − 1 ≤ w < 0.
2
(b) Case 2: If 0 ≤ w ≤ 1. Then
" #" ! !# Z θ+ 1 Z θ+ 21
1 1 1 2
FW (w) = θ − + w θ + − θ− + dx dy
2 2 2 θ− 12 +w x−w
| {z } | {z } | {z } | {z }
width of FW1 (w) length of FW1 (w) width of FW2 (w) length of FW2 (w)
" # Z θ+ 21 ! " # " #2 θ+ 1
1 1 1 1 1 2
= θ− +w ·1+ w − x + θ + dx = θ − + w − x−w−θ−
2 2 2 2 2 θ− 1 +w

θ− 12 +w
2
" # " #2 " #2 
1 1 1 1 1 1  
= θ− +w −  θ+ −w−θ− − θ− +w−w−θ−
 
2 2 2 2 2 2 
 

" #
1 1  1 1 1
= θ − + w − w2 − 1 = θ − + w − w2 +
2 2 2 2 2
1
FW (w) = w − w2 + θ, 0 ≤ w ≤ 1.
2

(1 + w)2 , − 1 ≤ w < 0
( 1
Thus, FW (w) = 2
w − 21 w2 + θ , 0 ≤ w ≤ 1.

20
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

1 + w , − 1 ≤ w < 0;



Therefore, fW (w) = d FW (w) =  1 − w , 0 ≤ w ≤ 1; and is independent of θ.


, |w| > 1


 0

21
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

7 Chapter 1 Section 5 Exercises


1
1. Let X have a binomial distribution with parameters n = 4 and p = . Compute the probabilities P (X ≥ k) for
4
k = 1, 2, 3, 4, and sum these to verify that the mean of the distribution is 1.
Solution:
X3
P(X ≥ 4) = 1 − P(X < 4) = 1 − b(x; 4, 0.25) = 1 − 0.9960938 = 0.0039062
x=0
2
X
P(X ≥ 3) = 1 − P(X < 3) = 1 − b(x; 4, 0.25) = 1 − 0.9492188 = 0.0507812
x=0
1
X
P(X ≥ 2) = 1 − P(X < 2) = 1 − b(x; 4, 0.25) = 1 − 0.7382812 = 0.2617188
x=0

P(X ≥ 1) = 1 − P(X < 1) = 1 − P(X = 0) = 1 − b(0; 4, 0.25) = 1 − 0.3164062 = 0.6835938


Therefore,
P(X ≥ 1) + P(X ≥ 2) + P(X ≥ 3) + P(X ≥ 4) = 0.6835938 + 0.2617188 + 0.0507812 + 0.0039062 = 1.

2. A jar has four chips colored red, green, blue, and yellow. A person draws a chip, observes its color, and returns
it. Chips are now drawn repeatedly, without replacement, until the first chip drawn is selected again. What is
the mean number of draws required?
Solution:
Let X = number of chip draws. For the chip drawing problem, the mean number of draws required then is,

E[X] = P(X > 0) + P(X > 1) + · · · + P(X > n)

Note that

P(X > 0) = 1, since it requires at least 1 chip draw


n−1 n−2 n−k
P(X > k) = · ··· , k = 1, 2, 3 since the chip draw is without replacement
n n−1 n−k+1
P(X > k | k ≥ 4) = 0, since there are only four cards

Thus,

E[X] = P(X > 0) + P(X > 1) + P(X > 2) + P(X > 3) + P(X > 4)
3 3 2 3 2 1
=1+ + · + · · +0
4 4 3 4 3 2
3 1 1 4 + 3 + 2 + 1 10 5
=1+ + + = = = .
4 2 4 4 4 2

5
Therefore, the mean number of draws is E[X] = .
2

22
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

3. Let X be an exponentially distributed random variable with parameter λ. Determine the mean of X

(a) by integrating by parts in the definition in equation (2.7) with m = 1;


(b) by integrating the upper tail probabilities in accordance with equation (5.3).

Which method do you find easier?


Solution: Z ∞ Z ∞ Z ∞
E[X] = x f (x)dx = x λe
−λx
dx = λ x e−λx dx
0 0 0

let u = x dv = e−λx dx
1
du = dx v = − e−λx
λ
" ∞ Z ∞ # Z ∞ ∞
x −λx 1 1 −λx 1
E[X] = λ − e + e dx =
−λx
e dx = − e =
−λx
λ 0 λ 0 0 λ 0 λ
1
Therefore, E[X] = .
λ
Since f (z) = λe−λz , z > 0 and parameter = λ then z
Z z
Cumulative Distribution Function: F(z) = λe dz = −e = 1 − e−λz
−λz −λz
0 0
Thus, by computing the upper tail probabilities,
Z ∞ Z ∞h Z ∞ ∞
 i 1 1
E[X] = [1 − F(z)]dz = 1 − 1 − e−λz dz = e−λz dz = − e−λz =

0 0 0 λ 0 λ

1
Therefore, E[X] = .
λ
4. A system has two components: A and B. The operating times until failure of the two components are indepen-
dent and exponentially distributed random variables with parameters 2 for the component A, and 3 for B. The
system fails at the first component failure.

(a) What is the mean time to failure for component A? For component B?
(b) What is the mean time to system failure?
(c) What is the probability that it is component A that causes system failure?
(d) Suppose that it is component A that fails first. What is the mean remaining operating life of component
B?

Solution:
Let XA = operating time of component A and XB = operating time of component B. Since the two components
are independent and exponentially distributed random variables with parameters 2 for the component A, and 3

23
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

1 1
for B then E[XA ] = and E[XB ] = .
2 3
Define Z = min{XA , XB } the event that Z occurs iff XA ≥ z and XB ≥ z for all z ∈ Z. So,

P(Z ≥ z) = P(XA ≥ z) · P(XB ≥ z), since XA and XB are independent


= e−λ1 z e−λ2 z
= e−(λ1 +λ2 )z .

Implies Z is an exponential random variable with parameter λ1 + λ2 . So,


1 1 1
E [min{XA , XB }] = = =
λ1 + λ2 2 + 3 5

1
Therefore, E [min{XA , XB }] = .
5
5. Consider a post office with two clerks. John, Paul, and Naomi enter simultaneously. John and Paul go directly
to the clerks, while Naomi must wait until either John or Paul is finished before she begins service.

(a) If all of the service times are independent exponentially distributed random variables with the same mean
1
, what is the probability that Naomi is still in the post office after the other two have left?
λ
(b) How does your answer change if the two clerks have different service rates, say λ1 = 3 and λ2 = 47?

(c) The mean time that Naomi spends in the post office is less than that for John or Paul provided that
max{λ1 , λ2 } > c min{λ1 , λ2 } for a certain constant c. What is the value of this constant?

Solution:
Let X = the service time of John with parameter: λ1 and Y = the service time of Paul with parameter: λ2 . Since
X and Y are independent, their joint density is:

fX,Y (x, y) = fX (x) fY (y) = λ1 e−λ1 x · λ2 e−λ2 y = λ1 λ2 e−(λ1 x+λ2 y)

defined on nonnegative reals.


We assumed WLOG that John went to clerk 1 and Paul went to clerk 2. Then the probability that Naomi is
still waiting when both have left is

P (Naomi is last) = P (Z > |X − Y|) = P ((Z > (X − Y)) ∩ (X > Y)) + P ((Z > (Y − X)) ∩ (X ≤ Y)) .

24
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

Considering the first term we have,


Z ∞ Z ∞
P ((Z > (X − Y)) ∩ (X > Y)) =
 
1 − FY (x − y) fX,Y (x, y)dxdy
0 x=y
Z ∞ Z ∞h i
= 1 − (1 − e−λ2 (x−y) ) λ1 λ2 e−(λ1 x+λ2 y) dxdy
0 y
Z ∞Z ∞ Z ∞ Z ∞
= λ1 λ2 e−λ2 x+λ2 y−λ1 x−λ2 y
dxdy = λ1 λ2 e−(λ1 +λ2 )x dxdy
0 y 0 y

λ1 λ2 λ1 λ2
Z ∞ Z ∞
e−(λ1 +λ2 )x dy = 0 − e−(λ1 +λ2 )y dy
h i
=

0 −(λ1 + λ2 ) y 0 −(λ1 + λ2 )

λ1 λ2 λ λ
Z ∞
e−(λ1 +λ2 )y dy = e−(λ1 +λ2 )y
1 2
=

(λ1 + λ2 ) 0 −(λ1 + λ2 ) 2 0
λ1 λ2 λ1 λ2
P ((Z > (X − Y)) ∩ (X > Y)) = (0 − 1) = .
−(λ1 + λ2 ) 2 (λ1 + λ2 )2
By symmetry, we have
λ1 λ2
P ((Z > (Y − X)) ∩ (X ≤ Y)) = .
(λ1 + λ2 )2
Thus,
λ1 λ2 λ1 λ2 2λ1 λ2
P (Naomi is last) = P (Z > |X − Y|) = + = .
(λ1 + λ2 )2 (λ1 + λ2 ) 2 (λ1 + λ2 )2
If the two distributions X and Y have the same parameter: λ then

2λ1 λ2 2λλ 2λ2 2λ2 1


P (Naomi is last) = P (Z > |X − Y|) = = = = = .
(λ1 + λ2 )2 (λ + λ)2 (2λ)2 4λ2 2

1
Therefore, P (Naomi is last) = .
2
If λ1 = 3 and λ2 = 47 then,
2λ1 λ2 2(3)(47) 282 282
P (Naomi is last) = P (Z > |X − Y|) = = = = = 0.1128.
(λ1 + λ2 )2 (3 + 47)2 502 2500

282
Therefore, P (Naomi is last) = = 0.1128.
2500

25
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

E[T ] = E[E[T |W]] = E(T |W > 0)P(W > 0) + E(T |W > 0)P(W < 0)
= E(Z + Y|W > 0)P(W > 0) + E(Z + X|W > 0)P(W < 0)
= [E(Z|W > 0) + E(Y|W > 0)] P(W > 0) + [E(Z|W > 0) + E(X|W > 0)]P(W < 0)
λ2 λ1 λ2 + λ1 + λ2 λ2 λ1 + λ1 + λ2 λ1
! !
1 1 1 1
= + + + = +
λ1 + λ2 λ2 λ1 + λ2 λ1 + λ2 λ1 λ1 + λ2 λ2 (λ1 + λ2 ) λ1 + λ2 λ1 (λ1 + λ2 ) λ1 + λ2
2λ2 + λ1 2λ1 + λ2 3λ1 + 3λ2 3(λ1 + λ2 )
= + = =
(λ1 + λ2 )2
(λ1 + λ2 )2
(λ1 + λ2 )2 (λ1 + λ2 )2
3
E[T ] = .
λ1 + λ2
Mean time that Naomi spends in the post office is less than for John or Paul implies
!
3 1 1 1
[Mean time of Naomi] < + [Mean time for John or Paul]
λ1 + λ2 2 λ1 λ2
1 λ1 + λ2
!
3
<
λ1 + λ2 2 λ1 λ2
6λ1 λ2 < (λ1 + λ2 )2
6λ1 λ2 < λ1 2 + 2λ1 λ2 + λ2 2
4λ1 λ2 < λ1 2 + λ2 2

WLOG, let max{λ1 , λ2 } = λ2 then min{λ1 , λ2 } = λ1 and

4λ1 λ2 < λ1 2 + λ2 2 4λ1 λ2 < λ1 2 + λ2 2


4λ1 λ2 < λ2 2 + λ2 2 4λ1 λ1 < λ1 2 + λ2 2
4λ1 λ2 < 2λ2 2 4λ1 2 < λ1 2 + λ2 2
2λ1 λ2 < λ2 2 3λ1 2 < λ2 2

2λ1 < λ2 3λ1 < λ2

2λ1 + 3λ1 < λ2 + λ2

(2 + 3)λ1 < 2λ2

(2 + 3)
λ1 < λ2
2√
(2 + 3)
min{λ1 , λ2 } < max{λ1 , λ2 }
2

2+ 3
Therefore, the value of the constant is c = .
2

26
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

8 Chapter 1 Section 5 Problems


1. Let X1 , X2 , . . . , Xn be independent random variables, all exponentially distributed with the same parameter λ.
Determine the distribution function for the minimum Z = min{X1 , . . . , Xn }.
Solution:
Let Z = min{X1 , . . . , Xn }. Now,

P(Z > z) = P(X1 > z, X2 > z, . . . , Xn > z)


= P(X1 > z)P(X2 > z) · · · P(Xn > z), by independence (11)

Hence,

FZ (z) = P(Z ≤ z)
= 1 − P(Z > z)
= 1 − P(X1 > z)P(X2 > z) · · · P(Xn > z), by (11)
= 1 − 1 − F X1 (z) 1 − F X2 (z) · · · 1 − F Xn (z)
    

= 1 − [1 − 1 − eλz ][1 − 1 − eλz ] · · · [1 − 1 − eλz ]


     

= 1 − enλz

Taking the derivative,

fZ (z) = d[FZ (z)] = d[1 − enλz ] = −nλenλz


Therefore, the distribution is f (z) = −nλenλz .

27
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

9 Chapter 2 Section 1 Exercises


1. I roll a six-sided die and observe the number N on the uppermost face. I then toss a fair coin N times and
observe X, the total number of heads to appear. What is the probability that N = 3 and X = 2? What is the
probability that X = 5? What is E[X], the expected number of heads to appear?
Solution:
The probability that N = 3 and X = 2 is
3 1 1
P (N = 3, X = 2) = P (X = 2 | N = 3) P (N = 3) = · = .
8 6 16

1
Therefore, P (N = 3, X = 2) = .
16
The probability that X = 5 is

P (X = 5) = P (N = 5, X = 5) + P (N = 6, X = 5) , since N corresponds to tosses


1+3
! ! ! !
1 1 1 6 1 1 4 1
= + = + = = = .
6 32 6 64 192 64 192 192 48

1
Therefore, P (X = 5) = .
48
2. Four nickels and six dimes are tossed, and the total number N of heads is observed. If N = 4, what is the
conditional probability that exactly two of the nickels were heads?
Solution:
The conditional probability that exactly two of the nickels were heads is
46
2 2 6(15) 90 3
10 = = = .
210 210 7
4

3
The conditional probability is .
7
3. A poker hand of five cards is dealt from a normal deck of 52 cards. Let X be the number of aces in the hand.
Determine P (X > 1 | X ≥ 1) . This is the probability that the hand contains more than one ace, given that it
has at least one ace. Compare this with the probability that the hand contains more than one ace, given that it
contains the ace of spades.
Solution:
The distribution of the number of ace cards in a 5-card hand is a problem in sampling without replacement-
hypergeometric distribution where
nT −n 5 47 
x r−x x 4−x
P (X = x) = T  = 52
r 4

28
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

where T = deck of playing cards, r = number of ace cards.


The probability of the event X that the hand has no ace is
547
0 4 35673
P(X = 0) = 52 = .
54145
4

Thus, the probability of having at least one ace is


35673 18472
P(X ≥ 1) = 1 − P(X = 0) = 1 − = . Hence,
54145 54145

P(X > 1 | X ≥ 1) = P(X = 2 | X ≥ 1) + P(X = 3 | X ≥ 1) + P(X = 4 | X ≥ 1)


P(X = 2) P(X = 3) P(X = 4) 1
= + + = [P(X = 2) + P(X = 3) + P(X = 4)]
P(X ≥ 1) P(X ≥ 1) P(X ≥ 1) P(X ≥ 1)
 547 547 547 
1  2 2 3 1 4 0 
 54145 " 2162 94 1
#
= 18472  52 + 52 + 52  =  + +
54145
18472 54145 54145 54145
4 4 4
!
54145 2257 2257
= = = 0.122184.
18472 54145 18472

Therefore, P(X > 1 | X ≥ 1) = 0.122184 .

4. A six-sided die is rolled, and the number N on the uppermost face is recorded. From a jar containing 10 tags
numbered 1, 2, . . . , 10 we then select N tags at random without replacement. Let X be the smallest number on
the drawn tags. Determine P(X=2).
Solution:
We want to obtain a general expression for the probability distribution of the random variable X representing
the minimum number drawn among N = n numbers chosen from {1, 2, . . . , 10} without replacement. Observe
that if the minimum number is X = x, then the remaining n − 1 numbers must be drawn from the set {x + 1, x +
2, . . . , 10}. Thus the desired probability is given by
10−x
n−1
P(X = x | N = n) = 10 , x = 1, 2, . . . , 11 − n
n

where N ∼ Uni f orm[1, 6] is the discrete uniform on the set {1, 2, . . . , 6}. This will give
8
6 6 6
X X n−1 1 1 X 119
P(X = 2) = P(X = 2 | N = n)P(N = n) = 10 · = n(10 − n) = = 0.2204.
n=1 n=1
6 6(90) n=1 540
n

Therefore, P(X = 2) = 0.2204 .

29
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.

5. Let X be a Poisson random variable with parameter λ. Find the conditional mean of X given that X is odd.
Solution:
The probability that X is odd is
P(X is odd) = e−λ sinh(λ) (see Chapter 1 Section 3 Problem Number 3 answer).
Thus the probability mass function of X given that X is odd is
λn e−λ
P(X Poisson ∩ X is odd) P(X Poisson) λn
P(X Poisson | X is odd) = = = −λ n! =
P(X is odd) P(X is odd) e sinh(λ) n! sinh(λ)
Therefore, the conditional mean of X given that X is odd is

λ λ3 λ5
E[X Poisson | X is odd] = 1 · +3· +5· + ···
1! sinh(λ) 3! sinh(λ) 5! sinh(λ)
λ λe λ2 e−λ λ4 e−λ λ
" −λ #
= + + + ··· = (cosh(λ)) = λ coth(λ)
sinh(λ) 1! 2! 4! sinh(λ)
eλ + e−λ
!
=λ λ .
e − e−λ

eλ + e−λ
!
Therefore E[X | X is odd] = λ λ .
e − e−λ

30

S-ar putea să vă placă și