Documente Academic
Documente Profesional
Documente Cultură
where Bc is the complementary event to B. (That is, Bc occurs if and only if B does not occur.)
Proof :
To prove that P(A) = P(A ∩ B) + P(A ∩ Bc ) we need the following:
(a) Claim 1: A = (A ∩ B) ∪ (A ∩ Bc ).
(b) Claim 2: (A ∩ B) ∩ (A ∩ Bc ) = ∅.
(c) Definition 1.1 Let (Ω, A ) be the σ-algebra of events, then the function P(·) : Ω → [0, 1] is a probability
function or probability measure if and only if the following holds:
i. P (Ac ) = 1 − P(A), ∀ A ∈ A
ii. P(A1 ∪ A2 ) = P(A1 ) + P(A2 ) if A1 ∩ A2 = ∅
Claim 1: A = (A ∩ B) ∪ (A ∩ Bc ).
Proof :
Note: Two sets A and B are equal if and only if A ⊆ B and B ⊆ A. Hence, to show that
A = (A ∩ B) ∪ (A ∩ Bc ), we need to show the subclaims: (A ∩ B) ∪ (A ∩ Bc ) ⊆ A and A ⊆ (A ∩ B) ∪ (A ∩ Bc ).
Subclaim 1: (A ∩ B) ∪ (A ∩ Bc ) ⊆ A.
Let x ∈ (A ∩ B) ∪ (A ∩ Bc ).
⇒ x∈ A ∩ B or x ∈ A ∩ Bc
⇒ x∈ A and x ∈ B or x ∈ A and x ∈ Bc
⇒ x∈ A or x ∈ A
⇒ x∈ A.
Therefore, (A ∩ B) ∪ (A ∩ Bc ) ⊆ A.
Note: For any sets C and D, the following hold,
1. C ∩ C c = ∅
2. D = D ∪ ∅
Thus, A = A ∪ ∅ = A ∪ (B ∩ Bc ).
Subclaim 2: A ⊆ (A ∩ B) ∪ (A ∩ Bc ).
Let x ∈ A.
⇒ x ∈ A ∪ (B ∩ Bc ), since A = A ∪ (B ∩ Bc )
⇒ x ∈ A or x ∈ B and x ∈ Bc
⇒ x ∈ A and x ∈ B or x ∈ A or x ∈ Bc
⇒ x ∈ A ∩ B or x ∈ A ∩ Bc
⇒ x ∈ (A ∩ B) ∪ (A ∩ Bc )
1
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
Therefore, A ⊆ (A ∩ B) ∪ (A ∩ Bc ).
Thus, (A ∩ B) ∪ (A ∩ Bc ) = A.
Claim 2: (A ∩ B) ∩ (A ∩ Bc ) = ∅.
Proof :
Since ∅ ⊆ (A ∩ B) ∩ (A ∩ Bc ) it remains to show that (A ∩ B) ∩ (A ∩ Bc ) ⊆ ∅. Now, let x ∈ (A ∩ B) ∩ (A ∩ Bc ).
⇒ x ∈ A ∩ B and x ∈ A ∩ Bc
⇒ x ∈ A and x ∈ B and x ∈ A and x ∈ Bc
⇒ x ∈ B and x ∈ Bc , a contradiction
Therefore, (A ∩ B) ∩ (A ∩ Bc ) = ∅.
Now,
Proof :
To prove that P(A ∪ B) = P(A) + P(B) − P(A ∩ B) we need the following:
(a) Claim 1: A ∪ B = A ∪ (B ∩ Ac ).
(b) Claim 2: A ∩ (B ∩ Ac ) = ∅.
(c) Claim 3: P(B ∩ Ac ) = P(B) − P(A ∩ B).
(d) Definition 1.2 Let (Ω, A ) be the σ-algebra of events, then the function P(·) : Ω → [0, 1] is a probability
function or probability measure if and only if the following holds:
i. P (Ac ) = 1 − P(A), ∀ A ∈ A
ii. P(A1 ∪ A2 ) = P(A1 ) + P(A2 ) if A1 ∩ A2 = ∅
Claim 1: A ∪ B = A ∪ (B ∩ Ac ).
Proof :
To show that A∪ B = A∪(B∩Ac ), we need to prove the subclaims: A∪ B ⊆ A∪(B∩Ac ) and A∪(B∩Ac ) ⊆ A∪ B.
Subclaim 1: A ∪ (B ∩ Ac ) ⊆ A ∪ B.
Let x ∈ A ∪ (B ∩ Ac ).
⇒ x∈ A or x ∈ B ∩ Ac
⇒ x∈ A or x ∈ B and x ∈ Ac
⇒ x∈ A or x ∈ B
⇒ x∈ A∪B
2
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
Therefore, A ∪ (B ∩ Ac ) ⊆ A ∪ B.
Note: For any sets C and D, the following hold,
1. C = U ∩ C, U = universal set,
2. U = D ∪ Dc
Thus, A ∪ B = U ∩ (A ∪ B) = (A ∪ Ac ) ∩ (A ∪ B) = (A ∪ Ac ) ∩ (B ∪ A).
Subclaim 2: A ∪ B ⊆ A ∪ (B ∩ Ac ).
Let x ∈ A ∪ B.
⇒ x ∈ (A ∪ Ac ) ∩ (B ∪ A), since A ∪ B = (A ∪ Ac ) ∩ (B ∪ A)
⇒ x ∈ (A ∪ Ac ) ∩ x ∈ (B ∪ A)
⇒ (x ∈ A or x ∈ Ac ) and (x ∈ B or x ∈ A)
⇒ (x ∈ A and x ∈ B or x ∈ A) or (x ∈ Ac and x ∈ B or x ∈ A)
⇒ x ∈ A or (x ∈ Ac and x ∈ B)
⇒ x ∈ A or x ∈ (B ∩ Ac )
⇒ x ∈ A ∪ (B ∩ Ac )
Therefore, A ∪ B ⊆ A ∪ (B ∩ Ac ).
Thus, A ∪ B = A ∪ (B ∩ Ac ).
Claim 2: A ∩ (B ∩ Ac ) = ∅.
Proof :
Since ∅ ⊆ A ∩ (B ∩ Ac ) it remains to show that A ∩ (B ∩ Ac ) ⊆ ∅. Now, let x ∈ A ∩ (B ∩ Ac ).
⇒ x ∈ A and (x ∈ B ∩ Ac )
⇒ x ∈ A and x ∈ B and x ∈ Ac
⇒ x ∈ A and x ∈ Ac , a contradiction
Therefore, A ∩ (B ∩ Ac ) = ∅.
Claim 3: P(B ∩ Ac ) = P(B) − P(A ∩ B).
Proof :
By interchanging the roles of A and B in problem no. 1, we have
Now,
P(A ∪ B) = P [A ∪ (B ∩ Ac )] , by Claim 1
= P(A) + P(B ∩ Ac ), by Claim 2 and Definition 1.2(ii)
P(A ∪ B) = P(A) + P(B) − P(A ∩ B), by Claim 3.
3
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
(b) Determine the corresponding density function f (x) in the three regions (i) x ≤ 0, (ii) 0 < x < 1, and (iii)
1 ≤ x.
(c) What is the mean of the distribution?
!
1 3
(d) If X is a random variable following the distribution specified in (a), evaluate P ≤X≤ .
4 4
Solution:
(a)
0 for x ≤ 0,
(b) Since f (x) = D x F(x), thus f (x) = for 0 < x < 1,
2
3x
for x ≥ 1.
0
3
Therefore, the mean of the distribution is E[X] = .
4
4
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
(d)
! ! ! !3 !3
1 3 3 1 3 1 27 1
P ≤X≤ =F −F = − = −
4 4 4 4 4 4 64 64
!
1 3 26
P ≤X≤ = .
4 4 64
!
1 3 26
Therefore, P ≤X≤ = .
4 4 64
4. Let Z be a discrete random variable having possible values 0, 1, 2 and 3 and probability mass function
1 1
p(0) = , p(2) = ,
4 8
1 1
p(1) = p(3) = ,
2 8
(a) Plot the corresponding distribution function.
(b) Determine the mean E[Z].
(c) Evaluate the variance Var[Z].
Solution:
(a)
(b) The mean is
4
X
E[Z] = xi · f (xi ) = x1 · f (x1 ) + x2 · f (x2 ) + x3 · f (x3 ) + x4 · f (x4 )
i=1
1 1 1 1 1 2 3 4+2+3
=0· +1· +2· +3· = + + =
4 2 8 8 2 8 8 8
9
E[Z] = .
8
9
Therefore, E[Z] = .
8
5
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
55
Therefore, Var[Z] = .
64
5. Let A, B, and C be arbitrary events. Establish the addition law
Proof :
Let D = B ∪ C then we have
Therefore, P(A ∪ B ∪ C) = P(A) + P(B) + P(C) − P(A ∩ B) − P(A ∩ C) − P(B ∪ C) + P(A ∩ B ∩ C).
6
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
Therefore, E[N] = 1.
2. Let N cards carry the distinct numbers x1 , . . . , xn . If two cards are drawn at random without replacement, show
1
that the correlation coefficient ρ between the numbers appearing on the two cards is − .
(N − 1)
Solution:
Let C1 = random variable that represents the number xi on the first card;
C2 = random variable that represents the number x j on the second card.
Then,
Pn Pn
xi xj
E[C1 ] = i=1
E[C2 ] = i=1
N PnN
x2j
Pn 2
i=1 xi i=1
E[C12 ] = E[C22 ] =
PnN Pn Pn 2
N
i=1 j=1 xi x j − i=1 xi
E[C1C2 ] =
N(N − 1)
7
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
s−t N(s−t)−(N−1)s
E[C1C2 ] − E[C1 ]E[C2 ] N(N−1)
− Ns2 N 2 (N−1)
ρ = q = q = Nt−s
t s t s
E[C12 ] − E[C1 ]2 E[C22 ] − E[C2 ]2 N
− N2 N
− N2
N2
N s − Nt − N s + s −(Nt − s)
ρ= =
(N − 1)(Nt − s) (N − 1)
1
ρ=− .
(N − 1)
1
Therefore, ρ=− .
(N − 1)
3. A population having N distinct element is sampled with replacement. Because of repetitions, a random sample
of size r may contain fewer than r distinct elements. Let S r be the sample size necessary to get r distinct
elements. Show that !
1 1 1
E[S r ] = N + + ··· + .
N N−1 N−r+1
Solution:
N−r
The probability that r distinct objects are selected from a population N with replacement is p = . Each
N
1 1 N
selection x has a geometric distribution distribution and has thus E[x] = = N−r = . Thus, the sum of
p N
N−r
the expected number of selection x is
r−1 r−1
X N X 1
E[S r ] = =N
i=0
N−i i=0
N−i
!
1 1 1
=N + + ··· +
N N−1 N − (r − 1)
!
1 1 1
E[S r ] = N + + ··· + .
N N−1 N−r+1
!
1 1 1
Therefore, E[S r ] = N + + ··· + .
N N−1 N−r+1
8
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
Definition 3.1 Binomial Distribution. If a binomial trial can result in a success with probability p and a failure
with probability q = 1 − p, then the probability distribution of the binomial random variable X, the number of
successes in n independent trials is
!
n x n−x
b(x; n, p) = p q , for x = 0, 1, 2, . . . , n.
x
! !
10 10
Now, P ({0 defective}) = 1
(0.05) (0.95)10−1
= (0.05)(0.95)9
1 1
= 0.3151;
! !
10 10
P ({1 defective}) = 0
(0.05) (0.95)10−0
= (0.95)10
0 0
= 0.5988;
P ({0 or 1 defective}) = P ({0 defective}) + P ({1 defective}) = 0.3151 + 0.5988
= 0.9139
Therefore, P ({0 defective}) = 0.3151 , P ({1 defective}) = 0.5988 , and P ({0 or 1 defective}) = 0.9139.
3. A fraction p = 0.05 of the items coming off a production process are defective. The output of the process is
sampled, one by one, in a random manner. What is the probabality that the first defective item found is the
tenth item sampled?
Solution:
The problem involves a negative binomial distribution.
9
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
Definition 3.2 Negative Binomial Distribution. If repeated independent trials can result in a success with
probability p and a failure with probability q = 1 − p, then the probability distribution of the random variable
X, the number of the trial on which the kth success occurs is given by
!
x − 1 k x−k
b (x; k, p) =
∗
p q , for x = k, k + 1, k + 2, . . .
k−1
!
10 − 1
Now, P(N = 10) = b (10, 1, 0.05) =
∗
(0.05)1 (0.95)10−1
1−1
!
9
= (0.05)(0.95)9
0
P(N = 10) = 0.0315
4. A Poisson distributed random variable X has a mean of λ = 2. What is the probability that X equals 2? What
is the probability that X is less than or equal to 2?
Solution:
The problem involves Poisson distribution.
Definition 3.3 Poisson Distribution. The probability distribution of the Poisson random variable X, represent-
ing the number of outcomes occuring in a given time interval or specified region, is
e−λ λ x
p(x; λ) = , for x = 0, 1, 2, . . .
x!
where λ is the average number of outcomes occuring in the given time interval or specified region.
e−2 22
Now, P(X = 2) = p(2; 2) = = 2e−2
2!
= 0.2727;
e−2 21 e−2 20
p(1; 2) = = 2e ;
−2
p(0; 2) = = e−2
1! 0!
P(X ≤ 2) = p(0; 2) + p(1; 2) + p(2; 2)
= e−2 + 2e−2 + 2e−2 = 5e−2
P(X ≤ 2) = 0.6767.
10
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
5. The number of bacteria in a prescribed area of a slide containing a sample of well water has a Poisson distri-
bution with parameter 5. What is the probability that the slide shows 8 or more bacteria?
Solution:
9 X ∞
X 1
Therefore, P(Z = z) = a10 j+z−i .
i=0 j=0
10
2. The mode of a probability mass function p(k) is any value k∗ for which p(k∗ ) ≥ p(k) for all k. Determine the
mode(s) for
Solution:
11
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
(a) To find the mode of the Poisson distribution we will find k∗ that will satisfy
p(k∗ )
p(k∗ ) ≥ p(k) or ≥1
p(k)
Equivalently we will find the values of λ and k that will satisfy the following:
e−λ λk+1
P(X = k + 1) (k+1)! λ
= = ≥1 (4)
P(X = k) e−λ λk k
k!
(b) To find the mode of the Binomial distribution we will find k∗ that will satisfy
∗ p(k∗ )
p(k ) ≥ p(k) or ≥1
p(k)
Equivalently we will find the values of p and n that will satisfy the following:
P(X = k + 1)
n!
(k+1)!(n−(k+1))!
pk+1 (1 − p)n−(k+1)
=
P(X = k) n!
k!(n−k)!
pk (1 − p)n−k
P(X = k + 1) p(n − k)
= ≥1 (5)
P(X = k) (1 − p)(k + 1)
12
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
From (5)
p(n − k)
≥1
(1 − p)(k + 1)
p(n − k) ≥ (1 − p)(k + 1)
np − pk ≥ k + 1 − pk − p
np + p − 1 ≥ k
(n + 1)p − 1 ≥ k
(n + 1)p ≥ k + 1 ≥ k
3. Let X be a Poisson random variable with parameter λ. Determine the probability that X is odd.
Solution:
13
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
P(X > 1.5) = 0.0498, the probability that a bulb selected at random will last more than 1.5 years.
Z 1.5
P(X = 1.5) = 2e−2x dx = 0.
1.5
Therefore, P(X > 1.5) = 0.0498, the probability that a bulb selected at random will last more than 1.5 years.
and
P(X = 1.5) = 0, the probability that a bulb selected at random will last exactly 1.5 years.
1 1
2. The median of a random variable X is any value a for which P(X ≤ a) ≥ and P(X ≥ a) ≥ . Determine the
2 2
median of an exponentially distributed random variable with parameter λ. Compare the median to the mean.
Solution: Z a
1
P(X ≤ a) = λe−λx dx ≥
Z a 0 2
1 a 1 1 1 1
λe−λx dx ≥ ⇒ −e−λx ≥ ⇒ − e−λa − e0 ≥ ⇒ − e−λa − 1 ≥ ⇒ 1 − e−λa ≥
0 2 0 2 2 2 2
1 1
⇒ e ≤−λa
⇒ −λa ≤ −log 2 ⇒ λa ≥ log 2 ⇒ a ≥ log 2.
2 Z a λZ a
1 1
Since P(X ≥ a) = 1 − P(X ≤ a) = 1 − λe−λx dx ≥ implies λe−λx dx ≤ . Then
Z a 0 2 0 2
1 1
λe−λx dx ≤ ⇒ a ≤ log 2.
0 2 λ
1
Thus, the median of the exponentially distributed random variable X is a = log 2.
λ
1
The exponential distribution has a mean = .
λ
log 2 = median < mean = λ1 for positive λ
( 1
Comparison: λ1 .
λ
log 2 = median > mean = λ1 for negative λ
3. The lengths, in inches, of cotton fibers used in a certain mill are exponentially distributed random variables
with parameter λ. It is decided to convert all measurements in this mill to the metric system. Describe the
probability distribution of the length, in centimeters, of cotton fibers in this mill.
14
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
Solution:
λ
Since 1 inch = 2.54 centimeters by conversion factor, then λ → describes the new probability distri-
2.54
bution of the length, in centimeters, of cotton fibers in this mill.
4. Twelve independent random variables, each uniformly distributed over the interval (0 , 1] are added, and 6 is
subtracted from the total. Determine the mean and the variance of the resulting random variable.
Solution:
Let x1 , x2 , . . . , x12 be independent uniformly distributed random variables over the interval (0 , 1] . The resulting
X12
random variable is (x1 + x2 + · · · + x12 ) − 6 = xi − 6. Solving for the mean
i=1
12 12 12 12
X X X X
xi − 6 = E xi + (−6) = E xi + E [(−6)] = E [xi ] + E [(−6)]
E
i=1 i=1 i=1 i=1
!
1 1 d
= 12 + (−6), since E [xi ] = , ∀ xi ∼ U (0 , 1]
2 2
=6−6
12
X
xi − 6 = 0.
E (6)
i=1
+ 2x2 x3 + 2x2 x4 + 2x2 x5 + · · · + 2x2 x12 + 2x3 x4 + 2x3 x5 + 2x3 x6 + · · · + 2x3 x12
| {z } | {z }
10 terms 9 terms
(11)(11 + 1)
+ · · · + 2x10 x11 + 2x10 x12 + 2x11 x12 , so we have = 11(6) = 66 terms of 2xi x j .
| {z } | {z } 2
2 terms 1 term
15
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
12 2
X
Therefore, xi = 12 terms of xi2 + 66 terms of 2xi x j , i , j.
i=1
(b) Claim 2: Note that 1
h i Z 1 xi3 1 h i 1 d
E xi =
2
xi f (xi )dxi = = . Therefore, E xi2 = , ∀ xi ∼ U (0 , 1].
2
0 3 0
3 3
(c) Claim 3: Observe that
h i h i h i
E 2xi x j = 2E xi x j = 2E [xi ] E x j , since xi , x j are independent
! !
1 1 1 d
=2 , since E [xi ] = , ∀ xi ∼ U (0 , 1]
2 2 2
h i 1
E 2xi x j = .
2
h i 1
Therefore, E 2xi x j , i , j = .
2
Now,
X 12
2
h i
xi = E 12 terms of xi2 + 66 terms of 2xi x j , i , j , by Claim 1
E
i=1
h i h i h i h i
= E 12 terms of xi2 + E 66 terms of 2xi x j , i , j = 12E xi2 + 66E 2xi x j , i , j
! !
1 1
= 12 + 66 , by Claim 2 and Claim 3
3 2
X12
2
xi = 4 + 33 = 37.
E (7)
i=1
Finally
12
X 12
2 12 2
12
2
X X X
xi − 6 = E xi − 6 = E
var xi − 6 − E xi − 6 − 0, by (6)
i=1 i=1 i=1 i=1
12
2 12
12
2 12
X X X X
= E xi + 36 = E xi + E [36]
xi − 12
xi − 12E
i=1 i=1 i=1 i=1
12
2 12
!
X X 1
= E xi + 36 = 37 − 12(12) + 36 = 37 − 72 + 36, by (7)
xi − 12E
i=1 i=1
2
12
X
xi − 6 = 1.
var
i=1
Therefore, the mean = 0 and the variance = 1 of the resulting random variable.
16
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
5. Let X and Y have the joint normal distribution described in equation (4.16). What value of α minimizes the
variance of Z = αX + (1 − α)Y? Simplify your result when X and Y are independent.
Solution:
The variance of Z = αX + (1 − α)Y is
The value that minimizes var(Z) is to set it to zero and solve for α.
Equation (8) is quadratic in α. Solving α by quadratic formula and take the minimum value
q
−[2ρσ x σy − 2σ2y ] − [2(ρσ x σy − σ2y )]2 − 4[σ2x − 2ρσ x σy + σ2y ][σ2y ]
α=
2[σ2x − 2ρσ x σy + σ2y ]
q
2[σ2y − ρσ x σy ] − 2 ρ2 σ2x σ2y − 2ρσ x σ3y + σ4y − σ2x σ2y + 2ρσ x σ3y − σ4y
=
2[σ2x − 2ρσ x σy + σ2y ]
q
σ2y − ρσ x σy − (ρ2 − 1)σ2x σ2y
α=
σ2x − 2ρσ x σy + σ2y
q
σ2y − ρσ x σy (ρ2 − 1)σ2x σ2y
α= − , for ρ , ±1 and
σ2x − 2ρσ x σy + σ2y σ2x − 2ρσ x σy + σ2y
σ2y − ρσ x σy
α∗ = , for ρ = ±1.
σ2x − 2ρσ x σy + σ2y
Solution:
17
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
z2 1h i 1h i
λz − = − z2 − 2λz = − z2 − 2λz + λ2 − λ2
2 2 2
1h 2 i 1
= − z − 2λz + λ2 + λ2
2 2
z2 1 1
λz − = − (z − λ)2 + λ2 (9)
2 2 2
Z +∞ Z +∞
λz 1 − z22
λz 1 z2
E[e ] = e √ e dz = √ eλz− 2 dz
−∞ 2π −∞ 2π
Z +∞
1 − 12 (z−λ)2 + 12 λ2
= √ e dz, by (9)
−∞ 2π
Z +∞ Z +∞
1 − 12 (z−λ)2 12 λ2 λ
1 2 1 1 2
= √ e e dz = e 2 √ e− 2 (z−λ) dz
−∞ 2π −∞ 2π
Z +∞
1
= e 2 λ · 1, since
1 2 1 2
√ e− 2 (z−λ) dz = 1
−∞ 2π
E[eλz ] = e 2 λ .
1 2
Therefore, E[eλz ] = e 2 λ .
1 2
1
2. Let W be an exponentially distributed random variable with parameter θ and mean µ = .
θ
(a) Determine P(W > µ).
(b) What is the mode of the distribution?
Solution:
(a)
! !
1 1
P(W > µ) = P W > =1−P W ≤
θ θ
Z 1θ 1
θ
=1− θe−θx dx = 1 + e−θx
0 0
= 1 + (e −1
− e ) = 1 + (e
0 −1
− 1) = e−1
1
P(W > µ) =
e
1
Therefore, P(W > µ) = .
e
18
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
(b) To find the mode of the exponential distribution we will find k∗ that will satisfy
p(k∗ )
p(k∗ ) ≥ p(k) or ≥1
p(k)
Equivalently we will find k∗ such that
P(X = k∗ )
≥ 1, for all k (10)
P(X = k)
P(X = 0) = θe−θ(0) = θ, P(X = 1) = θe−θ , P(X = 2) = θe−2θ , P(X = 3) = θe−3θ , . . .
implies P(X = 0) > P(X = 1) > P(X = 2) > P(X = 3) > · · · ;
P(X = 0) P(X = 0) P(X = 0)
implies ≥ 1, ≥ 1, ≥ 1, . . .
P(X = 1) P(X = 2) P(X = 3)
implies k∗ = X = 0 satisfies (10).
Thus, the mode of the exponential distribution is 0.
" #
1 1
3. Let X and Y be independent random variables uniformly distributed over the interval θ − , θ + for some
2 2
fixed θ. Show that W = X − Y has a distribution that is independent of θ with density function
1 + w for − 1 ≤ w < 0,
fW (w) =
1 − w for 0 ≤ w ≤ 1,
for |w| > 1.
0
Solution: h i h i
1 , if x ∈ θ − 12 , θ + 21 1 , if y ∈ θ − 21 , θ + 12
( (
Note that fX (x) = and fY (y) = .
0 , otherwise 0 , otherwise
h i
So for x, y ∈ θ − 21 , θ + 12 then
" ! ! ! !# !
1 1 1 1 1 1 1 1
w = x − y ∈ min(x − y), max(x − y) = θ − − θ+ , θ+ − θ− = − − , + = (−1, 1)
2 2 2 2 2 2 2 2
Consider the following figures
19
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
FW (w) can be interpreted as the area of the polygons. Then we have the following cases:
width length
" #2 " #2
1 1 1 1 1
=− θ + + θ θ θ
w − − − w − − − − − w
2 2 2 2 2
1
= (1 + w)2 , − 1 ≤ w < 0.
2
(b) Case 2: If 0 ≤ w ≤ 1. Then
" #" ! !# Z θ+ 1 Z θ+ 21
1 1 1 2
FW (w) = θ − + w θ + − θ− + dx dy
2 2 2 θ− 12 +w x−w
| {z } | {z } | {z } | {z }
width of FW1 (w) length of FW1 (w) width of FW2 (w) length of FW2 (w)
" # Z θ+ 21 ! " # " #2 θ+ 1
1 1 1 1 1 2
= θ− +w ·1+ w − x + θ + dx = θ − + w − x−w−θ−
2 2 2 2 2 θ− 1 +w
θ− 12 +w
2
" # " #2 " #2
1 1 1 1 1 1
= θ− +w − θ+ −w−θ− − θ− +w−w−θ−
2 2 2 2 2 2
" #
1 1 1 1 1
= θ − + w − w2 − 1 = θ − + w − w2 +
2 2 2 2 2
1
FW (w) = w − w2 + θ, 0 ≤ w ≤ 1.
2
(1 + w)2 , − 1 ≤ w < 0
( 1
Thus, FW (w) = 2
w − 21 w2 + θ , 0 ≤ w ≤ 1.
20
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
1 + w , − 1 ≤ w < 0;
Therefore, fW (w) = d FW (w) = 1 − w , 0 ≤ w ≤ 1; and is independent of θ.
, |w| > 1
0
21
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
2. A jar has four chips colored red, green, blue, and yellow. A person draws a chip, observes its color, and returns
it. Chips are now drawn repeatedly, without replacement, until the first chip drawn is selected again. What is
the mean number of draws required?
Solution:
Let X = number of chip draws. For the chip drawing problem, the mean number of draws required then is,
Note that
Thus,
E[X] = P(X > 0) + P(X > 1) + P(X > 2) + P(X > 3) + P(X > 4)
3 3 2 3 2 1
=1+ + · + · · +0
4 4 3 4 3 2
3 1 1 4 + 3 + 2 + 1 10 5
=1+ + + = = = .
4 2 4 4 4 2
5
Therefore, the mean number of draws is E[X] = .
2
22
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
3. Let X be an exponentially distributed random variable with parameter λ. Determine the mean of X
let u = x dv = e−λx dx
1
du = dx v = − e−λx
λ
" ∞ Z ∞ # Z ∞ ∞
x −λx 1 1 −λx 1
E[X] = λ − e + e dx =
−λx
e dx = − e =
−λx
λ 0 λ 0 0 λ 0 λ
1
Therefore, E[X] = .
λ
Since f (z) = λe−λz , z > 0 and parameter = λ then z
Z z
Cumulative Distribution Function: F(z) = λe dz = −e = 1 − e−λz
−λz −λz
0 0
Thus, by computing the upper tail probabilities,
Z ∞ Z ∞h Z ∞ ∞
i 1 1
E[X] = [1 − F(z)]dz = 1 − 1 − e−λz dz = e−λz dz = − e−λz =
0 0 0 λ 0 λ
1
Therefore, E[X] = .
λ
4. A system has two components: A and B. The operating times until failure of the two components are indepen-
dent and exponentially distributed random variables with parameters 2 for the component A, and 3 for B. The
system fails at the first component failure.
(a) What is the mean time to failure for component A? For component B?
(b) What is the mean time to system failure?
(c) What is the probability that it is component A that causes system failure?
(d) Suppose that it is component A that fails first. What is the mean remaining operating life of component
B?
Solution:
Let XA = operating time of component A and XB = operating time of component B. Since the two components
are independent and exponentially distributed random variables with parameters 2 for the component A, and 3
23
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
1 1
for B then E[XA ] = and E[XB ] = .
2 3
Define Z = min{XA , XB } the event that Z occurs iff XA ≥ z and XB ≥ z for all z ∈ Z. So,
1
Therefore, E [min{XA , XB }] = .
5
5. Consider a post office with two clerks. John, Paul, and Naomi enter simultaneously. John and Paul go directly
to the clerks, while Naomi must wait until either John or Paul is finished before she begins service.
(a) If all of the service times are independent exponentially distributed random variables with the same mean
1
, what is the probability that Naomi is still in the post office after the other two have left?
λ
(b) How does your answer change if the two clerks have different service rates, say λ1 = 3 and λ2 = 47?
(c) The mean time that Naomi spends in the post office is less than that for John or Paul provided that
max{λ1 , λ2 } > c min{λ1 , λ2 } for a certain constant c. What is the value of this constant?
Solution:
Let X = the service time of John with parameter: λ1 and Y = the service time of Paul with parameter: λ2 . Since
X and Y are independent, their joint density is:
P (Naomi is last) = P (Z > |X − Y|) = P ((Z > (X − Y)) ∩ (X > Y)) + P ((Z > (Y − X)) ∩ (X ≤ Y)) .
24
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
1
Therefore, P (Naomi is last) = .
2
If λ1 = 3 and λ2 = 47 then,
2λ1 λ2 2(3)(47) 282 282
P (Naomi is last) = P (Z > |X − Y|) = = = = = 0.1128.
(λ1 + λ2 )2 (3 + 47)2 502 2500
282
Therefore, P (Naomi is last) = = 0.1128.
2500
25
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
E[T ] = E[E[T |W]] = E(T |W > 0)P(W > 0) + E(T |W > 0)P(W < 0)
= E(Z + Y|W > 0)P(W > 0) + E(Z + X|W > 0)P(W < 0)
= [E(Z|W > 0) + E(Y|W > 0)] P(W > 0) + [E(Z|W > 0) + E(X|W > 0)]P(W < 0)
λ2 λ1 λ2 + λ1 + λ2 λ2 λ1 + λ1 + λ2 λ1
! !
1 1 1 1
= + + + = +
λ1 + λ2 λ2 λ1 + λ2 λ1 + λ2 λ1 λ1 + λ2 λ2 (λ1 + λ2 ) λ1 + λ2 λ1 (λ1 + λ2 ) λ1 + λ2
2λ2 + λ1 2λ1 + λ2 3λ1 + 3λ2 3(λ1 + λ2 )
= + = =
(λ1 + λ2 )2
(λ1 + λ2 )2
(λ1 + λ2 )2 (λ1 + λ2 )2
3
E[T ] = .
λ1 + λ2
Mean time that Naomi spends in the post office is less than for John or Paul implies
!
3 1 1 1
[Mean time of Naomi] < + [Mean time for John or Paul]
λ1 + λ2 2 λ1 λ2
1 λ1 + λ2
!
3
<
λ1 + λ2 2 λ1 λ2
6λ1 λ2 < (λ1 + λ2 )2
6λ1 λ2 < λ1 2 + 2λ1 λ2 + λ2 2
4λ1 λ2 < λ1 2 + λ2 2
26
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
Hence,
FZ (z) = P(Z ≤ z)
= 1 − P(Z > z)
= 1 − P(X1 > z)P(X2 > z) · · · P(Xn > z), by (11)
= 1 − 1 − F X1 (z) 1 − F X2 (z) · · · 1 − F Xn (z)
= 1 − enλz
27
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
1
Therefore, P (N = 3, X = 2) = .
16
The probability that X = 5 is
1
Therefore, P (X = 5) = .
48
2. Four nickels and six dimes are tossed, and the total number N of heads is observed. If N = 4, what is the
conditional probability that exactly two of the nickels were heads?
Solution:
The conditional probability that exactly two of the nickels were heads is
46
2 2 6(15) 90 3
10 = = = .
210 210 7
4
3
The conditional probability is .
7
3. A poker hand of five cards is dealt from a normal deck of 52 cards. Let X be the number of aces in the hand.
Determine P (X > 1 | X ≥ 1) . This is the probability that the hand contains more than one ace, given that it
has at least one ace. Compare this with the probability that the hand contains more than one ace, given that it
contains the ace of spades.
Solution:
The distribution of the number of ace cards in a 5-card hand is a problem in sampling without replacement-
hypergeometric distribution where
nT −n 5 47
x r−x x 4−x
P (X = x) = T = 52
r 4
28
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
4. A six-sided die is rolled, and the number N on the uppermost face is recorded. From a jar containing 10 tags
numbered 1, 2, . . . , 10 we then select N tags at random without replacement. Let X be the smallest number on
the drawn tags. Determine P(X=2).
Solution:
We want to obtain a general expression for the probability distribution of the random variable X representing
the minimum number drawn among N = n numbers chosen from {1, 2, . . . , 10} without replacement. Observe
that if the minimum number is X = x, then the remaining n − 1 numbers must be drawn from the set {x + 1, x +
2, . . . , 10}. Thus the desired probability is given by
10−x
n−1
P(X = x | N = n) = 10 , x = 1, 2, . . . , 11 − n
n
where N ∼ Uni f orm[1, 6] is the discrete uniform on the set {1, 2, . . . , 6}. This will give
8
6 6 6
X X n−1 1 1 X 119
P(X = 2) = P(X = 2 | N = n)P(N = n) = 10 · = n(10 − n) = = 0.2204.
n=1 n=1
6 6(90) n=1 540
n
29
AMS 623 Stochastic Processes Submitted by PEREZ, Kenneth P.
5. Let X be a Poisson random variable with parameter λ. Find the conditional mean of X given that X is odd.
Solution:
The probability that X is odd is
P(X is odd) = e−λ sinh(λ) (see Chapter 1 Section 3 Problem Number 3 answer).
Thus the probability mass function of X given that X is odd is
λn e−λ
P(X Poisson ∩ X is odd) P(X Poisson) λn
P(X Poisson | X is odd) = = = −λ n! =
P(X is odd) P(X is odd) e sinh(λ) n! sinh(λ)
Therefore, the conditional mean of X given that X is odd is
λ λ3 λ5
E[X Poisson | X is odd] = 1 · +3· +5· + ···
1! sinh(λ) 3! sinh(λ) 5! sinh(λ)
λ λe λ2 e−λ λ4 e−λ λ
" −λ #
= + + + ··· = (cosh(λ)) = λ coth(λ)
sinh(λ) 1! 2! 4! sinh(λ)
eλ + e−λ
!
=λ λ .
e − e−λ
eλ + e−λ
!
Therefore E[X | X is odd] = λ λ .
e − e−λ
30