Documente Academic
Documente Profesional
Documente Cultură
2
_
1
2
exp
_
1
2(1
2
)
_
_
x
1
1
_
2
2
_
x
1
1
__
x
2
2
_
+
_
x
2
2
_
2
__
where < x
1
< and < x
2
< .
(a) Dene Z
1
= (X
1
1
)/
1
and Z
2
= (X
2
2
)/
2
. Show that the joint pdf of Z
1
and Z
2
is given
by
g(z
1
, z
2
) =
1
2
_
1
2
exp
_
1
2(1
2
)
_
z
2
1
2z
1
z
2
+ z
2
2
__
where < z
1
< and < z
2
< .
(b) Show that M(t
1
, t
2
) = exp{
1
t
1
+
2
t
2
+
1
2
(
2
1
t
2
1
+
2
2
t
2
2
+ 2
1
2
t
1
t
2
)} for all t
1
, t
2
R. (Hint:
Find the joint mgf of Z
1
and Z
2
rst.)
(c) Find Cov(X
1
, X
2
).
(d) Show that X
1
N(
1
,
2
1
) and X
2
N(
2
,
2
2
).
(e) Use mgfs to show that X
1
and X
2
are independent random variables if and only if = 0.
(f) Find f(x
1
|x
2
) and f(x
2
|x
1
). What are these distributions?
12. Suppose X and Y are continuous random variables with joint pdf
f(x, y) = 4xy, 0 < x < 1, 0 < y < 1.
Calculate P(X + Y
1
2
).
2
SOLUTIONS TO EXTRA PROBLEMS
1. (a) (i) (0.7)
2
/0.3
(ii) F(x) = 1 (1 + 0.7x)(0.3)
x
for x = 1, 2, 3, ...
(iii) 0.4263 and 0.4652
(iv) 1.8571 and 1.2245
(b) (i) 1/
(ii) F(x) = arctan(x)/ + 1/2, < x <
(iii) 0.1324 and 0.1475
(iv) E(X) and Var(X) do not exist
(c) (i) 1/2
(ii) F(x) =
_
e
x
/2 for x 0
1 e
x
/2 for x > 0
(iii) 0.1415 and 0.1452
(iv) 0 and 2
(d) (i)
3
/2
(ii) F(x) = 1
1
2
e
x
_
2
x
2
+ 2x + 2
_
, x 0
(iii) F(3) F(1.1) =
1
2
e
11/10
_
2 +
11
5
+
121
2
100
_
1
2
e
3
(2 + 6 + 9
2
) and
F(3)F(1.1)
1
1
2
e
3
(2+6+9
2
)
(iv)
3
and
3
2
(e) (i)
(ii) F(x) = 1
_
x
_
, x
(iii)
_
10
11
_
3
_
and
(
10
11
)
3
)
1(
3
)
(iv)
1
, > 1 and
2
(2)(1)
2
, > 2
2. Without loss of generality, assume that X is a continuous random variable with pdf f(x). The discrete
case is proven similarly.
For any j = 1, 2, ..., k 1, consider
E
_
|X|
j
_
=
_
|x|
j
f(x)dx =
_
A
|x|
j
f(x)dx +
_
A
c
|x|
j
f(x)dx
where A = {x R : |x| 1} and A
c
= {x R : |x| > 1}. Note that |x|
j
1 for all x A.
Therefore,
E(|X|
j
)
_
A
f(x)dx +
_
A
c
|x|
j
f(x)dx
f(x)dx +
_
A
c
|x|
j
f(x)dx
= 1 +
_
A
c
|x|
j
f(x)dx.
However, |x|
j
< |x|
k
for all x A
c
, which implies that
_
A
c
|x|
j
f(x)dx <
_
A
c
|x|
k
f(x)dx < since
E(|X|
k
) exists (i.e. is nite). Therefore, we then have E(|X|
j
) < and so E(|X|
j
) exists.
3. M(t) = (1 t
2
)
1
for |t| < 1
E(X) = 0
Var(X) = 2
4. (a)
r
= (r + 2)!/2, r = 0, 1, 2, ...
(b)
r
= 2r!, r = 1, 2, 3, ...
3
(c)
2k1
=
k
j=1
(2k1)!
(2j1)!
for k = 1, 2, 3, ...
2k
=
k
j=0
(2k)!
(2j)!
for k = 0, 1, 2, ...
5. Since M
X
(t) = M
Y
(t), we have
j=0
e
tj
p
j
=
j=0
e
tj
q
j
for t (h, h), h > 0.
Let t = ln s where s
_
e
h
, e
h
_
. Then, we get that
j=0
s
j
p
j
=
j=0
s
j
q
j
for all s
_
e
h
, e
h
_
. This
implies that two power series are equal over a continuous range of values, and so p
j
= q
j
for all
j = 0, 1, 2, .... Thus, X and Y have the same probability distribution.
6. (a) k = 1
(b) f
X
(x) = qp
x
for x = 0, 1, 2, ...
f
Y
(y) = qp
y
for y = 0, 1, 2, ...
X and Y are independent.
7. X POI(1) and Y POI(2)
X and Y are not independent.
8. (a) k = 5/4
(b) f
X
(x) = 5(1 x
4
)/8 for 1 < x < 1
f
Y
(y) = 5(1 + 2y)
_
2
+
2
z
2
f (x
1
, x
2
) dx
1
dx
2
The joint pdf of Z
1
and Z
2
is then given by
g(z
1
, z
2
) =
2
z
1
z
2
G(z
1
, z
2
)
= f (
1
+
1
z
1
,
2
+
2
z
2
)
1
2
=
1
2
_
1
2
exp
_
1
2(1
2
)
_
z
2
1
2z
1
z
2
+ z
2
2
_
for < z
1
, z
2
< .
(b) Let us rst nd
M
Z
1
,Z
2
(t
1
, t
2
)
= E
_
e
t
1
Z
1
+t
2
Z
2
_
=
_
e
t
2
z
2
_
e
t
1
z
1
1
2
_
1
2
exp
_
1
2(1
2
)
_
z
2
1
2z
1
z
2
+ z
2
2
_
dz
1
dz
2
=
_
2
e
t
2
z
2
e
1
2(1
2
)
z
2
2
_
2
_
1
2
e
t
1
z
1
exp
_
1
2(1
2
)
_
z
2
1
2z
2
z
1
_
dz
1
dz
2
. (1)
4
Now consider
e
t
1
z
1
exp
_
1
2(1
2
)
_
z
2
1
2z
2
z
1
_
= exp
_
1
2(1
2
)
_
z
2
1
2z
2
z
1
2(1
2
)t
1
z
1
_
= exp
_
1
2(1
2
)
_
z
2
1
2
_
z
2
+ (1
2
)t
1
_
z
1
_
= exp
_
1
2(1
2
)
_
_
z
1
_
z
2
+ (1
2
)t
1
__
2
_
z
2
+ (1
2
)t
1
_
2
_
_
= exp
_
1
2(1
2
)
_
z
2
+ (1
2
)t
1
_
2
_
exp
_
1
2(1
2
)
_
_
z
1
_
z
2
+ (1
2
)t
1
__
2
_
_
.
Substituting this back into equation (1) and recognizing that the inner integral equals 1, we
subsequently obtain
M
Z
1
,Z
2
(t
1
, t
2
) =
_
2
e
t
2
z
2
e
1
2(1
2
)
z
2
2
exp
_
1
2(1 p
2
)
_
pz
2
+ (1
2
)t
1
_
2
_
dz
2
. (2)
Next consider
e
t
2
z
2
e
1
2(1
2
)
z
2
2
exp
_
1
2(1
2
)
_
z
2
+ (1
2
)t
1
_
2
_
= exp
_
1
2(1
2
)
_
z
2
2
2(1
2
)t
2
z
2
2
z
2
2
2(1
2
)t
1
z
2
(1
2
)
2
t
2
1
_
= e
1
2
(1
2
)t
2
1
exp
_
1
2
_
z
2
2
2t
2
z
2
2t
1
z
2
_
= e
1
2
(1
2
)t
2
1
exp
_
1
2
(z
2
(t
2
+ t
1
))
2
_
exp
_
1
2
(t
2
+ t
1
)
2
_
= e
1
2
(t
2
1
2
t
2
1
+t
2
2
+2t
1
t
2
+
2
t
2
1
)
exp
_
1
2
(z
2
(t
2
+ t
1
))
2
_
= e
1
2
(t
2
1
+2t
1
t
2
+t
2
2
)
exp
_
1
2
(z
2
(t
2
+ t
1
))
2
_
.
Substituting this back into equation (2), we obtain
M
Z
1
,Z
2
(t
1
, t
2
) = e
1
2
(t
2
1
+2t
1
t
2
+t
2
2
)
.
Finally, the joint mgf of X
1
and X
2
is given by
M(t
1
, t
2
) = E
_
e
t
1
X
1
+t
2
X
2
_
= E
_
e
t
1
(
1
+
1
Z
1
)+t
2
(
2
+
2
Z
2
)
_
= E
_
e
1
t
1
+
2
t
2
e
1
t
1
Z
1
+
2
t
2
Z
2
_
= e
1
t
1
+
2
t
2
M
Z
1
,Z
2
(
1
t
1
,
2
t
2
)
= exp
_
1
t
1
+
2
t
2
+
1
2
_
2
1
t
2
1
+ 2
1
2
t
1
t
2
+
2
2
t
2
2
_
_
.
(c) M
X
1
(t) = M(t, 0) = exp
_
1
t +
2
1
2
t
2
_
for t R.
We recognize this function as the mgf of a N
_
1
,
2
1
_
random variable. By the uniqueness property
of mgfs, X
1
N
_
1
,
2
1
_
.
The proof is similarly done for X
2
.
5
(d) E (X
1
X
2
)
=
2
t
1
t
2
M (t
1
, t
2
)| t
1
=0
t
2
=0
=
t
2
__
1
+
2
1
t
1
+
1
2
t
2
_
M (t
1
, t
2
)
t
1
=0
t
2
=0
=
_
2
M (t
1
, t
2
) +
_
1
+
2
1
t
1
+
1
2
t
2
_ _
2
+
2
2
t
2
+
1
2
t
1
_
M (t
1
, t
2
)}
t
1
=0
t
2
=0
=
1
2
(1) +
1
2
(1)
=
1
2
+
1
2
Thus,
Cov (X
1
, X
2
) = E (X
1
X
2
) E (X
1
) E (X
2
)
=
1
2
+
1
2
=
1
2
(e) If X
1
and X
2
are independent, then we have by a previous theorem that Cov(X
1
, X
2
) = 0. Since
Cov(X
1
, X
2
) =
1
2
, this implies that = 0.
Conversely, suppose that = 0. Then,
M (t
1
, t
2
) = exp
_
1
t
1
+
2
t
2
+
2
1
2
t
2
1
+
2
2
2
t
2
2
_
= exp
_
1
t
1
+
2
1
2
t
2
1
_
exp
_
2
t
2
+
2
2
2
t
2
2
_
= M
X
1
(t
1
) M
X
2
(t
2
) .
Since the joint mgf factors into a product of the marginal mgfs, we therefore have that X
1
and
X
2
are independent random variables by a previous theorem.
(f) For the sake of notational convenience, let u = (x
1
1
)/
1
and v = (x
2
2
)/
2
.
Then,
f (x
2
|x
1
) =
f(x
1
,x
2
)
f
1
(x
1
)
=
1
2
1
1
2
e
1
2(1
2
)
[
2
2v+v
2
]
1
2
1
e
1
2
2
=
1
2
2
1
2
e
1
2(1
2
)
[v
2
2v+
2
2
]
=
1
2
2
1
2
e
1
2
vu
1
2
2
.
Expressing this result in terms of the original variables, we therefore obtain
f (x
2
|x
1
) =
1
2
_
1
2
e
1
2
x
2
2
+
1
(x
1
1
)
}
1
2
2
, < x
2
< .
By inspection, the above pdf is normal with mean
2
+
1
(x
1
1
) and variance
2
2
(1
2
).
The corresponding results for the conditional pdf of X
1
given X
2
= x
2
follow by symmetry.
12. 1/96
6