Sunteți pe pagina 1din 6

STAT 330

EXTRA PROBLEMS CHAPTERS 1 TO 5


1. Consider the following functions:
(a) f(x) = kx(0.3)
x
, x = 1, 2, 3, ...
(b) f(x) = k(1 + x
2
)
1
, < x <
(c) f(x) = ke
|x|
, < x <
(d) f(x) = kx
2
e
x
, x > 0, > 0
(e) f(x) = kx
(+1)
, x > > 0, > 0
In each case:
(i) Determine k so that f(x) is a pmf/pdf.
(ii) Let X be a random variable with pmf/pdf f(x). Find the cdf of X.
(iii) Find P(1.1 < X 3) and P(X > 1.1|X 3).
(iv) Find E(X) and Var(X).
2. If E(|X|
k
) exists for some k Z
+
, then show that E(|X|
j
) exists for j = 1, 2, ..., k 1.
3. Let f(x) = e
|x|
/2 for < x < . Derive the moment generating function M(t) for this distribution.
State the values for which M(t) exists and use the mgf to nd the mean and variance.
4. For each of the following, nd all the moments about the origin of X if X is a random variable with
mgf M(t) given by:
(a) M(t) = (1 t)
3
, |t| < 1
(b) M(t) =
1+t
1t
, |t| < 1
(c) M(t) =
e
t
1t
2
, |t| < 1
5. Suppose X and Y are discrete random variables such that P(X = j) = p
j
and P(Y = j) = q
j
for
j = 0, 1, 2, .... Suppose also that M
X
(t) = M
Y
(t) for t (h, h), h > 0. Show that X and Y have the
same probability distribution. (Hint: Compare M
X
(ln s) and M
Y
(ln s) and recall that if two power
series are equal, then their coecients are equal.)
6. Suppose X and Y are discrete random variables with joint pmf
f(x, y) = kq
2
p
x+y
, x = 0, 1, 2...; y = 0, 1, 2...; 0 < p < 1, q = 1 p.
Determine:
(a) the value of k
(b) the marginal pmf of X and the marginal pmf of Y. Are X and Y independent random variables?
7. Suppose X and Y are discrete random variables with joint pmf
f(x, y) =
e
2
x!(y x)!
, y = 0, 1, 2, . . . ; x = 0, 1, 2, . . . , y.
Find the marginal pmf of X and the marginal pmf of Y. Are X and Y independent random variables?
1
8. Suppose X and Y are continuous random variables with joint pdf
f(x, y) = k(x
2
+ y), 1 < x < 1, 0 < y < 1 x
2
.
Determine:
(a) the value of k
(b) the marginal pdf of X and the marginal pdf of Y. Are X and Y independent random variables?
(c) P(Y X + 1).
9. Suppose X and Y are random variables with joint pmf given in Problem 7. Find the joint mgf of X
and Y and use it to calculate Cov(X, Y ).
10. Suppose X and Y are continuous random variables with joint pdf
f(x, y) = 2e
xy
, 0 < x < y < .
Find the joint mgf of X and Y and use it to calculate Cov(X, Y ).
11. The random variables X
1
and X
2
are said to have a bivariate normal distribution if the joint pdf is
given by
f(x
1
, x
2
) =
1
2
1

2
_
1
2
exp
_

1
2(1
2
)
_
_
x
1

1
_
2
2
_
x
1

1
__
x
2

2
_
+
_
x
2

2
_
2
__
where < x
1
< and < x
2
< .
(a) Dene Z
1
= (X
1

1
)/
1
and Z
2
= (X
2

2
)/
2
. Show that the joint pdf of Z
1
and Z
2
is given
by
g(z
1
, z
2
) =
1
2
_
1
2
exp
_

1
2(1
2
)
_
z
2
1
2z
1
z
2
+ z
2
2
__
where < z
1
< and < z
2
< .
(b) Show that M(t
1
, t
2
) = exp{
1
t
1
+
2
t
2
+
1
2
(
2
1
t
2
1
+
2
2
t
2
2
+ 2
1

2
t
1
t
2
)} for all t
1
, t
2
R. (Hint:
Find the joint mgf of Z
1
and Z
2
rst.)
(c) Find Cov(X
1
, X
2
).
(d) Show that X
1
N(
1
,
2
1
) and X
2
N(
2
,
2
2
).
(e) Use mgfs to show that X
1
and X
2
are independent random variables if and only if = 0.
(f) Find f(x
1
|x
2
) and f(x
2
|x
1
). What are these distributions?
12. Suppose X and Y are continuous random variables with joint pdf
f(x, y) = 4xy, 0 < x < 1, 0 < y < 1.
Calculate P(X + Y
1
2
).
2
SOLUTIONS TO EXTRA PROBLEMS
1. (a) (i) (0.7)
2
/0.3
(ii) F(x) = 1 (1 + 0.7x)(0.3)
x
for x = 1, 2, 3, ...
(iii) 0.4263 and 0.4652
(iv) 1.8571 and 1.2245
(b) (i) 1/
(ii) F(x) = arctan(x)/ + 1/2, < x <
(iii) 0.1324 and 0.1475
(iv) E(X) and Var(X) do not exist
(c) (i) 1/2
(ii) F(x) =
_
e
x
/2 for x 0
1 e
x
/2 for x > 0
(iii) 0.1415 and 0.1452
(iv) 0 and 2
(d) (i)
3
/2
(ii) F(x) = 1
1
2
e
x
_

2
x
2
+ 2x + 2
_
, x 0
(iii) F(3) F(1.1) =
1
2
e
11/10
_
2 +
11
5
+
121
2
100
_

1
2
e
3
(2 + 6 + 9
2
) and
F(3)F(1.1)
1
1
2
e
3
(2+6+9
2
)
(iv)
3

and
3

2
(e) (i)

(ii) F(x) = 1
_

x
_

, x
(iii)
_
10
11
_

3
_

and
(
10
11
)

3
)

1(

3
)

(iv)

1
, > 1 and

2
(2)(1)
2
, > 2
2. Without loss of generality, assume that X is a continuous random variable with pdf f(x). The discrete
case is proven similarly.
For any j = 1, 2, ..., k 1, consider
E
_
|X|
j
_
=
_

|x|
j
f(x)dx =
_
A
|x|
j
f(x)dx +
_
A
c
|x|
j
f(x)dx
where A = {x R : |x| 1} and A
c
= {x R : |x| > 1}. Note that |x|
j
1 for all x A.
Therefore,
E(|X|
j
)
_
A
f(x)dx +
_
A
c
|x|
j
f(x)dx

f(x)dx +
_
A
c
|x|
j
f(x)dx
= 1 +
_
A
c
|x|
j
f(x)dx.
However, |x|
j
< |x|
k
for all x A
c
, which implies that
_
A
c
|x|
j
f(x)dx <
_
A
c
|x|
k
f(x)dx < since
E(|X|
k
) exists (i.e. is nite). Therefore, we then have E(|X|
j
) < and so E(|X|
j
) exists.
3. M(t) = (1 t
2
)
1
for |t| < 1
E(X) = 0
Var(X) = 2
4. (a)

r
= (r + 2)!/2, r = 0, 1, 2, ...
(b)

r
= 2r!, r = 1, 2, 3, ...
3
(c)

2k1
=
k

j=1
(2k1)!
(2j1)!
for k = 1, 2, 3, ...

2k
=
k

j=0
(2k)!
(2j)!
for k = 0, 1, 2, ...
5. Since M
X
(t) = M
Y
(t), we have

j=0
e
tj
p
j
=

j=0
e
tj
q
j
for t (h, h), h > 0.
Let t = ln s where s
_
e
h
, e
h
_
. Then, we get that

j=0
s
j
p
j
=

j=0
s
j
q
j
for all s
_
e
h
, e
h
_
. This
implies that two power series are equal over a continuous range of values, and so p
j
= q
j
for all
j = 0, 1, 2, .... Thus, X and Y have the same probability distribution.
6. (a) k = 1
(b) f
X
(x) = qp
x
for x = 0, 1, 2, ...
f
Y
(y) = qp
y
for y = 0, 1, 2, ...
X and Y are independent.
7. X POI(1) and Y POI(2)
X and Y are not independent.
8. (a) k = 5/4
(b) f
X
(x) = 5(1 x
4
)/8 for 1 < x < 1
f
Y
(y) = 5(1 + 2y)

1 y/6 for 0 < y < 1


X and Y are not independent.
(c) 13/16
9. M(t
1
, t
2
) = exp{(1 + e
t
1
)e
t
2
2} for < t
1
, t
2
<
Cov(X, Y ) = 1
10. M(t
1
, t
2
) =
2
(1t
2
)(2t
1
t
2
)
for t
1
< 1, t
2
< 1
Cov(X, Y ) = 1/4
11. (a) Let G(z
1
, z
2
) denote the joint cdf of Z
1
and Z
2
. Then,
G(z
1
, z
2
) = P (Z
1
z
1
, Z
2
z
2
)
= P (X
1

1
+
1
z
1
, X
2

2
+
2
z
2
)
=
_

1
+
1
z
1

_

2
+
2
z
2

f (x
1
, x
2
) dx
1
dx
2
The joint pdf of Z
1
and Z
2
is then given by
g(z
1
, z
2
) =

2
z
1
z
2
G(z
1
, z
2
)
= f (
1
+
1
z
1
,
2
+
2
z
2
)
1

2
=
1
2
_
1
2
exp
_
1
2(1
2
)
_
z
2
1
2z
1
z
2
+ z
2
2

_
for < z
1
, z
2
< .
(b) Let us rst nd
M
Z
1
,Z
2
(t
1
, t
2
)
= E
_
e
t
1
Z
1
+t
2
Z
2
_
=
_

e
t
2
z
2
_

e
t
1
z
1
1
2
_
1
2
exp
_
1
2(1
2
)
_
z
2
1
2z
1
z
2
+ z
2
2

_
dz
1
dz
2
=
_

2
e
t
2
z
2
e
1
2(1
2
)
z
2
2
_

2
_
1
2
e
t
1
z
1
exp
_
1
2(1
2
)
_
z
2
1
2z
2
z
1

_
dz
1
dz
2
. (1)
4
Now consider
e
t
1
z
1
exp
_
1
2(1
2
)
_
z
2
1
2z
2
z
1

_
= exp
_
1
2(1
2
)
_
z
2
1
2z
2
z
1
2(1
2
)t
1
z
1

_
= exp
_
1
2(1
2
)
_
z
2
1
2
_
z
2
+ (1
2
)t
1
_
z
1

_
= exp
_
1
2(1
2
)
_
_
z
1

_
z
2
+ (1
2
)t
1
__
2

_
z
2
+ (1
2
)t
1
_
2
_
_
= exp
_
1
2(1
2
)
_
z
2
+ (1
2
)t
1
_
2
_
exp
_
1
2(1
2
)
_
_
z
1

_
z
2
+ (1
2
)t
1
__
2
_
_
.
Substituting this back into equation (1) and recognizing that the inner integral equals 1, we
subsequently obtain
M
Z
1
,Z
2
(t
1
, t
2
) =
_

2
e
t
2
z
2
e
1
2(1
2
)
z
2
2
exp
_
1
2(1 p
2
)
_
pz
2
+ (1
2
)t
1
_
2
_
dz
2
. (2)
Next consider
e
t
2
z
2
e
1
2(1
2
)
z
2
2
exp
_
1
2(1
2
)
_
z
2
+ (1
2
)t
1
_
2
_
= exp
_
1
2(1
2
)
_
z
2
2
2(1
2
)t
2
z
2

2
z
2
2
2(1
2
)t
1
z
2
(1
2
)
2
t
2
1

_
= e
1
2
(1
2
)t
2
1
exp
_

1
2
_
z
2
2
2t
2
z
2
2t
1
z
2

_
= e
1
2
(1
2
)t
2
1
exp
_

1
2
(z
2
(t
2
+ t
1
))
2
_
exp
_
1
2
(t
2
+ t
1
)
2
_
= e
1
2
(t
2
1

2
t
2
1
+t
2
2
+2t
1
t
2
+
2
t
2
1
)
exp
_

1
2
(z
2
(t
2
+ t
1
))
2
_
= e
1
2
(t
2
1
+2t
1
t
2
+t
2
2
)
exp
_

1
2
(z
2
(t
2
+ t
1
))
2
_
.
Substituting this back into equation (2), we obtain
M
Z
1
,Z
2
(t
1
, t
2
) = e
1
2
(t
2
1
+2t
1
t
2
+t
2
2
)
.
Finally, the joint mgf of X
1
and X
2
is given by
M(t
1
, t
2
) = E
_
e
t
1
X
1
+t
2
X
2
_
= E
_
e
t
1
(
1
+
1
Z
1
)+t
2
(
2
+
2
Z
2
)
_
= E
_
e

1
t
1
+
2
t
2
e

1
t
1
Z
1
+
2
t
2
Z
2
_
= e

1
t
1
+
2
t
2
M
Z
1
,Z
2
(
1
t
1
,
2
t
2
)
= exp
_

1
t
1
+
2
t
2
+
1
2
_

2
1
t
2
1
+ 2
1

2
t
1
t
2
+
2
2
t
2
2
_
_
.
(c) M
X
1
(t) = M(t, 0) = exp
_

1
t +

2
1
2
t
2
_
for t R.
We recognize this function as the mgf of a N
_

1
,
2
1
_
random variable. By the uniqueness property
of mgfs, X
1
N
_

1
,
2
1
_
.
The proof is similarly done for X
2
.
5
(d) E (X
1
X
2
)
=

2
t
1
t
2
M (t
1
, t
2
)| t
1
=0
t
2
=0
=

t
2
__

1
+
2
1
t
1
+
1

2
t
2
_
M (t
1
, t
2
)

t
1
=0
t
2
=0
=
_

2
M (t
1
, t
2
) +
_

1
+
2
1
t
1
+
1

2
t
2
_ _

2
+
2
2
t
2
+
1

2
t
1
_
M (t
1
, t
2
)}

t
1
=0
t
2
=0
=
1

2
(1) +
1

2
(1)
=
1

2
+
1

2
Thus,
Cov (X
1
, X
2
) = E (X
1
X
2
) E (X
1
) E (X
2
)
=
1

2
+
1

2
=
1

2
(e) If X
1
and X
2
are independent, then we have by a previous theorem that Cov(X
1
, X
2
) = 0. Since
Cov(X
1
, X
2
) =
1

2
, this implies that = 0.
Conversely, suppose that = 0. Then,
M (t
1
, t
2
) = exp
_

1
t
1
+
2
t
2
+

2
1
2
t
2
1
+

2
2
2
t
2
2
_
= exp
_

1
t
1
+

2
1
2
t
2
1
_
exp
_

2
t
2
+

2
2
2
t
2
2
_
= M
X
1
(t
1
) M
X
2
(t
2
) .
Since the joint mgf factors into a product of the marginal mgfs, we therefore have that X
1
and
X
2
are independent random variables by a previous theorem.
(f) For the sake of notational convenience, let u = (x
1

1
)/
1
and v = (x
2

2
)/
2
.
Then,
f (x
2
|x
1
) =
f(x
1
,x
2
)
f
1
(x
1
)
=
1
2
1

1
2
e

1
2(1
2
)
[

2
2v+v
2
]
1

2
1
e

1
2

2
=
1

2
2

1
2
e

1
2(1
2
)
[v
2
2v+
2

2
]
=
1

2
2

1
2
e

1
2

vu

1
2

2
.
Expressing this result in terms of the original variables, we therefore obtain
f (x
2
|x
1
) =
1

2
_
1
2
e

1
2

x
2

2
+

1
(x
1

1
)
}

1
2

2
, < x
2
< .
By inspection, the above pdf is normal with mean
2
+

1
(x
1

1
) and variance
2
2
(1
2
).
The corresponding results for the conditional pdf of X
1
given X
2
= x
2
follow by symmetry.
12. 1/96
6

S-ar putea să vă placă și