Sunteți pe pagina 1din 24

Multivariate Probability

Distributions: Part II
Cyr Emile MLAN, Ph.D.
mlan@stat.uconn.edu

Multivariate Probability Distributions: Part II

p. 1/2

Introduction
Text Reference: Introduction to Probability and Its
Applications, Chapter 6.
Reading Assignment: Sections 6.4-6.5, April
13-April 15

When we encounter problems that involve more than


one random variable, we often combine the variables
into a single function. We may be interested in the life
lengths n electronic components within the same
system.
We now discuss expectation and covariances of a
multivariate random variable.
Multivariate Probability Distributions: Part II

p. 2/2

Expected Values and Covariances


Definition 6.11:
Let g(X, Y ) be a function of 2 discrete random variables, X, Y , which
have joint probability mass function, p(x, y). Then the expected
value of g(X, Y ) is


E g(X, Y ) =

g(x, y) p(x, y)

x= y=

(The sum is in fact over all values of (x, y) for which p(x, y) > 0)
If X, Y are continuous random variables with joint density function
f (x, y), then the expected value of g(X, Y ) is
Z Z


E g(X, Y ) =
g(x, y) f (x, y) dx dy

(The double integral is in fact over all values of (x, y) for which
Multivariate Probability Distributions: Part II
p. 3/2
f (x, y) > 0

Expected Values and Covariances


If g(X, Y ) = XY , we have


E XY =

x=Zy=
Z

x y f (x, y) dx dy ,

if discrete

x y p(x, y) ,

if continuous

If g(X, Y ) = X k , we have


E X


k

xk p(x, y) ,

x=
Z Zy=

x f (x, y) dx dy ,

if discrete
if continuous

If g(X, Y ) = Y k , we have


E Y


k

y k p(x, y) ,

if discrete

x=
Z Zy=

Multivariate
Probability Distributions: Part II
k

y f (x, y) dx dy ,

if continuous

p. 4/2

Expected Values and Covariances


Example 6.9:
A process for producing an industrial chemical yields a product
containing two types of impurities. For a specified sample from
this process, let X denote the proportion of impurities in the
sample and let Y denote the proportion of type 1 impurities
among all impurities found. Suppose that the joint distribution
of X and Y can be modeled by the following probability density
function:

2(1 x), if 0 x 1, 0 y 1,
f (x, y) =
0,
otherwise.

Find the expected value of the proportion of type I impurities in


the sample.
Multivariate Probability Distributions: Part II

p. 5/2

Expected Values and Covariances


Solution:
The proportion of type I impurities in the sample is given by
XY . Thus,
Z 1Z 1
E(XY ) =
2xy(1 x) dy dx
0

x21 (1 x)21
dx
= B(2, 2)
B(2, 2)
0
(2) (2)
1!1!
1
=
=
=
(4)
3!
6
Z

Thus, we would expect 1/6 of the sample to be made up of type


I impurities.

Multivariate Probability Distributions: Part II

p. 6/2

Expected Values and Covariances


Theorem 6.5:
Let c1 , c2 , . . . , ck be k constants and let g1 (X, Y ),
g2 (X, Y ), . . . , gk (X, Y ) be functions of two random variables X
and Y . Then



E c1 g1 (X, Y ) + c2 g2 (X, Y ) + + ck gk (X, Y ) =




c1 E g1 (X, Y )] + c2 E g2 (X, Y )] + + ck E gk (X, Y )

In particular,


E X + Y = E[X] + E[Y ]


E X Y = E[X] E[Y ]


E cg(X, Y ) = cE[g(X, Y )]

Multivariate Probability Distributions: Part II

p. 7/2

Expected Values and Covariances


Example 6.10:
A nut company markets cans of deluxe mixed nuts containing
almonds, cashews, and peanuts. Suppose the weight of each
can is exactly 1 lb, but the weight contribution of each type of
nut is random. Let X = the weight of almonds in a selected can
and Y = the weight of cashews. Suppose that the joint
distribution of X and Y can be modeled by the following
probability density function:

24 x y, if 0 X + Y 1,
f (x, y) =
0,
otherwise.

If 1 lb of almonds costs the company $ 2.00, 1 lb of cashews


costs $3.00, and 1 lb of peanuts costs $1.00, find the expected
value of the total cost of the contents of a can.
Multivariate Probability Distributions: Part II

p. 8/2

Expected Values and Covariances


Solution:
The total cost of the contents of a can is given by
g(X, Y ) = 2X + 3Y + 1(1 X Y ) = 1 + X + 2Y
Thus,


E g(X, Y )

= 1 + E X + 2E Y
Z 1 Z 1x
Z
= 1 + 24
x(xy) dydx + 48
0

= 1 + 12

(1 x)2 x2 dx + 16

= 1 + 12B(3, 3) + 16B(2, 4)
11
2 4
= 1+ + =
5 5
5

1
0

1x

y(xy) dydx
0

x(1 x)3 dx
0

Thus, the expected total cost of the contents of a can is $2.20.


Multivariate Probability Distributions: Part II

p. 9/2

Joke

One day a man went to see his dentist.


During the examination, the man said,
"My teeth are great. But let me tell you
something. I never brush my teeth. I
never use a rinse on my teeth: I never
use a breath mint. I eat garlic all day
long. And Ive never had bad breath.
The dentist replied, "You need an
operation."
"What kind of operation?"
"On your nose!"
Multivariate Probability Distributions: Part II

p. 10/

Expected Values and Covariances


In the random variables X and Y are independent, the computation
of the expectation of g(X, Y ) can be made easier if the function
g(X, Y ) is the product of a function of X and a function of Y .

Theorem 6.6:
Let X and Y be independent random variables and g(X) and
h(Y ) be functions of only X and Y , respectively. Then




E g(X) h(Y ) = E g(X)] E g2 (Y )]

Multivariate Probability Distributions: Part II

p. 11/

Expected Values and Covariances


Definition 6.12:
If X and Y are two random variables with means X and Y , the
covariance of X and Y , written Cov(X, Y ), is defined by


Cov(X, Y ) = E (X X )(Y Y )
X

X



x X y Y p(x, y)

x= y=

if X and Y discrete
=
Z Z



x X y Y f (x, y) dx dx

x= y=

if X and Y continuous

In general, a positive covariance is an indication that an increase in


X results in an increase Y on average. A negative covariance is an
Multivariate Probability Distributions: Part II
p. 12/
indication that an increase in X results in a decrease in Y on
average.

Expected Values and Covariances


Rationale for this definition:
Suppose X and Y have a strong positive relationship to one
another, by which we mean that large values of X tend to
occur with large values of Y and small values of X with
small values of Y . Thus,
most of the probability mass function or density function

will be associated with X X ) and (Y Y either
both are positive or both negative, so the product

X X )(Y Y will tend to be positive.
Thus, for a strong positive relationship, Cov(X, Y ) should
be quite positive.

Multivariate Probability Distributions: Part II

p. 13/

Expected Values and Covariances


Suppose X and Y have a strong negative relationship to
one another, by which we mean that large values of X tend
to occur with small values of Y and small values of X with
large values of Y . Thus,
most of the probability mass function or density function

will be associated with X X ) and (Y Y of

opposite signs, so the product X X )(Y Y will
tend to be negative.
Thus, for a strong negative relationship, Cov(X, Y ) should
be quite negative.
If X and Y are not strongly related, positive and negative
products will tend to cancel one another yielding a
covariance near 0.Multivariate Probability Distributions: Part II

p. 14/

Expected Values and Covariances


Shortcut Computational Formula:



Cov(X, Y ) = E X Y E[X] E[Y ]

Example 6.11: Example 6.1 revisited


Compute Cov(X, Y ).

Solution:


E XY

= (100)(0)(.20) + (100)(100)(.10) + (100)(200)(.20) +


(250)(0)(.05) + (250)(100)(.15) + (250)(200)(.30)
= 23, 750

 
E X = 100(.50) + 250(.50) = 175
 
E Y
= 0(.25) + 100(.25) + 200(.50) = 125

Cov(X, Y ) = 23, 750 (175)(125) = 1875

Multivariate Probability Distributions: Part II

p. 15/

Expected Values and Covariances


Example 6.12: Example 6.10 revisited
Compute Cov(X, Y ).

Solution:
 
E X =

E XY

=
=

Cov(X, Y ) =

 
2
= E Y
5
Z 1 Z 1x
24
x2 y 2 dydx
0
1

2
8
x (1 x) dx = 8B(3, 4) =
15
0
  
2
2
2
2

=
15
5
5
75
Z

Multivariate Probability Distributions: Part II

p. 16/

Expected Values and Covariances


We have the following results:
Cov(X, X) = Var(X)
Cov(X, Y ) = Cov(Y, X)
Cov(aX, bY ) = ab Cov(X, Y )
If X, Y and U are 3 random variables and a and b are
constants, then
Cov(aX + bY, U ) = a Cov(X, U ) + b Cov(Y, U )
Var(X + Y ) = Var(X) + Var(Y ) + 2 Cov(X, Y )
If X and Y are independent, then
Cov(X, Y ) = 0
Multivariate Probability Distributions: Part II

p. 17/

Expected Values and Covariances


Unfortunately, the covariance has a serious flaw that
makes its values impossible to interpret. Its computed
values depend critically the scale of measurements.
A standardized version is the correlation defined
below.
Definition 6.13: The correlation between two random variable X and Y , written Corr(X, Y ) or is defined by

and we have

= Corr(X, Y ) = p

Cov(X, Y )
Var(X) Var(Y )

1 Corr(X, Y ) +1
Multivariate Probability Distributions: Part II

p. 18/

Expected Values and Covariances


The absolute value of the correlation measures the
strength of the relationship between X and Y .
The sign of the correlation indicates the direction of the
linear association.
We have the following results:
If a and c are either both positive or both negative constants, then
Corr(aX + b, cY + d) = Corr(X, Y )
If X and Y are independent, then
Corr(X, Y ) = 0
Thus, independent variables must be uncorrelated.
However,

= 0 does not imply independence.

= 1 or = 1 iff Y can be written as: Y = aX + b for some


Multivariate Probability Distributions: Part II
numbers a and b with a 6= 0.

p. 19/

Expected Values and Covariances


Example 6.13: Example 6.1 revisited
Compute Corr(X, Y ).

Solution:
We have


E X
 
Var X
 2
E Y
 
Var Y

= (100)2 (.50) + (250)2 (.50) = 36, 250


= 36, 250 1752 = 5625
= (0)2 (.25) + (100)2 (.25) + (200)2 (.50) = 22, 500

= 22, 500 1252 = 6875


1875
= .3015
Corr(X, Y ) = p
(5625)(6875)
Multivariate Probability Distributions: Part II

p. 20/

Expected Values and Covariances


Example 6.14: Example 6.10 revisited
Compute Corr(X, Y ).

Solution:
 2
E X

= 24

 
Var X =
Corr(X, Y ) =

1x

 2
1
x y dydx = = E Y
5
3

 2
 
1
2
1

=
= Var Y
5
5
25
2

2
s 75  =
3
1
1
25
25

Multivariate Probability Distributions: Part II

p. 21/

The Expected Value and Variance of Linear Functions of Random Variables

Theorem 6.7 :
Let Y1 , Y2 , , Yn and X1 , X2 , , Xm be random variables. For any
constants a1 , a2 , !
. . . , an and b1 , b2 , . . . , bm . Then the following hold:
n
n
X
X
(a) E
ai Yi =
ai E(Yi )
i=1

(b)

Var

n
X

i=1

ai Yi

i=1

n
X

a2i Var(Yi )

+2

i=1

ai aj Cov(Yi , Yj )

1i<jn

n
m
n X
m
X
X
X
(c) Cov
ai Yi ,
aj Xj =
ai bj Cov(Yi , Xj )
i=1

j=1

i=1 j=1

In particular, if X, Y, , Yn are independent, then


Var

n
X

ai Yi

n
X

a2i Var(Yi )

i=1
i=1
Multivariate
Probability
Distributions: Part II

p. 22/

Expected Values and Covariances


Example 6.14:
Let X1 , X2 , , Xn be independent random variables with E(Xi ) =
and Var(Xi ) = 2 . (These variables many denote the outcomes of n
independent trials of an experiment.) Define
n
1X
X=
Xi
n i=1
2
Show that E(X) = and Var(X) =
.
n

Solution:
n

E X

Var X

1X
1X
n
E(Xi ) =
=
=
n i=1
n i=1
n

n
n
1 X
1 X 2
n 2
2
Var(Xi ) = 2
= 2 =
2
n i=1
n i=1
n
n

Multivariate Probability Distributions: Part II

p. 23/

Joke

I have a friend who just started


the "A" diet. He can only eat food
that begins with the letter "A": a
steak, a chicken, a hamburger, a
bowl of French fries....
I have been a diet for three
weeks. The only thing that I lost
was three weeks.
Multivariate Probability Distributions: Part II

p. 24/

S-ar putea să vă placă și