Sunteți pe pagina 1din 48

CE 29 Lecture No: 2C

February 4, 2019

JOINT
PROBABILITY
DISTRIBUTION
Asst. Prof. Maxell P. Lumbera
Institute of Civil Engineering
University of the Philippines Diliman
CE 29 Lec 2C Joint Probability Distribution

LESSON OBJECTIVES

At the end of the lecture, the student should be able to:


1. Use joint probability mass functions and joint probability density
functions to calculate probabilities.
2. Calculate marginal and conditional probability distributions from joint
probability mass functions.
3. Interpret and calculate covariances and correlations between random
variables

2
CE 29 Lec 2C Joint Probability Distribution

INTRODUCTION

So far, our study of random variables and their probability distributions in is


restricted to one-dimensional sample spaces, in that we recorded
outcomes of an experiment as values assumed by a single random
variable.

There will be situations, however, where we may find it desirable to record


the simultaneous outcomes of several random variables.

For example, we might measure the amount of precipitate P and volume V


of gas released from a controlled chemical experiment, giving rise to a two-
dimensional sample space consisting of the outcomes (p, v)
3
CE 29 Lec 2C Joint Probability Distribution

JOINT PROBABILITY DISTRIBUTIONS

If X and Y are two discrete random variables, the probability distribution for
their simultaneous occurrence can be represented by a function with
values f(x,y) for any pair of values (x, y) within the range of the random
variables X and Y.

It is customary to refer to this function as the joint probability distribution


of X and Y.

! ", $ = &(( = " )*+ , = $)

4
CE 29 Lec 2C Joint Probability Distribution

JOINT PROBABILITY MASS FUNCTION

The function !"# (%, ') is a joint probability mass function of the discrete
random variables X and Y if

1. f XY ( x, y ) ³ 0
2. åå f
x y
XY ( x, y ) = 1

3. P( X = x, Y = y ) = f XY ( x, y )
4. P( X and Y Î A) = f ( x, y ) for X and Y within region A

5
CE 29 Lec 2C Joint Probability Distribution

EXAMPLE 1

A large insurance agency services a number of customers who have purchased both a homeowner’s
policy and an automobile policy from the agency. For each type of policy, a deductible amount must
be specified. For an automobile policy, the choices are $100 and $250, whereas for a homeowner’s
policy, the choices are 0, $100, and $200. Suppose an individual with both types of policy is selected
at random from the agency’s files. Let X = the deductible amount on the auto policy and Y = the
deductible amount on the homeowner’s policy. Suppose the joint pmf is given in the accompanying
joint probability table

Y
P(X,Y)
0 100 200

100 0.20 0.10 0.20


X
250 0.05 0.15 0.30

What is the P(100,100)? P(Y≥100)?


6
CE 29 Lec 2C Joint Probability Distribution

MARGINAL DISTRIBUTION (DISCRETE)

The marginal probability distribution for X is found by summing the


probabilities in each column whereas the marginal probability
distribution for Y is found by summing the probabilities in each row.

f X ( x ) = å f ( xy )
y

fY ( y ) = å f ( xy )
x

7
CE 29 Lec 2C Joint Probability Distribution

EXAMPLE 2

The joint probability distribution of number of bars of signal strength of a


computer server and its response time is shown in the table. Find !" 3 .

y = Response x = Number of Bars of Signal


time(nearest Strength
second) 1 2 3 f (y )
1 0.01 0.02 0.25 0.28
2 0.02 0.03 0.20 0.25
3 0.02 0.10 0.05 0.17
4 0.15 0.10 0.05 0.30
f (x ) 0.20 0.25 0.55 1.00

8
CE 29 Lec 2C Joint Probability Distribution

MEAN AND VARIANCE OF A MARGINAL


DISTRIBUTION

E(X) and V(X) can be obtained by first calculating the marginal probability
distribution of X and then determining E(X) and V(X) by the usual method.

E ( X ) = å x × fX ( x)
R

V ( X ) = å x 2 × f X ( x ) - µ X2
R

E ( Y ) = å y × fY ( y )
R

V (Y ) = å y 2 × fY ( y ) - µY2
R

9
CE 29 Lec 2C Joint Probability Distribution

EXAMPLE 3

The joint probability distribution of number y = Response x = Number of Bars of Signal


of bars of signal strength of a computer time(nearest Strength
server and its response time is shown in second) 1 2 3 f (y )
the table. What is the expected value of X? 1 0.01 0.02 0.25 0.28
2 0.02 0.03 0.20 0.25
What is the mean of Y? What is the 3 0.02 0.10 0.05 0.17
variance of X? What is the standard 4 0.15 0.10 0.05 0.30
deviation of Y? f (x ) 0.20 0.25 0.55 1.00

10
CE 29 Lec 2C Joint Probability Distribution

EXAMPLE 3

x = Number of Bars of Signal y = Response x = Number of Bars of


y = Response
time(nearest Strength time(nearest Signal Strength f (y ) y *f (y ) y 2*f (y )
second)
second) 1 2 3 f (y ) 1 2 3
1 0.01 0.02 0.25 0.28 1 0.01 0.02 0.25 0.28 0.28 0.28
2 0.02 0.03 0.20 0.25 2 0.02 0.03 0.20 0.25 0.50 1.00
3 0.02 0.10 0.05 0.17 3 0.02 0.10 0.05 0.17 0.51 1.53
4 0.15 0.10 0.05 0.30 4 0.15 0.10 0.05 0.30 1.20 4.80
f (x ) 0.20 0.25 0.55 1.00 f (x ) 0.20 0.25 0.55 1.00 2.49 7.61
x *f (x ) 0.20 0.50 1.65 2.35
x 2 *f (x ) 0.20 1.00 4.95 6.15

E(X) = 2.35 V(X) = 6.15 – 2.352 = 6.15 – 5.52 = 0.6275

E(Y) = 2.49 V(Y) = 7.61 – 2.492 = 7.61 – 6.20 = 1.4099 11


CE 29 Lec 2C Joint Probability Distribution

CONDITIONAL PROBABILITY

Given continuous random variables X and Y with joint probability density function
!"# $, & , the conditional probability density function of Y given X=x is
P( A Ç B)
Recall: P( A | B) =
P( B)
P ( X = x and Y = y )
P (Y = y | X = x ) =
P (X = x)

12
CE 29 Lec 2C Joint Probability Distribution

EXAMPLE 4

For the discrete random variables given, what is the conditional mean of Y
given X=1?

y = Response x = Number of Bars of Signal


time(nearest Strength
second) 1 2 3 f (y )
1 0.01 0.02 0.25 0.28
2 0.02 0.03 0.20 0.25
3 0.02 0.10 0.05 0.17
4 0.15 0.10 0.05 0.30
f (x ) 0.20 0.25 0.55 1.00

13
CE 29 Lec 2C Joint Probability Distribution

EXAMPLE 4

ANSWERS: y = Response x = Number of Bars of


time(nearest Signal Strength f (y )
second)
1 2 3
1 0.01 0.02 0.25 0.28
2 0.02 0.03 0.20 0.25
3 0.02 0.10 0.05 0.17
4 0.15 0.10 0.05 0.30
f (x ) 0.20 0.25 0.55 y*f(y|x=1) y2 *f(y|x=1)
1 0.050 0.080 0.455 0.05 0.05
2 0.100 0.120 0.364 0.20 0.40
3 0.100 0.400 0.091 0.30 0.90
4 0.750 0.400 0.091 3.00 12.00
Sum of f(y|x) 1.000 1.000 1.000 3.55 13.35
12.6025
0.7475
The mean number of attempts given one bar is 3.55 with variance of
0.7475. 14
CE 29 Lec 2C Joint Probability Distribution

INDEPENDENT RANDOM VARIABLES

For random variables X and Y, if any one of the following properties is true,
the others are also true. Then X and Y are independent.
Recall:

15
CE 29 Lec 2C Joint Probability Distribution

COVARIANCE

Covariance is a measure of the linear relationship between two random


variables.

First, we need to describe the expected value of a function of two random


variables. Let h(X, Y) denote the function of interest.

16
CE 29 Lec 2C Joint Probability Distribution

COVARIANCE

The covariance between the random variables X and Y, denoted as


!"#(%, ') or )*+ is

)*+ = !"# %, ' = - % − /* ' − /+ = - %' − /* /+


The units of )*+ are the units if X times units of Y.
−∞ < )*+ < ∞

17
CE 29 Lec 2C Joint Probability Distribution

EXAMPLE 5

A large insurance agency services a number of customers who have purchased both a
homeowner’s policy and an automobile policy from the agency. For each type of policy, a
deductible amount must be specified. For an automobile policy, the choices are $100 and $250,
whereas for a homeowner’s policy, the choices are 0, $100, and $200. Suppose an individual
with both types of policy is selected at random from the agency’s files. Let X = the deductible
amount on the auto policy and Y = the deductible amount on the homeowner’s policy. Suppose
the joint pmf is given in the accompanying joint probability table:
Y
P(X,Y)
0 100 200

100 0.20 0.10 0.20


X
250 0.05 0.15 0.30

What is the covariance of X and Y?


18
CE 29 Lec 2C Joint Probability Distribution

CORRELATION

Correlation is another measure of the linear relationship between two


random variables.
The correlation between random variables X and Y. denoted as !"# , is
%&'(), +) /"#
!"# = =
- ) . -(+) /" /#
Since /" > 0 and /# > 0, !"# and %&' (), +) have the same sign.

Note: If X and Y are independent random variables,


/)+ = !)+ = 0
ρXY = 0 is necessary, but not a sufficient condition for independence.
19
CE 29 Lec 2C Joint Probability Distribution

EXAMPLE 6

A large insurance agency services a number of customers who have purchased both a
homeowner’s policy and an automobile policy from the agency. For each type of policy, a
deductible amount must be specified. For an automobile policy, the choices are $100 and $250,
whereas for a homeowner’s policy, the choices are 0, $100, and $200. Suppose an individual
with both types of policy is selected at random from the agency’s files. Let X = the deductible
amount on the auto policy and Y = the deductible amount on the homeowner’s policy. Suppose
the joint pmf is given in the accompanying joint probability table:
Y
P(X,Y)
0 100 200

100 0.20 0.10 0.20


X
250 0.05 0.15 0.30

What is the correlation of X and Y?


20
CE 29 Lec 2C Joint Probability Distribution

QUIZ

Suppose that two batteries are randomly chosen without replacement for
the following group 12 of batteries:
3 new
4 used (working)
5 defective
Let X denote the number of new batteries chosen
Let Y denote the number of used batteries chosen

a. Find fXY(x,y), or the probability mass function.


b. Find the mean of X and mean of Y. 21
c. Find the conditional mean and variance of X given Y = 0.
CE 29 Lecture No: 2D
February 4, 2019

JOINT
PROBABILITY
DISTRIBUTION
Asst. Prof. Maxell P. Lumbera
Institute of Civil Engineering
University of the Philippines Diliman
CE 29 Lec 2C Joint Probability Distribution

LESSON OBJECTIVES

At the end of the lecture, the student should be able to:


1. Calculate means and variances for the linear combinations of random
variables
2. Understand the significance of Chebyshev’s Theorem and the Central
Limit Theorem in the study of probability

23
CE 29 Lec 2C Joint Probability Distribution

NOTES ABOUT THE EXPECTED VALUE AS


A MATHEMATICAL OPERATOR

If c = a constant number (i.e., not a variable) and X and Y are any random
variables…

1. E(c) = c
2. E(cX)=cE(X)
3. E(c + X)=c + E(X)
4. E(X+Y)= E(X) + E(Y)

24
CE 29 Lec 2C Joint Probability Distribution

NOTES ABOUT THE EXPECTED VALUE AS


A MATHEMATICAL OPERATOR

If c = a constant number (i.e., not a variable) and X and Y are any random
variables…
Example: If you cash in soda
cans in a junk shop, you
• E(c) = c always get 5 cents per can.
• E(cX)=cE(X)
Therefore, there’s no
• E(c + X)=c + E(X) randomness. You always
• E(X+Y)= E(X) + E(Y) expect to (and do) get 5 cents.

25
CE 29 Lec 2C Joint Probability Distribution

NOTES ABOUT THE EXPECTED VALUE AS


A MATHEMATICAL OPERATOR

If c = a constant number (i.e., not a variable) and X and Y are any random
variables…

• E(c) = c
• E(cX)=cE(X)
• E(c + X)=c + E(X)
• E(X+Y)= E(X) + E(Y)
Example: If the casino charges $10
per game instead of $1, then the
casino expects to make 10 times as
much on average from the game. 26
CE 29 Lec 2C Joint Probability Distribution

NOTES ABOUT THE EXPECTED VALUE AS


A MATHEMATICAL OPERATOR

If c = a constant number (i.e., not a variable) and X and Y are any random
variables…

• E(c) = c
• E(cX)=cE(X)
• E(c + X)=c + E(X)
• E(X+Y)= E(X) + E(Y)
Example, if the casino throws in a free drink worth
exactly $5.00 every time you play a game, you
always expect to (and do) gain an extra $5.00
regardless of the outcome of the game. 27
CE 29 Lec 2C Joint Probability Distribution

NOTES ABOUT THE EXPECTED VALUE AS


A MATHEMATICAL OPERATOR

If c = a constant number (i.e., not a variable) and X and Y are any random
variables…

• E(c) = c
• E(cX)=cE(X)
• E(c + X)=c + E(X)
• E(X+Y)= E(X) + E(Y)
Example: If you play the lottery twice, you expect
to lose: -$.86 + -$.86.
NOTE: This works even if X and Y are
dependent!! Does not require independence!! 28
CE 29 Lec 2C Joint Probability Distribution

NOTES ABOUT THE EXPECTED VALUE AS


A MATHEMATICAL OPERATOR

If c = a constant number (i.e., not a variable) and X and Y are any random
variables…

1. !(#) = 0
2. !(# + () = !(()
3. !(#() = # ) !(()
4. !(( + *) = !(() + !(*)

29
CE 29 Lec 2C Joint Probability Distribution

NOTES ABOUT THE EXPECTED VALUE AS


A MATHEMATICAL OPERATOR

If c = a constant number (i.e., not a variable) and X and Y are any random
variables…

• !(#) = &
Constants don’t vary!
• '(( + *) = '(*)
• '((*) = (2'(*)
• '(* + ,) = '(*) + '(,)

30
CE 29 Lec 2C Joint Probability Distribution

NOTES ABOUT THE EXPECTED VALUE AS


A MATHEMATICAL OPERATOR

If c = a constant number (i.e., not a variable) and X and Y are any random
variables…
Adding a constant to every instance of a random
variable doesn’t change the variability. It just shifts
• !(#) = 0 the whole distribution by c. If everybody grew 5
inches suddenly, the variability in the population
• '(( + *) = '(*)
would still be the same.
• !(#+) = # , !(+)
• !(+ + -) = !(+) + !(-)

31
CE 29 Lec 2C Joint Probability Distribution

NOTES ABOUT THE EXPECTED VALUE AS


A MATHEMATICAL OPERATOR

• If c = a constant number (i.e., not a variable) and X and Y are any


random variables…
Multiplying each instance of the random variable by
c makes it c-times as wide of a distribution, which
• !(#) = 0 corresponds to c2 as much variance (deviation
squared). For example, if everyone suddenly
• !(# + () = !(()
became twice as tall, there’d be twice the deviation
• )(*+) = *, )(+) and 4 times the variance in heights in the population.
• !(( + -) = !(() + !(-)

32
CE 29 Lec 2C Joint Probability Distribution

NOTES ABOUT THE EXPECTED VALUE AS


A MATHEMATICAL OPERATOR

If c = a constant number (i.e., not a variable) and X and Y are any random
variables…
APPLICABLE ONLY IF X and Y are
INDEPENDENT!!!!!!!!
• !(#) = 0
With two random variables, you have more
• !(# + () = !(()
opportunity for variation, unless they vary together
• !(#() = # ) !(() (are dependent, or have covariance): Var(X+Y)=
• *(+ + ,) = *(+) + *(,) Var(X) + Var(Y) + 2Cov(X, Y)

33
CE 29 Lec 2C Joint Probability Distribution

LINEAR FUNCTIONS OF RANDOM


VARIABLES

• A function of random variables is itself a random variable.


• A function of random variables can be formed by either linear or
nonlinear relationships. We limit our discussion to linear functions.
• Given random variables !1, !2, … , !& and constants '1, '2, … , '&

( = '1!1 + '2!2 + … + '&!&

is a linear combination of X1, X2,…,Xp.

34
CE 29 Lec 2C Joint Probability Distribution

MEAN OF A LINEAR FUNCTION

If !" , !$ , . . , !& are random variables, and c1, c2…cn are constants, and ' =
)1!1 + )2!2 + … + ).!. then

/ ' = 01232415354⋯41737 = )" 032 + )$ 035 + ⋯ + )& 037


Recall:
/()) = )
/()!) = )/(!) 0:34;< = =03 + >0<
/() + !) = ) + /(!)
/(! + ') = /(!) + /(')

• Note: The random variables !" , !$ , … !& are not necessarily independent

35
CE 29 Lec 2C Joint Probability Distribution

VARIANCE OF A LINEAR FUNCTION

If !" , !$ , . . , !& are random variables, and '1, '2 … '+ are constants, and , = '1!1 +
'2!2 + … + '+!+ then
/ , = 01$232415354⋯41737 = '"$ 03$2 + '$$ 03$5 + ⋯ + '&$ 03$7 + 2 8 8 '9 '; '<=(!9 , !; )
9:;

If !" , !$ , . . , !& are independent, then '<=(!9 , !; )=0

/ , = 01$232415354⋯41737 = '"$ 03$2 + '$$ 03$5 + ⋯ + '&$ 03$7 Recall:


/(') = 0
σ 2aX+bY = a 2σ 2X + b 2σ Y2 /(' + !) = /(!)
/('!) = ' $/(!)
/(! + ,) = /(!) + /(,)
σ 2X +Y = σ 2X + σ Y2
36
CE 29 Lec 2C Joint Probability Distribution

MEAN OF A SAMPLE MEAN

The sample mean is also a random variable derived from the values of the
samples (!" , !$ , . . !& are random variables which are independent and
identically distributed).
It is also called ‘statistic’, any function of the random variables constituting a
random sample.
It is also an estimator of the population mean.
1
X= ( X1 + X2 +... + Xn )
n
E( X ) = µ X =
1
n
( 1
)
µ X1 + µ X 2 + ... + µ X n = (µ + µ + ... + µ ) = µ
n
Recall:

E( X ) = µ
'()) = )
'()!) = )'(!)
'() + !) = ) + '(!) 37
'(! + -) = '(!) + '(-) ' - = ./0102/3132⋯2/515 = )".10 + )$.13 + ⋯ + )& .15
CE 29 Lec 2C Joint Probability Distribution

VARIANCE OF A SAMPLE MEAN

The sample mean is also a random variable derived from the values of the
samples (!" , !$ , . . !& are random variables which are independent and
identically distributed).
1
X= ( X1 + X2 +... + Xn )
n
2 2 2
!1$ !1$ !1$ σ2
σ 2X = # & σ 2X1 + # & σ 2X2 +.... # & σ 2Xn 2
σ =
"n% "n% "n% X
n
2
!1$ σ2 σ
= # & (σ + σ +... + σ ) =
2 2 2 σX =
"n% n n
Recall:
• -(/) = 0
• -(/ + !) = -(!)
• -(/!) = / $-(!) 38
• -(! + 4) = -(!) + -(4)
CE 29 Lec 2C Joint Probability Distribution

EXAMPLE 1

Suppose that the random variable X represents the length of a punched


part in centimeters. Let Y be the length of the part in millimeters. If E(X) =
5 and V(X) = 0.25, what are the mean and variance of Y.

39
CE 29 Lec 2C Joint Probability Distribution

EXAMPLE 2

A semiconductor product consists of three layers. Supposing that the


variances in thickness of the first, second, and third layers are 25, 40,
and 30 square nanometers, respectively, and the layer thickness are
independent. What is the variance of the thickness of the final product?

40
CE 29 Lec 2C Joint Probability Distribution

EXAMPLE 3

Consider the perimeter of a part. Let X1 and X2 denote the length and
width of a part with standard deviations 0.1 and 0.2 centimeters,
respectively. Suppose that the covariance between X1 and X2 is 0.02.
Determine the variance of the perimeter of a part.

41
CE 29 Lec 2C Joint Probability Distribution

EXAMPLE 4

An investor has $200 to invest. He will invest $100 in each of the two
investments. Let X and Y denote the returns on the two investments.
Assume that the means of each return is $5 and the standard deviations
for each return is $2. Moreover, assume that the correlation value is 0.5.
Find the mean and standard deviation of the total returns from the two
investments.

42
CE 29 Lec 2C Joint Probability Distribution

CHEBYSHEV’S THEOREM

The probability that any random variable X will assume a


value within k standard deviations of the mean is at least
#
1 − %. That is,
$

1
& ' − () < + < ' + () ≥ 1 − %
(

This is lower bound only and if the probability distribution is


known, it is better to compute for the actual value for the
probability.

43
CE 29 Lec 2C Joint Probability Distribution

EXAMPLE 5

The length of a rivet manufactured by a certain process has a mean of


50 mm and standard deviation of 0.45 mm. What is the least possible
value for the probability that the length of the rivet is within 45.1 – 50.9
mm?

44
CE 29 Lec 2C Joint Probability Distribution

EXAMPLE 6

A random variable X has a mean μ = 8, and a variance σ2 = 9, and an


unknown probability distribution, Find
a. !(−4 < & < 20)
b. !( & − 8 < 6)

45
CE 29 Lec 2C Joint Probability Distribution

CENTRAL LIMIT THEOREM

'(
!
"~$ %,
)
*+ ~$()%, )' ( )

For most populations,


if the sample size is
greater than 30, the
Central Limit Theorem
approximation is good.
46
CE 29 Lec 2C Joint Probability Distribution

CENTRAL LIMIT THEOREM:


MATHEMATICAL PROOF

If X is a random variable from any distribution with known mean, E(X), and
variance, Var(X), then the expected value and variance of the average of n
observations of X is:
n n

åx i å E ( x) nE ( x)
E( X n ) = E( i =1
)= i =1
= = E ( x)
n n n
n n

åx i åVar ( x) nVar ( x) Var ( x)


Var ( X n ) = Var ( i =1 ) = i =1
= =
n n2 n 2
n
47
THANKS!
Any questions?
You can send a query at
mplumbera@up.edu.ph

48

S-ar putea să vă placă și