Documente Academic
Documente Profesional
Documente Cultură
February 4, 2019
JOINT
PROBABILITY
DISTRIBUTION
Asst. Prof. Maxell P. Lumbera
Institute of Civil Engineering
University of the Philippines Diliman
CE 29 Lec 2C Joint Probability Distribution
LESSON OBJECTIVES
2
CE 29 Lec 2C Joint Probability Distribution
INTRODUCTION
If X and Y are two discrete random variables, the probability distribution for
their simultaneous occurrence can be represented by a function with
values f(x,y) for any pair of values (x, y) within the range of the random
variables X and Y.
4
CE 29 Lec 2C Joint Probability Distribution
The function !"# (%, ') is a joint probability mass function of the discrete
random variables X and Y if
1. f XY ( x, y ) ³ 0
2. åå f
x y
XY ( x, y ) = 1
3. P( X = x, Y = y ) = f XY ( x, y )
4. P( X and Y Î A) = f ( x, y ) for X and Y within region A
5
CE 29 Lec 2C Joint Probability Distribution
EXAMPLE 1
A large insurance agency services a number of customers who have purchased both a homeowner’s
policy and an automobile policy from the agency. For each type of policy, a deductible amount must
be specified. For an automobile policy, the choices are $100 and $250, whereas for a homeowner’s
policy, the choices are 0, $100, and $200. Suppose an individual with both types of policy is selected
at random from the agency’s files. Let X = the deductible amount on the auto policy and Y = the
deductible amount on the homeowner’s policy. Suppose the joint pmf is given in the accompanying
joint probability table
Y
P(X,Y)
0 100 200
f X ( x ) = å f ( xy )
y
fY ( y ) = å f ( xy )
x
7
CE 29 Lec 2C Joint Probability Distribution
EXAMPLE 2
8
CE 29 Lec 2C Joint Probability Distribution
E(X) and V(X) can be obtained by first calculating the marginal probability
distribution of X and then determining E(X) and V(X) by the usual method.
E ( X ) = å x × fX ( x)
R
V ( X ) = å x 2 × f X ( x ) - µ X2
R
E ( Y ) = å y × fY ( y )
R
V (Y ) = å y 2 × fY ( y ) - µY2
R
9
CE 29 Lec 2C Joint Probability Distribution
EXAMPLE 3
10
CE 29 Lec 2C Joint Probability Distribution
EXAMPLE 3
CONDITIONAL PROBABILITY
Given continuous random variables X and Y with joint probability density function
!"# $, & , the conditional probability density function of Y given X=x is
P( A Ç B)
Recall: P( A | B) =
P( B)
P ( X = x and Y = y )
P (Y = y | X = x ) =
P (X = x)
12
CE 29 Lec 2C Joint Probability Distribution
EXAMPLE 4
For the discrete random variables given, what is the conditional mean of Y
given X=1?
13
CE 29 Lec 2C Joint Probability Distribution
EXAMPLE 4
For random variables X and Y, if any one of the following properties is true,
the others are also true. Then X and Y are independent.
Recall:
15
CE 29 Lec 2C Joint Probability Distribution
COVARIANCE
16
CE 29 Lec 2C Joint Probability Distribution
COVARIANCE
17
CE 29 Lec 2C Joint Probability Distribution
EXAMPLE 5
A large insurance agency services a number of customers who have purchased both a
homeowner’s policy and an automobile policy from the agency. For each type of policy, a
deductible amount must be specified. For an automobile policy, the choices are $100 and $250,
whereas for a homeowner’s policy, the choices are 0, $100, and $200. Suppose an individual
with both types of policy is selected at random from the agency’s files. Let X = the deductible
amount on the auto policy and Y = the deductible amount on the homeowner’s policy. Suppose
the joint pmf is given in the accompanying joint probability table:
Y
P(X,Y)
0 100 200
CORRELATION
EXAMPLE 6
A large insurance agency services a number of customers who have purchased both a
homeowner’s policy and an automobile policy from the agency. For each type of policy, a
deductible amount must be specified. For an automobile policy, the choices are $100 and $250,
whereas for a homeowner’s policy, the choices are 0, $100, and $200. Suppose an individual
with both types of policy is selected at random from the agency’s files. Let X = the deductible
amount on the auto policy and Y = the deductible amount on the homeowner’s policy. Suppose
the joint pmf is given in the accompanying joint probability table:
Y
P(X,Y)
0 100 200
QUIZ
Suppose that two batteries are randomly chosen without replacement for
the following group 12 of batteries:
3 new
4 used (working)
5 defective
Let X denote the number of new batteries chosen
Let Y denote the number of used batteries chosen
JOINT
PROBABILITY
DISTRIBUTION
Asst. Prof. Maxell P. Lumbera
Institute of Civil Engineering
University of the Philippines Diliman
CE 29 Lec 2C Joint Probability Distribution
LESSON OBJECTIVES
23
CE 29 Lec 2C Joint Probability Distribution
If c = a constant number (i.e., not a variable) and X and Y are any random
variables…
1. E(c) = c
2. E(cX)=cE(X)
3. E(c + X)=c + E(X)
4. E(X+Y)= E(X) + E(Y)
24
CE 29 Lec 2C Joint Probability Distribution
If c = a constant number (i.e., not a variable) and X and Y are any random
variables…
Example: If you cash in soda
cans in a junk shop, you
• E(c) = c always get 5 cents per can.
• E(cX)=cE(X)
Therefore, there’s no
• E(c + X)=c + E(X) randomness. You always
• E(X+Y)= E(X) + E(Y) expect to (and do) get 5 cents.
25
CE 29 Lec 2C Joint Probability Distribution
If c = a constant number (i.e., not a variable) and X and Y are any random
variables…
• E(c) = c
• E(cX)=cE(X)
• E(c + X)=c + E(X)
• E(X+Y)= E(X) + E(Y)
Example: If the casino charges $10
per game instead of $1, then the
casino expects to make 10 times as
much on average from the game. 26
CE 29 Lec 2C Joint Probability Distribution
If c = a constant number (i.e., not a variable) and X and Y are any random
variables…
• E(c) = c
• E(cX)=cE(X)
• E(c + X)=c + E(X)
• E(X+Y)= E(X) + E(Y)
Example, if the casino throws in a free drink worth
exactly $5.00 every time you play a game, you
always expect to (and do) gain an extra $5.00
regardless of the outcome of the game. 27
CE 29 Lec 2C Joint Probability Distribution
If c = a constant number (i.e., not a variable) and X and Y are any random
variables…
• E(c) = c
• E(cX)=cE(X)
• E(c + X)=c + E(X)
• E(X+Y)= E(X) + E(Y)
Example: If you play the lottery twice, you expect
to lose: -$.86 + -$.86.
NOTE: This works even if X and Y are
dependent!! Does not require independence!! 28
CE 29 Lec 2C Joint Probability Distribution
If c = a constant number (i.e., not a variable) and X and Y are any random
variables…
1. !(#) = 0
2. !(# + () = !(()
3. !(#() = # ) !(()
4. !(( + *) = !(() + !(*)
29
CE 29 Lec 2C Joint Probability Distribution
If c = a constant number (i.e., not a variable) and X and Y are any random
variables…
• !(#) = &
Constants don’t vary!
• '(( + *) = '(*)
• '((*) = (2'(*)
• '(* + ,) = '(*) + '(,)
30
CE 29 Lec 2C Joint Probability Distribution
If c = a constant number (i.e., not a variable) and X and Y are any random
variables…
Adding a constant to every instance of a random
variable doesn’t change the variability. It just shifts
• !(#) = 0 the whole distribution by c. If everybody grew 5
inches suddenly, the variability in the population
• '(( + *) = '(*)
would still be the same.
• !(#+) = # , !(+)
• !(+ + -) = !(+) + !(-)
31
CE 29 Lec 2C Joint Probability Distribution
32
CE 29 Lec 2C Joint Probability Distribution
If c = a constant number (i.e., not a variable) and X and Y are any random
variables…
APPLICABLE ONLY IF X and Y are
INDEPENDENT!!!!!!!!
• !(#) = 0
With two random variables, you have more
• !(# + () = !(()
opportunity for variation, unless they vary together
• !(#() = # ) !(() (are dependent, or have covariance): Var(X+Y)=
• *(+ + ,) = *(+) + *(,) Var(X) + Var(Y) + 2Cov(X, Y)
33
CE 29 Lec 2C Joint Probability Distribution
34
CE 29 Lec 2C Joint Probability Distribution
If !" , !$ , . . , !& are random variables, and c1, c2…cn are constants, and ' =
)1!1 + )2!2 + … + ).!. then
• Note: The random variables !" , !$ , … !& are not necessarily independent
35
CE 29 Lec 2C Joint Probability Distribution
If !" , !$ , . . , !& are random variables, and '1, '2 … '+ are constants, and , = '1!1 +
'2!2 + … + '+!+ then
/ , = 01$232415354⋯41737 = '"$ 03$2 + '$$ 03$5 + ⋯ + '&$ 03$7 + 2 8 8 '9 '; '<=(!9 , !; )
9:;
The sample mean is also a random variable derived from the values of the
samples (!" , !$ , . . !& are random variables which are independent and
identically distributed).
It is also called ‘statistic’, any function of the random variables constituting a
random sample.
It is also an estimator of the population mean.
1
X= ( X1 + X2 +... + Xn )
n
E( X ) = µ X =
1
n
( 1
)
µ X1 + µ X 2 + ... + µ X n = (µ + µ + ... + µ ) = µ
n
Recall:
E( X ) = µ
'()) = )
'()!) = )'(!)
'() + !) = ) + '(!) 37
'(! + -) = '(!) + '(-) ' - = ./0102/3132⋯2/515 = )".10 + )$.13 + ⋯ + )& .15
CE 29 Lec 2C Joint Probability Distribution
The sample mean is also a random variable derived from the values of the
samples (!" , !$ , . . !& are random variables which are independent and
identically distributed).
1
X= ( X1 + X2 +... + Xn )
n
2 2 2
!1$ !1$ !1$ σ2
σ 2X = # & σ 2X1 + # & σ 2X2 +.... # & σ 2Xn 2
σ =
"n% "n% "n% X
n
2
!1$ σ2 σ
= # & (σ + σ +... + σ ) =
2 2 2 σX =
"n% n n
Recall:
• -(/) = 0
• -(/ + !) = -(!)
• -(/!) = / $-(!) 38
• -(! + 4) = -(!) + -(4)
CE 29 Lec 2C Joint Probability Distribution
EXAMPLE 1
39
CE 29 Lec 2C Joint Probability Distribution
EXAMPLE 2
40
CE 29 Lec 2C Joint Probability Distribution
EXAMPLE 3
Consider the perimeter of a part. Let X1 and X2 denote the length and
width of a part with standard deviations 0.1 and 0.2 centimeters,
respectively. Suppose that the covariance between X1 and X2 is 0.02.
Determine the variance of the perimeter of a part.
41
CE 29 Lec 2C Joint Probability Distribution
EXAMPLE 4
An investor has $200 to invest. He will invest $100 in each of the two
investments. Let X and Y denote the returns on the two investments.
Assume that the means of each return is $5 and the standard deviations
for each return is $2. Moreover, assume that the correlation value is 0.5.
Find the mean and standard deviation of the total returns from the two
investments.
42
CE 29 Lec 2C Joint Probability Distribution
CHEBYSHEV’S THEOREM
1
& ' − () < + < ' + () ≥ 1 − %
(
43
CE 29 Lec 2C Joint Probability Distribution
EXAMPLE 5
44
CE 29 Lec 2C Joint Probability Distribution
EXAMPLE 6
45
CE 29 Lec 2C Joint Probability Distribution
'(
!
"~$ %,
)
*+ ~$()%, )' ( )
If X is a random variable from any distribution with known mean, E(X), and
variance, Var(X), then the expected value and variance of the average of n
observations of X is:
n n
åx i å E ( x) nE ( x)
E( X n ) = E( i =1
)= i =1
= = E ( x)
n n n
n n
48