Sunteți pe pagina 1din 2

Math 2274 Lecture 17- Joint distributions of several random variables

Chapter 4, Lecture notes - X, Y are Discrete

Joint pf f is f(x,y) = P(X=x AND Y=y) = P(X=x ,Y=y)


Joint df F is F(x,y) = P(Xx AND Y y) = P(Xx ,Y y)
Mathematical properties:
a) i) (,) (, ) = 1 ii) f(x,y) 0
b) i) 0F(x,y)1 ii) F(x,y) is non-decreasing as a function of x and y.

Example: Consider the range of X is {0,1,2,3} and the range of Y is the same as X. State the
range of (X,Y) and calculate f(1,2), F(1,2) and F(2,3) where the joint pf f is given below.

3 1/32 2/32 2/32 1/32


Y 2 3/32 2/32 2/32 3/32
1 2/32 3/32 3/32 2/32
0 1/32 1/32 2/32 2/32
0 1 2 3
X

Marginal Distribution of X( or of Y) = Distribution of X alone (or Y alone).

The marginal pf. g of X is defined to be:

g(x) = P(X=x) for all x in the range of X.

= (, )

The marginal pf. h of Y is defined to be:

h(y) = P(Y=y) for all y in the range of Y.

= (, )

Example: Find the marginal pfs. For X and Y respectively in the previous example.

Note: See other properties of joint distributions on pgs 98 99, Lecture notes.

Properties of Expectation:
i) E(c) = c where c is a constant
ii) E[cg(X,Y)] = cE[g(X,Y)]
iii) E [c1g1(X,Y) + c2g2(X,Y)] = c1E[g1(X,Y)] + c2E[g2(X,Y)]
where g is a real-valued function of x and y and E[g(X,Y)] = (,) (, )(, )

Example: Calculate E(X2Y) for the previous example.


Independence

X and Y are independent iff f(x,y) = g(x)h(y) for all (x,y).


Theorem 1: If X and Y are independent then E(XY) = E(X). E(Y) Prove! See pg. 100

Covariance- is a measure of the strength of the linear relationship between rvs. It depends on the
units of measurement used.

Let X and Y be 2 rvs. with joint pf. f(x,y). Then, the covariance between X and Y is
Cov(X,Y) = E[(X-X)(Y-Y)]
Note: Cov(X,X) = E[(X-X)2] var(X) i.e. variance is a special case of covariance when
the rvs. are identical.
Properties of Covariance:
i) Cov(X+a, Y+b) = Cov (X,Y)
ii) Cov(aX,bY) = abCov(X,Y)
Theorem 2: Cov(X,Y) = E(XY) E(X)E(Y) Proof See pg. 114
Corollary 1: If X and Y are independent then Cov(X,Y) = 0. Proof- See pg. 114
Note: the converse is NOT true.

Correlation a measure of the strength of the linear relationship between two rvs. It is invariant
with respect to changes of scale or location.

Let X and Y be two rvs. with finite means and variances. The correlation between X and
(,)
Y is (, ) = =
()()

Properties of Correlation:
i) ( + , + ) = (, )
ii) (, ) = (, )
Lemma 1: Let U and V be two rvs. [E (UV)]2E(U2)E(V2). Proof Pg 116
Theorem 3: -1 (, ) 1. Proof Pg 116.

Example: X and Y are two discrete rvs. Using the table below, find i) the marginal pf. of X, ii)
determine whether X and Y are independent, ii) find Cov(X,Y).

3 2/24 3/24 3/24 3/24


Y 2 1/24 1/24 1/24 1/24
1 1/24 2/24 5/24 1/24
0 1 2 3
X

Note: Please read Chapter 8- Lecture notes on Multiple Integration which will be used to
calculate the joint distributions for two or more continuous rvs.

S-ar putea să vă placă și