Documente Academic
Documente Profesional
Documente Cultură
Introduction
Earlier, we analyse experiments in which an outcome is one number. Next, we analyse experiments in which an outcome is a collection of numbers. Each number is a sample value of a random variable. The probability model for such an experiment contains the properties of the individual random variables and also contains the relationship among the random variables. We will go through both cases with discrete and continuous random variables. Pair of random variables appear in a wide range of practical situations. As an example of two random variables that we encounter all the time in practice is the case where we have a signal (X), emitted by a radio transmitter, and the corresponding signal (Y ) that eventually arrives at a receiver. In practice, we observe Y but we really want to know X. As noise and distortion prevent us to observe X directly, we use the probability model fX,Y (x, y) to study and estimate X.
In an experiment that produces one random variable, events are points or intervals on a line. In an experiment that leads to two random variables X and Y , each outcome (x, y) is a point in a plane and events are points or areas in the plane. Denition 2.1 Joint cdf. The joint cdf of two random variables X and Y is dened as FX,Y (x, y) = P [X x, Y y].
Denition 3.1 Joint pmf. The joint pmf of two discrete random variables X and Y is dened by PX,Y (x, y) = P [X = x, Y = y], x SX , y SY .
Mathematics 290
1st Semester
P (X = x, Y = y) is the probability that X and Y will simultaneously take on the values of x and y, respectively. Thus, P (X = x, Y = y) is the probability of the intersection of the events X = x and Y = y. Given the joint probability mass function P (x, y) it is possible to obtain the probability distribution of X alone and that of Y alone, referred to also as marginal distributions of X & Y respectively. The marginal distribution for X, PX (x), is obtained by summing P (x, y) over the values of Y , while the marginal distribution for Y , PY (y), is obtained by summing P (x, y) over the values of X. Example 3.1 Suppose a die is rolled twice. Dene X = absolute dierence of the faces up Y = largest of the two faces up For example: if the roll of the die produces a 4 and a 6, (4, 6), then X = |4 6| = 2, Y = 6. If (5, 1) is obtained, then X = |5 1| = 4, Y = 5. What is the joint pmf of X and Y ? Find the probability mass function of X ? Find the probability mass function of Y ? Find P (X < 4, Y < 3). Denition 3.2 Conditional pmf and conditional expectation. The conditional pmf of Y given X = x is PY (y | x) = P [Y = y | X = x], which gives PY (y | x) = PX,Y (x, y) . PX (x) y SY ,
ySY
yPY (y | x).
1st Semester
Since g(x) = E[Y | x] is a function of x, g(X) is a function of X and denoted by E[Y | X]. It can be shown that E[Y ] = E[E[Y | X]]. The above remains valid with Y replaced by a function of Y . In particular, E[Y k ] = E[E[Y k | X]]. Law of total probability: P [Y A] =
xk SX
(1)
P [Y A | X = xk ]PX (xk ).
Denition 3.3 Independent. We generalize the idea of independent events earlier to random variables. We say that X and Y are independent random variables if and only if the events {X = x} and {Y = y} are independent for all x SX and y SY . Thus, P [x A, y B] = P [x A]P [y B] for all A SX and B SY . A necessary and sucient condition for the independence of X and Y is PX,Y (x, y) = PX (x)PY (y). Example 3.2 Chip defects. The total number of defects X on a chip is a Poisson random variable with mean . Suppose that each defect has a probability p of falling in a region R and that the location of each defect is independent of the locations of all other defects. Find the pmf of the number of defects Y that fall in the region R. Solution: Note that given X = k, Y is binomial with parameter p, Y Binomial(k, p), i.e., ( ) k j p (1 p)kj , 0 j k j P [Y = j | X = k] = 0, j > k.
Mathematics 290
1st Semester
k=j ( k=j
Example 3.3 Customers at service station. The number of customers that arrive at a service station during a time t is a Poisson random variable with parameter t. The time T required to service each customer is an exponential random variable with parameter . Find the pmf of the number of customers N that arrive during the service time of a specic customer. Solution: We have N Poisson(t) and hence P [N = k | T = t] = (t)k t e . k!
By the law of total probability, we obtain P [N = k] = P [N = k | T = t]fT (t)dt = P [N = k | T = t]et dt 0 (t)k t t = e e dt k! 0 k k (+)t = t e dt k! 0 k! k (Use integration by parts) = k! ( + )k+1 ( )( )k = . + + Example 3.4 Find the mean and variance of the number of customer arrivals N during the service time T of a specic customer in Example 3.3.
Mathematics 290
1st Semester
Solution: E[N ] = E[E[N |T ]] = E[T ] = / Var[N ] = E[E[(N E[N ])2 |T ]] [ ] = E E[N 2 |T ] 2E[N ]E[N |T ] + (E[N ])2 [ ] = E (T + 2 T 2 ) 2 2 T / + (/)2 = E[T ] + 2 E[T 2 ] 2 2 E[T ]/ + (/)2 = / + (/)2 .
Denition 4.1 Joint pdf. The joint pdf of two continuous random variables X and Y is 2 FX,Y (x, y) fX,Y (x, y) = . xy Thus, P [(X, Y ) A] =
A
Marginal cdf FX (x) can be obtained from FX,Y (x, y) by taking the limit of y . Similarly, marginal cdf FY (y) can be obtained from FX,Y (x, y) by taking the limit of x . Example 4.1 The joint cdf for X and Y is given by (1 ex )(1 ey ), x 0, y 0 FX,Y (x, y) = 0, elsewhere.
Mathematics 290
1st Semester
Find the marginal cdfs, i.e. cdf of X and cdf of Y . Solution: We have: FX (x) = lim FX,Y (x, y) = 1 ex , x 0
y
Denition 4.2 Conditional pdf. The conditional pdf of Y given X = x is fY (y | x) = There holds P [Y A | X = x] =
A
Denition 4.3 Independence of Two Random Variables Similar to the discrete case, a necessary and sucient condition for the independence of X and Y is FX,Y (x, y) = FX (x)FY (y) which is equivalent to fX,Y (x, y) = fX (x)fY (y). Example 4.2 Let X and Y be random variables with joint pdf: 6y, 0 y x 1 fX,Y (x, y) = 0, otherwise (a) Find the marginal pdf fX (x). (b) Find the conditional pdf fY (y|x). (c) Find the conditional expected value E[Y |x].
Mathematics 290 1st Semester
6y dy = 3x2 , 0 x 1 otherwise.
0,
fY (y|x) =
Example 4.3 Random variables X and Y are independent and identically distributed with probability density function fX (x) = Find the joint pdf fX,Y (x, y). Solution: We have fX,Y (x, y) = fX (x)fY (y) { (1 x/2)(1 y/2), = 0, 1 x/2, 0,
0x2 otherwise.
0 x, y 2 otherwise.
There are many situations in which we observe two random variables and use their values to compute a new random variable. For example, we can describe the amplitude of the signal transmitted by a radio station as a random variable X. We can describe the attenuation of the signal as it travels to the antenna of a moving car as another random variable Y . Thus, the amplitude of the signal at the radio receiver in the car is the random variable W = X/Y. Consider a random variable W = g(X, Y ). When X and Y are discrete random variables, SW , the range of W , is a countable set corresponding to all the possible values of g(X, Y ). Observe that {W = w} is the event g(X, Y ) = w. Thus, we can obtain PW (w) by adding the values of PX,Y (x, y) corresponding to the (x, y) pairs for which g(x, y) = w
Mathematics 290 1st Semester
Denition 5.1 For discrete random variables X and Y , the random variable W = g(X, Y ) has pmf PW (w) =
(x,y):g(x,y)=w
When X and Y are continuous random variables and g(x, y) is a continuous function, W = g(X, Y ) is a continuous random variable. To nd pdf fW (w), it is usually helpful to rst nd the cdf FW (w) and then calculate the derivative. Denition 5.2 FW (w) =
(x,y):g(x,y)w
Denition: E[XY ] is called the correlation of X and Y . The covariance of X and Y is dened by cov(X, Y ) = E[(X E[X])(Y E[Y ])] and the correlation coecient of X and Y is dened by X,Y = It can be shown that X,Y obeys 1 X,Y 1. Denition: If E[XY ] = 0, X and Y are said to be orthogonal. If cov(X, Y ) = 0, X and Y are said to be uncorrelated. One can show that X and Y are uncorrelated if and only if E[XY ] = E[X]E[Y ]. Example 6.1 Let be uniformly distributed in the interval (0, 2). Let X = cos and Y = sin . Show that X and Y are orthogonal and uncorrelated. cov(X, Y ) . X Y
Mathematics 290
1st Semester
Solution: We have E[XY ] = E[sin cos ] 2 1 = sin cos d 2 0 2 1 1 = sin(2) d 2 0 2 2 1 cos(2) = = 0. 4 2 0 Thus, X and Y are orthogonal. Also, 1 E[X] = 2 Similarly, E[Y ] = 0. Thus, E[XY ] = 0 = E[X]E[Y ] and as such X and Y are uncorrelated.
0 2
sin d = 0.
Mathematics 290
1st Semester