Documente Academic
Documente Profesional
Documente Cultură
October 7, 2010
Quiz I Review
October 7, 2010
1 / 26
Quiz Information
sheet allowed
Quiz I Review
October 7, 2010
2 / 26
A Probabilistic Model:
P(B) P(A)
Events
Quiz I Review
October 7, 2010
3 / 26
Probability Axioms
Given a sample space : 1. Nonnegativity: P(A) 0 for each event A 2. Additivity: If A and B are disjoint events, then P(A B) = P(A) + P(B) If A1 , A2 , . . . , is a sequence of disjoint events, P(A1 A2 ) = P(A1 ) + P(A2 ) + 3. Normalization P() = 1
(Massachusetts Institute of Technology) Quiz I Review October 7, 2010 4 / 26
Given events A, B and C : 1. 2. 3. 4. If A B, then P(A) P(B) P(A B) = P(A) + P(B) P(A B) P(A B) P(A) + P(B) P(A B C ) = P(A) + P(Ac B) + P(Ac B c C )
Quiz I Review
October 7, 2010
5 / 26
Discrete Models
si
Therefore the probability of the event A is given as P(A) = P(s1 ) + P(s2 ) + + P(sn )
Discrete Uniform Probability Law: If all
P(A) =
(Massachusetts Institute of Technology) Quiz I Review
|A| ||
October 7, 2010 6 / 26
Conditional Probability
Given an event B with P(B) > 0, the conditional
P(A|B) is a valid probability law on , satisfying 1. P(A|B) 0 2. P(|B) = 1 3. P(A1 A2 |B) = P(A1 |B) + P(A2 |B) + , where {Ai }i is a set of disjoint events
Quiz I Review
October 7, 2010
7 / 26
Multiplication Rule
Let A1 , . . . , An be a set of events such that
n1 i =1
> 0.
Ai
Then the joint probability of all events is n n1 P Ai = P(A1 )P(A2 |A1 )P(A3 |A1 A2 ) P(An | Ai )
i =1 i =1
A1 P(A1)
A1 A2 A3
Quiz I Review
October 7, 2010
8 / 26
P(B Ai ) =
P(B|Ai )P(Ai )
B A1 B A2 B A3 B
October 7, 2010 9 / 26
A1 B
A2 A1 A2 A3 A3
Bc B Bc B Bc
Quiz I Review
Bayes Rule
Given a nite partition A1 , . . . , An of with P(Ai ) > 0, then for each event B with P(B) > 0 P(Ai |B) = P(B|Ai )P(Ai ) P(B|Ai )P(Ai ) = n P(B) i =1 P(B |Ai )P(Ai )
B A1 B A2 A1 A2 A3 A3
(Massachusetts Institute of Technology) Quiz I Review
A1 B A2 B A3 B
October 7, 2010 10 / 26
Bc B Bc B Bc
Independence of Events
Events A and B are independent if and only if
P(A B) = P(A)P(B) or
P(A|B) = P(A) if P(B) > 0
Events A and B are conditionally independent given an event C if and only if P(A B|C ) = P(A|C )P(B|C ) P(A|B C ) = P(A|C ) if P(B C ) > 0
Independence Conditional Independence.
(Massachusetts Institute of Technology) Quiz I Review October 7, 2010 11 / 26
or
for each i = j
P
iS
Ai =
iS
P(Ai ) S {1, 2, . . . , n}
Quiz I Review
October 7, 2010
12 / 26
Counting Techniques
13 / 26
Counting Techniques-contd
Partitions: The number of ways to partition an n-element set into r disjoint subsets, with nk elements in the k th subset: n n n n1 n n1 nr 1 = n1 , n2 , . . . , nr n1 n2 nr
n!
= n1 !n2 !, , nr !
{X = x} = { |X () = x}
The probability mass function (PMF) for the
random variable X assigns a probability to each event {X = x}: pX (x) = P({X = x}) = P { |X () = x}
Quiz I Review October 7, 2010 15 / 26
PMF Properties
Let X be a random variable and S a countable
random variable:
g X X () g (X ()) = Y ()
with PMF pY (y ) = PX (x)
x |g (x)=y
Quiz I Review
October 7, 2010
16 / 26
Expectation
Given a random variable X with PMF pX (x): E[X ] = x xpX (x) Given a derived random variable Y = g (X ): E[g (X )] = g (x)pX (x) = ypY (y ) = E [Y ]
x y
E[X n ] =
x n pX (x)
Quiz I Review
October 7, 2010
17 / 26
Variance
The variance of X is calculated as 2 var (X ) = E[(X E[X ])2 ] = x (x E[X ]) pX (x) var (X ) = E[X 2 ] E[X ]2 var (aX + b) = a2 var (X ) Note that var (x) 0
(Massachusetts Institute of Technology) Quiz I Review October 7, 2010 18 / 26
pX ,Y (x, y ) = P {X = x} {Y = y }
The marginal PMFs of X and Y are given
respectively as
pX (x) = pY (y ) =
(Massachusetts Institute of Technology)
pX ,Y (x, y ) pX ,Y (x, y )
October 7, 2010 19 / 26
Quiz I Review
pZ (z) =
Expectation:
pX ,Y (x, y )
E[Z ] =
x,y
Linearity: Suppose g (X , Y ) = aX + bY + c.
in the PMF:
21 / 26
Conditioned RV - contd
Multiplication Rule:
pX (x) = pX (x) =
Quiz I Review
October 7, 2010
22 / 26
Conditional Expectation
Let X and Y be random variables on a sample space . Given an event A with P(A) > 0 E[X |A] = xpX |A (x)
x
If PY (y ) > 0, then
E[X |{Y = y }] =
xpX |Y (x|y )
Quiz I Review
Independence
Let X and Y be random variables dened on and let A be an event with P(A) > 0.
X is independent of A if either of the following hold:
hold:
Independence
If X and Y are independent, then the following hold: If g and h are real-valued functions, then g (X ) and h(Y ) are independent. E[XY ] = E[X ]E[Y ] (inverse is not true) var (X + Y ) = var (X ) + var (Y ) Given independent random variables X1 , . . . , Xn , var (X1 +X2 + Xn ) = var (X1 )+var (X2 )+ +var (Xn ) +
Quiz I Review
October 7, 2010
25 / 26
Bernoulli Binomial Geometric Uniform X 1 success 0 failure Number of successes in n Bernoulli trials Number of trials until rst success An integer in the interval [a,b] pX (k) p k=1 1p k =0 n k nk k p (1 p) k = 0, 1, . . . , n (1 p)k1 p 1k = 1, 2, . . . ba+1 k = a, . . . , b 0 otherwise E[X ] p np
1 p a+b 2
Quiz I Review
October 7, 2010
26 / 26
For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms .