Sunteți pe pagina 1din 23

Chapter 2

Theory of probability
Sample space, sample points, events
Sample space is the set of all possible sample points
Example 0. Tossing a coin: = { H, T }
Example 1. Rolling die: = { 1, 2, 3, 4, 5, 6 }
Example 2. Number of customers in queue: = { 0, 1, 2, }
Example 3. Call holding time: = { }
Events A, B, C, are (measurable) subsets of the sample space
Example 1. Even numbers of a die: A = { 2, 4, 6 }
Example 2. No customers in queue: A = { 0 }
Example 3. Call holding time greater than 3.0 (min) : A = { }
Denote by the set of all events A
Sure event : The sample space itself
Impossible event: The empty set
Combination of events

Un ion A or B :
Intersection A or B :
Complement not A :
Events A and B are disjoint if

A set of events { B
1
, B
2
, - is a partition of event A if
i. for all
Ii.
| B A =
} B r A | { B A e e O e = e e e o
} B nd A | { B A e e O e = e e e a
} A | { A
c
e O e = e e
| B A = j = i
A B
i i
=
B
1

B
2

B
3

Probability
Probability of event is denoted by P(A), P(A) * 0, 1 +
Probability measure P is thus
A real-valued set function define on the set of events ,
P : [0,1]
Properties
i. 0 P(A) 1
ii. P() = 0
iii. P() = 1
iv. P(A) = 1 P(A)

v.
vi.
vii.
viii.
B) P(A - P(B) + P(A) = B) P(A
P(B) + P(A) = B) P(A disjoint are B and A
) ( P(A) A of partition is } {
i i i
B P B E =
) ( ) ( B P A P B A s c
A
B
Conditional probability
Assume that P(B) > 0
Definition : The conditional probability of event A given
that event B occurred is defined as


It follow that


) (
) | (
) | (
B P
B A P
B A P =
) | ( ) ( ) | ( ) ( ) | ( B A P A P B A P B P B A P = =
Bayes theorem
Let {B
i
- be partition of the sample space
Assume that P(A) > 0 and P(Bi) > 0 for all I, Then by (slide 6)


Furthermore, by the theorem of total probability (slide 7),
we get


This is Bayess theorem
Probabilities P(B
i
) are called a priori probabilities of event
B
i

Probabilities P( B
i
|A ) are called a posteriori probabilities
of events Bi (given that the event A occured )
) (
) | ( ) (
) (
) (
) | (
A P
A B P B P
A P
B A P
A B P
i i
i
=

=
) | ( ) (
) | ( ) (
) | (
j j j
i i
i
B A P B P
B A P B P
A B P
E
=
Statistical independence of events

Definition : Events A and B are independent if

It follow that


Correspondingly :

) ( ) ( ) ( B P A P B A P =
) (
) (
) ( ) (
) (
) (
) | ( A P
B P
B P A P
B P
B A P
B A P = =

=
) (
) (
) ( ) (
) (
) (
) | ( B P
A P
B P A P
A P
B A P
A B P = =

=
Random Variables
Definition : real-valued random variable X is a real-
valued and measurable function defined on the
sample space
Each sample point Is associated with a real
number
Measurability means that all sets of type

Belong to the set of events, that is

The probability of such an event is denoted by
9 O O : , X
O c s O e = s } ) ( | { : } { x X x X e e
e s } { x X
} { x X P s
Example
A coin is tossed three times
Sample space :

Let X be the random variable that tells the
total number of tails in these three
experiments :



HHH HHT HTH THH HTT THT TTH TTT
0 1 1 1 2 2 2 3
1,2,3} = i T}, {H, | ) , , {( =
i 3 2 1
e e e e e e
e
) (e x
Indicators of events
Lests being an arbitrary event
Definitioan : The indicator of event A is a random
variable defined as follows:


Clearly:



e A

e
e
e
A
A
A
e
e
, 0
, 1
) ( } 1 1 { A P P
A
= =
) ( 1 ) ( } 0 1 {
0
A P A P P
A
= = =
Cumulative distribution
Definition : the cumulative distribution function (cdf) of a
random variable X is a function define as follow:

Cdf determain the distribution of the random variable,
That is : the probabilities P, X B -, where
Properties
i. Fx is non-decreasing
ii. Fx is continuous form the right
iii. Fx (-) = 0
iv. Fx () = 1

] 1 , 0 [ : 9
x
F
) ( : ) ( x X P x F
x
s
e e 9 c } { B X and B
0
Fx(x)
1
x
Statistical independence of
randomvariables
Definition : Random variables X and Y independent if for
all x and y


Definition : Random variables X
1
, , X
n
are (totally)
independent if for all i and x
i


y} }P{Y x X P{ = y} Y x, P{X s s s s
} x P{X }... x X P{ = } x X ..., , x P{X
n n 1 1 n n 1 1
s s s s
Maximum and minimum of
independent random variable
Let the random variables X
1
, , X
n
be independent
Denote : X
max
: = max { X
1
, , X
n
}. Then



Denote : X
min
: = min { X
1
, , X
n
}. Then
x} Xn , , x X P{ = } x P{X
1
max
s s s
x} P{Xn , }, x X P{ =
1
s s
x} Xn , , x X P{ = } x P{X
1
min
> > >
x} P{Xn , }, x X P{ =
1
> >
Discrete random variables

Definition : set is called discrete if it is
Finite, A= { X
1
, , X
n
} , or
Countably infinite, A= { X
1
, X
2
, , -
Definition : random variable X is dicrete if there is a discrete
setsuch that

It follow that


The set Sx is called the value set














9 c A
9 c
x
S
1 } { = e
x
S X P
x
S x all for 0 0 } { e > = x X P
x
S x all for 0 0 } { e > = x X P
Point probabilities
Let X be a discrete random variable
The distribution of X is determined by the point
probabilities p
i
,

Definition : the probability mass function (pmf) of X is a
function


Cdf is in this case a step function:






e
e
= =
X
X i i
x
S x 0,
S x = x , p
} x = P{X : (x) p
[0,1] = p
x
9
X i i i
S x }, x = P{X : p e =

s
= s =
x x i
i
i i
p } x P{X : (x) F
x







Probabilitas mass function (pdf) cumulative distribution function (cdf)
Example
X1 X2 X3 X4 X1 X2 X3 X4
Px(X) Fx(X)
x
1
x
1
} x , x , x , {x = S
4 3 2 1 X
Impendence of discrete random
variables
Discrete random variables X and Y are independent if and
only if for all

} { } { ] , [
i i i i
y Y P x X P y Y x X P = = = = =
Expectation
Definition : the expectation (mean value ) of X is defined by


Note 1 : the expectaftion exists only if
Note 2 : if , then we may denote E*X+ =
Properties :
i.
ii.
iii.



i
S x
i
S x
x
S x S x
x
x p x x p x x X P x x X P X E
X X X X
. . ) ( . ] [ . ] [ ] [

e e e e
= = = = = = =
< E | |
i i i
x p
= E | |
i i i
x p
cE[X] = E[cX] R c e
E[Y] E[X] = Y] E[X + +
E[X]E[Y] = [XY] E t independen Y and X
Variance
Definition : the variance of X is defined by


Useful formula (prove!) :


Properties :
i.
ii.
] E[X]) - E[(X : Var[X] : [X] D :
2 2
X
2
= = = o
2 2 2
E[X] - ] E[X [X] D =
[X] D c = [cX] D c
2 2 2
9 e
[Y] D + [X] D = Y] + [X D t independen Y and X
2 2 2

Covariance
Definition : the covariance between X and Y is defined by

Useful formula (prove!) :

Properties :
i. Cov[X,X] = Var[X]
ii. Cov[X,Y] = Cov[Y,X]
iii. Cov[X+Y,Z] = Cov[X,Z] + Cov[Y,Z]
iv. X and Y independent

E[Y])] - (Y ) E[X] - E[(X : Y] Cov[X, :=
XY
2
= o
E[X]E[Y] - E[XY] Y] Cov[X, =
0 Y] Cov[X, =
Other distribution related parameters
Definition : the standard deviation of X is defined by


Definition : the coefficient of varition of X is defined by


Definition : the kth moment of X is defined by

Var[X] = [X] D : D[X] :
2
x
= = o
] [
] [
: C[X] :
x
Y E
X D
c = =
] E[X :
k
x
) (
=
k

THANK YOU.

S-ar putea să vă placă și