Sunteți pe pagina 1din 110

Random'Variables'..........................................................................................................................................'4!

Probability'Distributions'for'Discrete'Random'Variables'................................................................................'4!
Probability!Distribution!or!Probability!Mass!Function!(pmf)!(f(x)!or!p(x))!....................................................................................!4!
Cumulative!Distribution!Function!(P(x)!or!F(x))!.........................................................................................................................................!4!
The!Expected!Value!of!X!..........................................................................................................................................................................................!4!
Rules&of&Expected&Value&............................................................................................................................................................................................&4!
The!Variance!of!X!.......................................................................................................................................................................................................!5!
Rules&of&Variance&.........................................................................................................................................................................................................&5!
Discrete!Probability!Distributions!.....................................................................................................................................................................!5!
The&Binomial&Probability&Distribution&..............................................................................................................................................................&5!
The&Negative&Binomial&Distribution&...................................................................................................................................................................&6!
Geometric&Probability&Distribution&.....................................................................................................................................................................&6!
Poisson&Probability&Distribution&..........................................................................................................................................................................&6!
Find!the!expected!value!and!variance!from!Moment!Generating!Function!.....................................................................................!9!
Cumulative'Distribution'function'P(y)'or'F(y)'...............................................................................................'10!
Properties!of!a!Cumulative!Distribution!Function!...................................................................................................................................!10!
Probability'density'function'of'a'Continuous'random'variable'.....................................................................'10!
Properties!of!a!Density!Function!.....................................................................................................................................................................!10!
Probability'density'function'is'Derivative'of'Cumulative'distribution'function'.............................................'10!
Expected'Value'............................................................................................................................................'10!
Expected!Value!Properties!..................................................................................................................................................................................!11!
Continuous'Distributions'.............................................................................................................................'11!
Uniform!.......................................................................................................................................................................................................................!11!
The!Normal!Probability!Distribution!.............................................................................................................................................................!12!
Transforming&normal&random&Variable&to&standard&Normal&...............................................................................................................&12!
Gamma!Probability!Distribution!......................................................................................................................................................................!13!
Relating!ChiNSquared!and!Gamma!...................................................................................................................................................................!14!
ChiNSquared!Distribution!....................................................................................................................................................................................!15!
Exponential!Distribution!(!Gamma!Density!function!when!!=!1)!....................................................................................................!15!
Two'Discrete'Random'Variables'..................................................................................................................'16!
Joint!Probability!Mass!function!(p(x,y)!or!f(x,y))!......................................................................................................................................!16!
Joint!Distribution!Function!Fy1, y2!.................................................................................................................................................................!16!
Two'Continuous'Random'Variables'.............................................................................................................'17!
Joint!Probability!Density!function!(p(x,y)!or!f(x,y))!................................................................................................................................!17!
Joint!Distribution!Function!P(X,Y)!or!F(X,Y)!...............................................................................................................................................!17!
Properties&of&Joint&Distribution&Function&.......................................................................................................................................................&18!
Marginal'Probability'functions'.....................................................................................................................'18!
Independent'Random'Variables'...................................................................................................................'19!
Conditional'Distributions'.............................................................................................................................'20!
Discrete!Conditional!Distributions!..................................................................................................................................................................!20!
Continuous!Conditional!Distributions!...........................................................................................................................................................!20!
Expected'Values,'Covariance,'and'Correlation'.............................................................................................'20!
Independent!Expected!Value!functions!........................................................................................................................................................!22!
Conditional'Expectations'.............................................................................................................................'23!
Conditional'Variances'..................................................................................................................................'24!
Finding'Distributions'of'Functions'of'RV'......................................................................................................'26!

(1)! The'CDF'technique'...............................................................................................................................'27!
Transformation'Methods'.............................................................................................................................'29!
OneNtoNOne!Transformations!............................................................................................................................................................................!29!
LogNormal!Distribution!.......................................................................................................................................................................................!32!
Joint!Transformations!...........................................................................................................................................................................................!33!
Moment!Generating!Method!..............................................................................................................................................................................!37!
Sampling'Distributions'Related'to'the'Normal'Distribution'..........................................................................'41!
ChiRSquared'Distribution'..............................................................................................................................'42!
Notes:&Sample&Variance&S2probability&distribution&..................................................................................................................................&44!
Students'T'distribution'...............................................................................................................................'46!
F'Distribution'...............................................................................................................................................'48!
Notes:!...........................................................................................................................................................................................................................!48!
The'Central'Limit'Theorem'...........................................................................................................................'50!
The'Normal'Approximation'to'the'Binomial'Distribution'.............................................................................'51!
The'Bias'and'Mean'Square'Error'of'Point'Estimators'...................................................................................'54!
Relative'Efficiency'........................................................................................................................................'54!
The'Method'of'Moments'.............................................................................................................................'54!
Method'of'Maximum'Likelihood'(Better'unbiased'estimators)'....................................................................'56!
LargeRSample'Confidence'Intervals'..............................................................................................................'58!
Small''Sample'Confidence'Intervals'for'!!and!!1!!2'..................................................................................'61!
Comparing!the!means!of!two!normal!populations!...................................................................................................................................!63!
Confidence!Intervals!for!2!................................................................................................................................................................................!66!
zTests'and'Confidence'Intervals'for'a'Difference'Between'Two'Population'Means'......................................'68!
Basic!Assumptions!.................................................................................................................................................................................................!68!
Expected!Value!of!__X!__Y!....................................................................................................................................................................................!68!
Standard!Deviation!of!__X!__Y!............................................................................................................................................................................!68!
Null'Hypothesis'and'the'Alternative'Hypothesis'..........................................................................................'69!
Notes:!...........................................................................................................................................................................................................................!69!
Simple'Vs'Composite'Hypothesis'.................................................................................................................'69!
OneR'and'Two'sided'Alternatives'...............................................................................................................'69!
The'Rejection'Region'...................................................................................................................................'70!
Type'I'and'Type'II'Errors'..............................................................................................................................'70!
Example!...................................................................................................................................................................................................................................!71!
Example!...................................................................................................................................................................................................................................!72!
Relationship!between!!and!!...........................................................................................................................................................................!73!
Common'LargeRSample'Tests'.......................................................................................................................'74!
Calculating'Type'II'Error'Probabilities'and'Finding'the'Sample'Size'for'Z'Tests'.............................................'77!
Relationships'between'HypothesisRTesting'Procedures'and'Confidence'Intervals'........................................'80!
Significance'Levels'or'pRValues'.....................................................................................................................'81!
PNvalue!.........................................................................................................................................................................................................................!81!
Notes:&..............................................................................................................................................................................................................................&81!

Calculating&P&Values&................................................................................................................................................................................................&82!

SmallRSample'Hypothesis'Testing'for'!!and!!1!!!2'......................................................................................'85!
Small!Sample!test!for!!1!!!2!..............................................................................................................................................................................!89!
Example:!.....................................................................................................................................................................................................................!91!
Testing!Hypotheses!Concerning!Variances!.................................................................................................................................................!93!
Test!of!the!Hypothesis!12 = !22!.................................................................................................................................................................!97!
Example:&.......................................................................................................................................................................................................................&98!
Power'of'Tests'...........................................................................................................................................'100!
Most'Powerful'Test'...................................................................................................................................'102!
The!NeymanNPearson!Lemma!.........................................................................................................................................................................!102!
Example&......................................................................................................................................................................................................................&103!
Example&......................................................................................................................................................................................................................&104!
Example&......................................................................................................................................................................................................................&106!
!
!

!
Discrete(Random(Variables(and(Probability(
Distributions!
Random'Variables'
Definition:!For!a!given!sample!space!S!of!some!experiment,!a!random!variable!(rv)!is!any!rule!that!
associates!a!number!with!each!outcome!in!S.!In!mathematical!language,!a!random!variable!is!a!
function!whose!domain!is!the!sample!space!and!whose!range!is!the!set!of!real!numbers.!

Probability'Distributions'for'Discrete'Random'Variables'
Probability'Distribution'or'Probability'Mass'Function'(pmf)'(f(x)'or'p(x))'
The!probability!distribution!or!probability!mass!function!(pmf)!of!a!discrete!rv!is!defined!for!every!
number!x!by!! ! = !! ! = ! = ! !""!! !: ! ! = ! .!!
!
!"#$%"&'%(:"
1.!!!1 ! ! 0,
! "
2.

! ! ! = !1"
!!!!!

3. ! !!!!! = !

!(!) , !!"!!! !."


!!!!!

Cumulative'Distribution'Function'(P(x)'or'F(x))!!
!

The!cumulative!distribution!function!(cdf)!F(x)!of!a!discrete!rv!variable!X!with!pmf!p(x)!is!defined!
for!every!number!x!by!
! ! = !! ! ! = !

! ! !"
!:!!!

"
!"!!!"#!!"#$%&!!, !(!)!!"!!!!!"#$%$&'&()!!!"!!!!!"#$%&$'!!"#$%!!"!!!!"##!!"!!"!!"#$!!."

The'Expected'Value'of'X'

Let!X!be!a!discrete!rv!with!set!of!possible!values!D!and!pmf!p(x).The!expected!value!of!mean!value!
of!X,denoted!by!E(X)!is:!
! ! = !! =

!! ! "
!!

'
!
!

Rules'of'Expected'Value'
!
! !" + ! = !! ! ! + !!"
!

The'Variance'of'X'
!
Let!X!have!p(x)!and!expected!value!.!Then!the!variance!of!X,!denoted!by!V(X)!or!Var(X)!or!!!! !or!just!! ! !is:!
!"# ! = !

!!
!

!
'
!
!
!
!

! ! =! !!

= !! ! ! ! !

"

!!!!"#$%#&%!!"#$%&$'(!!"!!!!" "
!!! = ! !!! "

Rules'of'Variance'
!
!
!"# !" + ! = ! !!"!!
= !! ! !!! !!!
!
!!"!! = ! ! !! !
So,!in!particular!
!
!!" = ! ! !! ,!!!!!!!!! = ! !! !

Discrete'Probability'Distributions'
'

The'Binomial'Probability'Distribution'
1.
The experiment consists of a sequence of n smaller experiments called trials, where n
is fixed in advance of the experiment.
2.

Each trial can result in one of the same two possible outcomes (dichotomous trials),
which we generically denote by success (S) and failure (F).

3.

The trials are independent, so that the outcome on any particular trial does not
influence the outcome on any other trial.

4.

The probability of success P(S) is constant from trial to trial; we denote this
probability by p.

The$Binomial$Random$Variable$and$Distribution$
The binomial random variable X associated with a binomial experiment consisting of n
trials is defined as
X 5 the number of Ss among the n trials
The binomial distribution is the approximate probability model for sampling without
replacement from a finite dichotomous (SF) popula- tion provided the sample size n is
small relative to the population size N
Because the pmf of a binomial rv X depends on the two parameters n and p, we denote
the pmf by b(x; n, p).
! !; !, ! = {!

! ! = !!"
! ! = !!"#

! !
!! 1 !
!

!!!

!!!!!! = 0,1,2, !

'

The'Negative'Binomial'Distribution'
The negative binomial rv and distribution are based on an experiment satisfying the following
conditions:
1. The experiment consists of a sequence of independent trials.
2. Each trial can result in either a success (S) or a failure (F).
3. The probability of success is constant from trial to trial, so P(S on trial i) 5 p for i 5
1, 2, 3, . . . .
4. The experiment continues (trials are performed) until a total of r successes have
been observed, where r is a specified positive integer.
The random variable of interest is X = the number of trials that precede the rth success; X is
called a negative binomial random variable because, in contrast to the binomial rv, the number
of successes is fixed and the number of trials is random.
The pmf of the negative binomial rv X with parameters r = number of Ss and p = P(S) is
!" !; !, ! =

!!!
!!!

!! 1 !

!!!

,!!!!!! = !, ! + 1, ! + 2 ..

! ! =!
! ! =
'

!
!
!
!

! !!!
!!

Geometric'Probability'Distribution'
1. involves!identical!and!independent!trials!
2. can!only!result!in!1!of!2!outcomes:!success!or!failure!
3. probability!of!success!is!equal!to!p!and!is!constant!from!trial!to!trial!
4. the!geometric!random!variable!X!is!the!number!of!the!trial!on!which!the!first!success!
occurs! !
!
The!pmf:!
! ! = ! (1 !)!!! !,!!!!!! = 1,2,3 . 0 ! 1! !
!

! ! = !!

! ! =

!!!
!!

Notes:!often!used!to!model!distributions!of!lengths!of!waiting!times!

'

Poisson'Probability'Distribution'
Notes:!The!Poisson!probability!distribution!often!provides!a!good!model!for!the!probability!
distribution!of!the!number!of!rare!events!that!occur!in!space,!time,!volume,!or!any!other!
dimension,!where!!!is!the!average!value!of!Y.!
It!provides!a!good!model!for!the!probability!distribution!of!the!number!Y!of!automobile!
accidents,!industrial!accidents,!or!other!types!of!accidents!in!a!given!unit!of!time.!!
!
Examples:!

number!of!telephone!calls!handled!by!a!switchboard!in!a!time!interval!
the!number!of!radioactive!particles!that!decay!in!a!particular!time!period!
the!number!of!errors!a!typist!makes!in!typing!a!page!
and!the!number!of!automobiles!using!a!freeway!access!ramp!in!a!tenNminute!
interval!

The!pmf:!
!!
! ! ! = !! ! !! ,!!!!!!!! = 0,1,2,3, . . ! > 0!
!
!
!
!
!

!
!

!
!

!
!
!

!
! ! = !!
! ! = !!

!
!

Moments(and(Moment"Generating)Functions!
!
!

Notes:!Used!to!determine!a!unique!p(y)!or!pmf!(probability!distributions)!or!f(x)!
Definition:!the!kth!moment&of&a&random&variable&Y&taken&about&the&origin&is!defined!to!be!! ! ! !
and!is!denoted!by!!!! !
!
Definition:!The!momentQgenerating&function&m(t)&for&a&random&variable&Y&!is!defined!to!be!
! ! = !!(! !" ).!We!say!that!a!momentNgenerating!function!for!Y!exists!if!there!exists!a!positive!
constant&b&!such!that!m(t)!is!finite!for!|t|!!b.!
!
!" !

! !" = !1 + !" +

! ! = !! ! !" = !

!!

!! ! !!!!! = !

+
!!
!!

!" !

!!
!"

!" !
!!

! ! =!

! +!
!!

+ !
1 + !" +

! !"

!!

! + !!

!!

!" !
!!
!

!!

!" !
!!

!!

! ! + !!

!
!
! ! !!!!!! = 1 + !!!! + !! !!! + !! !!! + !
!
Example:/!
!
Find!the!momentNgenerating!function!for!a!Bernoullis!RV!
!
!
! ! = !! ! !" = ! !!!! ! !" (! ! 1 ! !!! !
!
!
1 1 1 ! ! + !! ! ! ! 1!
!
!
! ! + ! !! !/
/
/
/
/
/
/
/
/
/
/
/
/
/
/
/
/
/
/
/
/
/
/
/
/
/
/

!" !
!!
!!

+ ! ! !
! ! + !

Example://

!
!
!
!

'

Find'the'expected'value'and'variance'from'Moment'Generating'Function'

!
!
!
!
!
!
!
!

!
!
!

!"#$%!! ! !
!
E(X)!=!M(0)!
!
Var(x)!=!! ! ! !!! ! ! = !!! 0 !! 0

!!

Continuous(Random(Variables(and(Their(
Probability*Distributions!
Cumulative'Distribution'function'P(y)'or'F(y)'
!
!
!

The!distribution!function!of!Y,!denoted!F(y),!is!such!that!F(y)!=!P(Y!!y)!for!
!
! ! = ! ! ! ! !"!

'

Properties'of'a'Cumulative'Distribution'Function''

!<!y!<! !

1. !
= !!"!!! ! ! = !0!
2. !
= lim! ! ! = 1!!!
3. F(y)!is!a!nonNdecreasing!function!of!y.![If!!!! !"#!!! !are!any!values!such!that!!! < !! ,!
then!! !! ! !! .]!

Probability'density'function'of'a'Continuous'random'variable'
'

Properties'of'a'Density'Function'
1. ! ! 0!!"#!!""!!,
2. ! ! ! !"! = 1!

<!<

!!

If!the!random!variable!Y!has!density!function!f(y)!and!a!<!b,!then!the!probability!that!Y!falls!in!the!
interval![a,b]!is!
!
!
! ! ! ! = ! ! ! ! !"!

Probability'density'function'is'Derivative'of'Cumulative'distribution'function'
!

! ! =

!" !
!"

= !(!)!

Expected'Value'
!
!
!
!
!

The!expected!value!of!a!continuous!random!variable!Y!is!
!
! ! = ! ! !" ! !"!
Let!g(Y)!be!a!function!of!Y;!then!the!expected!value!of!g(Y)!is!given!by!
!
! ! ! = ! ! ! ! ! ! !"!!

'

Expected'Value'Properties'

!
!
!

Continuous'Distributions'
'

Uniform'
If!!! < ! !! ,!a!random!variable!Y!is!said!to!have!a!continuous!uniform&probability&distribution&!
on!the!interval!(!! , !! )!if!and!only!if!the!density!function!of!Y!is!

!
!

!
!
!
!

'

The'Normal'Probability'Distribution'

!
!

!
!

'

'

Transforming'normal'random'Variable'to'standard'Normal'

'

'

'

'

'

'

Gamma'Probability'Distribution'

Random!variables!that!are!always!not!nonnegative!and!for!various!reasons!yield!
distributions!of!data!that!are!skewed!to!the!right.!The!density!function!drops!gradually!as!y!
increases.!
!
Examples:!
! lengths!of!time!between!malfunctions!for!aircraft!engines!possess!a!skewed!
frequency!distribution!
lengths!of!time!between!arrivals!at!a!supermarket!checkout!queue!
lengths!of!time!

!!!"!!!!!!"#!!"#"$%&%#.!
!!!"!!!!!"#$%!!"#"$%&%#.!
! ! !"!!!!!"##"!!"#$%&'#!
!

!
!

!
!

Proof:!

!
!
!

'

Relating'ChiRSquared'and'Gamma'

!
!
!

!
!!

'
ChiRSquared'Distribution'

!
!

'

Exponential'Distribution'('Gamma'Density'function'when'!'='1)''

!
!

!
Notes:!useful!for!modeling!the!length!of!life!of!electronic!components!
Suppose!that!that!length!of!time!a!component!already!has!operated!does!not!affect!its!
chance!of!operating!for!at!least!b!additional!time!units.!That!is,!the!probability!that!the!
component!will!operate!for!more!than!a+b!time!unites,!given!that!it!has!already!operated!
for!at!least!a!time!units,!is!the!same!as!the!probability!that!a!new!component!will!operate!
for!at!least!b!time!unitis!if!the!new!component!is!put!into!service!at!time!0.!
!

!!

!
!
!

Jointly(Distributed(Random(Variables!
Two'Discrete'Random'Variables'
The probability mass function (pmf) of a single discrete rv X specifies how much probability mass is
placed on each possible X value. The joint pmf of two discrete rvs X and Y describes how much
probability mass is placed on each possible pair of values (x, y).
!

'

Joint'Probability'Mass'function'(p(x,y)'or'f(x,y))'
!Let!X!and!Y!be!two!discrete!rvs!defined!on!the!sample!space!S!of!an!experiment.!The!joint!
probability!mass!function!p(x,y)!is!defined!for!each!pair!of!numbers!(x,y)!by!!
!
! !, ! = !(! = !!!"#!! = !!
!
It!must!be!the!case!that!p(x,y)!!0!and! ! ! ! !, ! = 1!
!
Now!let!A!be!any!set!consisting!of!pairs!of!(x,y)!values!(e.g., A = {(x, y): x + y = 5} or {(x, y): max(x,
y) 3}). Then the probability of ! !, ! ! is obtained by summing the joint pmf over pairs in A:
!(!, !)

! !, ! ! = !

!
!,!

'

Joint'Distribution'Function'! !! , !! '

The marginal probability mass function of X, denoted by !! (!), is given by


!! ! = !

!:! !,! !! !

!, ! !!!!!!"#!!"#!!"##$%&'!!"#$%!!!

Similarly, the marginal probability mass function of Y is


!! ! =

!:! !,! !! !

!, ! !!!"#!!"#!!"##$%&'!!"#$%!!

Two'Continuous'Random'Variables'
Joint'Probability'Density'function'(p(x,y)'or'f(x,y))'
!!
Let!X!and!Y!be!two!continuous.!The!joint!probability!density!function!f(x, y) for these two

variables is a function satisfying f(x, y) 0 and


!

!!

! !, ! !"!# = 1!

Then!for!any!twoNdimensional!set!A
! !, ! ! = !

! !, ! !"!#
!

In particular, if A is the two-dimensional rectangle {(x,y): a x b,c y d}, then


!

! !, ! ! = ! ! ! !, ! ! ! = !

! !, ! !"!#
!

'

Joint'Distribution'Function'P(X,Y)'or'F(X,Y)'

'

'

Properties'of'Joint'Distribution'Function'

The marginal probability density function of X, denoted by !! (!), is given by


!! ! = ! , !!!!!!
Similarly, the marginal probability density function of Y is
!! ! = ! , !!!!
!

Marginal'Probability'functions'

!
!

'
Independent'Random'Variables'
The definition says that two variables are independent if their joint pmf or pdf is the product of
the two marginal pmfs or pdfs. Intuitively, independence says that knowing the value of one of
the variables does not provide additional information about what the value of the other variable
might be.
!
Two!random!variables!X!and!Y!are!said!to!be!independent!if!for!every!pair!of!x!and!y!values!
! !, ! = ! !! ! !! ! !!!!!"!!!!"#!!!!"#!!"#$%&'&!
!
or!
! !, ! = ! !! ! ! !! ! !!!!!"!!!!"#!!!!"#!!"#$%#&"&'!
!

!
!
!

Conditional'Distributions'
!

Let!X!and!Y!be!two!continuous!rvs!with!joint!pdf!!f(x,y)!and!marginial!X!pdf!!! (!).!Then!for!any!X!
value!x!for!which!!! ! > !0!!,!the!conditional!probability!density!function!of!Y!given!that!X!=!x!is!!
! !, !
!! !
!
!! !
If!X!and!Y!are!discrete,!replacing!pdfs!by!pmfs!in!this!definition!gives!the!conditional!probability!
mas!function!of!Y!when!X!=!x.!!
!
!

'

Discrete'Conditional'Distributions'

'

Continuous'Conditional'Distributions'

!
!
!

Expected'Values,'Covariance,'and'Correlation'
!

Let!X!and!Y!be!jointly!distributed!rvs!with!pmf!! !, ! !or!pdf!! !, ! !according!to!whether!the!


variables!are!discrete!or!continuous.!Then!the!expected'value!of!a!function! !, ! ,!denote!by!
E[h(X,Y)]!or!!! !,! ,!is!given!by!
!, ! (!! !, ! !!!!!"!!!!"#!!!!"#!!"#$%&'&
! !, !

!, ! ! !, ! !"!#!!!!"!!!!"#!!!!"#!!"#$%#"&'

!!!

!
!
!
!
!
!
!

Example:!

Independent'Expected'Value'functions' '

!
!

!
The!covariance'between!two!rvs!X!and!Y!is!
! ! !! ! ! !! ! !, ! !!!!!, !!!"#$%&'&
!"# !, ! = !! ! ! !! ! ! !!

=!

! ! !! ! ! !! ! !, ! !"!#!!!!!!!, !!!"#$%#&"&'
!
Shortcut:!
!

!"# !, ! = !! !" ! !! ! !! !

The!larger!the!absolute!value!of!the!covariance!of!Y1&and!Y2,&the!greater!the!linear!dependence!
between!Y1&and!Y2.&&
Positive!values!indicate!that!Y1&increases!as!Y2&increases;!negative!values!indicate!that!Y1&decreases!
as!Y2&increases.!!
A!zero!value!of!the!covariance!indicates!that!the!variables!are!uncorrelated&and!that!there!is!no!linear!
dependence!between!Y1&and!Y2.!

!
!
!
The!Correlation!coefficient!of!X!and!Y,!denoted!by!Corr(X,Y),!or!!!,! ,!or!just!!,!is!defined!by!
!"# !, !
!!,! =
!
!! ! !!
1. If!a!and!c!are!either!both!positive!or!both!negative,!
!"## !" + !, !" + ! = !!"##(!, !)!
2. For!any!two!rvs!X!and!Y,!N1!!Corr(X,Y)!!1!
3. If X and Y are independent, then ! =!0, but ! = 0 does not imply independence.
4. ! = 1 or -1 iff Y=aX + b for some numbers a and b with a0.
!
!

Conditional'Expectations'

!
!
!
!
!

!
Example:!

Conditional'Variances'

!
!
!

!
Example:!

!
!
!
!

Distribution*of*Functions*of*Random*Variables!

Any!function!of!a!random!variable!X!is!itself!a!random!variable,!and!the!probability!distribution!of!
a!function!of!X!is!determined!by!the!probability!distribution!of!X.!

Goal:!To!discuss!several!methods!of!determining!the!distribution!of!a!function!of!a!random!
variable!X.!

Often!it!is!useful!to!express!the!pdf!or!CDF!of!a!function!of!a!random!variable!in!terms!of!the!pdf!or!
CDF!of!the!original!variable.!In!such!cases,!the!pdfs!and!CDFs!are!referred!to!as!derived!
distributions.!

!
!

Finding'Distributions'of'Functions'of'RV'
Consider random variables Y1,Y2,...,Yn and a function U(Y1,Y2,...,Yn), denoted simply as U .
Then three of the methods for finding the probability distribution of U are as follows:
!

1. The method of distribution functions: This method is typically used when the Ys have
continuous distributions.

First,!find!the!distribution!function!for!U,&FU&(u)!=!P(U&!u),&by!using!the!methods!that!
we!discussed!in!Chapter!5.!!
To!do!so,!we!must!find!the!region!in!the!y1,y2,...,yn!space!for!which!U!u!and!!
then!find!P(U&!u)&by!integrating!f(y1,y2,...,yn)&over!this!region.!!
The!density!function!for!U&is!then!obtained!by!differentiating!the!distribution!function,!
FU(u).&!
!

2. The method of transformations: If we are given the density function of a random variable Y ,
the method of transformations results in a general expression for the density of U = h(Y ) for
an increasing or decreasing function h(y). Then if Y1 and Y2 have a bivariate distribution, we
can use the univariate result explained earlier to find the joint density of Y1 and U = h(Y1,
Y2). By integrating over y1, we find the marginal probability density function of U , which is
our objective. This method will be illustrated in Section 6.4.
3. The method of moment-generating functions: This method is based on a uniqueness theorem,
Theorem 6.1, which states that, if two random variables have identical moment-generating
functions, the two random variables possess the same probability distributions. To use this
method,

we!must!find!the!momentNgenerating!function!for!U&and!compare!it!with!the!momentN
generating!functions!for!the!common!discrete!and!continuous!random!variables!
derived!in!Chapters!3!and!4.!!
If!it!is!identical!to!one!of!these!momentNgenerating!functions,!the!probability!
distribution!of!U&can!be!identified!because!of!the!uniqueness!theorem.!!

'
(1) The'CDF'technique'
!

Assume:!A!random!variable!X!has!CDF!FX(x)!and!we!are!interested!in!some!function!of!X,!
say!Y!=!u(X).!

The!CDF!technique!nds!the!CDF!of!Y!by!expressing!the!distribution!of!Y!in!terms!of!the!
CDF!of!X.!

First,!for!each!y,!we!will!dene!a!set!Ay!=!fxju(x)!!yg.!

Therefore,![Y!!y]!and![X!2!Ay]!are!equivalent!events.!This!implies!that!FY!(y)!=!P[u(X)!!y]!=!
P[X!2!Ay]!

For!many!situations,!it!is!possible!to!express![u(X)!!y]!in!terms!of!an!equivalent!event![x1!!
X!!x2].!For!such!cases:!FY!(y)!=!P[u(X)!!y]!=!P[x1!!X!!x2]!

!
!
!
!
!

!
!
!
!

!
Example:!

!
!

Transformation'Methods'
!

OneRtoROne'Transformations'
!

Assume:!A!random!variable!X!has!CDF!FX(x)!and!pdf!fX(x)!and!we!are!interested!in!some!function!
of!X,!say!Y!=!u(X).!

If!the!equation!y!=!u(x)!can!be!solved!uniquely,!say!x!=!w(y)!(or!x!=!!!! (y)),!then!we!say!the!
transformation!is!oneNtoNone.!

!
!

Theorem(6.3.1):!Discrete!Case:!Let!X!be!a!discrete!random!variable!with!pdf!fX(x)!and!let!Y!=!u(X)!
dene!a!oneNtoNone!transformation.!(That!is,!y!=!u(x)!can!be!solved!uniquely,!say!x!=!w(y).)!Then!
the!pdf!of!Y!is!!

!
!
!

!! ! = ! !! ! ! !!!!"#!! !!!!!!!"!!! = !! !! ! > !0}!!"!!!!!"##$%&!!"!!.!

Theorem(6.3.2):!Continuous!Case:!Let!X!be!a!continuous!random!variable!with!pdf!!! ! !
and!that!Y!=!u(X)!denes!a!oneNtoNone!transformation!from!(the!support!of!X)!! =
!! !! ! > 0}!onto!(the!support!of!Y!)!!! = {!|!! (!) ! > !0}!with!inverse!transformation!x!=!
!

w(y).!If!the!derivative!!" ! ! !is!continuous!and!nonzero!on!B,!then!the!pdf!of!Y!is!!
!
!! ! = ! !! ! !
! ! !!!!"#!! !"
!"
!
!

The!derivative!of!w(y),!denoted!! = !" !(!),!is!referred!to!as!the!Jacobian!of!the!


transformation.!

!
!

Transforming!a!continuous!random!variable!is!equivalent!to!making!a!change!of!variables!
in!an!integral.!

LogNormal'Distribution'

Joint'Transformations'

!
!

!
!
!

Moment'Generating'Method'

!
!

!
Note:!The method of moment-generating functions is often very useful for finding the

distributions of sums of independent random variables.

!
The!moment!Generating!function!of!__! !!N!xbar!is!

!
!
!
!
!
!
!

Sampling)Distributions)and)the)Central)Limit)
Theorem!
Sampling'Distributions'Related'to'the'Normal'Distribution'
!

!
!
!
!

!
Example:!

!
!
!
!

'

ChiRSquared'Distribution'

Definition:/The!sampling!distribution!of!the!sum!of!the!squares!of!independent,!standard!
normal!random!variables!
!

!
!
!
!

!
!
!

!
!
!

Example:!

'
!

!
!
!
!

'

Notes:'Sample'Variance'!! probability'distribution''

!
Example:!
!
!

!
!
!
!

'
!

Students'T'distribution'

!
!

!
!

!
!
!
!

!
Examples:!

!
!
!
!

'

F'Distribution'

'

'

Notes:''

!
!

!
!
!
!
!
!

Examples:!
!
!

!
!

'

The'Central'Limit'Theorem'

!
!
!

!
!
!

!
Example:!

!
!

'

The'Normal'Approximation'to'the'Binomial'Distribution'

!
!

!
!
!

!
Example:!

Estimation!
The'Bias'and'Mean'Square'Error'of'Point'Estimators'

!
!

Relative'Efficiency'
If 1 and 2 denote two unbiased estimators for the same parameter , we prefer to use the
estimator with the smaller variance.

The'Method'of'Moments'
!

Steps:!

1.

Find!the!population!moments!!! , !! , !! .!(#!of!moments!to!use!=!#!of!unknown!
parameters)!
N !! = ! ! = !!(0)!

2.

N !! = ! ! ! = !!(0)!
X!~iid!!! (!)!sample!moments!(!!!!! , !!!!! )!
N

!!!!! = !
!

3.!
!

!
!!! !! =
!
!
!!! !! !

!!"# !

N !!!!! = !
Set!population!moments!equal!to!sample!momentsN!solve!system!of!equations!for!!!

!
Example:!
!

Method'of'Maximum'Likelihood'(Better'unbiased'estimators)'
!
!

!
!
!

!
!

Confidence)Intervals!
LargeRSample'Confidence'Intervals'

!
!
!
!

!
Examples:!

!
!
!
!
!

!
Example!2:!

!
!
!
!

!
Example:!

!
!
!

'

Small''Sample'Confidence'Intervals'for'!!!"#!!! !! '

!
!
!

'

'

'

'

'

Comparing'the'means'of'two'normal'populations'

'

'

'

'
'

'
'
'
'
'
'
'
'
'
'
'

'

Confidence'Intervals'for'!! '
!

!
!

!
Note:!If sample measurements have been selected from a normal distribution, a confidence

interval for 2 can be developed through use of the 2 distribution. These intervals are very sensitive to
the assumption that the underlying population is normally distributed. Consequently, the actual
confidence coefficient associated with the interval estimation procedure can differ markedly from the
nominal value if the underlying population is not normally distributed.
!

Inferences(Based(on(Two(Samples!
zTests'and'Confidence'Intervals'for'a'Difference'Between'Two'Population'Means'
!

Basic'Assumptions'

1. !! , !! , . !! !is!a!random!sample!from!a!distribution!with!a!mean!!! !and!variance!!!! !
2. !! , !! , . . !! !is!a!random!sample!from!a!distribution!with!mean!!! !and!variance!!!! !
3. The!X!and!Y!samples!are!independent!of!one!another!

Expected'Value'of'__! __! !'

Is!!! !! !!!!(__
__! !is!an!unbiased!estimator)!
!
!

Standard'Deviation'of'__! __! '


Is!!!!!!! =
!
!
!

!!!

!! +
!

!!!
!

Hypothesis*Testing!
Null'Hypothesis'and'the'Alternative'Hypothesis'

Notes:''

!
!

Simple'Vs'Composite'Hypothesis'

If!a!hypothesis!completely!specifies!the!distribution!to!which!it!refers!(i.e.,!if!it!specifies!the!pdf!
family!to!which!the!distribution!belongs!as!well!as!the!values!of!all!parameters!required!for!that!
family),!it!is!called!simple.!Both!hypotheses!in!the!previous!example!are!simple;!this!would!not!be!
the!case!if!their!common!variance!were!unknown.!If!the!distribution!related!to!a!hypothesis!is!not!
completely!specified,!the!hypothesis!is!called!composite.!
!

OneR'and'Two'sided'Alternatives'

Often,!hypotheses!specify!the!value!of!a!parameter!or!a!range!of!values!for!a!parameter.!When!an!
alternative!hypothesis!is!composite!(that!is,!when!it!specifies!a!parameter!range!as!opposed!to!a!
specific!value),!it!can!be!one!of!two!types:!one!!sided!or!two!!sided.!For!instance,!in!the!heights!

example!we!might!be!interested!either!in!the!alternative!hypothesis!:!!>!175!or!in!the!alternative!
hypothesis!:!! !175;!these!alternatives!are!examples!of!oneNsided!and!twoNsided!alternative!
hypotheses,!respectively.!As!we!will!see,!whether!a!hypothesis!is!one!or!twoN!sided!plays!an!
important!role!in!the!testing!of!hypotheses.!

The'Rejection'Region'
The rejection region, which will henceforth be denoted by RR, specifies the values of the test
statistic for which the null hypothesis is to be rejected in favor of the alternative hypothesis. If
for a particular sample, the computed value of the test statistic falls in the rejection region RR,
we reject the null hypothesis H0 and accept the alternative hypothesis Ha . If the value of the
test statistic does not fall into the RR, we accept H0.
!

Type'I'and'Type'II'Errors'
!
!
!

!
Notes:!

For any statistical test, the probability of a type I error depends on the value of the parameter
specified in the null hypothesis. This probability can be calculated, at least approximately, for
each of the testing procedures discussed in this text. For the procedures discussed thus far, the
probability of a type II error can be calculated only after a specific value of the parameter of
interest has been singled out for consideration.
!

'
Example(
!

!
!

Example(

'

Relationship'between'!!!"#!!'

!
!

!
!
!

Common'LargeRSample'Tests'
!

!
Example:!

!
!

Calculating'Type'II'Error'Probabilities'and'Finding'the'Sample'Size'for'Z'Tests'
!

Relationships'between'HypothesisRTesting'Procedures'and'Confidence'Intervals'

!
!

Significance'Levels'or'pRValues'
!
!

Definition:/the!probability!!of!a!type!I!error!is!often!called!the!significance!level!

'

PRvalue'

!
'

!
'

Notes:'

The smaller the p-value becomes, the more compelling is the evidence that the null

hypothesis should be rejected. The p-value is the smallest value of for which the
null hypothesis can be rejected. Thus, if the desired value of is greater than or
equal to the p-value, the null hypothesis is rejected for that value of .
Indeed, the null hypothesis should be rejected for any value of down to and
including the p-value.
'
!
!

!
!

'

Calculating'P'Values'

!
!

!
!

SmallRSample'Hypothesis'Testing'for'!!!"#!!! ! !! '
!

!
Example:!

Small'Sample'test'for'!! ! !! '

'

Example:'

!
!

!
!
!
!
!
!
!
!
!
!
!

Testing'Hypotheses'Concerning'Variances'
!

Test'of'the'Hypothesis'!!! = ! !!! '

Example:'
!

!
!

Power'of'Tests'

Most'Powerful'Test'

The'NeymanRPearson'Lemma'
!

Example'

Example'

!
!

Example'

S-ar putea să vă placă și