Sunteți pe pagina 1din 72

Session 12 References

Probability and Statistical Course.

Instructor: Dr.Ing.(c) Sergio A. Abreo C.

Escuela de Ingenieras Electrica, Electronica y de


Telecomunicaciones

Universidad Industrial de Santander

May 27, 2017

Connectivity and Signal Processing Research Group.


info@cps.uis.edu.co https://cpsuis.wordpress.com
Session 12 References

Agenda

1 Session 12
Point Estimation of Parameters

2 References
Session 12 References

Introduction

Statistical Inference
The field of statistical inference consists of those methods
used to make decisions or to draw conclusions about a
population.
These methods utilize the information contained in a sample
from the population in drawing conclusions.
Statistical inference may be divided into two major areas:
Parameters estimation.
Hypothesis testing.
We will see that it is possible to establish the precision of the
estimate.
Since a statistic is a random variable, it has a probability
distribution. We call the probability distribution of a statistic
a sampling distribution.
Session 12 References

Introduction

Statistical Inference
The field of statistical inference consists of those methods
used to make decisions or to draw conclusions about a
population.
These methods utilize the information contained in a sample
from the population in drawing conclusions.
Statistical inference may be divided into two major areas:
Parameters estimation.
Hypothesis testing.
We will see that it is possible to establish the precision of the
estimate.
Since a statistic is a random variable, it has a probability
distribution. We call the probability distribution of a statistic
a sampling distribution.
Session 12 References

Introduction

Statistical Inference
The field of statistical inference consists of those methods
used to make decisions or to draw conclusions about a
population.
These methods utilize the information contained in a sample
from the population in drawing conclusions.
Statistical inference may be divided into two major areas:
Parameters estimation.
Hypothesis testing.
We will see that it is possible to establish the precision of the
estimate.
Since a statistic is a random variable, it has a probability
distribution. We call the probability distribution of a statistic
a sampling distribution.
Session 12 References

Introduction

Statistical Inference
The field of statistical inference consists of those methods
used to make decisions or to draw conclusions about a
population.
These methods utilize the information contained in a sample
from the population in drawing conclusions.
Statistical inference may be divided into two major areas:
Parameters estimation.
Hypothesis testing.
We will see that it is possible to establish the precision of the
estimate.
Since a statistic is a random variable, it has a probability
distribution. We call the probability distribution of a statistic
a sampling distribution.
Session 12 References

Introduction

Statistical Inference
The field of statistical inference consists of those methods
used to make decisions or to draw conclusions about a
population.
These methods utilize the information contained in a sample
from the population in drawing conclusions.
Statistical inference may be divided into two major areas:
Parameters estimation.
Hypothesis testing.
We will see that it is possible to establish the precision of the
estimate.
Since a statistic is a random variable, it has a probability
distribution. We call the probability distribution of a statistic
a sampling distribution.
Session 12 References

Introduction

Statistical Inference
The field of statistical inference consists of those methods
used to make decisions or to draw conclusions about a
population.
These methods utilize the information contained in a sample
from the population in drawing conclusions.
Statistical inference may be divided into two major areas:
Parameters estimation.
Hypothesis testing.
We will see that it is possible to establish the precision of the
estimate.
Since a statistic is a random variable, it has a probability
distribution. We call the probability distribution of a statistic
a sampling distribution.
Session 12 References

Introduction

Statistical Inference
The field of statistical inference consists of those methods
used to make decisions or to draw conclusions about a
population.
These methods utilize the information contained in a sample
from the population in drawing conclusions.
Statistical inference may be divided into two major areas:
Parameters estimation.
Hypothesis testing.
We will see that it is possible to establish the precision of the
estimate.
Since a statistic is a random variable, it has a probability
distribution. We call the probability distribution of a statistic
a sampling distribution.
Session 12 References

Introduction

Statistical Inference
The parameters will be represented by .
The objective of point estimation is to select a single number,
based on sample data, that is the most plausible value for .
If X is a random variable with probability distribution f (x),
characterized by the unknown parameter , and if
X1 , X2 , .., Xn is a random sample of size n from X , the
statistics = h(X1 , X2 , ..., Xn ) is called a point estimator of .
Note that is a random variable because it is a function of
random variables.
After the sample has been selected, takes on a particular
numerical value called the point estimate of .
Session 12 References

Introduction

Statistical Inference
The parameters will be represented by .
The objective of point estimation is to select a single number,
based on sample data, that is the most plausible value for .
If X is a random variable with probability distribution f (x),
characterized by the unknown parameter , and if
X1 , X2 , .., Xn is a random sample of size n from X , the
statistics = h(X1 , X2 , ..., Xn ) is called a point estimator of .
Note that is a random variable because it is a function of
random variables.
After the sample has been selected, takes on a particular
numerical value called the point estimate of .
Session 12 References

Introduction

Statistical Inference
The parameters will be represented by .
The objective of point estimation is to select a single number,
based on sample data, that is the most plausible value for .
If X is a random variable with probability distribution f (x),
characterized by the unknown parameter , and if
X1 , X2 , .., Xn is a random sample of size n from X , the
statistics = h(X1 , X2 , ..., Xn ) is called a point estimator of .
Note that is a random variable because it is a function of
random variables.
After the sample has been selected, takes on a particular
numerical value called the point estimate of .
Session 12 References

Introduction

Statistical Inference
The parameters will be represented by .
The objective of point estimation is to select a single number,
based on sample data, that is the most plausible value for .
If X is a random variable with probability distribution f (x),
characterized by the unknown parameter , and if
X1 , X2 , .., Xn is a random sample of size n from X , the
statistics = h(X1 , X2 , ..., Xn ) is called a point estimator of .
Note that is a random variable because it is a function of
random variables.
After the sample has been selected, takes on a particular
numerical value called the point estimate of .
Session 12 References

Introduction

Statistical Inference
The parameters will be represented by .
The objective of point estimation is to select a single number,
based on sample data, that is the most plausible value for .
If X is a random variable with probability distribution f (x),
characterized by the unknown parameter , and if
X1 , X2 , .., Xn is a random sample of size n from X , the
statistics = h(X1 , X2 , ..., Xn ) is called a point estimator of .
Note that is a random variable because it is a function of
random variables.
After the sample has been selected, takes on a particular
numerical value called the point estimate of .
Session 12 References

Point estimate

Definition
A point estimate of some population parameter is a single
numerical value of a statistics . The statistics is called the
point estimator. For example:
Some population parameter e.g. or 2
The statistics e.g. X or S 2
Single numerical value e.g. The computed values
Session 12 References

Point estimate

Definition
A point estimate of some population parameter is a single
numerical value of a statistics . The statistics is called the
point estimator. For example:
Some population parameter e.g. or 2
The statistics e.g. X or S 2
Single numerical value e.g. The computed values
Session 12 References

Point estimate

Definition
A point estimate of some population parameter is a single
numerical value of a statistics . The statistics is called the
point estimator. For example:
Some population parameter e.g. or 2
The statistics e.g. X or S 2
Single numerical value e.g. The computed values
Session 12 References

Point estimate

Choices
We may have several different choices for the point estimator
of a parameter.
If we wish to estimate the mean of a population, we might
consider the sample mean, the sample median, or perhaps the
average of the smallest and largest observations in the sample
as point estimators.
In order to decide which point estimator of a particular
parameter is the best one to use, we need to examine their
statistical properties and develop some criteria for comparing
estimators.
Session 12 References

Point estimate

Choices
We may have several different choices for the point estimator
of a parameter.
If we wish to estimate the mean of a population, we might
consider the sample mean, the sample median, or perhaps the
average of the smallest and largest observations in the sample
as point estimators.
In order to decide which point estimator of a particular
parameter is the best one to use, we need to examine their
statistical properties and develop some criteria for comparing
estimators.
Session 12 References

Point estimate

Choices
We may have several different choices for the point estimator
of a parameter.
If we wish to estimate the mean of a population, we might
consider the sample mean, the sample median, or perhaps the
average of the smallest and largest observations in the sample
as point estimators.
In order to decide which point estimator of a particular
parameter is the best one to use, we need to examine their
statistical properties and develop some criteria for comparing
estimators.
Session 12 References

Concepts of point estimation

Estimators
An estimator should be close in some sense to the true
value of the unknown parameter.
Formally, we say that is an unbiased estimator of if the
expected value of is equal to . i.e. E () = .
If the estimator is not unbiased, then the difference E ()
is called the bias of the estimator .
In the book is demostrated that E (X ) = and E (S 2 ) = 2
but E (S) 6= . For large samples, the bias is very small.
However, there are good reasons for using S as an estimator
of in samples from normal distributions.
Session 12 References

Concepts of point estimation

Estimators
An estimator should be close in some sense to the true
value of the unknown parameter.
Formally, we say that is an unbiased estimator of if the
expected value of is equal to . i.e. E () = .
If the estimator is not unbiased, then the difference E ()
is called the bias of the estimator .
In the book is demostrated that E (X ) = and E (S 2 ) = 2
but E (S) 6= . For large samples, the bias is very small.
However, there are good reasons for using S as an estimator
of in samples from normal distributions.
Session 12 References

Concepts of point estimation

Estimators
An estimator should be close in some sense to the true
value of the unknown parameter.
Formally, we say that is an unbiased estimator of if the
expected value of is equal to . i.e. E () = .
If the estimator is not unbiased, then the difference E ()
is called the bias of the estimator .
In the book is demostrated that E (X ) = and E (S 2 ) = 2
but E (S) 6= . For large samples, the bias is very small.
However, there are good reasons for using S as an estimator
of in samples from normal distributions.
Session 12 References

Concepts of point estimation

Estimators
An estimator should be close in some sense to the true
value of the unknown parameter.
Formally, we say that is an unbiased estimator of if the
expected value of is equal to . i.e. E () = .
If the estimator is not unbiased, then the difference E ()
is called the bias of the estimator .
In the book is demostrated that E (X ) = and E (S 2 ) = 2
but E (S) 6= . For large samples, the bias is very small.
However, there are good reasons for using S as an estimator
of in samples from normal distributions.
Session 12 References

Concepts of point estimation

Estimators
An estimator should be close in some sense to the true
value of the unknown parameter.
Formally, we say that is an unbiased estimator of if the
expected value of is equal to . i.e. E () = .
If the estimator is not unbiased, then the difference E ()
is called the bias of the estimator .
In the book is demostrated that E (X ) = and E (S 2 ) = 2
but E (S) 6= . For large samples, the bias is very small.
However, there are good reasons for using S as an estimator
of in samples from normal distributions.
Session 12 References

Variance of Point Estimator


Definition
Suppose that 1 and 2 are unbiased estimators of .
This indicates that the distribution of each estimator is
centered at the true value of .
However, the variance of these distributions may be different.

Figure: The sampling distributions of two unbiased estimators. Taken


from [Montgomery and Runger, 2010]
Session 12 References

Variance of Point Estimator


Definition
Suppose that 1 and 2 are unbiased estimators of .
This indicates that the distribution of each estimator is
centered at the true value of .
However, the variance of these distributions may be different.

Figure: The sampling distributions of two unbiased estimators. Taken


from [Montgomery and Runger, 2010]
Session 12 References

Variance of Point Estimator


Definition
Suppose that 1 and 2 are unbiased estimators of .
This indicates that the distribution of each estimator is
centered at the true value of .
However, the variance of these distributions may be different.

Figure: The sampling distributions of two unbiased estimators. Taken


from [Montgomery and Runger, 2010]
Session 12 References

Variance of Point Estimator

Definition
Since 1 has a smaller variance than 2 , the estimator 1 is
more likely to produce an estimate close to the true value .
A logical principle of estimation, when selecting among several
estimators, is to choose the estimator that has minimum
variance.
If we consider all unbiased estimators of , the one with
smallest variance is called the minimum variance unbiased
estimator (MVUE).
If X1 , X2 , ..., Xn is a random sample of size n from a normal
distribution with mean and variance 2 , the sample mean X
is the MVUE for .
Session 12 References

Variance of Point Estimator

Definition
Since 1 has a smaller variance than 2 , the estimator 1 is
more likely to produce an estimate close to the true value .
A logical principle of estimation, when selecting among several
estimators, is to choose the estimator that has minimum
variance.
If we consider all unbiased estimators of , the one with
smallest variance is called the minimum variance unbiased
estimator (MVUE).
If X1 , X2 , ..., Xn is a random sample of size n from a normal
distribution with mean and variance 2 , the sample mean X
is the MVUE for .
Session 12 References

Variance of Point Estimator

Definition
Since 1 has a smaller variance than 2 , the estimator 1 is
more likely to produce an estimate close to the true value .
A logical principle of estimation, when selecting among several
estimators, is to choose the estimator that has minimum
variance.
If we consider all unbiased estimators of , the one with
smallest variance is called the minimum variance unbiased
estimator (MVUE).
If X1 , X2 , ..., Xn is a random sample of size n from a normal
distribution with mean and variance 2 , the sample mean X
is the MVUE for .
Session 12 References

Variance of Point Estimator

Definition
Since 1 has a smaller variance than 2 , the estimator 1 is
more likely to produce an estimate close to the true value .
A logical principle of estimation, when selecting among several
estimators, is to choose the estimator that has minimum
variance.
If we consider all unbiased estimators of , the one with
smallest variance is called the minimum variance unbiased
estimator (MVUE).
If X1 , X2 , ..., Xn is a random sample of size n from a normal
distribution with mean and variance 2 , the sample mean X
is the MVUE for .
Session 12 References

Standard Error

Definition
When the numerical value or point estimate of a parameter is
reported, it is usually desirable to give some idea of the
precision of estimation.
The measure of precision usually employed is the standard
error of the estimator that has been used.
The standard error
q of an estimator is its standard deviation,
given by = V ().
Suppose we are sampling from a normal distribution with
mean and variance 2 .
The distribution of X is normal with mean and variance

2 /n, so the standard error of X is X = / n.
If we did not know but substituted the sample standard

deviation S into the above equation, X = S/ n.
Session 12 References

Standard Error

Definition
When the numerical value or point estimate of a parameter is
reported, it is usually desirable to give some idea of the
precision of estimation.
The measure of precision usually employed is the standard
error of the estimator that has been used.
The standard error
q of an estimator is its standard deviation,
given by = V ().
Suppose we are sampling from a normal distribution with
mean and variance 2 .
The distribution of X is normal with mean and variance

2 /n, so the standard error of X is X = / n.
If we did not know but substituted the sample standard

deviation S into the above equation, X = S/ n.
Session 12 References

Standard Error

Definition
When the numerical value or point estimate of a parameter is
reported, it is usually desirable to give some idea of the
precision of estimation.
The measure of precision usually employed is the standard
error of the estimator that has been used.
The standard error
q of an estimator is its standard deviation,
given by = V ().
Suppose we are sampling from a normal distribution with
mean and variance 2 .
The distribution of X is normal with mean and variance

2 /n, so the standard error of X is X = / n.
If we did not know but substituted the sample standard

deviation S into the above equation, X = S/ n.
Session 12 References

Standard Error

Definition
When the numerical value or point estimate of a parameter is
reported, it is usually desirable to give some idea of the
precision of estimation.
The measure of precision usually employed is the standard
error of the estimator that has been used.
The standard error
q of an estimator is its standard deviation,
given by = V ().
Suppose we are sampling from a normal distribution with
mean and variance 2 .
The distribution of X is normal with mean and variance

2 /n, so the standard error of X is X = / n.
If we did not know but substituted the sample standard

deviation S into the above equation, X = S/ n.
Session 12 References

Standard Error

Definition
When the numerical value or point estimate of a parameter is
reported, it is usually desirable to give some idea of the
precision of estimation.
The measure of precision usually employed is the standard
error of the estimator that has been used.
The standard error
q of an estimator is its standard deviation,
given by = V ().
Suppose we are sampling from a normal distribution with
mean and variance 2 .
The distribution of X is normal with mean and variance

2 /n, so the standard error of X is X = / n.
If we did not know but substituted the sample standard

deviation S into the above equation, X = S/ n.
Session 12 References

Standard Error

Definition
When the numerical value or point estimate of a parameter is
reported, it is usually desirable to give some idea of the
precision of estimation.
The measure of precision usually employed is the standard
error of the estimator that has been used.
The standard error
q of an estimator is its standard deviation,
given by = V ().
Suppose we are sampling from a normal distribution with
mean and variance 2 .
The distribution of X is normal with mean and variance

2 /n, so the standard error of X is X = / n.
If we did not know but substituted the sample standard

deviation S into the above equation, X = S/ n.
Session 12 References

Standard Error

Example
Generates a population of 250000 elements normally
distributed (use randn).
Choose a sample of 50000. Use datasample(data,k)
Compute the sample mean X .
Compute the sample standard deviation S.
Compute the standard error of X .
Compute the standard error percentaje with respect to X .
What does it means?
Session 12 References

Standard Error

Example
Generates a population of 250000 elements normally
distributed (use randn).
Choose a sample of 50000. Use datasample(data,k)
Compute the sample mean X .
Compute the sample standard deviation S.
Compute the standard error of X .
Compute the standard error percentaje with respect to X .
What does it means?
Session 12 References

Standard Error

Example
Generates a population of 250000 elements normally
distributed (use randn).
Choose a sample of 50000. Use datasample(data,k)
Compute the sample mean X .
Compute the sample standard deviation S.
Compute the standard error of X .
Compute the standard error percentaje with respect to X .
What does it means?
Session 12 References

Standard Error

Example
Generates a population of 250000 elements normally
distributed (use randn).
Choose a sample of 50000. Use datasample(data,k)
Compute the sample mean X .
Compute the sample standard deviation S.
Compute the standard error of X .
Compute the standard error percentaje with respect to X .
What does it means?
Session 12 References

Standard Error

Example
Generates a population of 250000 elements normally
distributed (use randn).
Choose a sample of 50000. Use datasample(data,k)
Compute the sample mean X .
Compute the sample standard deviation S.
Compute the standard error of X .
Compute the standard error percentaje with respect to X .
What does it means?
Session 12 References

Standard Error

Example
Generates a population of 250000 elements normally
distributed (use randn).
Choose a sample of 50000. Use datasample(data,k)
Compute the sample mean X .
Compute the sample standard deviation S.
Compute the standard error of X .
Compute the standard error percentaje with respect to X .
What does it means?
Session 12 References

Standard Error

Example
Generates a population of 250000 elements normally
distributed (use randn).
Choose a sample of 50000. Use datasample(data,k)
Compute the sample mean X .
Compute the sample standard deviation S.
Compute the standard error of X .
Compute the standard error percentaje with respect to X .
What does it means?
Session 12 References

Mean Square Error


Definition
Sometimes it is necessary to use a biased estimator.
In such cases, the mean square error of the estimator can be
important.
The mean square error of an estimator of the parameter
is defined as MSE () = E ( )2 = V () + (bias)2 .

Figure: A biased estimator 1 that has smaller variance than the


unbiased estimator 2 . Taken from [Montgomery and Runger, 2010]
Session 12 References

Mean Square Error


Definition
Sometimes it is necessary to use a biased estimator.
In such cases, the mean square error of the estimator can be
important.
The mean square error of an estimator of the parameter
is defined as MSE () = E ( )2 = V () + (bias)2 .

Figure: A biased estimator 1 that has smaller variance than the


unbiased estimator 2 . Taken from [Montgomery and Runger, 2010]
Session 12 References

Mean Square Error


Definition
Sometimes it is necessary to use a biased estimator.
In such cases, the mean square error of the estimator can be
important.
The mean square error of an estimator of the parameter
is defined as MSE () = E ( )2 = V () + (bias)2 .

Figure: A biased estimator 1 that has smaller variance than the


unbiased estimator 2 . Taken from [Montgomery and Runger, 2010]
Session 12 References

Mean Square Error


Definition
If is an unbiased estimator of , the mean square error of
is equal to the variance of .
The mean square error is an important criterion for comparing
two estimators.
Let 1 and 2 be two estimators of the parameter , and let
MSE (1 ) and MSE (2 ) be the mean square errors of 1 and
2 .
Then the relative efficiency of 2 to 1 is defined as
MSE (1 )/MSE (2 ).
If this relative efficiency is less than 1, we would conclude that
1 is a more efficient estimator of than 2 , in the sense
that it has a smaller mean square error.
Sometimes we find that biased estimators are preferable to
unbiased estimators because they have smaller MSE.
Session 12 References

Mean Square Error


Definition
If is an unbiased estimator of , the mean square error of
is equal to the variance of .
The mean square error is an important criterion for comparing
two estimators.
Let 1 and 2 be two estimators of the parameter , and let
MSE (1 ) and MSE (2 ) be the mean square errors of 1 and
2 .
Then the relative efficiency of 2 to 1 is defined as
MSE (1 )/MSE (2 ).
If this relative efficiency is less than 1, we would conclude that
1 is a more efficient estimator of than 2 , in the sense
that it has a smaller mean square error.
Sometimes we find that biased estimators are preferable to
unbiased estimators because they have smaller MSE.
Session 12 References

Mean Square Error


Definition
If is an unbiased estimator of , the mean square error of
is equal to the variance of .
The mean square error is an important criterion for comparing
two estimators.
Let 1 and 2 be two estimators of the parameter , and let
MSE (1 ) and MSE (2 ) be the mean square errors of 1 and
2 .
Then the relative efficiency of 2 to 1 is defined as
MSE (1 )/MSE (2 ).
If this relative efficiency is less than 1, we would conclude that
1 is a more efficient estimator of than 2 , in the sense
that it has a smaller mean square error.
Sometimes we find that biased estimators are preferable to
unbiased estimators because they have smaller MSE.
Session 12 References

Mean Square Error


Definition
If is an unbiased estimator of , the mean square error of
is equal to the variance of .
The mean square error is an important criterion for comparing
two estimators.
Let 1 and 2 be two estimators of the parameter , and let
MSE (1 ) and MSE (2 ) be the mean square errors of 1 and
2 .
Then the relative efficiency of 2 to 1 is defined as
MSE (1 )/MSE (2 ).
If this relative efficiency is less than 1, we would conclude that
1 is a more efficient estimator of than 2 , in the sense
that it has a smaller mean square error.
Sometimes we find that biased estimators are preferable to
unbiased estimators because they have smaller MSE.
Session 12 References

Mean Square Error


Definition
If is an unbiased estimator of , the mean square error of
is equal to the variance of .
The mean square error is an important criterion for comparing
two estimators.
Let 1 and 2 be two estimators of the parameter , and let
MSE (1 ) and MSE (2 ) be the mean square errors of 1 and
2 .
Then the relative efficiency of 2 to 1 is defined as
MSE (1 )/MSE (2 ).
If this relative efficiency is less than 1, we would conclude that
1 is a more efficient estimator of than 2 , in the sense
that it has a smaller mean square error.
Sometimes we find that biased estimators are preferable to
unbiased estimators because they have smaller MSE.
Session 12 References

Mean Square Error


Definition
If is an unbiased estimator of , the mean square error of
is equal to the variance of .
The mean square error is an important criterion for comparing
two estimators.
Let 1 and 2 be two estimators of the parameter , and let
MSE (1 ) and MSE (2 ) be the mean square errors of 1 and
2 .
Then the relative efficiency of 2 to 1 is defined as
MSE (1 )/MSE (2 ).
If this relative efficiency is less than 1, we would conclude that
1 is a more efficient estimator of than 2 , in the sense
that it has a smaller mean square error.
Sometimes we find that biased estimators are preferable to
unbiased estimators because they have smaller MSE.
Session 12 References

Mean Square Error

Exercise
Suppose we have a random sample of size 2n from a
population denoted by X , and E (X ) = and V (X ) = 2 .
2n n
1 X 1X
Let X1 = Xi and X2 = Xi be two estimators of .
2n n
i=1 i=1
Which is the better estimator of ? Explain your choice.
Both estimators are unbiased because E (X ) = .
However in the first case the variance is V (X1 ) = 2 /2n
While in the second case the variance is V (X2 ) = 2 /n.
MSE (1 ) 2 /2n n 1
Which means that MSE (2 )
= 2 /n
= 2n = 2

This indicates that X1 is the better estimator with the smaller


variance.
Session 12 References

Mean Square Error

Exercise
Suppose we have a random sample of size 2n from a
population denoted by X , and E (X ) = and V (X ) = 2 .
2n n
1 X 1X
Let X1 = Xi and X2 = Xi be two estimators of .
2n n
i=1 i=1
Which is the better estimator of ? Explain your choice.
Both estimators are unbiased because E (X ) = .
However in the first case the variance is V (X1 ) = 2 /2n
While in the second case the variance is V (X2 ) = 2 /n.
MSE (1 ) 2 /2n n 1
Which means that MSE (2 )
= 2 /n
= 2n = 2

This indicates that X1 is the better estimator with the smaller


variance.
Session 12 References

Mean Square Error

Exercise
Suppose we have a random sample of size 2n from a
population denoted by X , and E (X ) = and V (X ) = 2 .
2n n
1 X 1X
Let X1 = Xi and X2 = Xi be two estimators of .
2n n
i=1 i=1
Which is the better estimator of ? Explain your choice.
Both estimators are unbiased because E (X ) = .
However in the first case the variance is V (X1 ) = 2 /2n
While in the second case the variance is V (X2 ) = 2 /n.
MSE (1 ) 2 /2n n 1
Which means that MSE (2 )
= 2 /n
= 2n = 2

This indicates that X1 is the better estimator with the smaller


variance.
Session 12 References

Mean Square Error

Exercise
Suppose we have a random sample of size 2n from a
population denoted by X , and E (X ) = and V (X ) = 2 .
2n n
1 X 1X
Let X1 = Xi and X2 = Xi be two estimators of .
2n n
i=1 i=1
Which is the better estimator of ? Explain your choice.
Both estimators are unbiased because E (X ) = .
However in the first case the variance is V (X1 ) = 2 /2n
While in the second case the variance is V (X2 ) = 2 /n.
MSE (1 ) 2 /2n n 1
Which means that MSE (2 )
= 2 /n
= 2n = 2

This indicates that X1 is the better estimator with the smaller


variance.
Session 12 References

Mean Square Error

Exercise
Suppose we have a random sample of size 2n from a
population denoted by X , and E (X ) = and V (X ) = 2 .
2n n
1 X 1X
Let X1 = Xi and X2 = Xi be two estimators of .
2n n
i=1 i=1
Which is the better estimator of ? Explain your choice.
Both estimators are unbiased because E (X ) = .
However in the first case the variance is V (X1 ) = 2 /2n
While in the second case the variance is V (X2 ) = 2 /n.
MSE (1 ) 2 /2n n 1
Which means that MSE (2 )
= 2 /n
= 2n = 2

This indicates that X1 is the better estimator with the smaller


variance.
Session 12 References

Mean Square Error

Exercise
Suppose we have a random sample of size 2n from a
population denoted by X , and E (X ) = and V (X ) = 2 .
2n n
1 X 1X
Let X1 = Xi and X2 = Xi be two estimators of .
2n n
i=1 i=1
Which is the better estimator of ? Explain your choice.
Both estimators are unbiased because E (X ) = .
However in the first case the variance is V (X1 ) = 2 /2n
While in the second case the variance is V (X2 ) = 2 /n.
MSE (1 ) 2 /2n n 1
Which means that MSE (2 )
= 2 /n
= 2n = 2

This indicates that X1 is the better estimator with the smaller


variance.
Session 12 References

Mean Square Error

Exercise
Suppose we have a random sample of size 2n from a
population denoted by X , and E (X ) = and V (X ) = 2 .
2n n
1 X 1X
Let X1 = Xi and X2 = Xi be two estimators of .
2n n
i=1 i=1
Which is the better estimator of ? Explain your choice.
Both estimators are unbiased because E (X ) = .
However in the first case the variance is V (X1 ) = 2 /2n
While in the second case the variance is V (X2 ) = 2 /n.
MSE (1 ) 2 /2n n 1
Which means that MSE (2 )
= 2 /n
= 2n = 2

This indicates that X1 is the better estimator with the smaller


variance.
Session 12 References

Mean Square Error

Exercise
Suppose we have a random sample of size 2n from a
population denoted by X , and E (X ) = and V (X ) = 2 .
2n n
1 X 1X
Let X1 = Xi and X2 = Xi be two estimators of .
2n n
i=1 i=1
Which is the better estimator of ? Explain your choice.
Both estimators are unbiased because E (X ) = .
However in the first case the variance is V (X1 ) = 2 /2n
While in the second case the variance is V (X2 ) = 2 /n.
MSE (1 ) 2 /2n n 1
Which means that MSE (2 )
= 2 /n
= 2n = 2

This indicates that X1 is the better estimator with the smaller


variance.
Session 12 References

Mean Square Error

Reproductive property of the normal distribution


If X1 , X2 , .., Xp are independent, normal random variables with
E (Xi ) = i and V (Xi ) = i2 , for i = 1, 2, .., p,
Y = c1 X1 + c2 X2 + ... + cp Xp
is a normal random variable with
E (Y ) = c1 1 + c2 2 + ... + cp p
and
V (Y ) = c12 12 + c22 22 + ... + cp2 p2
Session 12 References

Mean Square Error

Reproductive property of the normal distribution


If X1 , X2 , .., Xp are independent, normal random variables with
E (Xi ) = i and V (Xi ) = i2 , for i = 1, 2, .., p,
Y = c1 X1 + c2 X2 + ... + cp Xp
is a normal random variable with
E (Y ) = c1 1 + c2 2 + ... + cp p
and
V (Y ) = c12 12 + c22 22 + ... + cp2 p2
Session 12 References

Mean Square Error

Reproductive property of the normal distribution


If X1 , X2 , .., Xp are independent, normal random variables with
E (Xi ) = i and V (Xi ) = i2 , for i = 1, 2, .., p,
Y = c1 X1 + c2 X2 + ... + cp Xp
is a normal random variable with
E (Y ) = c1 1 + c2 2 + ... + cp p
and
V (Y ) = c12 12 + c22 22 + ... + cp2 p2
Session 12 References

Mean Square Error

Reproductive property of the normal distribution


If X1 , X2 , .., Xp are independent, normal random variables with
E (Xi ) = i and V (Xi ) = i2 , for i = 1, 2, .., p,
Y = c1 X1 + c2 X2 + ... + cp Xp
is a normal random variable with
E (Y ) = c1 1 + c2 2 + ... + cp p
and
V (Y ) = c12 12 + c22 22 + ... + cp2 p2
Session 12 References

Mean Square Error

Reproductive property of the normal distribution


If X1 , X2 , .., Xp are independent, normal random variables with
E (Xi ) = i and V (Xi ) = i2 , for i = 1, 2, .., p,
Y = c1 X1 + c2 X2 + ... + cp Xp
is a normal random variable with
E (Y ) = c1 1 + c2 2 + ... + cp p
and
V (Y ) = c12 12 + c22 22 + ... + cp2 p2
Session 12 References

Mean Square Error

Reproductive property of the normal distribution


If X1 , X2 , .., Xp are independent, normal random variables with
E (Xi ) = i and V (Xi ) = i2 , for i = 1, 2, .., p,
Y = c1 X1 + c2 X2 + ... + cp Xp
is a normal random variable with
E (Y ) = c1 1 + c2 2 + ... + cp p
and
V (Y ) = c12 12 + c22 22 + ... + cp2 p2
Session 12 References

Mean Square Error

Exercise
Let X1 , X2 , ..., X7 denote a random sample from a population
having mean and variance 2 . Consider the following estimators
of :
X1 + X2 + ... + X7
1 =
7
2X1 X6 + X4
2 =
2

Is either estimator unbiased?


Which estimator is best?
In what sense is it best?
Session 12 References

Mean Square Error

Exercise
1 1
E (1 ) = [E (X1 ) + E (X2 ) + ... + E (X7 )] = (7) =
7 7
1 1
E (2 ) = [E (2X1 ) + E (X6 ) + E (X7 )] = (2 + ) =
2 2
Both 1 and 2 are unbiased estimates of since the expected
values of these statistics are equivalent to the true mean, .
X1 + X2 + .. + X7 1
V (1 ) = V [ ] = (V (X1 )+V (X2 )+...+V (X7 ))
7 49
1 1
= (7 2 ) = 2
49 7
Session 12 References

Mean Square Error

Exercise

2X1 X6 + X4 1
V (2 ) = V [ ] = (V (2X1 ) + V (X6 ) + V (X4 ))
2 4
1 1
= (4V (X1 ) + V (X6 ) + V (X4 )) = (4 2 + 2 + 2 )
4 4
1 3
= (6 2 ) = 2
4 2
Since both estimators are unbiased, the variances can be compared
to decide which is the better estimator. The variance of 1 is
smaller than that of 2 , 1 is the better estimator.
Session 12 References

References I

S-ar putea să vă placă și