Sunteți pe pagina 1din 53

MECHANICAL

ENGINEERING SYSTEMS
LABORATORY

Group 02

Asst. Prof. Dr. E. İlhan KONUKSEVEN


TYPICAL FREQUENCY CURVES
1. BINOMIAL DISTRIBUTION

IF AN EVENT CAN OCCUR IN 2 WAYS (A) OR (B) AND IF


(p) IS THE PROBABILITY OF OCCURRENCE OF (A) AND
(q) IS THE PROBABILITY OF OCCURRENCE OF (B),

THE PROBABILITY (P) OF (A) OCCURRING x TIMES OUT


OF n TRIALS IS :

n!
P (x)  . p xq nx
x ! n  x  !

. p x 1  p 
n! n x

x ! n  x  !
TYPICAL FREQUENCY CURVES
1. BINOMIAL DISTRIBUTION
n!
P (x)  . p xq nx
x ! n  x  !

. p x 1  p 
n! nx

x ! n  x  !

P(x) is the probability of obtaining “x successes” in “n independent trials” of a


“discrete event” for which “p” and “1-p” are respective probabilities of
“success” and “no success” in a single trial.
Mean  n.x
S.D.  np(1  p )

When “p” in the binomial distribution is too small and “n” is too large, the
calculation of the binomial distribution becomes quite complex and
approaches “Poisson” distribution.
TYPICAL FREQUENCY CURVES

2. POISSON DISTRIBUTION

IF EVENTS ARE OCCURRING RANDOMLY


INDEPENDENT OF EACH OTHER THEN THEY
WILL MAKE UP A POISSON DISTRIBUTION

e   . x
P( x) 
x! x 
2. POISSON DISTRIBUTION
e   . x
P (x) 
x! x 

P(x) is the probability of obtaining “x successive successes” of a “discrete


event” during an interval of time T where “” is the “average number of
successes” in the same interval T.
Mean  

S.D. = 

When n becomes large in binomial distribution, or  becomes large in


Poisson distribution, the envelope f(x) of the resulting continuous
distribution of a continuous variable x is called the “Gaussian/Normal
distribution” or “Gaussian Probability Density Function”.
TYPICAL FREQUENCY CURVES
3. GAUSSIAN / NORMAL DISTRIBUTION or
GAUSSIAN PROBABILITY DENSITY FUNCTION

WHEN THE PROBABILITY OF (A) OR (B) OCCURRING


IS THE SAME AND IF THE NUMBER OF TRIALS ARE
LARGE THEN THIS DISTRIBUTION IS CALLED THE
NORMAL DISTRIBUTION

THIS IS TYPICAL OR UNBIASED RANDOM ERROR


DISTRIBUTION
TYPICAL FREQUENCY CURVES
3. GAUSSIAN / NORMAL DISTRIBUTION or
GAUSSIAN PROBABILITY DENSITY FUNCTION
1  x  2
1   
2  
f ( x ) e
 2 x 
where Mean  
0.4
f m ax  1 2 
S.D. =  =0.3989
0.3

x
2
1  z2
f ( z ) e z
2 with  f(z) 0.2

0.1
where Mean = 0
S.D. = 1
0.0
-  -3 -2 -1 0 1 2 3  +
Its “bell-shaped” curve for the
z
standardized normal distribution
looks like as follows:
x
3. GAUSSIAN DISTRIBUTION
Properties of Gaussian Distribution:

1. f(x) > 0 for all finite values of x and f(x)  0 as x  

2. f(x) is symmetrical about its mean

3. Mean ( x ) determines its location and S.D. () determines its amount of
dispersion
f(x) f1(x)
x1  x 2
1   2

f2(x)

0 x

 

4.  f ( x)dx  f ( z )dz 1
 
x 3. GAUSSIAN DISTRIBUTION
Properties of Gaussian Distribution:
z2

5. P (z1  z  z 2 ) =  f ( z )dz or geometrically


z1

f(z) Area
is
P(z1zz2)

where P(z1  z  z2) is the


“probability” of having z
0 z1 z2 z between z1 and z2.
3. GAUSSIAN DISTRIBUTION
(NORMALIZED)

z2 z2 z1

 P ( z )dz   P ( z )dz   P ( x)dx


z1

 I ( z 2 )  I ( z1 )
z1 z2 z
z z2 f (z )
1 

2 e 2 dz
0
0 z

z 0.00 0.01 0.02 0.03 0.04 …..


0.0 0.0000 0.0040 0.0080 0.0120 0.0160
0.1 0.0398 0.0438 0.0478 0.0517 0.0557
0.2 0.0793 0.0832 0.0871 0.0910 0.0948
0.3 0.1179 0.1217 0.1255 0.1293 0.1331
.
INTERPRETATION OF FREQUENCY CURVES

EXPECTED VALUE

BECAUSE OF THE UNCERTAINTY INVOLVED IN MEASUREMENT


THE TRUE VALUE WILL NEVER BE KNOWN

THE VALUE OF THE QUANTITY TO BE MEASURED CAN ONLY BE


ESTIMATED

THE PROBLEM IS TO BE AS SURE AS POSSIBLE ABOUT THE


ESTIMATION

IF THE SAME VALUE IS MEASURED AGAIN AND AGAIN WITH


THE SAME INSTRUMENT THEN THE EXPECTED VALUE WILL BE
THE MEAN OF ALL THE MEASUREMENTS
z 0 .0 0 0 .0 1 0 .0 2 0 .0 3 0 .0 4 0 .0 5 0 .0 6 0 .0 7 0 .0 8
0 .0 0 .0 0 0 0 0 .0 0 4 0 0 .0 0 8 0 0 .0 1 2 0 0 .0 1 6 0 0 .0 1 9 9 0 .0 2 3 9 0 .0 2 7 9 0 .0 3 1 9
0 .1 0 .0 3 9 8 0 .0 4 3 8 0 .0 4 7 8 0 .0 5 1 7 0 .0 5 5 7 0 .0 5 9 6 0 .0 6 3 6 0 .0 6 7 5 0 .0 7 1 4
0 .2 0 .0 7 9 3 0 .0 8 3 2 0 .0 8 7 1 0 .0 9 1 0 0 .0 9 4 8 0 .0 9 8 7 0 .1 0 2 6 0 .1 0 6 4 0 .1 1 0 3
0 .3 0 .1 1 7 9 0 .1 2 1 7 0 .1 2 5 5 0 .1 2 9 3 0 .1 3 3 1 0 .1 3 6 8 0 .1 4 0 6 0 .1 4 4 3 0 .1 4 8 0
0 .4 0 .1 5 5 4 0 .1 5 9 1 0 .1 6 2 8 0 .1 6 6 4 0 .1 7 0 0 0 .1 7 3 6 0 .1 7 7 2 0 .1 8 0 8 0 .1 8 4 4
0 .5 0 .1 9 1 5 0 .1 9 5 0 0 .1 9 8 5 0 .2 0 1 9 0 .2 0 5 4 0 .2 0 8 8 0 .2 1 2 3 0 .2 1 5 7 0 .2 1 9 0
0 .6 0 .2 2 5 7 0 .2 2 9 1 0 .2 3 2 4 0 .2 3 5 7 0 .2 3 8 9 0 .2 4 2 2 0 .2 4 5 4 0 .2 4 8 6 0 .2 5 1 7
0 .7 0 .2 5 8 0 0 .2 6 1 1 0 .2 6 4 2 0 .2 6 7 3 0 .2 7 0 4 0 .2 7 3 4 0 .2 7 6 4 0 .2 7 9 4 0 .2 8 2 3
0 .8 0 .2 8 8 1 0 .2 9 1 0 0 .2 9 3 9 0 .2 9 6 7 0 .2 9 9 5 0 .3 0 2 3 0 .3 0 5 1 0 .3 0 7 8 0 .3 1 0 6
0 .9 0 .3 1 5 9 0 .3 1 8 6 0 .3 2 1 2 0 .3 2 3 8 0 .3 2 6 4 0 .3 2 8 9 0 .3 3 1 5 0 .3 3 4 0 0 .3 3 6 5
1 .0 0 .3 4 1 3 0 .3 4 3 8 0 .3 4 6 1 0 .3 4 8 5 0 .3 5 0 8 0 .3 5 3 1 0 .3 5 5 4 0 .3 5 7 7 0 .3 5 9 9
1 .1 0 .3 6 4 3 0 .3 6 6 5 0 .3 6 8 6 0 .3 7 0 8 0 .3 7 2 9 0 .3 7 4 9 0 .3 7 7 0 0 .3 7 9 0 0 .3 8 1 0
1 .2 0 .3 8 4 9 0 .3 8 6 9 0 .3 8 8 8 0 .3 9 0 7 0 .3 9 2 5 0 .3 9 4 4 0 .3 9 6 2 0 .3 9 8 0 0 .3 9 9 7
1 .3 0 .4 0 3 2 0 .4 0 4 9 0 .4 0 6 6 0 .4 0 8 2 0 .4 0 9 9 0 .4 1 1 5 0 .4 1 3 1 0 .4 1 4 7 0 .4 1 6 2
1 .4 0 .4 1 9 2 0 .4 2 0 7 0 .4 2 2 2 0 .4 2 3 6 0 .4 2 5 1 0 .4 2 6 5 0 .4 2 7 9 0 .4 2 9 2 0 .4 3 0 6
1 .5 0 .4 3 3 2 0 .4 3 4 5 0 .4 3 5 7 0 .4 3 7 0 0 .4 3 8 2 0 .4 3 9 4 0 .4 4 0 6 0 .4 4 1 8 0 .4 4 2 9
1 .6 0 .4 4 5 2 0 .4 4 6 3 0 .4 4 7 4 0 .4 4 8 4 0 .4 4 9 5 0 .4 5 0 5 0 .4 5 1 5 0 .4 5 2 5 0 .4 5 3 5
1 .7 0 .4 5 5 4 0 .4 5 6 4 0 .4 5 7 3 0 .4 5 8 2 0 .4 5 9 1 0 .4 5 9 9 0 .4 6 0 8 0 .4 6 1 6 0 .4 6 2 5
1 .8 0 .4 6 4 1 0 .4 6 4 9 0 .4 6 5 6 0 .4 6 6 4 0 .4 6 7 1 0 .4 6 7 8 0 .4 6 8 6 0 .4 6 9 3 0 .4 6 9 9
1 .9 0 .4 7 1 3 0 .4 7 1 9 0 .4 7 2 6 0 .4 7 3 2 0 .4 7 3 8 0 .4 7 4 4 0 .4 7 5 0 0 .4 7 5 6 0 .4 7 6 1
2 .0 0 .4 7 7 2 0 .4 7 7 8 0 .4 7 8 3 0 .4 7 8 8 0 .4 7 9 3 0 .4 7 9 8 0 .4 8 0 3 0 .4 8 0 8 0 .4 8 1 2
2 .1 0 .4 8 2 1 0 .4 8 2 6 0 .4 8 3 0 0 .4 8 3 4 0 .4 8 3 8 0 .4 8 4 2 0 .4 8 4 6 0 .4 8 5 0 0 .4 8 5 4
2 .2 0 .4 8 6 1 0 .4 8 6 4 0 .4 8 6 8 0 .4 8 7 1 0 .4 8 7 5 0 .4 8 7 8 0 .4 8 8 1 0 .4 8 8 4 0 .4 8 8 7
2 .3 0 .4 8 9 3 0 .4 8 9 6 0 .4 8 9 8 0 .4 9 0 1 0 .4 9 0 4 0 .4 9 0 6 0 .4 9 0 9 0 .4 9 1 1 0 .4 9 1 3
2 .4 0 .4 9 1 8 0 .4 9 2 0 0 .4 9 2 2 0 .4 9 2 5 0 .4 9 2 7 0 .4 9 2 9 0 .4 9 3 1 0 .4 9 3 2 0 .4 9 3 4
2 .5 0 .4 9 3 8 0 .4 9 4 0 0 .4 9 4 1 0 .4 9 4 3 0 .4 9 4 5 0 .4 9 4 6 0 .4 9 4 8 0 .4 9 4 9 0 .4 9 5 1
2 .6 0 .4 9 5 3 0 .4 9 5 5 0 .4 9 5 6 0 .4 9 5 7 0 .4 9 5 9 0 .4 9 6 0 0 .4 9 6 1 0 .4 9 6 2 0 .4 9 6 3
2 .7 0 .4 9 6 5 0 .4 9 6 6 0 .4 9 6 7 0 .4 9 6 8 0 .4 9 6 9 0 .4 9 7 0 0 .4 9 7 1 0 .4 9 7 2 0 .4 9 7 3
2 .8 0 .4 9 7 4 0 .4 9 7 5 0 .4 9 7 6 0 .4 9 7 7 0 .4 9 7 7 0 .4 9 7 8 0 .4 9 7 9 0 .4 9 7 9 0 .4 9 8 0
2 .9 0 .4 9 8 1 0 .4 9 8 2 0 .4 9 8 2 0 .4 9 8 3 0 .4 9 8 4 0 .4 9 8 4 0 .4 9 8 5 0 .4 9 8 5 0 .4 9 8 6
3 .0 0 .4 9 8 7 0 .4 9 8 7 0 .4 9 8 7 0 .4 9 8 8 0 .4 9 8 8 0 .4 9 8 9 0 .4 9 8 9 0 .4 9 8 9 0 .4 9 9 0
3 .1 0 .4 9 9 0 0 .4 9 9 1 0 .4 9 9 1 0 .4 9 9 1 0 .4 9 9 2 0 .4 9 9 2 0 .4 9 9 2 0 .4 9 9 2 0 .4 9 9 3
3 .2 0 .4 9 9 3 0 .4 9 9 3 0 .4 9 9 4 0 .4 9 9 4 0 .4 9 9 4 0 .4 9 9 4 0 .4 9 9 4 0 .4 9 9 5 0 .4 9 9 5
3 .3 0 .4 9 9 5 0 .4 9 9 5 0 .4 9 9 5 0 .4 9 9 6 0 .4 9 9 6 0 .4 9 9 6 0 .4 9 9 6 0 .4 9 9 6 0 .4 9 9 6
3 .4 0 .4 9 9 7 0 .4 9 9 7 0 .4 9 9 7 0 .4 9 9 7 0 .4 9 9 7 0 .4 9 9 7 0 .4 9 9 7 0 .4 9 9 7 0 .4 9 9 7
3 .5 0 .4 9 9 8 0 .4 9 9 8 0 .4 9 9 8 0 .4 9 9 8 0 .4 9 9 8 0 .4 9 9 8 0 .4 9 9 8 0 .4 9 9 8 0 .4 9 9 8
4 .0 0 .4 9 9 9 6 8 3
4 .5 0 .4 9 9 9 9 6 6
5 .0 0 .4 9 9 9 9 9 7
Normal Distribution Based Probability Estimates of Error
(Confidence Intervals):

If a data population abides by Gaussian theory, the probability


that a single data item will fall into an interval in terms of the
standard deviation of this population can be obtained by using
the tabulated integral given in the previous page.

Example 1:
Px ) = Px ) - Px 

= Px  Px 

= 0.3849 - 03413 = 0.0436 = 4.36 %

Implying that the probability of a measurement


to be in an interval [ ] is 4.36 %
INTERPRETATION OF FREQUENCY CURVES
2. STANDARD DEVIATION

1
P
 P ( z ). dz
1
 0 . 6826
0.3980

THE PROBABILITY THAT 0.2420 x


x    z 
IS 68.3 % 
P
 


 

P ( x ). dx  0 . 6826 -1 +1 z

- +
 x
3. GAUSSIAN DISTRIBUTION
(NORMALIZED)
Example :
P(x ) = 2*P(x )
= 2*P(0 x )
= 2*0.4772 = 0.9544 = 95.44 %
Implying that x value will be as close to  as ±2 with a probability
(certainty) greater than 95%; i.e. with an uncertainty less 5 %.
This information may be shown as : x =  ± 2 (95.44 %)
or if unbiased estimates are used : x = ± 2 (95.44 %)
P
2

 P ( z ).dz  0.9544
2

x
z

-2 +2 z
3. GAUSSIAN DISTRIBUTION
(NORMALIZED)
Example :
P(x ) = P(x  )
= P(0 x  ) - P(0 x )
= 0.5 - P(0 x )
= 0.5 - 0.4987 = 0.0013 = 1.3 ‰
Implying that x values 3 less than  are only likely
with a probability of 1.3 ‰
P 3

 P ( z ).dz  0.9974
3

-3 +3 z
INTERPRETATION OF FREQUENCY CURVES
EXPECTED VALUE

ONLY  HAS UNIQUE


PROPERTIES
z1  z2
A REASON FOR CHOOSING  AS THE EXPECTED VALUE IS THAT IT
COULD NOT VERY WELL BE ANY OTHER VALUE

FOR EXAMPLE, IF Z1 WERE CHOSEN THEN BASED SOLELY ON THE


EVIDENCE OF THE FREQUENCY CURVE Z2 COULD ALSO BE CHOSEN

A. IT IS THE MEAN OF ALL THE MEASUREMENTS


B. IT IS THE MODE (THE VALUE WITH THE GREATEST FREQUENCY)
C. IT IS THE MEDIAN MEASUREMENTS ABOVE AND BELOW  OCCUR
EQUALLY FREQUENTLY
THE MEAN IS CALCULATED AS FOLLOWS :

A. DISCRETE VALUE DISTRIBUTION

  X
  X F( X )
i
j j
N
where i  1 to N
j  1 to M
THE MEAN IS CALCULATED AS FOLLOWS :

B. HISTOGRAM

MULTIPLY THE MID-VALUE OF THE


MEASUREMENT IN EACH INTERVAL BY THE
RELATIVE FREQUENCY FOR THAT INTERVAL
AND SUM OVER ALL POSSIBLE VALUES

   X j  X j 1 X j  X j 1 F ( X j )
1
2
THE MEAN IS CALCULATED AS FOLLOWS :

C. CONTINUOUS DISTRIBUTIONS


  

X . P ( X ). dX
 2
 V A R IA N C E
  STANDARD D E V IA T IO N
FOR FINITE NUMBER OF MEASUREMENTS

2 2

 x  x  .F ( x j )
N M

  xi  x 
1
s 2
s2  j
N i 1 j 1

N 2

 x  x
1
s  i
N i 1

THIS IS THE ACTUAL STANDARD DEVIATION


FOR THE FINITE NUMBER OF MEASUREMENTS
TERMINOLOGY

 = THE BEST ESTIMATE OF THE


TRUE VALUE FOR INFINITE
NUMBER OF MEASUREMENTS
x = THE BEST ESTIMATE OF THE
TRUE VALUE FOR FINITE
NUMBER OF MEASUREMENTS
TERMINOLOGY

= THE STANDARD DEVIATION


FOR INFINITE NUMBER OF
MEASUREMENTS
s = THE STANDARD DEVIATION
FOR FINITE NUMBER OF
MEASUREMENTS
IF WE WANT TO ESTIMATE THE TRUE VALUE BY
AS A RESULT OF A FINITE NUMBER OF
MEASUREMENTS
_
WE WILL ASSUME THAT   x

_
IF x

1
 n 2
THEN    s
 n 1
IN ORDER TO DETERMINE
HOW ACCURATELY x IS
ESTIMATING 
_

(x) 
n
_
( x )  STANDARD DEVIATION
OF x FROM 
THEREFORE
_ _
  x  (x)
THIS CORRESPONDS TO A 68.3 %
CONFIDENCE LEVEL FOR THE
INTERVAL
 _ _ _ _

 x  ( x ) , x  ( x ) 
 
TO CONTAIN THE TRUE VALUE.
IF WE WANT TO EXPRESS  WITH A CONFIDENCE
LEVEL OF 95 % THEN

 _  1 . 96 
' _
 ( x )    1 . 96  ( x )
  n

AND FOR 99% CONFIDENCE LEVEL :

2 .58
'
 _
 _
 ( x )    2 .58  ( x )
  n
2 .58
P
 P ( z ).dz  0.99
 2 .58

-2.58 +2.58 z
Example:
Let the result of measurements to determine the spring constants of a sample
drawn from a very large number of valve springs manufactured be obtained as:
n = 40, x = 152.5 N/cm, s = 0.889 N/cm

a) Determine the range of spring constants with a “confidence level” of ±95 %.


±0.95 confidence level  P( x -zx xzs) = 0.95
 P(0 x xs z) = 0.475
 z = 1.96 (from table)
x = 152.5 ± 1.96*0.889 = 152.5 ± 1.74 N/cm (95 %)

b) Determine the confidence interval of the mean value with a “confidence level”
of ±95 %.
Standard deviation of the mean: s x  s / n  0.889 / 40  0.141 N/cm
 = 152.5 ± 1.96*0.141 = 152.5 ± 0.28 N/cm (95 %)

c) Determine the % probability of having springs with spring constants greater


than 154 N/cm.
z = (x - x ) / s = (154-152.5)/0.889 = 1.69
P(x > 154) = P(z > 1.69)
= P(0  z  ) - P(0  z  1.69)
= 0.5 - 0.4545 = 0.0455 = 4.6 %
 z 0 .0 0 0 .0 1 0 .0 2 0 .0 3 0 .0 4 0 .0 5 0 .0 6 0 .0 7 0 .0 8
0 .0 0 .0 0 0 0 0 .0 0 4 0 0 .0 0 8 0 0 .0 1 2 0 0 .0 1 6 0 0 .0 1 9 9 0 .0 2 3 9 0 .0 2 7 9 0 .0 3 1 9
0 .1 0 .0 3 9 8 0 .0 4 3 8 0 .0 4 7 8 0 .0 5 1 7 0 .0 5 5 7 0 .0 5 9 6 0 .0 6 3 6 0 .0 6 7 5 0 .0 7 1 4
0 .2 0 .0 7 9 3 0 .0 8 3 2 0 .0 8 7 1 0 .0 9 1 0 0 .0 9 4 8 0 .0 9 8 7 0 .1 0 2 6 0 .1 0 6 4 0 .1 1 0 3
0 .3 0 .1 1 7 9 0 .1 2 1 7 0 .1 2 5 5 0 .1 2 9 3 0 .1 3 3 1 0 .1 3 6 8 0 .1 4 0 6 0 .1 4 4 3 0 .1 4 8 0
0 .4 0 .1 5 5 4 0 .1 5 9 1 0 .1 6 2 8 0 .1 6 6 4 0 .1 7 0 0 0 .1 7 3 6 0 .1 7 7 2 0 .1 8 0 8 0 .1 8 4 4
0 .5 0 .1 9 1 5 0 .1 9 5 0 0 .1 9 8 5 0 .2 0 1 9 0 .2 0 5 4 0 .2 0 8 8 0 .2 1 2 3 0 .2 1 5 7 0 .2 1 9 0
0 .6 0 .2 2 5 7 0 .2 2 9 1 0 .2 3 2 4 0 .2 3 5 7 0 .2 3 8 9 0 .2 4 2 2 0 .2 4 5 4 0 .2 4 8 6 0 .2 5 1 7
0 .7 0 .2 5 8 0 0 .2 6 1 1 0 .2 6 4 2 0 .2 6 7 3 0 .2 7 0 4 0 .2 7 3 4 0 .2 7 6 4 0 .2 7 9 4 0 .2 8 2 3
0
0
.8
.9
0
0
.2 8 8 1
.3 1 5 9
0
0
.2 9
.3 1
1
8
0
6
0
0
.2 9
.3 2
3
1
9
2
0
0
.2 9
.3 2
6
3
7
8
0
0
.2 9
.3 2
9
6
5
4
0
0
.3 0
.3 2
2
8
3
9
0
0
0.08
.3 0
.3 3
5
1
1
5
0
0
.3 0
.3 3
7
4
8
0
0
0
.3 1
.3 3
0
6
6
5
1 .0 0 .3 4 1 3 0 .3 4 3 8 0 .3 4 6
0.4951
1 0 .3 4 8 5 0 .3 5 0 8 0 .3 5 3 1 0 .3 5 5 4 0 .3 5 7 7 0 .3 5 9 9
1
1
1
.1
.2
.3
0
0
0
z
.3 6 4 3
.3 8 4 9
.4 0 3 2
0
0
0
.3 6
.3 8
.4 0
6
6
4
5
9
9
0
0
0
.3 6
.3 8
.4 0
8
8
6
6
8
6
0
0
0
.3 7
.3 9
.4 0
0
0
8
8
7
2
0
0
0
.3 7
.3 9
.4 0
2
2
9
9
5
9
0
0
0
.3 7
.3 9
.4 1
4
4
1
9
4
5
0
0
0
.3 7
.3 9
.4 1
7
6
3
0
2
1
0
0
0
.3 7
.3 9
.4 1
9
8
4
0
0
7
0
0
0
.3 8
.3 9
.4 1
1
9
6
0
7
2
1 .4 0 .4 1 9 2 0 .4 2 0 7 0 .4 2 2 2 0 .4 2 3 6 0 .4 2 5 1 0 .4 2 6 5 0 .4 2 7 9 0 .4 2 9 2 0 .4 3 0 6
1 .5 0 .4 3 3 2 0 .4 3 4 5 0 .4 3 5 7 0 .4 3 7 0 0 .4 3 8 2 0 .4 3 9 4 0 .4 4 0 6 0 .4 4 1 8 0 .4 4 2 9
1 .6 0 .4 4 5 2 0 .4 4 6 3 0 .4 4 7 4 0 .4 4 8 4 0 .4 4 9 5 0 .4 5 0 5 0 .4 5 1 5 0 .4 5 2 5 0 .4 5 3 5
1 .7 0 .4 5 5 4 0 .4 5 6 4 0 .4 5 7 3 0 .4 5 8 2 0 .4 5 9 1 0 .4 5 9 9 0 .4 6 0 8 0 .4 6 1 6 0 .4 6 2 5
1 .8 0 .4 6 4 1 0 .4 6 4 9 0 .4 6 5 6 0 .4 6 6 4 0 .4 6 7 1 0 .4 6 7 8 0 .4 6 8 6 0 .4 6 9 3 0 .4 6 9 9
1 .9 0 .4 7 1 3 0 .4 7 1 9 0 .4 7 2 6 0 .4 7 3 2 0 .4 7 3 8 0 .4 7 4 4 0 .4 7 5 0 0 .4 7 5 6 0 .4 7 6 1
2 .0 0 .4 7 7 2 0 .4 7 7 8 0 .4 7 8 3 0 .4 7 8 8 0 .4 7 9 3 0 .4 7 9 8 0 .4 8 0 3 0 .4 8 0 8 0 .4 8 1 2
2 .1 0 .4 8 2 1 0 .4 8 2 6 0 .4 8 3 0 0 .4 8 3 4 0 .4 8 3 8 0 .4 8 4 2 0 .4 8 4 6 0 .4 8 5 0 0 .4 8 5 4
2 .2 0 .4 8 6 1 0 .4 8 6 4 0 .4 8 6 8 0 .4 8 7 1 0 .4 8 7 5 0 .4 8 7 8 0 .4 8 8 1 0 .4 8 8 4 0 .4 8 8 7
2 .3 0 .4 8 9 3 0 .4 8 9 6 0 .4 8 9 8 0 .4 9 0 1 0 .4 9 0 4 0 .4 9 0 6 0 .4 9 0 9 0 .4 9 1 1 0 .4 9 1 3
2 .4 0 .4 9 1 8 0 .4 9 2 0 0 .4 9 2 2 0 .4 9 2 5 0 .4 9 2 7 0 .4 9 2 9 0 .4 9 3 1 0 .4 9 3 2 0 .4 9 3 4
2 .5 0 .4 9 3 8 0 .4 9 4 0 0 .4 9 4 1 0 .4 9 4 3 0 .4 9 4 5 0 .4 9 4 6 0 .4 9 4 8 0 .4 9 4 9 0 .4 9 5 1
2 .6 0 .4 9 5 3 0 .4 9 5 5 0 .4 9 5 6 0 .4 9 5 7 0 .4 9 5 9 0 .4 9 6 0 0 .4 9 6 1 0 .4 9 6 2 0 .4 9 6 3
2 .7 0 .4 9 6 5 0 .4 9 6 6 0 .4 9 6 7 0 .4 9 6 8 0 .4 9 6 9 0 .4 9 7 0 0 .4 9 7 1 0 .4 9 7 2 0 .4 9 7 3
2
2
.8
.9
0
0
.4 9 7 4
.4 9 8 1
0
0
.4 9
.4 9
2.5
7
8
5
2
0
0
.4 9
.4 9
7
8
6
2
0
0
.4 9
.4 9
7
8
7
3
0
0
.4 9
.4 9
7
8
7
4
0
0
.4 9
.4 9
7
8
8
4
0
0
.4 9
.4 9
7
8
9
5
0
0
.4 9
.4 9
7
8
9
5
0
0
.4 9
.4 9
8
8
0
6
3 .0 0 .4 9 8 7 0 .4 9 8 7 0 .4 9 8 7 0 .4 9 8 8 0 .4 9 8 8 0 .4 9 8 9 0 .4 9 8 9 0 .4 9 8 9 0 .4 9 9 0
3 .1 0 .4 9 9 0 0 .4 9 9 1 0 .4 9 9 1 0 .4 9 9 1 0 .4 9 9 2 0 .4 9 9 2 0 .4 9 9 2 0 .4 9 9 2 0 .4 9 9 3
3 .2 0 .4 9 9 3 0 .4 9 9 3 0 .4 9 9 4 0 .4 9 9 4 0 .4 9 9 4 0 .4 9 9 4 0 .4 9 9 4 0 .4 9 9 5 0 .4 9 9 5
3 .3 0 .4 9 9 5 0 .4 9 9 5 0 .4 9 9 5 0 .4 9 9 6 0 .4 9 9 6 0 .4 9 9 6 0 .4 9 9 6 0 .4 9 9 6 0 .4 9 9 6
3 .4 0 .4 9 9 7 0 .4 9 9 7 0 .4 9 9 7 0 .4 9 9 7 0 .4 9 9 7 0 .4 9 9 7 0 .4 9 9 7 0 .4 9 9 7 0 .4 9 9 7
3 .5 0 .4 9 9 8 0 .4 9 9 8 0 .4 9 9 8 0 .4 9 9 8 0 .4 9 9 8 0 .4 9 9 8 0 .4 9 9 8 0 .4 9 9 8 0 .4 9 9 8
4 .0 0 .4 9 9 9 6 8 3
4 .5 0 .4 9 9 9 9 6 6
5 .0 0 .4 9 9 9 9 9 7
Some commonly used confidence levels are:

Common Name Confidence Certainty P(|x - |>z)


Level % P(|x - |>zx )
Probable Error ±0.6754  50 1 in 2
Standard Deviation ± 68.26 1 in 3
90 % Error ±1.65  90 1 in 10
95 % Error ±1.96  95 1 in 20
Two Sigma Error ±2  95.44 1 in 22
99 % Error ±2.58  99 1 in 100
Three Sigma Error ±3  99.74 1 in 369
Maximum Error ±3.29  99.9+ 1 in 1000
STUDENT’S t-DISTRIBUTION

WHEN THE SAMPLE SIZE IS SMALL (n<20-30) THEN 


BECOMES A BIASED ESTIMATE OF , AND THE
STANDARDIZED VARIABLE

x
t
s/ n
is used to determine confidence levels of the estimation of
the mean, based on so-called “Student’s t-Distribution”
rather than the “Normal Distribution”. Therefore, the
estimation results of the mean must be presented as

  xt
n
The “t value” corresponds to the limits of the
integral
t

 f (t, )dt = Desired Certainty (Confidence)


t

where f(t,) is the Student’s t-distribution defined as


 1

 (   1) / 2  t2  2
f ( t,  )  1  
 (  / 2)   

which is independent of  and , but its shape is determined only by the


degrees of freedom =n-1, therefore by the number of data points, n.
It has almost the same characteristics as the normal distribution, except that
it has more probability concentrated in tails and less in the central part


Mean = 0 & S.D. 
 2

It converges to the normal distribution very fast as n gets large.


0.4
Gaussian

0.2 Student t

0.0
- -3 -2 -1 0 1 2 3 +
z, t

THE t-DISTRIBUTION HAS MORE


PROBABILITY CONCENTRATED AT
THE TAILS AND LESS IN THE CENTRAL
PART WHERE x  
_

t  _
 ( x )
THEREFORE THE BEST ESTIMATE
OF THE TRUE VALUE BASED ON A
FINITE NUMBER (N<20-30) OF
MEASUREMENTS AND
CONFIDENCE LEVEL CAN BE
EXPRESSED AS :
_ _
  x  t . ( x )
WHERE t IS A FUNCTION OF ( n ) AND
CONFIDENCE LEVEL
n C O N F ID E N C E L E V E L
0 .5 0 .6 0 .7 0 .8 0 .9 0 .9 5 0 .9 8 0 .9 9 0 .9 9 9
2 1 .0 0 0 1 .3 7 6 1 .9 6 3 3 .0 7 8 6 .3 1 4 1 2 .7 0 6 3 1 .8 2 1 6 3 .6 5 6 6 3 6 .5 7 8
3 0 .8 1 6 1 .0 6 1 1 .3 8 6 1 .8 8 6 2 .9 2 0 4 .3 0 3 6 .9 6 5 9 .9 2 5 3 1 .6 0 0
4 0 .7 6 5 0 .9 7 8 1 .2 5 0 1 .6 3 8 2 .3 5 3 3 .1 8 2 4 .5 4 1 5 .8 4 1 1 2 .9 2 4
5 0 .7 4 1 0 .9 4 1 1 .1 9 0 1 .5 3 3 2 .1 3 2 2 .7 7 6 3 .7 4 7 4 .6 0 4 8 .6 1 0
6 0 .7 2 7 0 .9 2 0 1 .1 5 6 1 .4 7 6 2 .0 1 5 2 .5 7 1 3 .3 6 5 4 .0 3 2 6 .8 6 9
7 0 .7 1 8 0 .9 0 6 1 .1 3 4 1 .4 4 0 1 .9 4 3 2 .4 4 7 3 .1 4 3 3 .7 0 7 5 .9 5 9
8 0 .7 1 1 0 .8 9 6 1 .1 1 9 1 .4 1 5 1 .8 9 5 2 .3 6 5 2 .9 9 8 3 .4 9 9 5 .4 0 8
9 0 .7 0 6 0 .8 8 9 1 .1 0 8 1 .3 9 7 1 .8 6 0 2 .3 0 6 2 .8 9 6 3 .3 5 5 5 .0 4 1
1 0 0 .7 0 3 0 .8 8 3 1 .1 0 0 1 .3 8 3 1 .8 3 3 2 .2 6 2 2 .8 2 1 3 .2 5 0 4 .7 8 1
1 1 0 .7 0 0 0 .8 7 9 1 .0 9 3 1 .3 7 2 1 .8 1 2 2 .2 2 8 2 .7 6 4 3 .1 6 9 4 .5 8 7
1 2 0 .6 9 7 0 .8 7 6 1 .0 8 8 1 .3 6 3 1 .7 9 6 2 .2 0 1 2 .7 1 8 3 .1 0 6 4 .4 3 7
1 3 0 .6 9 5 0 .8 7 3 1 .0 8 3 1 .3 5 6 1 .7 8 2 2 .1 7 9 2 .6 8 1 3 .0 5 5 4 .3 1 8
1 4 0 .6 9 4 0 .8 7 0 1 .0 7 9 1 .3 5 0 1 .7 7 1 2 .1 6 0 2 .6 5 0 3 .0 1 2 4 .2 2 1
1 5 0 .6 9 2 0 .8 6 8 1 .0 7 6 1 .3 4 5 1 .7 6 1 2 .1 4 5 2 .6 2 4 2 .9 7 7 4 .1 4 0
1 6 0 .6 9 1 0 .8 6 6 1 .0 7 4 1 .3 4 1 1 .7 5 3 2 .1 3 1 2 .6 0 2 2 .9 4 7 4 .0 7 3
1 7 0 .6 9 0 0 .8 6 5 1 .0 7 1 1 .3 3 7 1 .7 4 6 2 .1 2 0 2 .5 8 3 2 .9 2 1 4 .0 1 5
1 8 0 .6 8 9 0 .8 6 3 1 .0 6 9 1 .3 3 3 1 .7 4 0 2 .1 1 0 2 .5 6 7 2 .8 9 8 3 .9 6 5
1 9 0 .6 8 8 0 .8 6 2 1 .0 6 7 1 .3 3 0 1 .7 3 4 2 .1 0 1 2 .5 5 2 2 .8 7 8 3 .9 2 2
2 0 0 .6 8 8 0 .8 6 1 1 .0 6 6 1 .3 2 8 1 .7 2 9 2 .0 9 3 2 .5 3 9 2 .8 6 1 3 .8 8 3
2 1 0 .6 8 7 0 .8 6 0 1 .0 6 4 1 .3 2 5 1 .7 2 5 2 .0 8 6 2 .5 2 8 2 .8 4 5 3 .8 5 0
2 2 0 .6 8 6 0 .8 5 9 1 .0 6 3 1 .3 2 3 1 .7 2 1 2 .0 8 0 2 .5 1 8 2 .8 3 1 3 .8 1 9
2 3 0 .6 8 6 0 .8 5 8 1 .0 6 1 1 .3 2 1 1 .7 1 7 2 .0 7 4 2 .5 0 8 2 .8 1 9 3 .7 9 2
2 4 0 .6 8 5 0 .8 5 8 1 .0 6 0 1 .3 1 9 1 .7 1 4 2 .0 6 9 2 .5 0 0 2 .8 0 7 3 .7 6 8
2 5 0 .6 8 5 0 .8 5 7 1 .0 5 9 1 .3 1 8 1 .7 1 1 2 .0 6 4 2 .4 9 2 2 .7 9 7 3 .7 4 5
2 6 0 .6 8 4 0 .8 5 6 1 .0 5 8 1 .3 1 6 1 .7 0 8 2 .0 6 0 2 .4 8 5 2 .7 8 7 3 .7 2 5
2 7 0 .6 8 4 0 .8 5 6 1 .0 5 8 1 .3 1 5 1 .7 0 6 2 .0 5 6 2 .4 7 9 2 .7 7 9 3 .7 0 7
2 8 0 .6 8 4 0 .8 5 5 1 .0 5 7 1 .3 1 4 1 .7 0 3 2 .0 5 2 2 .4 7 3 2 .7 7 1 3 .6 8 9
2 9 0 .6 8 3 0 .8 5 5 1 .0 5 6 1 .3 1 3 1 .7 0 1 2 .0 4 8 2 .4 6 7 2 .7 6 3 3 .6 7 4
3 0 0 .6 8 3 0 .8 5 4 1 .0 5 5 1 .3 1 1 1 .6 9 9 2 .0 4 5 2 .4 6 2 2 .7 5 6 3 .6 6 0
4 0 0 .6 8 1 0 .8 5 1 1 .0 5 0 1 .3 0 4 1 .6 8 5 2 .0 2 3 2 .4 2 6 2 .7 0 8 3 .5 5 8
5 0 0 .6 8 0 0 .8 4 9 1 .0 4 8 1 .2 9 9 1 .6 7 7 2 .0 1 0 2 .4 0 5 2 .6 8 0 3 .5 0 0
1 0 0 0 .6 7 7 0 .8 4 5 1 .0 4 2 1 .2 9 0 1 .6 6 0 1 .9 8 4 2 .3 6 5 2 .6 2 6 3 .3 9 1
2 5 0 0 .6 7 5 0 .8 4 3 1 .0 3 9 1 .2 8 5 1 .6 5 1 1 .9 7 0 2 .3 4 1 2 .5 9 6 3 .3 3 0
5 0 0 0 .6 7 5 0 .8 4 2 1 .0 3 8 1 .2 8 3 1 .6 4 8 1 .9 6 5 2 .3 3 4 2 .5 8 6 3 .3 1 0
 0 .6 7 4 0 .8 4 2 1 .0 3 6 1 .2 8 2 1 .6 4 5 1 .9 6 0 2 .3 2 6 2 .5 7 6 3 .2 9 0
_ _
  x  t . ( x )

IS SIMILAR TO
_ _
  x  z . ( x )
AND

tz as n
Example
Let the result of measurements to determine the spring constants of a sample
drawn from a very large number of valve springs manufactured be obtained as:
n = 10, x = 152.5 N/cm, s = 0.889 N/cm
Determine the confidence interval of the mean value with a “confidence level” of
±95 %.
Standard deviation of the mean:
s x  s / n  0.889 / 10 = 0.281 N/cm

i) If normal distribution is used, z = 1.96


 = 152.5 ± 1.96*0.281 = 152.5 ± 0.55 N/cm
ii) If t-distribution is used, t = 2.262 (from table)
 = 152.5 ± 2.262*0.281 = 152.5 ± 0.64 N/cm
If the data were:
n = 5, x = 152.5 N/cm, s = 0.889 N/cm
Standard deviation of the mean:
x
s  s / n  0.889 / 5 = 0.398 N/cm

i) If normal distribution is used, z = 1.96


 = 152.5 ± 1.96*0.398 = 152.5 ± 0.78 N/cm
ii) If t-distribution is used, t = 2.776 (from table)
 = 152.5 ± 2.776*0.398 = 152.5 ± 1.10 N/cm
REJECTION OF BAD DATA
CHAUVENET’S CRITERION
IF A MEASUREMENT IS THOUGHT TO BE WRONG
(NOT PRECISE OR INACCURATE) PERHAPS DUE TO
A FAULTY READING OR SO, THEN THE
CHAUVENET’S CRITERION IS USED TO CONFIRM
SUCH A SUSPICION.

The question is here that “Whether a loner or outlier which is


thought to be faulty is to be rejected or eliminated?”

If one considers n measurements, then the Chauvenet’s criterion


states that:
“A reading may be rejected if the probability
of obtaining its deviation from the mean is
less than 1/2n”
CHAUVENET’S CRITERION

CHAUVENET’S CRITERION IS USED


FOR REJECTING OBSERVATIONS
WHOSE ERRORS ARE “TOO GREAT”
THE LIMITING VALUE BEYOND WHICH
ERRORS ARE CONSIDERED TO BE “TOO 1  2 I z i   1
GREAT” 2n
P

-1 +1 z
Example:
Let the result of measurements to determine the spring constants
of a sample drawn from a very large number of valve springs
manufactured be obtained as:
n = 40, x = 152.5 N/cm, s = 0.889 N/cm

It is desired to determine the range of the spring constant value to


be used to eliminate a measurement (out of 40 measurements) if it
P
happens to be located out of this range.

P( x + zs  |x|) < 1/2n = 1/(2*40) = 0.0125


P( x + zs  x) < (1/2n)/2 = 0.0125/2 = 0.00625 z

0.5 - P(0  x  x + zs) < 0.00625


P(0  x  x +zs) > 0.5 - 0.00625 = 0.49375
 z  2.50 (from table)
Therefore, a measurement may be rejected if it lies outside the
range: x = x ± zs = 152.5 ± 2.5*0.889 = 152.5 ± 2.22 N/cm
 z 0 .0 0 0 .0 1 0 .0 2 0 .0 3 0 .0 4 0 .0 5 0 .0 6 0 .0 7 0 .0 8
0 .0 0 .0 0 0 0 0 .0 0 4 0 0 .0 0 8 0 0 .0 1 2 0 0 .0 1 6 0 0 .0 1 9 9 0 .0 2 3 9 0 .0 2 7 9 0 .0 3 1 9
0 .1 0 .0 3 9 8 0 .0 4 3 8 0 .0 4 7 8 0 .0 5 1 7 0 .0 5 5 7 0 .0 5 9 6 0 .0 6 3 6 0 .0 6 7 5 0 .0 7 1 4
0 .2 0 .0 7 9 3 0 .0 8 3 2 0 .0 8 7 1 0 .0 9 1 0 0 .0 9 4 8 0 .0 9 8 7 0 .1 0 2 6 0 .1 0 6 4 0 .1 1 0 3
0 .3 0 .1 1 7 9 0 .1 2 1 7 0 .1 2 5 5 0 .1 2 9 3 0 .1 3 3 1 0 .1 3 6 8 0 .1 4 0 6 0 .1 4 4 3 0 .1 4 8 0
0 .4 0 .1 5 5 4 0 .1 5 9 1 0 .1 6 2 8 0 .1 6 6 4 0 .1 7 0 0 0 .1 7 3 6 0 .1 7 7 2 0 .1 8 0 8 0 .1 8 4 4
0 .5 0 .1 9 1 5 0 .1 9 5 0 0 .1 9 8 5 0 .2 0 1 9 0 .2 0 5 4 0 .2 0 8 8 0 .2 1 2 3 0 .2 1 5 7 0 .2 1 9 0
0 .6 0 .2 2 5 7 0 .2 2 9 1 0 .2 3 2 4 0 .2 3 5 7 0 .2 3 8 9 0 .2 4 2 2 0 .2 4 5 4 0 .2 4 8 6 0 .2 5 1 7
0 .7 0 .2 5 8 0 0 .2 6 1 1 0 .2 6 4 2 0 .2 6 7 3 0 .2 7 0 4 0 .2 7 3 4 0 .2 7 6 4 0 .2 7 9 4 0 .2 8 2 3
0
0
.8
.9
0
0
.2 8 8 1
.3 1 5 9
0
0
.2 9
.3 1
1
8
0
6
0
0
.2 9
.3 2
3
1
9
2
0
0
.2 9
.3 2
6
3
7
8
0
0
.2 9
.3 2
9
6
5
4
0
0
.3 0
.3 2
2
8
3
9
0
0
0.08
.3 0
.3 3
5
1
1
5
0
0
.3 0
.3 3
7
4
8
0
0
0
.3 1
.3 3
0
6
6
5
1
1
.0
.1
0
0
.3 4 1 3
.3 6 4 3
0
0
.3 4
.3 6
3
6
8
5
0
0
.3 4
.3 6
6
8
0.4951
1
6
0
0
.3 4
.3 7
8
0
5
8
0
0
.3 5
.3 7
0
2
8
9
0
0
.3 5
.3 7
3
4
1
9
0
0
.3 5
.3 7
5
7
4
0
0
0
.3 5
.3 7
7
9
7
0
0
0
.3 5
.3 8
9
1
9
0
1 .2 0 .3 8 4 9 0 .3 8 6 9 0 .3 8 8 8 0 .3 9 0 7 0 .3 9 2 5 0 .3 9 4 4 0 .3 9 6 2 0 .3 9 8 0 0 .3 9 9 7
1 .3 0 .4 0 3 2 0 .4 0 4 9 0 .4 0 6 6 0 .4 0 8 2 0 .4 0 9 9 0 .4 1 1 5 0 .4 1 3 1 0 .4 1 4 7 0 .4 1 6 2
1 .4 0 .4 1 9 2 0 .4 2 0 7 0 .4 2 2 2 0 .4 2 3 6 0 .4 2 5 1 0 .4 2 6 5 0 .4 2 7 9 0 .4 2 9 2 0 .4 3 0 6
1
1
.5
.6
0
0
z
.4 3 3 2
.4 4 5 2
0
0
.4 3
.4 4
4
6
5
3
0
0
.4 3
.4 4
5
7
7
4
0
0
.4 3
.4 4
7
8
0
4
0
0
.4 3
.4 4
8
9
2
5
0
0
.4 3
.4 5
9
0
4
5
0
0
.4 4
.4 5
0
1
6
5
0
0
.4 4
.4 5
1
2
8
5
0
0
.4 4
.4 5
2
3
9
5
1 .7 0 .4 5 5 4 0 .4 5 6 4 0 .4 5 7 3 0 .4 5 8 2 0 .4 5 9 1 0 .4 5 9 9 0 .4 6 0 8 0 .4 6 1 6 0 .4 6 2 5
1 .8 0 .4 6 4 1 0 .4 6 4 9 0 .4 6 5 6 0 .4 6 6 4 0 .4 6 7 1 0 .4 6 7 8 0 .4 6 8 6 0 .4 6 9 3 0 .4 6 9 9
1 .9 0 .4 7 1 3 0 .4 7 1 9 0 .4 7 2 6 0 .4 7 3 2 0 .4 7 3 8 0 .4 7 4 4 0 .4 7 5 0 0 .4 7 5 6 0 .4 7 6 1
2 .0 0 .4 7 7 2 0 .4 7 7 8 0 .4 7 8 3 0 .4 7 8 8 0 .4 7 9 3 0 .4 7 9 8 0 .4 8 0 3 0 .4 8 0 8 0 .4 8 1 2
2 .1 0 .4 8 2 1 0 .4 8 2 6 0 .4 8 3 0 0 .4 8 3 4 0 .4 8 3 8 0 .4 8 4 2 0 .4 8 4 6 0 .4 8 5 0 0 .4 8 5 4
2 .2 0 .4 8 6 1 0 .4 8 6 4 0 .4 8 6 8 0 .4 8 7 1 0 .4 8 7 5 0 .4 8 7 8 0 .4 8 8 1 0 .4 8 8 4 0 .4 8 8 7
2 .3 0 .4 8 9 3 0 .4 8 9 6 0 .4 8 9 8 0 .4 9 0 1 0 .4 9 0 4 0 .4 9 0 6 0 .4 9 0 9 0 .4 9 1 1 0 .4 9 1 3
2 .4 0 .4 9 1 8 0 .4 9 2 0 0 .4 9 2 2 0 .4 9 2 5 0 .4 9 2 7 0 .4 9 2 9 0 .4 9 3 1 0 .4 9 3 2 0 .4 9 3 4
2 .5 0 .4 9 3 8 0 .4 9 4 0 0 .4 9 4 1 0 .4 9 4 3 0 .4 9 4 5 0 .4 9 4 6 0 .4 9 4 8 0 .4 9 4 9 0 .4 9 5 1
2 .6 0 .4 9 5 3 0 .4 9 5 5 0 .4 9 5 6 0 .4 9 5 7 0 .4 9 5 9 0 .4 9 6 0 0 .4 9 6 1 0 .4 9 6 2 0 .4 9 6 3
2 .7 0 .4 9 6 5 0 .4 9 6 6 0 .4 9 6 7 0 .4 9 6 8 0 .4 9 6 9 0 .4 9 7 0 0 .4 9 7 1 0 .4 9 7 2 0 .4 9 7 3
2
2
.8
.9
0
0
.4 9 7 4
.4 9 8 1
0
0
.4 9
.4 9
2.5
7
8
5
2
0
0
.4 9
.4 9
7
8
6
2
0
0
.4 9
.4 9
7
8
7
3
0
0
.4 9
.4 9
7
8
7
4
0
0
.4 9
.4 9
7
8
8
4
0
0
.4 9
.4 9
7
8
9
5
0
0
.4 9
.4 9
7
8
9
5
0
0
.4 9
.4 9
8
8
0
6
3 .0 0 .4 9 8 7 0 .4 9 8 7 0 .4 9 8 7 0 .4 9 8 8 0 .4 9 8 8 0 .4 9 8 9 0 .4 9 8 9 0 .4 9 8 9 0 .4 9 9 0
3 .1 0 .4 9 9 0 0 .4 9 9 1 0 .4 9 9 1 0 .4 9 9 1 0 .4 9 9 2 0 .4 9 9 2 0 .4 9 9 2 0 .4 9 9 2 0 .4 9 9 3
3 .2 0 .4 9 9 3 0 .4 9 9 3 0 .4 9 9 4 0 .4 9 9 4 0 .4 9 9 4 0 .4 9 9 4 0 .4 9 9 4 0 .4 9 9 5 0 .4 9 9 5
3 .3 0 .4 9 9 5 0 .4 9 9 5 0 .4 9 9 5 0 .4 9 9 6 0 .4 9 9 6 0 .4 9 9 6 0 .4 9 9 6 0 .4 9 9 6 0 .4 9 9 6
3 .4 0 .4 9 9 7 0 .4 9 9 7 0 .4 9 9 7 0 .4 9 9 7 0 .4 9 9 7 0 .4 9 9 7 0 .4 9 9 7 0 .4 9 9 7 0 .4 9 9 7
3 .5 0 .4 9 9 8 0 .4 9 9 8 0 .4 9 9 8 0 .4 9 9 8 0 .4 9 9 8 0 .4 9 9 8 0 .4 9 9 8 0 .4 9 9 8 0 .4 9 9 8
4 .0 0 .4 9 9 9 6 8 3
4 .5 0 .4 9 9 9 9 6 6
5 .0 0 .4 9 9 9 9 9 7
The following table lists values of maximum acceptable normalized
deviations (z) for various values of number of data points (n)
according to this criterion:
n zmax
3 1.38
4 1.54
5 1.65
6 1.73
7 1.80
10 1.96
15 2.13
25 2.33
50 2.57
100 2.81
300 3.14
500 3.29
1000 3.48

Note that when applying the Chauvenet’s criterion:

1. n should be large.
2. End points of a curve must not be eliminated.
3. If a data point is rejected, then x and s must be recomputed.
4. Successive applications more than once are not acceptable.
Example:

The following readings (n=10) are taken of a certain physical length in


cm: 5.30, 5.73, 6.77, 5.26, 4.33, 5.45, 6.09, 5.64, 5.81, 5.75

Then the best estimates of the mean and standard deviation of the
true length can be computed as:

x = 5.613 cm and s = 0.627 cm

Now, let us test our data points for a possible inconsistency in


readings using Chauvenet’s criterion:
i xi zi
1 5.30 0.499
2 5.73 0.187
3 6.77 1.845
4 5.26 0.563
5 4.33 2.046
6 5.45 0.260
7 6.09 0.761
8 5.64 0.043
9 5.81 0.314
10 5.75 0.219
In accordance with the Table given in previous page, for n=10, a data point
with zi > 1.96 may be eliminated. Hence, let us decide to reject the data point
number 5 with z5 = 2.046 > 1.96.
P(0  x  x +zs) > 0.5 – 1/(2*10)/2 = 0.475
i xi zi  z  1.96 (from table)
1 5.30 0.499
2 5.73 0.187
3 6.77 1.845
4 5.26 0.563
Therefore, a measurement may be rejected if it lies
5 4.33 2.046
outside the range: x = x ± zs = 5.613 ± 1.96*0.627
6 5.45 0.260
7 6.09 0.761
8 5.64 0.043
9 5.81 0.314
10 5.75 0.219

When this point is eliminated, the new estimates of the mean and the standard
deviation of the true length becomes:

x = 5.756 cm and s = 0.462 cm


Note that the elimination of this data point has resulted in a 26.5 % reduction in
s (from 0.627 to 0.462).
 2
(CHI  SQUARE ) TEST

IN GENERAL THIS TEST CAN BE


APPLIED TO REJECT A HYPOTHESIS

FOR ERROR ANALYSIS IT IS USED TO


TEST IF THE ERROR IS RANDOM OR
NOT

THAT IS, CAN THE DATA BE ASSUMED


TO HAVE A NORMAL (UNBIASED)
DISTRIBUTION
Randomness test  2
( CHI  SQUARE ) TEST

If we want to determine how well a given set of data fit to an


assumed distribution, chi-square test is used. In this test the
quantity chi-square is defined as:

 
2
N
(no )i  (ne )i 
2

i 1 (ne ) i
where
N : the number of cells or groups of observations
(no)i : number of observed occurrences in group i
(ne)i : number of expected occurrences in group i
(in other words, the value which would be
obtained if the measurements matched the
expected distribution perfectly)
F=N-k Chi-Square Function
F= DEGREES OF FREEDOM
k= NUMBER OF CONSTRAINTS P=0.001 0.01
50 0.05
45
40
35 0.50
30
25 5%


0.95
20
0.99
15
10
5
0
1 3 5 7 9 11131517192123252729313335
Number of degrees of freedom, F
Probability

F 0.99 0.95 0.50 0.05 0.01 0.001

1 0.0002 0.004 0.45 3.84 6.63 10.83

2 0.02 0.10 1.39 5.99 9.21 13.82

3 0.11 0.35 2.37 7.81 11.34 16.27

4 0.30 0.71 3.36 9.49 13.28 18.47

The following is 5 0.55 1.15 4.35 11.07 15.09 20.51

6 0.87 1.64 5.35 12.59 16.81 22.46

tabulated data of 7 1.24 2.17 6.35 14.07 18.48 24.32

8 1.65 2.73 7.34 15.51 20.09 26.12

this plot: 9 2.09 3.33 8.34 16.92 21.67 27.88

10 2.56 3.94 9.34 18.31 23.21 29.59

11 3.05 4.57 10.34 19.68 24.73 31.26

12 3.57 5.23 11.34 21.03 26.22 32.91

13 4.11 5.89 12.34 22.36 27.69 34.53

14 4.66 6.57 13.34 23.68 29.14 36.12

15 5.23 7.26 14.34 25.00 30.58 37.70

16 5.81 7.96 15.34 26.30 32.00 39.25

17 6.41 8.67 16.34 27.59 33.41 40.79

18 7.01 9.39 17.34 28.87 34.81 42.31

19 7.63 10.12 18.34 30.14 36.19 43.82

20 8.26 10.85 19.34 31.41 37.57 45.31

21 8.90 11.59 20.34 32.67 38.93 46.80

22 9.54 12.34 21.34 33.92 40.29 48.27

23 10.20 13.09 22.34 35.17 41.64 49.73

24 10.86 13.85 23.34 36.42 42.98 51.18

25 11.52 14.61 24.34 37.65 44.31 52.62

26 12.20 15.38 25.34 38.89 45.64 54.05

27 12.88 16.15 26.34 40.11 46.96 55.48

28 13.56 16.93 27.34 41.34 48.28 56.89

29 14.26 17.71 28.34 42.56 49.59 58.30

30 14.95 18.49 29.34 43.77 50.89 59.70

31 15.66 19.28 30.34 44.99 52.19 61.10

32 16.36 20.07 31.34 46.19 53.49 62.49

33 17.07 20.87 32.34 47.40 54.78 63.87

34 17.79 21.66 33.34 48.60 56.06 65.25

35 18.51 22.47 34.34 49.80 57.34 66.62


In this plot and table, F represents the number of degrees of
freedom in the measurements and is given by

F=N-k

where N is the number of cells and k is the number of imposed


conditions on the expected distribution.

Obviously, the smaller the 2-value is the better is the


agreement between the assumed distribution and the observed
values, because it corresponds to a larger probability value for
this match.
Example:
An equal number of motors are purchased from a company A and
company B. At the end of a year, records show 5 failures for A-type
motors and 9 failures for B-type motors. Does this result imply that A-
type motors are more reliable?

In this problem, we have two cases (N=2): A-type failures and B-type
failures; that is, (no)A = 5 & (no)B = 9.
If we make the hypothesis that failures are random, then for a total of 14
failures one would expect 7 failures for A and 7 failures for B. So, (ne)A =
(ne)B = 7,
2 2
(5 - 7) (9 - 7)
giving  
2
  114
.
7 7

The only imposed condition in this problem is


(no)A + (no)B = 14 hence k=1  F=2-1=1

If these values are used, it is found that the probability of the difference
in failure rates being coincidental has a probability of 0.286 ( 30 %).
This is a reasonably high probability and does not allow us to say
directly that A-types are more reliable than B-types.
If the same failure rates continues and at the end of a longer period we
end up 50 failures for A and 90 failures for B, 2 becomes 11.4 giving a
probability value of 0.000734 (< 0.1 %) for the difference to be
coincidental. In this case, we can say without a doubt that A-type motors
are more reliable than B-type motors.

Usually, a probability of less than 5 % allows rejection of a random


difference hypothesis.

If we have a series of measurements and the goodness of the fit of the


measurement data to a normal distribution is in question, then the
measurement range is divided somewhat arbitrarily into sampling
intervals such that each interval contains at least 5 data points. Then
using the number of occurrences in each interval, the hypothesis of the
data distribution is tested by using 2 method. Note that k=3 for this case
since
A fixed number of data points are used (1 constraint)
To estimate the expected occurrences mean and standard
deviation of the sample are used (2 more constraints)
Example
Suppose an investigator wishes to see if 20 boys and girls respond differently to an
attitudinal question regarding the educational value of extracurricular activities and
observed the following (A = very valuable, U = uncertain, and D = little value).
Boys A = 60 U = 20 D = 20
Girls A = 40 U = 0 D = 60
Expected frequencies (Fe) for each cell are determined by the following formula.

Example - For the cell "Boys - A", the corresponding row subtotal = 100, the
corresponding column subtotal = 100, and the total number of observations = 200.
NOTE: Row subtotals and column subtotals must have equal sums, and total
expected frequencies must equal total observed frequencies.

http://www.okstate.edu/ag/agedcm4h/academic/aged5980a/5980/newpage28.htm
A U D Row
Subtotals

Boys 60 (50) 20 (10) 20 (40) 100


Girls 40 (50) 0 (10) 60 (40) 100

Column 100 20 80 200


Subtotals

Degrees of Freedom = (Rows - 1)(Columns - 1) = (2 - 1)(3 - 1) = 2


Table value of X2.05 with 2 degrees of freedom = 5.991
Therefore, reject null hypothesis.
Degrees of Freedom
A value of X2 cannot be evaluated unless the number of degrees of freedom associated with
it is known. The number of degrees of freedom associated with any X2 may be easily
computed.
If there is one independent variable, df = r - 1 where r is the number of levels of the
independent variable.
If there are two independent variables, df = (r - l) (s - l) where r and s are the number of
levels of the first and second independent variables, respectively.
If there are three independent variables, df = (r - l) (s - 1) (t - 1) where r, s, and t are the
number of levels of the first, second, and third independent variables, respectively.

S-ar putea să vă placă și