Sunteți pe pagina 1din 8

STANDARD DEVIATION AND ITS APPLICATIONS

INTRODUCTION:
In statistics, the standard deviation (SD, also represented by the Greek letter sigma σ or the Latin
letter s) is a measure that is used to quantify the amount of variation or dispersion of a set of data
values.[1] A low standard deviation indicates that the data points tend to be close to
the mean (also called the expected value) of the set, while a high standard deviation indicates
that the data points are spread out over a wider range of values.
Standard deviation is a measure of how spread out a data set is. It's used in a huge number of
applications. In finance, standard deviations of price data are frequently used as a measure of
volatility. In opinion polling, standard deviations are a key part of calculating margins of error.

Standard deviation is a measurement used in statistics of the amount a number varies from the
average number in a series of numbers. The standard deviation tells those interpreting the data,
how reliable the data is or how much difference there is between the pieces of data by showing
how close to the average all of the data is.
 A low standard deviation means that the data is very closely related to the average, thus very
reliable.
 A high standard deviation means that there is a large variance between the data and the
statistical average, thus not as reliable.
Calculating Standard Deviation
The standard deviation is determined by finding the square root of what is called the variance.
The variance is found by squaring the differences from the mean.

IN ORDER TO DETERMINE STANDARD DEVIATION:


1. Determine the mean (the average of all the numbers) by adding up all the data pieces and
dividing by the number of pieces of data.
2. Subtract each piece of data from the mean and then square it.
3. Determine the average of all of those squared numbers calculated in #2 to find the variance.
4. Find the square root of the numbers in #3 and that is the standard deviation.
Calculators are available online to quickly determine a standard deviation. For example,
at Calculator.net and MathPortal.org.
USES FOR STANDARD DEVIATION
Some examples of situations in which standard deviation might help to understand the value of
the data:
 A class of students took a math test. Their teacher found that the mean score on the test was
an 85%. She then calculated the standard deviation of the other test scores and found a very
small standard deviation which suggested that most students scored very close to 85%.
 A dog walker wants to determine if the dogs on his route are close in weight or not close in
weight. He takes the average of the weight of all ten dogs. He then calculates the variance,
and then the standard deviation. His standard deviation is extremely high. This suggests that
the dogs are of many various weights, or that he has a few dogs whose weights are outliers
that are skewing the data.
 A market researcher is analyzing the results of a recent customer survey. He wants to have
some measure of the reliability of the answers received in the survey in order to predict how
a larger group of people might answer the same questions. A low standard deviation shows
that the answers are very projectable to a larger group of people.
 A weather reporter is analyzing the high temperature forecasted for a series of dates versus
the actual high temperature recorded on each date. A low standard deviation would show a
reliable weather forecast.
 A class of students took a test in Language Arts. The teacher determines that the mean grade
on the exam is a 65%. She is concerned that this is very low, so she determines the standard
deviation to see if it seems that most students scored close to the mean, or not. The teacher
finds that the standard deviation is high. After closely examining all of the tests, the teacher
is able to determine that several students with very low scores were the outliers that pulled
down the mean of the entire class’s scores.
 An employer wants to determine if the salaries in one department seem fair for all
employees, or if there is a great disparity. He finds the average of the salaries in that
department and then calculates the variance, and then the standard deviation. The employer
finds that the standard deviation is slightly higher than he expected, so he examines the data
further and finds that while most employees fall within a similar pay bracket, three loyal
employees who have been in the department for 20 years or more, far longer than the others,
are making far more due to their longevity with the company. Doing the analysis helped the
employer to understand the range of salaries of the people in the department

APPLICATIONS OF STANDARD DEVIATION

1. Standard score

Standard scores are used to compare students’ performances in different tests.


For a distribution of marks with mean x and standard deviation ,
The standard score z is
z xx

2. Percentages of Normal data lying within a certain number of standard deviations from the
mean

A distribution curve for a set of data is basically a frequency or relative frequency curve of the
data. It is found that the distribution curves for a lot of commonly occurring data sets follow a
certain pattern that came to be known as normal distributions.

A normal distribution has a bell-shaped curve as shown.


y 50 50
% %
99.
7%
95
68
%
%

34 34
% %

13. 13 2.3
2.3
5% .5 5% x
m5%m m m m %m m
3 2  + +2 +3
A normalcurve
 has the following
 characteristics:

It is symmetrical about the mean.

1. Mean = mode = median. They all lie at the centre of the curve.

2. There are fewer data for values further away from the mean.

a) about 68% of the data lie within 1 standard deviation from the mean.
b) about 95% of the data lie within 2 standard deviations from the mean.
c) about 99.7% of the data lie within 3 standard deviations from the mean.
ESTIMATION

One can find the standard deviation of an entire population in cases (such as standardized
testing) where every member of a population is sampled. In cases where that cannot be done, the
standard deviation σ is estimated by examining a random sample taken from the population and
computing a statistic of the sample, which is used as an estimate of the population standard
deviation. Such a statistic is called an estimator, and the estimator (or the value of the estimator,
namely the estimate) is called a sample standard deviation, and is denoted by s (possibly with
modifiers). However, unlike in the case of estimating the population mean, for which the sample
mean is a simple estimator with many desirable properties (unbiased, efficient, maximum
likelihood), there is no single estimator for the standard deviation with all these properties,
and unbiased estimation of standard deviation is a very technically involved problem. Most
often, the standard deviation is estimated using the corrected sample standard
deviation (using N − 1), defined below, and this is often referred to as the "sample standard
deviation", without qualifiers. However, other estimators are better in other respects: the
uncorrected estimator (using N) yields lower mean squared error, while using N − 1.5 (for the
normal distribution) almost completely eliminates bias.

UNCORRECTED SAMPLE STANDARD DEVIATION:

Firstly, the formula for the population standard deviation (of a finite population) can be applied
to the sample, using the size of the sample as the size of the population (though the actual
population size from which the sample is drawn may be much larger). This estimator, denoted
by sN, is known as the uncorrected sample standard deviation, or sometimes the standard
deviation of the sample (considered as the entire population), and is defined as follows

Where are the observed values of the sample items and is the mean value of these

observations, while the denominator N stands for the size of the sample: this is the square root of
he sample variance, which is the average of the squared deviations about the sample mean.

This is a consistent estimator (it converges in probability to the population value as the number
of samples goes to infinity), and is the maximum-likelihood estimate when the population is
normally distributed.[citation needed] However, this is a biased estimator, as the estimates are
generally too low. The bias decreases as sample size grows, dropping off as 1/n, and thus is most

significant for small or moderate sample sizes; for the bias is below 1%. Thus for very
large sample sizes, the uncorrected sample standard deviation is generally acceptable. This
estimator also has a uniformly smaller mean squared error than the corrected sample standard
deviation.
CORRECTED SAMPLE STANDARD DEVIATION:

If the biased sample variance (the second central moment of the sample, which is a downward-
biased estimate of the population variance) is used to compute an estimate of the population's
standard deviation, the result is

Here taking the square root introduces further downward bias, by Jensen's inequality, due to
the square root being a concave function. The bias in the variance is easily corrected, but the bias
from the square root is more difficult to correct, and depends on the distribution in question.An
unbiased estimator for the variance is given by applying Bessel's correction, using N − 1 instead
of N to yield the unbiased sample variance, denoted s2:

This estimator is unbiased if the variance exists and the sample values are drawn independently
with replacement. N − 1 corresponds to the number of degrees of freedom in the vector of

deviations from the mean,

Taking square roots reintroduces bias (because the square root is a nonlinear function, which
does not commute with the expectation), yielding the corrected sample standard
deviation, denoted by s:

As explained above, while s2 is an unbiased estimator for the population variance, s is still a
biased estimator for the population standard deviation, though markedly less biased than the
uncorrected sample standard deviation. The bias is still significant for small samples (N less than
10), and also drops off as 1/N as sample size increases. This estimator is commonly used and
generally known simply as the "sample standard deviation".

EXAMPLE OF STANDARD DEVATION

First, let's look at what a standard deviation is measuring. Consider two small businesses with
four employees each. In one business, two employees make $19 per hour and the other two make
$21 per hour. In the second business, two employees make $15 per hour, one makes $24, and the
last makes $26
In both companies, the average wage is $20 per
hour, but the distribution of hourly wages is clearly different. In company A, all four employees'
wages are tightly bunched around that average, while at company B, there's a big spread between
the two employees making $15 and the other two employees.

Standard deviation is a measure of how far away individual measurements tend to be from the
mean value of a data set. The standard deviation of company A's employees is 1, while the
standard deviation of company B's wages is about 5. In general, the larger the standard deviation
of a data set, the more spread out the individual points are in that set.

Technically, It's More Complicated

The technical definition of standard deviation is somewhat complicated. First, for each data
value, find out how far the value is from the mean by taking the difference of the value and the
mean. Then, square all of those differences. Then, take the average of those squared differences.
Finally, take the square root of that average.

The reason we go through such a complicated process to define standard deviation is because
this measure appears as a parameter in a number of statistical and probabilistic formulas, most

notably the normal distribution. The normal


distribution is an extremely important tool in statistics. The shape of a normal distribution is a
bell shaped curve, like the one in the image .That curve shows, roughly speaking, how likely a
random process following a normal distribution is to take on a particular value along the
horizontal axis .Values near the peak, where the curve is highest, are more likely than values
further away, where the curve is closer to the horizontal axis. Normal distributions appear in
situations where there are a large number of independent but similar random events occurring.
Things like heights of people in a particular population tend to roughly follow a normal
distribution .Standard deviations are important here because the shape of a normal curve is
determined by its mean and standard deviation. The mean tells you where the middle, highest
part of the curve should go. The standard deviation tells you how skinny or wide the curve will
be. If you know these two numbers, you know everything you need to know about the shape of
your curve . Flipping this idea around, normal distributions also give us a good way to interpret
standard deviations. In any normal distribution, there are fixed probabilities for intervals around
the mean based on multiples of the standard deviation of the distribution.In particular, about 2/3
of measurements of a normally distributed quantity should fall within one standard deviation of
the mean, 95% of measurements within two standard deviations of the mean, and 99.7% within
three standard deviations of the mean. This illustration of the normal curve lists these values:

Suppose there's a standardized test that


hundreds of thousands of students take. If the test's questions are well designed, the students'
scores should be roughly normally distributed. Say the mean score on the test is 100, with a
standard deviation of 10 points. The rule mentioned above means that about 2/3 of the students
should have scores between 90 and 110, 95% of students should be between 80 and 120, and
nearly all the students - 99.7% - should have scores within three standard deviations of the mean

REFERENCE:

http://examples.yourdictionary.com/examples-of-standard-deviation.html

https://en.wikipedia.org/wiki/Standard_deviation
KG COLLEGE OF ARTS AND SCIENCE

ASSIGNMENT-I

NAME : V.SUSHMITHA

ROLL.NO :152AAB52

CLASS : II-B

DEPARTMENT : COMMERCE

DUE DATE : 30.12.2016

SUBJECT : STATISTICS FOR BUSINESS

ASSIGNMENT TOPIC : STANDARD DEVIATION AND ITS APPLICATIONS

FACULTY NAME :MS.NIRMALA

MARKS FACULTY SIGNATURE

S-ar putea să vă placă și