Standard Deviation Definition
Standard Deviation is a statistical term used to measure the
amount of variability or dispersion around an average. Technically
it is a measure of volatility. Dispersion is the difference between
the actual and the average value. The larger this dispersion or
variability is, the higher is the standard deviation.
Standard Deviation
The Standard Deviation is a measure of how spread out numbers
are.
Its symbol is σ (the greek letter sigma)
the standard deviation is a measure of the amount of variation
or dispersion of a set of values. A low standard deviation indicates that the
values tend to be close to the mean (also called the expected value) of the
set, while a high standard deviation indicates that the values are spread out
over a wider range.
Standard deviation may be abbreviated SD, and is most commonly
represented in mathematical texts and equations by the lower case Greek
letter sigma σ, for the population standard deviation, or the Latin letter s, for
the sample standard deviation.[2]
The standard deviation is a statistic that measures the dispersion of a
dataset relative to its mean and is calculated as the square root of
the variance. The standard deviation is calculated as the square root of
variance by determining each data point's deviation relative to the mean. If
the data points are further from the mean, there is a higher deviation within
the data set; thus, the more spread out the data, the higher the standard
deviation.
Standard Deviation
Advantages Disadvantages
Shows how much data is clustered around a mean value It doesn't give you the
full range of the data
It gives a more accurate idea of how the data is distributed
It can be hard to
Not as affected by extreme values calculate
Only used with data
where an independent
variable is plotted
against the frequency of
it
Assumes a normal
distribution pattern
The standard deviation (SD) is one of a measure of variation of a data set. Some of the advantage
of SD are:
1. It is calculated based on all the data points in a data set
2. Good estimate of variation of a data set if the distribution is normal
Standard deviation is complex to compute and difficult to understand as compared to
other measures of dispersion. Standard deviation is highly affected by the extreme
values in the series. 3. Standard deviation cannot be obtained for open end class
frequency distribution
Advantages/Merits Of Standard Deviation
1. Rigidly Defined
Standard deviation is rigidly defined measure and its value is always fixed.
2. Best Measure
Standard deviation is based on all the items in the series. So, it is the best measure of
dispersion.
3. Less Affected
Standard deviation is least affected by the sampling fluctuations than other measures
(mean deviation and quartile deviation).
4. Suitable For Algebraic Operation
Standard deviation can be used for mathematical operations and algebraic treatments.
It is also applicable in statistical analysis.
Disadvantage
1. Complex Method
Standard deviation is complex to compute and difficult to understand as compared to
other measures of dispersion.
2. High Effect Standard deviation is highly affected by the extreme values in the series.
3. Standard deviation cannot be obtained for open end class frequency distribution.
The standard devidtion, like any other devices has certain merits and demerits. These are
outlined here as under:
Merits
It is rigidly defined and free from any ambiguity.
Its calculation is based on all the observations of a series and it cannot be correctly
calculated ignoring any item of a series.
It strictly follows the algebraic principles, and it never ignores the + and – signs like the
mean deviation.
It is capable of further algebraic treatment as it has a lot of algebraic properties.
It is used as a formidable instrument in making higher statistical analysis viz.: correlation,
skewness, regression and sample studies, etc.
It is not much affected by the fluctuations in sampling for which is widely used in testing
the hypotheses and for conducting the different tests of significance viz. : test, t2 test etc.
In a normal distribution, X¯ ± 1 covers 68.27% of the values for which it is called a
standard measure of dispersion.
It exhibits the scatter of dispersion of the various items of a series form its arithmetic
mean and thereby justifies its name as a measure of dispersion.
It enables us to make a comparative study of the two, or moiré series, and to tell upon
their consistency, or stability through calculation of the important factors viz. co-efficient of
variation, variance etc.
It enables us to determine the reliability of the Mean of the two or more series when they
show the identical means.
It can be calculated through a good number of methods yielding the same results.
It maintains an empirical relation with other measures of dispersion as under :
Range = 6, QD =2/3 , and MD = 4/5
It has a good number of algebraic properties for which it is possible to determine the
number of many connected factors like combined standard deviation of two or more series.
Demerits
It is not understood by a common man.
Its calculation is difficult as it involves many mathematical models and processes.
It is affected very much by the extreme values of a series in as much as the squares of
deviations of big items proportionately bigger than the squares of the smaller items.
It cannot be used for comparing the dispersion of two, or more series given in different
units.