Measures of Variation
Range
The range is the simplest measure of variation to find. It is simply the highest value minus the lowest value.RANGE = MAXIMUM - MINIMUMSince the range only uses the largest and smallest values, it is greatly affected by extreme values, that is - it is not resistant to change.
Variance
"Average Deviation"
The range only involves the smallest and largest numbers, and it would be desirable to have a statistic which involved all of the data values.The first attempt one might make at this is something they might call the average deviation from the mean and define it as:
The problem is that this summation is always zero. So, the average deviation will always be zero. That is why the average deviation is never used.
Variation
So, to keep it from being zero, the deviation from the mean is squared and called the "squared deviation from the mean". The sum of the squared deviations from the mean is called the variation. The problem with the variation is that it does not take into account how many data values were used to obtain the sum.Population Variance
If we divide the variation by the number of values in the population, we get something called the population variance. This variance is the "average squared deviation from the mean".Unbiased Estimate of the Population Variance
One would expect the sample variance to simply be the population variance with the population mean replaced by the sample mean. However, one of the major uses of statistics is to estimate the corresponding parameter. This formula has the problem that the estimated value isn't the same as the parameter. To counteract this, the sum of the squares of the deviations is divided by one less than the sample size.Standard Deviation
There is a problem with variances. Recall that the deviations were squared. That means that the units were also squared. To get the units back the same as the original data values, the square root must be taken.The sample standard deviation is not the unbiased estimator for the population standard deviation.
The calculator does not have a variance key on it. It does have a standard deviation key. You will have to square the standard deviation to find the variance.
Sum of Squares (shortcuts)
The sum of the squares of the deviations from the means is given a shortcut notation and several alternative formulas.A little algebraic simplification returns:
What's wrong with the first formula, you ask? Consider the following example - the last row are the totals for the columns
- Total the data values: 23
- Divide by the number of values to get the mean: 23/5 = 4.6
- Subtract the mean from each value to get the numbers in the second column.
- Square each number in the second column to get the values in the third column.
- Total the numbers in the third column: 5.2
- Divide this total by one less than the sample size to get the variance: 5.2 / 4 = 1.3
x | ||
4 | 4 - 4.6 = -0.6 | ( - 0.6 )^2 = 0.36 |
5 | 5 - 4.6 = 0.4 | ( 0.4 ) ^2 = 0.16 |
3 | 3 - 4.6 = -1.6 | ( - 1.6 )^2 = 2.56 |
6 | 6 - 4.6 = 1.4 | ( 1.4 )^2 = 1.96 |
5 | 5 - 4.6 = 0.4 | ( 0.4 )^2 = 0.16 |
23 | 0.00 (Always) | 5.2 |
Now, let's consider the shortcut formula. The only things that you need to find are the sum of the values and the sum of the values squared. There is no subtraction and no decimals or fractions until the end. The last row contains the sums of the columns, just like before.
- Record each number in the first column and the square of each number in the second column.
- Total the first column: 23
- Total the second column: 111
- Compute the sum of squares: 111 - 23*23/5 = 111 - 105.8 = 5.2
- Divide the sum of squares by one less than the sample size to get the variance = 5.2 / 4 = 1.3
x | x^2 |
4 | 16 |
5 | 25 |
3 | 9 |
6 | 36 |
5 | 25 |
23 | 111 |
Chebyshev's Theorem
The proportion of the values that fall within k standard deviations of the mean will be at least , where k is an number greater than 1."Within k standard deviations" interprets as the interval: to .
Chebyshev's Theorem is true for any sample set, not matter what the distribution.
Empirical Rule
The empirical rule is only valid for bell-shaped (normal) distributions. The following statements are true.- Approximately 68% of the data values fall within one standard deviation of the mean.
- Approximately 95% of the data values fall within two standard deviations of the mean.
- Approximately 99.7% of the data values fall within three standard deviations of the mean.
No comments:
Post a Comment