MTH302 Business Mathematics & Statistics GDB 1 Solution Spring 2014

Since variance and standard deviation both are used to check variability in the data set. Discuss which one is more reliable to apply. Justify your answer at least with two appropriate examples.


The variance of a data set measures the mathematical dispersion of the data relative to the mean. However, though this value is theoretically correct, it is difficult to apply in a real-world sense because the values used to calculate it were squared. The standard deviation, as the square root of the variance gives a value that is in the same units as the original values, which makes it much easier to work with and easier to interpret in conjunction with the concept of the normal curve.

Standard deviation is a measurement used in statistics of the amount a number varies from the average number in a series of numbers . the standard deviation tells those interpreting the data how reliable the data is or how much difference there is between the pieces of data by showing how close to the average all of the data is

There is a two example :-
1 : A low standard deviation means that the data is very closely related to the average, thus very reliable.
2 : A high standard deviation means that there is a large variance between the data and the statistical average,thus not as reliable.


DOWNLOAD SOLUTION HERE
loading...