I’m currently attending the

An Introduction to Operations Managementcourse in Coursera.org. At some point in the course, the professor started to deal with variation in the operations’ time.The measurement he uses is the

Coefficient of Variation, the ratio between the standard deviation and the mean:$c_v = \frac{\sigma}{\mu}$

Why would this measurement be used? What are the advantages and disadvantages of working with

CVbesides working with, say, standard deviation? What is the intuition behind this measurement?

**Answer**

I think of it as a relative measure of spread or variability in the data. If you think of the statement, “The standard deviation is 2.4” it really tells you nothing without respect to the mean (and thus the unit of measure, I suppose). If the mean is equal to 104, the standard deviation of 2.4 communicates quite a different picture of spread than if the mean were 25,452 with a standard deviation of 2.4..

The same reason you normalize data (subtract the mean and divide by the standard deviation) to place data expressed in different units on a comparable or equal footing—so too this measure of variability is normalized—to aid in comparisons.

**Attribution***Source : Link , Question Author : Lucas Reis , Answer Author : gung – Reinstate Monica*