Why is Covariance Useful?

There are a number of topics related to covariance on this site. What I am having trouble grasping: why is covariance a useful thing to calculate?

As far as I see it, covariance is not a helpful statistic. It is hard to interpret and it is not at all standardized (like correlation). It can be calculated on two variables with totally different measurement systems.

Does anyone have an example that could help elucidate the necessity of calculating covariance? Is it simply a means to an end in calculating parameters for regression?


Covariance matrix contains more information than correlation matrix:

  • You can derive a correlation matrix from a covariance matrix.
  • But you cannot derive a covariance matrix using only a correlation matrix! (You also would need the standard deviations.)

Covariance matrices contain all the information of: (i) a correlation matrix plus (ii) a standard deviation vector. In some sense, covariance matrices are the more compact, mathematically convenient object to work with.

Another example using covariance:

I’ll bring up a simple finance example that doesn’t obviously involve regression:

  • Let there be n possible investment assets.
  • Let Σ be the covariance matrix for the n assets.
  • Let w be a vector denoting portfolio weights on the n assets.

Then portfolio variance is given by the matrix equation:

You can’t write this formula this succinctly using a correlation matrix.

A portfolio that minimizes variance would be a solution to:
minimize (over wwΣw subject to: w1=1

Note this would be the same as minimizing the standard deviation of portfolio returns.

Covariance turns out to be a rather ubiquitous concept for any problem involving two or more random variables. It comes up all over the place. Better start getting used to it!

Source : Link , Question Author : ST21 , Answer Author : Matthew Gunn

Leave a Comment