What is the difference between $R^2$ and variance score in Scikit-learn?

I was reading about regression metrics in the python scikit-learn manual and even though each one of them has its own formula, I cannot tell intuitively what is the difference between $R^2$ and variance score and therefore when to use one or another to evaluate my models.


  1. $R^2 = 1- \frac{SSE}{TSS}$

  2. $\text{explained variance score} = 1 – \mathrm{Var}[\hat{y} – y]\, /\, \mathrm{Var}[y]$, where the $\mathrm{Var}$ is biased variance, i.e. $\mathrm{Var}[\hat{y} – y] = \frac{1}{n}\sum(error – mean(error))^2$. Compared with $R^2$, the only difference is from the mean(error). if mean(error)=0, then $R^2$ = explained variance score

  3. Also note that in adjusted-$R^2$, unbiased variance estimation is used.

Source : Link , Question Author : hipoglucido , Answer Author : SultanOrazbayev

Leave a Comment