I was reading about regression metrics in the python scikitlearn manual and even though each one of them has its own formula, I cannot tell intuitively what is the difference between $R^2$ and variance score and therefore when to use one or another to evaluate my models.
Answer

$R^2 = 1 \frac{SSE}{TSS}$

$\text{explained variance score} = 1 – \mathrm{Var}[\hat{y} – y]\, /\, \mathrm{Var}[y]$, where the $\mathrm{Var}$ is biased variance, i.e. $\mathrm{Var}[\hat{y} – y] = \frac{1}{n}\sum(error – mean(error))^2$. Compared with $R^2$, the only difference is from the mean(error). if mean(error)=0, then $R^2$ = explained variance score

Also note that in adjusted$R^2$, unbiased variance estimation is used.
Attribution
Source : Link , Question Author : hipoglucido , Answer Author : SultanOrazbayev