# Should partial R2R^2 add up to total R2R^2 in multiple regression?

Following is a model created from mtcars dataset:

> ols(mpg~wt+am+qsec, mtcars)

Linear Regression Model

ols(formula = mpg ~ wt + am + qsec, data = mtcars)

Model Likelihood     Discrimination
Ratio Test           Indexes
Obs       32    LR chi2     60.64    R2       0.850
sigma 2.4588    d.f.            3    R2 adj   0.834
d.f.      28    Pr(> chi2) 0.0000    g        6.456

Residuals

Min      1Q  Median      3Q     Max
-3.4811 -1.5555 -0.7257  1.4110  4.6610

Coef    S.E.   t     Pr(>|t|)
Intercept  9.6178 6.9596  1.38 0.1779
wt        -3.9165 0.7112 -5.51 <0.0001
am         2.9358 1.4109  2.08 0.0467
qsec       1.2259 0.2887  4.25 0.0002


The model seems good with total $R^2$ of 0.85. However, partial $R^2$ values seen on following plot do not add up to this value. They add up to approx 0.28.

> plot(anova(mod), what='partial R2')


Is there any relation between sum of all partial $R^2$ and total $R^2$ ? The analysis is done with rms package.

## No.

One way to understand partial $$R2R^2$$ for a given predictor is that it equals the $$R2R^2$$ that you would get if you first regress your independent variable on all other predictors, take the residuals, and regress those on the remaining predictor.

So if e.g. all predictors are perfectly identical (collinear), one can have decent $$R2R^2$$, but partial $$R2R^2$$ for all predictors will be exactly zero, because any single predictor has zero additional explanatory power.

On the other hand, if all predictors together explain the dependent variable perfectly, i.e. $$R2=1R^2=1$$, then partial $$R2R^2$$ for each predictor will be $$11$$ too, because whatever is unexplained by all other predictors can be perfectly explained by the remaining one.

So the sum of all partial $$R2R^2$$ can easily be below or above the total $$R2R^2$$. They do not have to coincide even if all predictors are orthogonal. Partial $$R2R^2$$ is a bit of a weird measure.

See this long thread for many more details: Importance of predictors in multiple regression: Partial $R^2$ vs. standardized coefficients.