Interpreting logistic regression coefficients with a regularization term

I understand the coefficients of a logistic equation can be interpreted as odd ratio. If a regularization term is added to control for over-fitting, how does this change the interpretation of the coefficients?

Answer

The coefficients that are returned standard with a logistic regression fit are not odds ratios. They represent the change in the log odds of ‘success’ associated with a one-unit change in their respective variable, when all else is held equal. If you exponentiate a coefficient, then you can interpret the result as an odds ratio (of course, this is not true of the intercept). More on this can be found in my answer here: Interpretation of simple predictions to odds ratios in logistic regression.

Adding a penalty to the model fit will (potentially) change the fitted value of the estimated coefficients, but it does not change the interpretation of the coefficients in the sense discussed in your question / above.*

* (I wonder if confusion about this statement is the origin of the recent downvote.) To be clearer: The fitted coefficient on $X_1$, $\hat\beta_1$ represents the change in the log odds of success associated with a 1-unit change in $X_1$ if there is no penalty term used in fitting the model and if a penalty term is used to fit the model. In neither case is it the odds ratio. However, $\exp(\hat\beta_1)$ is the odds ratio associated with a 1-unit change in $X_1$, again irrespective of whether a penalty term was used to fit the model. A model fitted with a penalty term can be interpreted within a Bayesian framework, but doesn’t necessarily have to be. Moreover, even if it is, $\hat\beta_1$ still represents the change in the log odds of success associated with a 1-unit change in $X_1$ not an odds ratio.

Attribution
Source : Link , Question Author : Jim T , Answer Author : Community

Leave a Comment