# Interpreting logistic regression coefficients with a regularization term

I understand the coefficients of a logistic equation can be interpreted as odd ratio. If a regularization term is added to control for over-fitting, how does this change the interpretation of the coefficients?

* (I wonder if confusion about this statement is the origin of the recent downvote.) To be clearer: The fitted coefficient on $X_1$, $\hat\beta_1$ represents the change in the log odds of success associated with a 1-unit change in $X_1$ if there is no penalty term used in fitting the model and if a penalty term is used to fit the model. In neither case is it the odds ratio. However, $\exp(\hat\beta_1)$ is the odds ratio associated with a 1-unit change in $X_1$, again irrespective of whether a penalty term was used to fit the model. A model fitted with a penalty term can be interpreted within a Bayesian framework, but doesn’t necessarily have to be. Moreover, even if it is, $\hat\beta_1$ still represents the change in the log odds of success associated with a 1-unit change in $X_1$ not an odds ratio.