# Back-transformation of regression coefficients

I’m doing a linear regression with a transformed dependent variable. The following transformation was done so that the assumption of normality of residuals would hold. The untransformed dependant variable was negatively skewed, and the following transform made it close to normal:

where $Y_{orig}$ is the dependent variable on the original scale.

I think it makes sense to use some transformation on the $\beta$ coefficients to work our way back to the original scale. Using the following regression equation,

and by fixing $X=0$, we have

And finally,

Using the same logic, I found

Now things work very well for a model with 1 or 2 predictors; the back-transformed coefficients resemble the original ones, only now I can trust the standard errors. The problem comes when including an interaction term, such as

Then the back-transformation for the $\beta$s are not so close to the ones from the original scale, and I’m not sure why that happens. I’m also unsure if the formula found for back-transforming a beta coefficient is usable as is for the 3rd $\beta$ (for the interaction term). Before going into crazy algebra, I thought I’d ask for advice…

That is a simple deterministic (i.e. non-random) model. In that case, you could back transform the coefficients on the original scale, since it’s just a matter of some simple algebra. But, in usual regression you only have $E(Y|X)=α+β⋅X$ ; you’ve left the error term out of your model. If transformation from $Y$ back to $Y_{orig}$ is non-linear, you may have a problem since $E\big(f(X)\big)≠f\big(E(X)\big)$, in general. I think that may have to do with the discrepancy you’re seeing.