An unbiased estimator of the ratio of two regression coefficients?

Suppose you fit a linear/logistic regression $g(y) = a_0 + a_1\cdot x_1 + a_2\cdot x_2$, with the aim of an unbiased estimate of $\frac{a_1}{a_2}$. You are very confident that both $a_1$ and $a_2$ are very positive relative to the noise in their estimations.

If you have the joint covariance of $a_1, a_2$, you could calculate, or at least simulate the answer. Are there any better ways, and in real-life problems with a lot of data, how much trouble do you get in for taking the ratio of estimates, or for taking a half-step and assuming the coefficients are independent?

Answer

I would suggest doing error propagation on the variable type and minimize either the error or relative error of $\frac{a_1}{a_2}$. For example, from Strategies for Variance Estimation or Wikipedia

$f = \frac{A}{B}\,$
$\sigma_f^2 \approx f^2 \left[\left(\frac{\sigma_A}{A}\right)^2 + \left(\frac{\sigma_B}{B}\right)^2 – 2\frac{\sigma_{AB}}{AB} \right]$

$\sigma_f \approx \left| f \right| \sqrt{ \left(\frac{\sigma_A}{A}\right)^2 + \left(\frac{\sigma_B}{B}\right)^2 – 2\frac{\sigma_{AB}}{AB} }$

As a guess, you probably want to minimize $(\frac{\sigma_f}{f})^2$. It is important to understand that when one does regression to find a best parameter target, one has forsaken goodness of fit. The fit process will find a best $\frac{A}{B}$, and this is definitively not related to minimizing residuals. This has been done before by taking logarithms of a non-linear fit equation, for which multiple linear applied with a different parameter target and Tikhonov regularization.

The moral of this story is that unless one asks the data to yield the answer that one desires, one will not obtain that answer. And, regression that does not specify the desired answer as a minimization target will not answer the question.

Attribution
Source : Link , Question Author : quasi , Answer Author : Carl

Leave a Comment