I have a linear regression and two estimates, say A and B and their standard errors. I need to find the standard error of a ratio A/B [or A/(1-B)].

I guess the main problem is that I don’t know the correlation between A and B. I also guess that makes this unsolvable, since the correlation plays a major role. Is this correct?

If I knew it, what would be the way to calculate the standard error?

**Answer**

1) The variance of a ratio is approximately:

$Var(x/y) \approx \left(\frac{E(x)}{E(y)}\right)^2 \left(\frac{Var(x)}{E(x)^2} + \frac{Var(y)}{E(y)^2} – 2 \frac{Cov(x,y)}{E(x)E(y)}\right)$

You might want to look at the answers to this question for more information.

Usually regression packages do provide at least the option to print out the estimated covariance matrix of the parameter estimates, so perhaps there’s some way of getting that covariance term.

2) However, the bootstrap may give you more accurate confidence intervals, especially if your denominator variable is not many standard errors away from zero.

**Attribution***Source : Link , Question Author : Ondrej , Answer Author : Community*