It is easy to produce a random variable with Dirichlet distribution using Gamma variables with the same scale parameter. If:

$ X_i \sim \text{Gamma}(\alpha_i, \beta) $

Then:

$ \left(\frac{X_1}{\sum_j X_j},\; \ldots\; , \frac{X_n}{\sum_j X_j}\right) \sim \text{Dirichlet}(\alpha_1,\;\ldots\;,\alpha_n) $

Problem

What happens if the scale parameters are not equal?$ X_i \sim \text{Gamma}(\alpha_i, \beta_i) $

Then what is the distribution this variable?

$ \left(\frac{X_1}{\sum_j X_j},\; \ldots\; , \frac{X_n}{\sum_j X_j}\right) \sim \; ? $

For me it would be sufficient to know the expected value of this distribution.

I need a approximate closed algebraic formula that can be evaluated very very quickly by a computer.

Let’s say approximation with accurancy of 0.01 is sufficient.

You can assume that:$ \alpha_i, \beta_i \in \mathbb{N} $

NoteIn short, the task is to find an approximation of this integral:$ f(\vec{\alpha}, \vec{\beta}) = \int_{\mathbb{R}^n_+} \;\frac{x_1}{\sum_j x_j} \cdot \prod_j \frac{\beta_j^{\alpha_j}}{\Gamma(\alpha_j)} x_j^{\alpha_j – 1} e^{-\beta_j x_j} \;\; dx_1\ldots dx_n$

**Answer**

Just an initial remark, if you want computational speed you usually have to sacrifice accuracy. “More accuracy” = “More time” in general. Anyways here is a second order approximation, should improve on the “crude” approx you suggested in your comment above:

$$E\Bigg(\frac{X_{j}}{\sum_{i}X_{i}}\Bigg)\approx

\frac{E[X_{j}]}{E[\sum_{i}X_{i}]}

-\frac{cov[\sum_{i}X_{i},X_{j}]}{E[\sum_{i}X_{i}]^2}

+\frac{E[X_{j}]}{E[\sum_{i}X_{i}]^3} Var[\sum_{i}X_{i}]

$$

$$= \frac{\alpha_{j}}{\sum_{i} \frac{\beta_{j}}{\beta_{i}}\alpha_{i}}\times\Bigg[1 – \frac{1}{\Bigg(\sum_{i} \frac{\beta_{j}}{\beta_{i}}\alpha_{i}\Bigg)}

+ \frac{1}{\Bigg(\sum_{i} \frac{\alpha_{i}}{\beta_{i}}\Bigg)^2}\Bigg(\sum_{i} \frac{\alpha_{i}}{\beta_{i}^2}\Bigg)\Bigg]

$$

**EDIT** An explanation for the above expansion was requested. The short answer is wikipedia. The long answer is given below.

write $f(x,y)=\frac{x}{y}$. Now we need all the “second order” derivatives of $f$. The first order derivatives will “cancel” because they will all involve multiples $X-E(X)$ and $Y-E(Y)$ which are both zero when taking expectations.

$$\frac{\partial^2 f}{\partial x^2}=0$$

$$\frac{\partial^2 f}{\partial x \partial y}=-\frac{1}{y^2}$$

$$\frac{\partial^2 f}{\partial y^2}=2\frac{x}{y^3}$$

And so the taylor series up to second order is given by:

$$\frac{x}{y} \approx \frac{\mu_x}{\mu_y}+\frac{1}{2}\Bigg(-\frac{1}{\mu_y^2}2(x-\mu_x)(y-\mu_y) + 2\frac{\mu_x}{\mu_y^3}(y-\mu_y)^2 \Bigg)$$

Taking expectations yields:

$$E\Big[\frac{x}{y}\Big] \approx \frac{\mu_x}{\mu_y}-\frac{1}{\mu_y^2}E\Big[(x-\mu_x)(y-\mu_y)\Big] + \frac{\mu_x}{\mu_y^3}E\Big[(y-\mu_y)^2\Big]$$

Which is the answer I gave. (although I initially forgot the minus sign in the second term)

**Attribution***Source : Link , Question Author : Łukasz Lew , Answer Author : probabilityislogic*