# Suppose Y_1, \dots, Y_n \overset{\text{iid}}{\sim} \text{Exp}(1)Y_1, \dots, Y_n \overset{\text{iid}}{\sim} \text{Exp}(1). Show \sum_{i=1}^{n}(Y_i – Y_{(1)}) \sim \text{Gamma}(n-1, 1)\sum_{i=1}^{n}(Y_i – Y_{(1)}) \sim \text{Gamma}(n-1, 1)

What is the easiest way to see that the following statement is true?

Suppose $Y_1, \dots, Y_n \overset{\text{iid}}{\sim} \text{Exp}(1)$.
Show $\sum_{i=1}^{n}(Y_i - Y_{(1)}) \sim \text{Gamma}(n-1, 1)$.

Note that $Y_{(1)} = \min\limits_{1 \leq i \leq n}Y_i$.

By $X \sim \text{Exp}(\beta)$, this means that $f_{X}(x) = \dfrac{1}{\beta}e^{-x/\beta} \cdot \mathbf{1}_{\{x > 0\}}$.

It is easy to see that $Y_{(1)} \sim \text{Exponential}(1/n)$. Furthermore, we also have that $\sum_{i=1}^{n}Y_i \sim \text{Gamma}(\alpha = n, \beta = 1)$ under the parametrization

Solution given Xi’an’s Answer: Using the notation in the original question:
From this, we get that $\sum_{i=2}^{n}(n-i+1)[Y_{(i)}-Y_{(i-1)}] \sim \text{Gamma}(n-1, 1)$.

The proof is given in the Mother of All Random Generation Books, Devroye’s Non-uniform Random Variate Generation, on p.211 (and it is a very elegant one!):

Theorem 2.3 (Sukhatme, 1937) If we define $E_{(0)}=0$ then the normalised exponential spacings
derived from the order statistics $E_{(1)}\le\ldots\le E_{(n)}$ of an i.i.d. exponential sample of size $n$
are themselves i.i.d. exponential variables

Proof. Since

the joint density of the order statistic $(E_{(1)},\ldots,E_{(n)})$ writes as

Setting $Y_i=(E_{(i)}-E_{(i-1)})$, the change of variables from $(E_{(1)},\ldots,E_{(n)})$ to $(Y_1,\ldots,Y_n)$ has a constant Jacobian [incidentally equal to $1/n!$ but this does not need to be computed] and hence the density of $(Y_1,\ldots,Y_n)$ is proportional to
which establishes the result. Q.E.D.

An alternative suggested to me by Gérard Letac is to check that has the same distribution as (by virtue of the memoryless property), which makes the derivation of straightforward.