# Distribution of ∑ni=1XiYi∑ni=1X2i\frac{\sum_{i=1}^n X_iY_i}{\sum_{i=1}^n X_i^2} where Xi,YiX_i,Y_is are i.i.d Normal variables

Suppose $$X1,…,Xn,Y1,…,YnX_1,\ldots,X_n,Y_1,\ldots,Y_n$$ are i.i.d $$N(0,1)\mathcal N(0,1)$$ random variables.

I am interested in the distribution of $$U=∑ni=1XiYi∑ni=1X2iU=\frac{\sum_{i=1}^n X_iY_i}{\sum_{i=1}^n X_i^2}$$

I define $$Z=∑ni=1XiYi√∑ni=1X2iZ=\frac{\sum_{i=1}^n X_iY_i}{\sqrt{\sum_{i=1}^n X_i^2}}$$

Then, $$Z∣(X1=x1,…,Xn=xn)=∑ni=1xiYi√∑ni=1x2i∼N(0,1)Z\mid (X_1=x_1,\ldots,X_n=x_n)=\frac{\sum_{i=1}^n x_iY_i}{\sqrt{\sum_{i=1}^n x_i^2}}\sim \mathcal N(0,1)$$

As this conditional distribution is independent of $$X1,…,XnX_1,\ldots,X_n$$, the unconditional distribution should also be the same. That is, I can say that $$Z∼N(0,1)Z\sim \mathcal N(0,1)$$

Relating $$UU$$ and $$ZZ$$, I have $$U=Z√∑ni=1X2iU=\frac{Z}{\sqrt{\sum_{i=1}^n X_i^2}}$$

Now since I saw that $$Z∣(X1,…,Xn)d=ZZ\mid (X_1,\ldots,X_n)\stackrel{d}{=}Z$$, I can say that $$ZZ$$ is independent of $$X1,…,XnX_1,\ldots,X_n$$.

So I have

$$U=1√nZ√∑ni=1X2in=T√nU=\frac{1}{\sqrt n}\frac{Z}{\sqrt{\frac{\sum_{i=1}^n X_i^2}{n}}}=\frac{T}{\sqrt n}$$, where $$TT$$ is distributed as a $$tt$$ distribution with $$nn$$ degrees of freedom.

I think conditioning is the easiest way to see the result here. But is this a perfectly rigorous argument and is there any direct/alternative way of finding distributions of such functions of linear combinations of i.i.d Normal variables?

Although this is a conditional argument as well, using the characteristic function is faster:
E[exp{ιt∑iYiXi/∑jX2j}]=E[E[exp{ιtYiXi/∑jX2j}]|X]=E[E[∏iexp{ιtYiXi/∑jX2j}]|X]=E[∏iE[exp{tYiXi/∑jX2j}]|Xι]=E[∏iexp{−t2X2i/2{∑jX2j}2}]=E[exp{−t2/2∑jX2j}]\begin{align*} \mathbb E\left[\exp\left\{ \iota t\sum_i Y_i X_i\Big/{\sum_j X_j^2}\right\}\right] &= \mathbb E\left[\left.\mathbb E\left[\exp\left\{\iota t Y_i X_i\Big/{\sum_j X_j^2}\right\}\right]\,\right|\,\mathbf X \right]\\ &=\mathbb E\left[\left.\mathbb E\left[\prod_i \exp\left\{\iota t Y_i X_i\Big/{\sum_j X_j^2}\right\}\right]\,\right|\,\mathbf X \right]\\ &=\mathbb E\left[\prod_i\left.\mathbb E\left[ \exp\left\{ t Y_i X_i\Big/{\sum_j X_j^2}\right\}\right]\,\right|\,\mathbf X\iota \right]\\ &=\mathbb E\left[\prod_i \exp\left\{- t^2 X_i^2 \Big/2\left\{\sum_j X_j^2\right\}^2\right\}\right]\\ &=\mathbb E\left[ \exp\left\{- t^2 \Big/2{\sum_j X_j^2}\right\}\right]\\ \end{align*}
Invoking Wolfram’s integrator, this expectation is equal to
$$\int_0^∞ \zeta^{n/2 – 1} \frac{\exp(-\zeta – t^2/\zeta)}{Γ(n/2)}\ \text{d}\,\zeta = \frac{2 t^{n/2} K_{-n/2}(2 t)}{Γ(n/2)}\int_0^∞ \zeta^{n/2 - 1} \frac{\exp(-\zeta - t^2/\zeta)}{Γ(n/2)}\ \text{d}\,\zeta = \frac{2 t^{n/2} K_{-n/2}(2 t)}{Γ(n/2)}$$
where $$K_nK_n$$ is the modified Bessel function of the second kind. Hence, except for $$n=1n=1$$ this is not the characteristic function of the Cauchy distribution. This looks instead like the characteristic function of the Student’s $$tt$$ distribution.