# Moment/mgf of cosine of directional vectors?

Can anybody suggest how I can compute the second moment (or the whole moment generating function) of the cosine of two gaussian random vectors $x,y$, each distributed as $\mathcal N (0,\Sigma)$, independent of each other?
IE, moment for the following random variable

The closest question is Moment generating function of the inner product of two gaussian random vectors which derives MGF for the inner product. There’s also this answer from mathoverflow which links this question to distribution of eigenvalues of sample covariance matrices, but I don’t immediately see how to use those to compute the second moment.

I suspect that second moment scales in proportion to half-norm of eigenvalues of $\Sigma$ since I get this result through algebraic manipulation for 2 dimensions, and also for 3 dimensions from guess-and-check. For eigenvalues $a,b,c$ adding up to 1, second moment is:

Using the following for numerical check

val1[a_, b_, c_] := (a + b + c)/(Sqrt[a] + Sqrt[b] + Sqrt[c])^2
val2[a_, b_, c_] := Block[{},
x := {x1, x2, x3};
y := {y1, y2, y3};
normal := MultinormalDistribution[{0, 0, 0}, ( {
{a, 0, 0},
{0, b, 0},
{0, 0, c}
} )];
vars := {x \[Distributed] normal, y \[Distributed] normal};
NExpectation[(x.y/(Norm[x] Norm[y]))^2, vars]]

val1[1.5,2.5,3.5] - val2[1.5,2.5,3.5]


Checking the formula for 4 variables (within numerical bounds):

val1[a_, b_, c_,
d_] := (a + b + c + d)/(Sqrt[a] + Sqrt[b] + Sqrt[c] + Sqrt[d])^2
val2[a_, b_, c_, d_] := Block[{},
x := {x1, x2, x3, x4};
y := {y1, y2, y3, y4};
normal :=
MultinormalDistribution[{0, 0, 0,
0}, {{a, 0, 0, 0}, {0, b, 0, 0}, {0, 0, c, 0}, {0, 0, 0, d}}];
vars := {x \[Distributed] normal, y \[Distributed] normal};
NExpectation[(x.y/(Norm[x] Norm[y]))^2, vars]]

val1[0.5, 1.5, 2.5, 3.5] - val2[0.5, 1.5, 2.5, 3.5]


Hey Yaroslav, you really do not have to hurry accepting my answer on MO and are more than welcomed to ask further details :).

Since you reformulate the question in 3-dim I can see exactly what you want to do. In MO post I thought you only need to calculate the largest cosine between two random variables. Now the problem seems tougher.

First, we calculate the normalized Gaussian $\frac{X}{\|X\|}$, which is not a trivial job since it actually has a name “projected normal distribution” because we can rewrite the multivariate normal density $X$ in terms of its polar coordinate $(\|X\|,\frac{X}{\|X\|})=(r,\boldsymbol{\theta})$. And the marginal density for $\boldsymbol{\theta}$ can be obtained in

An important instance is that in which $x$ has a bivariate normal
distribution $N_2(\mu,\Sigma)$, in which $\|x\|^{-1}x$ is said to have a
projected normal (or angular Gaussian or offset normal) distribution.[Mardia&Peter]p.46

In this step we can obtain distributions $\mathcal{PN}_{k}$ for $\frac{X}{\|X\|}\perp\frac{Y}{\|Y\|}$, and hence their joint density $(\frac{X}{\|X\|},\frac{Y}{\|Y\|})$ due to independence. As for a concrete density function of projected normal distribution, see [Mardia&Peter] Chap 10. or  Equation (4) or  . (Notice that in  they also assume a special form of covariance matrix $\Sigma=\left(\begin{array}{cc} \Gamma & \gamma\\ \gamma' & 1 \end{array}\right)$)

Second, since we already obtained their joint density, their inner product can be readily derived using transformation formula . Also see .

As long as we computed the density, the second moment is only a problem of integration.

Reference

[Mardia&Peter]Mardia, Kanti V., and Peter E. Jupp. Directional statistics. Vol. 494. John Wiley & Sons, 2009.

Wang, Fangpo, and Alan E. Gelfand. “Directional data analysis under the general projected normal distribution.” Statistical methodology 10.1 (2013): 113-127.

Hernandez-Stumpfhauser, Daniel, F. Jay Breidt, and Mark J. van der Woerd. “The general projected normal distribution of arbitrary dimension: modeling and Bayesian inference.” Bayesian Analysis (2016).