# Moments of $Y=X_1 + X_2 X_3 + X_4 X_5 X_6 +\cdots$

The $$X_i$$‘s are i.i.d. and $$X$$ denotes any of these random variables. We assume here that $$|E(X)|<1$$ to guarantee convergence. I am interested in particular in the third moment $$E(Y^3)$$. For the first two moments, we have (see here):

$$E(Y) = \frac{E(X)}{1-E(X)},\mbox{ Var}(Y)=\frac{\mbox{Var}(X)}{(1-E^2(X))(1-E(X^2))}.$$

The reason for my interest is as follows. Let
$$Z = X_1 + X_1 X_2 + X_1 X_2 X_3 + \cdots .$$

If $$E(X)=0$$, then $$E(Y) = E(Z)$$ and $$E(Y^2)=E(Z^2)$$, see here. My hope is that this is no longer true for higher moments, that is, $$E(Y^3) \neq E(Z^3)$$. Ultimately, that’s what I’d like to prove. While the two first moments are not enough to make the model identifiable ($$Y$$ vs. $$Z$$) it is my hope that by using the first three moments as model parameters, this is enough to discriminate between $$Y$$ and $$Z$$, and thus make the model identifiable. Note that
$$E(Z^3) =\frac{E(X^3)(1+3E(Z)+3E(Z^2))}{1-E(X^3)}.$$
The result for $$E(Z^3)$$ comes from section 4.2. in this article.

Update

A possible way to compute $$E(Y^n)$$ is as follows. Define $$Y_k$$ as the sum of the first $$k$$ terms in $$Y=X_1 + X_2 X_3 + X_4 X_5 X_6 +\cdots$$. Then $$Y_{k+1}=Y_k + V_{k+1}$$ with $$V_{k+1}$$ being a product of $$k+1$$ i.i.d. random variables with same distribution as $$X$$. Also, $$Y_k$$ and $$V_{k+1}$$ are independent. Thus

$$E(Y_{k+1}^n) =\sum_{i=0}^n \frac{n!}{i!(n-i)!}E(Y_k^i) (E(X^{n-i}))^{k+1}.$$

We can focus on the case $$n=3$$ to begin with. The above recurrence relation (if correct) could lead to a solution. We are interested in the case $$k\rightarrow\infty$$, as $$Y_k \rightarrow Y$$ (in distribution.) We also have the following recursion:

$$E(Y_{k+1}^n) =E(Y_k \cdot Y_{k+1}^{n-1}) + (E(X))^{k+1}\cdot E(Y_{k+1}^{n-1}).$$

Update 2

If $$X$$ has the distribution $$P(X=-0.5) = 0.5, P(X=0.5) = 0.5$$ then both $$Y, Z$$ have the same uniform distribution on $$[-1, 1]$$. It is then impossible to discriminate between models $$Y$$ or $$Z$$.

I also looked at the case $$X$$ = Normal$$(0, 1/4)$$. The variances for $$Y$$ or $$Z$$ are both identical as expected, confirmed by empirical evidence, and both are equal to $$1/3$$ as expected. Yet higher moments are different. Below is the chart showing the empirical percentile distribution, comparing $$Y$$ (blue) vs. $$Z$$ (red), if $$X$$ is Normal$$(0, 1/4)$$. They are clearly different.

Let $$t_1=X_1$$ and $$t_2=X_2X_3$$, etc., so $$Y=\sum_i t_i$$. Then with some help from Mathematica (like this):
\begin{align} E[Y^3] &= E\left[6\sum_{i
For $$Z$$, we can similarly let $$u_1=X_1$$ and $$u_2=X_1X_2$$, etc., so $$Z=\sum_i u_i$$. Then:
\begin{align} E[Z^3] &= E\left[6\sum_{ij}E[X^3]^jE[X^2]^{i-j}+ \sum_{i}E[X^3]^i \\ &= s_3(6s_2s_1+ 3s_1 + 3s_2+ 1) \end{align}
where $$s_n=\sum_i E[X^n]^i=E[X^n]/(1-E[X^n])$$.