# Product of two independent random variables

I have a sample of about 1000 values​​. These data are obtained from the product of two independent random variables $\xi \ast \psi$. The first random variable has a uniform distribution $\xi \sim U(0,1)$. The distribution of the second random variable is not known. How can I estimate the distribution of the second ($\psi$) random variable?

We have, Assuming $\psi$ has support on the positive real line,
Where $X \sim F_n$ and $F_n$ is the empirical distribution of the data.
Taking the log of this equation we get,

Thus by Levy’s continuity theorem, and independance of $\xi$ and$\psi$
taking the charactersitic functions:

Now, $\xi\sim Unif[0,1]$$, therefore$$-Log(\xi) \sim Exp(1)$
Thus,

Given that $\Psi_{ln(X)} =\frac{1}{n}\sum_{k=1}^{1000}\exp(itX_k) ,$
With $X_1 ... X_{1000}$ The random sample of $\ln(X)$.

We can now specify completly the distribution of $Log(\psi)$ through its characteristic function:

If we assume that the moment generating functions of $\ln(\psi)$ exist and that $t<1$ we can write the above equation in term of moment generating functions:

It is enough then to invert the Moment generating function to get the distribution of $ln(\phi)$ and thus that of $\phi$