# Can we conclude from Eg(X)h(Y)=Eg(X)Eh(Y)\DeclareMathOperator{\E}{\mathbb{E}}\E g(X)h(Y)=\E g(X) \E h(Y) that X,YX,Y are independent?

Well, we cannot, see for example https://en.wikipedia.org/wiki/Subindependence for an interesting counterexample. But the real question is: Is there some way to strengthen the condition so that independence follows? For example, is there some set of functions $g_1, \dotsc, g_n$ so that if $\E g_i(X) g_j(Y) =\E g_i(X) \E g_j(Y)$ for all $i,j$ then independence follows? And, how large must such a set of function be, infinite?

And, in addition, is there some good reference that treats this question?

Let $(\Omega, \mathscr F, P)$ be a probability space. By definition two random variables $X, Y :\Omega \to \mathbb R$ are independent if their $\sigma$-algebras $S_X := \sigma(X)$ and $S_Y := \sigma(Y)$ are independent, i.e. $\forall A \in S_X, B \in S_Y$ we have $P(A \cap B) = P(A)P(B)$.

Let $g_a(x) = I(x \leq a)$ and take $G = \{g_a : a \in \mathbb Q\}$ (thanks to @grand_chat for pointing out that $\mathbb Q$ suffices). Then we have

and

If we assume that $\forall a, b \in \mathbb Q$

then we can appeal to the $\pi-\lambda$ theorem to show that

i.e. $X \perp Y$.

So unless I’ve made a mistake, we’ve at least got a countable collection of such functions and this applies to any pair of random variables defined over a common probability space.