Well, we cannot, see for example https://en.wikipedia.org/wiki/Subindependence for an interesting counterexample. But the real question is: Is there some way to strengthen the condition so that independence follows? For example, is there some

setof functions g1,…,gn so that if Egi(X)gj(Y)=Egi(X)Egj(Y) for all i,j then independence follows? And, how large must such a set of function be, infinite?And, in addition, is there some good reference that treats this question?

**Answer**

Let (Ω,F,P) be a probability space. By definition two random variables X,Y:Ω→R are independent if their σ-algebras SX:=σ(X) and SY:=σ(Y) are independent, i.e. ∀A∈SX,B∈SY we have P(A∩B)=P(A)P(B).

Let ga(x)=I(x≤a) and take G={ga:a∈Q} (thanks to @grand_chat for pointing out that Q suffices). Then we have

E(ga(X)gb(Y))=E(I(X≤a)I(Y≤b))=E(I(X≤a,Y≤b))=P(X≤a∩Y≤b)

and

E(ga(X))E(gb(Y))=P(X≤a)P(Y≤b).

If we assume that ∀a,b∈Q

P(X≤a∩Y≤b)=P(X≤a)P(Y≤b)

then we can appeal to the π−λ theorem to show that

P(A∩B)=P(A)P(B)∀A∈SX,B∈SY

i.e. X⊥Y.

So unless I’ve made a mistake, we’ve at least got a countable collection of such functions and this applies to any pair of random variables defined over a common probability space.

**Attribution***Source : Link , Question Author : kjetil b halvorsen , Answer Author : jld*