Can we conclude from Eg(X)h(Y)=Eg(X)Eh(Y)\DeclareMathOperator{\E}{\mathbb{E}}\E g(X)h(Y)=\E g(X) \E h(Y) that X,YX,Y are independent?

Well, we cannot, see for example for an interesting counterexample. But the real question is: Is there some way to strengthen the condition so that independence follows? For example, is there some set of functions g1,,gn so that if Egi(X)gj(Y)=Egi(X)Egj(Y) for all i,j then independence follows? And, how large must such a set of function be, infinite?

And, in addition, is there some good reference that treats this question?


Let (Ω,F,P) be a probability space. By definition two random variables X,Y:ΩR are independent if their σ-algebras SX:=σ(X) and SY:=σ(Y) are independent, i.e. ASX,BSY we have P(AB)=P(A)P(B).

Let ga(x)=I(xa) and take G={ga:aQ} (thanks to @grand_chat for pointing out that Q suffices). Then we have

If we assume that a,bQ
then we can appeal to the πλ theorem to show that
i.e. XY.

So unless I’ve made a mistake, we’ve at least got a countable collection of such functions and this applies to any pair of random variables defined over a common probability space.

Source : Link , Question Author : kjetil b halvorsen , Answer Author : jld

Leave a Comment