Does Bayes theorem hold for expectations?

Is it true that for two random variables $A$ and $B$,

$$E(A\mid B)=E(B\mid A)\frac{E(A)}{E(B)}?$$

Answer

$$E[A\mid B] \stackrel{?}= E[B\mid A]\frac{E[A]}{E[B]} \tag 1$$
The conjectured result $(1)$ is trivially true for independent random variables $A$ and $B$ with nonzero means.

If $E[B]=0$, then the right side of $(1)$ involves a division by $0$ and so $(1)$ is meaningless. Note that whether or not $A$ and $B$ are independent is not relevant.

In general, $(1)$ does not hold for dependent random variables but specific examples of dependent $A$ and $B$ satisfying $(1)$ can be found. Note that we must continue to insist that $E[B]\neq 0$, else the right side of $(1)$ is meaningless. Bear in mind that $E[A\mid B]$ is a random variable that happens to be a function of the random variable $B$, say $g(B)$ while $E[B\mid A]$ is a random variable that is a function of the random variable $A$, say $h(A)$. So, $(1)$ is similar to asking whether

$$g(B)\stackrel{?}= h(A)\frac{E[A]}{E[B]} \tag 2$$
can be a true statement, and obviously the answer is that $g(B)$ cannot be a multiple of $h(A)$ in general.

To my knowledge, there are only two special cases where $(1)$ can hold.

  • As noted above, for independent random variables $A$ and $B$, $g(B)$ and $h(A)$ are degenerate random variables (called constants by statistically-illiterate folks) that equal $E[A]$ and $E[B]$
    respectively, and so if $E[B]\neq 0$, we have equality in $(1)$.

  • At the other end of the spectrum from independence, suppose that
    $A=g(B)$ where $g(\cdot)$ is an invertible function and thus $A=g(B)$ and $B=g^{-1}(A)$ are wholly dependent random variables. In this case,
    $$E[A\mid B] = g(B), \quad E[B\mid A] = g^{-1}(A) = g^{-1}(g(B)) = B$$
    and so $(1)$ becomes
    $$g(B)\stackrel{?}= B\frac{E[A]}{E[B]}$$
    which holds exactly when $g(x) = \alpha x$ where $\alpha$ can be
    any nonzero real number. Thus, $(1)$ holds whenever $A$ is a scalar multiple of $B$, and of course $E[B]$ must be nonzero
    (cf. Michael Hardy’s answer). The above development
    shows that $g(x)$ must be a linear function and that
    $(1)$ cannot hold for affine functions $g(x) = \alpha x + \beta$ with $\beta \neq 0$. However, note that Alecos Papadopolous in
    his answer and his comments thereafter claims that if $B$
    is a normal random variable with nonzero mean, then for specific
    values of $\alpha$ and $\beta\neq 0$ that he provides,
    $A=\alpha B+\beta$
    and $B$ satisfy $(1)$. In my opinion, his example is incorrect.

In a comment on this answer, Huber has suggested considering the
symmetric conjectured equality
$$E[A\mid B]E[B] \stackrel{?}=E[B\mid A]E[A]\tag{3}$$
which of course always holds for independent random variables regardless of the values of
$E[A]$ and $E[B]$ and for scalar multiples $A = \alpha B$ also. Of course, more trivially, $(3)$
holds for any zero-mean random variables $A$ and $B$ (independent or dependent, scalar multiple or not; it does not
matter!): $E[A]=E[B]=0$ is sufficient for equality in $(3)$.
Thus, $(3)$ might not be as interesting as $(1)$ as a topic for discussion.

Attribution
Source : Link , Question Author : tomka , Answer Author : Community

Leave a Comment