Can we say anything about the dependence of a random variable and a function of a random variable? For example is $X^2$ dependent on $X$?

**Answer**

Here is a proof of @cardinal’s comment with a small twist. If $X$ and $f(X)$ are independent then

$$

\begin{array}{lcl}

P(X \in A \cap f^{-1}(B)) & = & P(X \in A, f(X) \in B) \\

& = & P(X \in A) P(f(X) \in B) \\

& = & P(X \in A) P(X \in f^{-1}(B))

\end{array}

$$

Taking $A = f^{-1}(B)$ yields the equation

$$P(f(X) \in B) = P(f(X) \in B)^2,$$

which has the two solutions 0 and 1. Thus $P(f(X) \in B) \in \{0, 1\}$ for all $B$. In complete generality, it’s not possible to say more. If $X$ and $f(X)$ are independent, then $f(X)$ is a variable such that for any $B$ it is either in $B$ or in $B^c$ with probability 1. To say more, one needs more assumptions, e.g. that singleton sets $\{b\}$ are measurable.

However, details at the measure theoretic level do not seem to be the main concern of the OP. If $X$ is real and $f$ is a real function (and we use the Borel $\sigma$-algebra, say), then taking $B = (-\infty, b]$ it follows that the distribution function for the distribution of $f(X)$ only takes the values 0 and 1, hence there is a $b$ at which it jumps from $0$ to $1$ and $P(f(X) = b) = 1$.

At the end of the day, the answer to the OPs question is that $X$ and $f(X)$ are generally dependent and only independent under very special circumstances. Moreover, the Dirac measure $\delta_{f(x)}$ always qualifies for a conditional distribution of $f(X)$ given $X = x$, which is a formal way of saying that knowing $X = x$ then you also know exactly what $f(X)$ is. This special form of dependence with a degenerate conditional distribution is characteristic for functions of random variables.

**Attribution***Source : Link , Question Author : Rohit Banga , Answer Author : NRH*