I’ve worked with the mutual information for some time. But I found a very recent measure in the “correlation world” that can also be used to measure distribution independence, the so called “distance correlation” ( also termed Brownian correlation): http://en.wikipedia.org/wiki/Brownian_covariance. I checked the papers where this measure is introduced, but without finding any allusion to the mutual information.

So, my questions are:

- Do they solve exactly the same problem? If not, how the problems are different?
- And if the previous question can be answered on the positive, what are the advantages of using one or the other?

**Answer**

Information / mutual information does not depend on the possible values, it depends only on the probabilities therefore it is less sensitive. Distance correlation is more powerful and simpler to compute. For a comparision see

**Attribution***Source : Link , Question Author : dsign , Answer Author : gabor J Szekely*