I have a very basic doubt. Sorry if this irritates few. I know that Mutual Information value should be greater than 0, but should it be less than 1 ? Is it bounded by any upper value ?

Thanks,

Amit.

**Answer**

Yes, it does have an upper bound, but not 1.

The mutual information (in bits) is 1 when two parties (statistically) share one bit of information. However, they can share a arbitrary large data. In particular, if they share 2 bits, then it is 2.

The mutual information is bounded from above by the Shannon entropy of probability distributions for single parties, i.e. $I(X,Y) \leq \min \left[ H(X), H(Y) \right]$ .

**Attribution***Source : Link , Question Author : Amit , Answer Author : Peter Ellis*