The

**mutual information**between two random variables X and Y is given by :

I(X,Y)=sum ( P(X,Y)*log(P(X,Y)/(P(X)*P(Y))) )

where P(X) and P(Y) are the probability distributions of X and Y.

If the random variables X and Y are independent, I(X,Y) = 0.

mutual information can also be found as :

I(X,Y)=H(X)-H(X/Y)=H(Y)-H(Y/X)

where H(X) is the entropy of X and is given by :

H(X) = sum ( P(X)*log(P(X)) )