Concept:
Mutual information of two random variables is a measure to tell how much one random variable tells about the other.
It is mathematically defined as:
I(X1, X2) = H(X1) – H(X1/X2)
Application:
Since X1 and X2 are independent, we can write:
H(X1/X2) = H(X1)
I(X1,X2 ) = H(X1) – H(X1)
= 0