Correct Answer - Option 2 : (b) and (c)
Mutual Information measures of the amount of information that one Random variable contains about another Random variable.
The mutual information between two jointly distributed discrete Random variables x and y is given by:
\(I\left( {X\;;Y} \right) = \;\mathop \sum \limits_{xy} p\left( {x,\;y} \right) \cdot \log \frac{{p\left( {x,\;y} \right)}}{{p\left( x \right) \cdot p\left( y \right)}}\)
In terms of Entropy, this is written as:
I(x ; y) = H(x) – H(x/y) ---(1)
(Option (b) is correct)
Also, the conditional entropy states that:
H(x, y) = H(y/x) + H(x)
H(x, y) = H(x/y) + H(y).
From above equations we can write:
H(x/y) = H(x, y) – H(y) ---(2)
Using Equations (1) and (2), we can write:
I(x ; y) = H(x) – [H(x, y) – H(y)]
I(x ; y) = H(x) + H(y) – H(x, y) ---(3)
(Option (c) is correct)