Correct Answer - Option 2 : S
1 is less than S
2
Concept:
Entropy of mutual information H(X):
\(H\left( X \right) = \mathop \sum \nolimits_{i = 1}^n p\left( {{x_i}} \right).{\log _2}\frac{1}{{p\left( {{x_i}} \right)}} = {\log _2}m\)
[for equiprobable ‘m’ no. of symbol]
Calculation:
Case-I: For source S1;
No. of symbol; m = 4
H(x) = log2 m
= 2 bits / symbol
Case-II: For source S2;
No. of symbol; m = 16
H(x) = log2 m
= 4 bits / symbol
∴ entropy of S1 is less than S2
1) Entropy for a continuous random variable is known as differential entropy:
\(H\left( X \right) = \mathop \smallint \nolimits_{ - \infty }^\infty {F_X}\left( x \right).{\log _2}\frac{1}{{{F_X}\left( x \right)}}\)
FX(x) ⇒ PDF of RV ‘X’
2) Entropy is a measure of uncertainty.