Correct Answer - Option 2 : log
2K
Entropy is the average information content per symbol in a group of symbols.
For M symbols with probability P1, P2,…Pi, the entropy (H) is given by:
\(H = - \mathop \sum \nolimits_{i = 1}^M {P_i}{\log _2}\left( {{P_i}} \right)\;bits/symbol\)
For a discrete memoryless source containing K symbols with equal probability of occurrence,
maximum entropy is there and is given by:
Hmax = log2 K bits/symbol
Note:
Range of Entropy is 0 ≤ H ≤ log2 M
If the symbol rate is rs symbol/sec, the source information rate is:
R = rs H bit/sec