Concept:
Information associated with the event is “inversely” proportional to the probability of occurrence.
Entropy: The average amount of information is called the “Entropy”.
\(H = \;\mathop \sum \limits_i {P_i}{\log _2}\left( {\frac{1}{{{P_i}}}} \right)\;bits/symbol\)
Calculation:
Given symbols a1, a2, a3, a4, and their
probabilities are 1/2, 1/4, 1/8, 1/8
\( H = {\frac{1}{2}{{\log }_2}2 + \frac{1}{4}{{\log }_2}4 + \frac{1}{8}{{\log }_2}8 + \frac{1}{8}{{\log }_2}8} \)
H = 0.5 + 0.5 + 0.375 + 0.375
H = 1.75 bits/symbol
The minimum required average codeword length is 1.75 bits/symbol.