__Concept:__

The Entropy of a source, which is also the average information content is given by:

\(H = \sum P\left( {{x_i}} \right){\log _2}\left( {\frac{1}{{P\left( {{x_i}} \right)}}} \right)\)

Where P(xi) = Probability of each codeword.

__Application__:

We have P_{1} = P_{4} = 0.125

Now, P_{1} + P_{2} + P_{3} + P_{4} = 1

Since P2 = P3, the above equation becomes:

\(0.12 + 2{{\rm{P}}_2} + 0.125 = 1\)

2P_{2} = 0.75

P_{2} = P_{3} = 0.375

Average information will be:

H = - P_{1}log_{2}P_{1} - P_{2}log_{2}P_{2} - P_{3}log_{2}P_{3} - P_{4}log_{2}P_{4}

H = - 0.125.log_{2} 0.125 - 0.375.log_{2} 0.375 - 0.375.log_{2 }0.375 - 0.125.log_{2 }0.125

H = 0.375 + 0.531 + 0.531 + 0.375

H = 1.812 bits/symbol.

Sampling rate is the Nyquist rate, i.e.

r = 2 × 100 = 200 samples/sec

Thus, the information rate will be: r × H

= 200 × 1.812

= 362.4 bits/sec.