Correct Answer - Option 3 : 14400 bps
Concept:
The entropy of a probability distribution is the average or the amount of information when drawing from a probability distribution.
It is calculated as:
\(H=\underset{i=1}{\overset{n}{\mathop \sum }}\,{{p}_{i}}{{\log }_{2}}\left( \frac{1}{{{p}_{i}}} \right)bits/symbol\)
pi is the probability of the occurrence of a symbol.
Also, the information rate of the source is given by:
R = Hfs
fs = Sampling frequency
Calculation:
Given the bandwidth of the analog signal fm = 4 kHz
Since the signal is sampled at the Nyquist rate. So, we have the sampling frequency as:
fs = 2fm = 8k samples / sec
The entropy of the source will be:
\(H = \mathop \sum \limits_{k = 1}^4 {p_k}{\log _2}\frac{1}{{{p_k}}}\)
\(= {p_1}{\log _2}\frac{1}{{{p_1}}} + {p_2}{\log _2}\frac{1}{{{p_2}}} + {p_3}{\log _2}\frac{1}{{{p_3}}} + {p_4}{\log _2}\frac{1}{{{p_4}}}\)
\( = \frac{1}{8}{\log _2}8 + \frac{3}{8}{\log _2}\frac{8}{3} + \frac{3}{8}{\log _2}\frac{8}{3} + \frac{1}{8}{\log _2}8\)
= 1.81 bits / sample
Therefore, the information rate of the source will be:
R = Hfs = (1.81) × (8 x 103) = 14400 bps