Use app×
Join Bloom Tuition
One on One Online Tuition
JEE MAIN 2025 Foundation Course
NEET 2025 Foundation Course
CLASS 12 FOUNDATION COURSE
CLASS 10 FOUNDATION COURSE
CLASS 9 FOUNDATION COURSE
CLASS 8 FOUNDATION COURSE
0 votes
764 views
in General by (115k points)
closed by
Consider a discrete memory less source with alphabet S = s0, s1, s2, s3, s4, ...  and respective probabilities of occurrence P = {1/2, 1/4, 1/8, 1/16, 1/32}. The entropy of the source (in bits) is _______.

1 Answer

0 votes
by (152k points)
selected by
 
Best answer

Concept: The entropy of the source is the average information. 

Calculation: The entropy of the source is:

 \(\rm H(s)=-s_0 log_2⁡\ s_0-s_1 log_2⁡\ s_1-s_2 log_2⁡\ s_2 …………\)

\( \Rightarrow {\rm{H}}\left( {\rm{s}} \right) = \frac{1}{2} × 1 + \frac{1}{4} × 2 + \frac{1}{8} × 3 + \frac{1}{{16}} × 4. \ldots \ldots \ldots .. \)

\(\Rightarrow {\rm{H}}\left( {\rm{s}} \right) = \frac{1}{2}\left[ {1 +2 × {\rm{\;}}\frac{1}{2} + 3 × \frac{1}{ 2^2 } + 4 × \frac{1}{{{2^3}}} \ldots \ldots \ldots \ldots .} \right] \)

Now, the expansion of

\({\left( {1 - {\rm{x}}} \right)^{ - 2}} = 1 + 2{\rm{x}} + 3{{\rm{x}}^2} + 4{{\rm{x}}^3}{\rm{\;}} \ldots \ldots \ldots .\)

Comparing with

\(1 + 2 × \frac{1}{2} + 3 × \frac{1}{{{2^2}}} + 4 × \frac{1}{{{2^3}}}\)

We see, \({\rm{x}} = \frac{1}{2}\)

Thus, 

\(\begin{array}{l} = {\left( {1 - \frac{1}{2}} \right)^{ - 2}}\\ = {\left( {\frac{1}{2}} \right)^{ - 2}} = {2^2} = 4 \end{array}\)

Now,

H(s) = 0.5 × 4 = 2 bits/symbol

Welcome to Sarthaks eConnect: A unique platform where students can interact with teachers/experts/students to get solutions to their queries. Students (upto class 10+2) preparing for All Government Exams, CBSE Board Exam, ICSE Board Exam, State Board Exam, JEE (Mains+Advance) and NEET can ask questions from any subject and get quick answers by subject teachers/ experts/mentors/students.

Categories

...