Use app×
Join Bloom Tuition
One on One Online Tuition
JEE MAIN 2025 Foundation Course
NEET 2025 Foundation Course
CLASS 12 FOUNDATION COURSE
CLASS 10 FOUNDATION COURSE
CLASS 9 FOUNDATION COURSE
CLASS 8 FOUNDATION COURSE
0 votes
303 views
in General by (101k points)
closed by
For a discrete memory less source containing K symbols, the upper bound for the entropy is
1. 1/K
2. log2K
3. log10 K
4. log2 ­(1-K)

1 Answer

0 votes
by (108k points)
selected by
 
Best answer
Correct Answer - Option 2 : log2K

Entropy is the average information content per symbol in a group of symbols.

For M symbols with probability P1, P2,…Pi, the entropy (H) is given by:

\(H = - \mathop \sum \nolimits_{i = 1}^M {P_i}{\log _2}\left( {{P_i}} \right)\;bits/symbol\) 

For a discrete memoryless source containing K symbols with equal probability of occurrence,

maximum entropy is there and is given by:

Hmax = log2 K bits/symbol

Note:

Range of Entropy is 0 ≤ H ≤ log2 M

If the symbol rate is rs symbol/sec, the source information rate is:

R = rs H bit/sec

Welcome to Sarthaks eConnect: A unique platform where students can interact with teachers/experts/students to get solutions to their queries. Students (upto class 10+2) preparing for All Government Exams, CBSE Board Exam, ICSE Board Exam, State Board Exam, JEE (Mains+Advance) and NEET can ask questions from any subject and get quick answers by subject teachers/ experts/mentors/students.

Categories

...