Use app×
Join Bloom Tuition
One on One Online Tuition
JEE MAIN 2025 Foundation Course
NEET 2025 Foundation Course
CLASS 12 FOUNDATION COURSE
CLASS 10 FOUNDATION COURSE
CLASS 9 FOUNDATION COURSE
CLASS 8 FOUNDATION COURSE
0 votes
236 views
in General by (240k points)
closed by

Discrete source S1 has 4 equiprobable symbols while discrete source S2 has 16 equiprobable symbols. When the entropy of these two sources is compared, the entropy of:


1. S1 is greater than S2
2. S1 is less than S2
3. S1 is equal to S2
4. Depends on rate of symbols/second

1 Answer

0 votes
by (238k points)
selected by
 
Best answer
Correct Answer - Option 2 : S1 is less than S2

Concept:

Entropy of mutual information H(X):

\(H\left( X \right) = \mathop \sum \nolimits_{i = 1}^n p\left( {{x_i}} \right).{\log _2}\frac{1}{{p\left( {{x_i}} \right)}} = {\log _2}m\)

[for equiprobable ‘m’ no. of symbol]

Calculation:

Case-I: For source S1;

No. of symbol; m = 4

H(x) = log2 m

= 2 bits / symbol

Case-II: For source S2;

No. of symbol; m = 16

H(x) = log2 m

= 4 bits / symbol

entropy of S1 is less than S2

1) Entropy for a continuous random variable is known as differential entropy:

\(H\left( X \right) = \mathop \smallint \nolimits_{ - \infty }^\infty {F_X}\left( x \right).{\log _2}\frac{1}{{{F_X}\left( x \right)}}\) 

FX(x) PDF of RV ‘X’

2) Entropy is a measure of uncertainty.

Welcome to Sarthaks eConnect: A unique platform where students can interact with teachers/experts/students to get solutions to their queries. Students (upto class 10+2) preparing for All Government Exams, CBSE Board Exam, ICSE Board Exam, State Board Exam, JEE (Mains+Advance) and NEET can ask questions from any subject and get quick answers by subject teachers/ experts/mentors/students.

Categories

...