Use app×
Join Bloom Tuition
One on One Online Tuition
JEE MAIN 2025 Foundation Course
NEET 2025 Foundation Course
CLASS 12 FOUNDATION COURSE
CLASS 10 FOUNDATION COURSE
CLASS 9 FOUNDATION COURSE
CLASS 8 FOUNDATION COURSE
0 votes
248 views
in General by (78.8k points)
closed by

Let (X1, X2) be independent random varibales. X1 has mean 0 and variance 1, while X2 has mean 1 and variance 4. The mutual information I(X1 ; X2) between X1 and X2 in bits is_______.

1 Answer

0 votes
by (79.1k points)
selected by
 
Best answer

Concept:

Mutual information of two random variables is a measure to tell how much one random variable tells about the other.

It is mathematically defined as:

I(X1, X2) = H(X1) – H(X1/X2)

Application:

Since X1 and X2 are independent, we can write:

H(X1/X2) = H(X1)

I(X1,X2 ) = H(X1) – H(X1)

= 0

Welcome to Sarthaks eConnect: A unique platform where students can interact with teachers/experts/students to get solutions to their queries. Students (upto class 10+2) preparing for All Government Exams, CBSE Board Exam, ICSE Board Exam, State Board Exam, JEE (Mains+Advance) and NEET can ask questions from any subject and get quick answers by subject teachers/ experts/mentors/students.

Categories

...