Mutual Information from this Gaussian Distribution

Click For Summary
To calculate Mutual Information from a given Gaussian distribution, the relevant equations involve the joint distribution and the marginal distributions of the variables. The user references two sources that provide insights into the necessary equations. The primary challenge is connecting these equations to derive Mutual Information specifically for the Gaussian distribution presented. Understanding the relationship between the covariance matrix and the joint entropy is crucial for this calculation. Clarifying these connections will enable the accurate computation of Mutual Information.
Arman777
Insights Author
Gold Member
Messages
2,163
Reaction score
191
Homework Statement
Calculating Mutual Information from Gaussian
Relevant Equations
Statistics Equations
Let us suppose we are given a Gaussian Distribution in the form of

$$p(x,y) \propto exp(-\frac{1}{2}x^2 - \frac{1}{2}by^2 - cxy)$$ What are the equations that I need to use to obtain Mutual Information ?
 
Physics news on Phys.org
Question: A clock's minute hand has length 4 and its hour hand has length 3. What is the distance between the tips at the moment when it is increasing most rapidly?(Putnam Exam Question) Answer: Making assumption that both the hands moves at constant angular velocities, the answer is ## \sqrt{7} .## But don't you think this assumption is somewhat doubtful and wrong?

Similar threads

Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
7
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 18 ·
Replies
18
Views
2K
Replies
0
Views
841
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K