Mutual Information from this Gaussian Distribution

Click For Summary
To calculate Mutual Information from a given Gaussian distribution, the relevant equations involve the joint distribution and the marginal distributions of the variables. The user references two sources that provide insights into the necessary equations. The primary challenge is connecting these equations to derive Mutual Information specifically for the Gaussian distribution presented. Understanding the relationship between the covariance matrix and the joint entropy is crucial for this calculation. Clarifying these connections will enable the accurate computation of Mutual Information.
Arman777
Insights Author
Gold Member
Messages
2,163
Reaction score
191
Homework Statement
Calculating Mutual Information from Gaussian
Relevant Equations
Statistics Equations
Let us suppose we are given a Gaussian Distribution in the form of

$$p(x,y) \propto exp(-\frac{1}{2}x^2 - \frac{1}{2}by^2 - cxy)$$ What are the equations that I need to use to obtain Mutual Information ?
 
Physics news on Phys.org
First, I tried to show that ##f_n## converges uniformly on ##[0,2\pi]##, which is true since ##f_n \rightarrow 0## for ##n \rightarrow \infty## and ##\sigma_n=\mathrm{sup}\left| \frac{\sin\left(\frac{n^2}{n+\frac 15}x\right)}{n^{x^2-3x+3}} \right| \leq \frac{1}{|n^{x^2-3x+3}|} \leq \frac{1}{n^{\frac 34}}\rightarrow 0##. I can't use neither Leibnitz's test nor Abel's test. For Dirichlet's test I would need to show, that ##\sin\left(\frac{n^2}{n+\frac 15}x \right)## has partialy bounded sums...