Mutual information of a noisy function

In summary, the conversation discusses a random variable Y that is defined as a linear function of x with a noise term. It is mentioned that x is not a random variable, but rather an independent variable. The concept of mutual information between x and Y is brought up, but it is argued that there is no mutual information between them since x is not a random variable. The conversation also touches on the idea of assigning joint probabilities to x and Y, but it is ultimately concluded that this is not possible.
  • #1
joshthekid
46
1
So let suppose I have a random variable Y that is defined as follows:
$$Y=\alpha x+ \mathcal{N}(\mu,\sigma) \text{ where } x \text{ } \epsilon \text{ }\mathbb {R}$$
and
$$\mathcal{N}(\mu,\sigma)\text{ is a i.i.d. normally distributed random variable with mean }\mu\text{ and variance }\sigma$$
So I know Y is a random variable, but x is not, however is seems to me that their is a probability measure
$$P(x,Y)\text{ } \epsilon\text{ }[0,1]$$
Therefore, the mutual information is
$$I(x;Y)=H(x)+H(x|Y)=H(Y)+H(Y|x)$$
However it seems that
$$H(x)=-\int_{x}p(x)lnp(x)dx$$
is not defined because x is not a random variable so is their really any mutual information between x and Y? is p(x,Y) an actual joint probability distribution? Any insight would be awesome thanks.
 
Last edited:
Physics news on Phys.org
  • #2
joshthekid said:
So let suppose I have a random variable Y that is defined as follows:
$$Y=\alpha x+ \mathcal{N}(\mu,\sigma) \text{ where } x \text{ } \epsilon \text{ }\mathbb {R}$$
and
$$\mathcal{N}(\mu,\sigma)\text{ is a i.i.d. normally distributed random variable with mean }\mu\text{ and variance }\sigma$$
So I know Y is a random variable, but x is not, however is seems to me that their is a probability measure
$$P(x,Y)\text{ } \epsilon\text{ }[0,1]$$
Therefore, the mutual information is
$$I(x;Y)=H(x)+H(x|Y)=H(Y)+H(Y|x)$$
However it seems that
$$H(x)=-\int_{x}p(x)lnp(x)dx$$
is not defined because x is not a random variable so is their really any mutual information between x and Y? is p(x,Y) an actual joint probability distribution? Any insight would be awesome thanks.

For some reason it is not showing the text between equations. Here it is in full
 
  • #3
Do you know the value of ##x## (is ##x## a constant?), or you want to estimate it from ##Y##?
 
  • #4
EngWiPy said:
Do you know the value of ##x## (is ##x## a constant?), or you want to estimate it from ##Y##?
x is not constant it is the independent variable in this case. Without the noise term this would just be a simple linear function $$Y=\alpha x$$
 
  • #5
Fine, but what it does represent in reality? When you observe ##Y##, do you know what is ##x##?
 
  • #6
EngWiPy said:
Fine, but what it does represent in reality? When you observe ##Y##, do you know what is ##x##?
Yes. Ultimately this is a regression problem. I have observed values x and observed values Y and want to know the mutual information between them with the knowledge that Y is a linear function of x. For example let's say I send a signal x which is received by a receiver that transforms x by multiplying it by a constant, but their is some unknown source of noise added between reception and transmission. I want to know how much of Y can be explained by x.
 
  • #7
I suspect that ##x## is unknown at the time of observing ##Y##, which makes it random. Say you have two signals ##x_1## and ##x_2##, and you transmitted ##x_1##. You receive ##y=\alpha\,x_1+n##, where ##n## is the noise. ##x_1## is a number at the transmitter, but at the receiver it is random (it could be ##x_1## or ##x_2##) because it is unknown.
 
  • #8
There is no jpint probability function that you can assign to the set (x, Y). Since x is not a random variable, it prevents any attempt at assigning joint probabilities.

You could reverse the roles of x and Y and say that ##X = (y - \mathcal{N}(\mu,\,\sigma^{2}) )/\alpha##
In that case, X would be a random variable, y would not be a random variable and there would still not be a joint probability.
 
Last edited:

1. What is mutual information of a noisy function?

Mutual information of a noisy function is a measure of the amount of information that is shared between two variables, where one variable is a noisy version of the other. It is a quantification of the dependence between these two variables, taking into account both the noise and the underlying relationship between them.

2. How is mutual information of a noisy function calculated?

Mutual information of a noisy function can be calculated using a mathematical formula that takes into account the joint distribution of the two variables and their individual distributions. This formula is often represented as MI(X,Y) = H(X) + H(Y) - H(X,Y), where H(X) and H(Y) are the entropies of the two variables and H(X,Y) is the joint entropy.

3. What does a high mutual information of a noisy function indicate?

A high mutual information of a noisy function indicates a strong dependence between the two variables. This means that the two variables are highly related or influence each other in some way, even if there is noise present. It can also suggest that there is a pattern or structure in the noise that is related to the underlying relationship between the two variables.

4. How is mutual information of a noisy function useful in scientific research?

Mutual information of a noisy function is useful in scientific research because it provides a way to quantify the dependence between two variables, even in the presence of noise. This can help researchers understand the strength and nature of the relationship between variables and can be used in various fields such as neuroscience, signal processing, and machine learning.

5. Are there any limitations to using mutual information of a noisy function?

Yes, there are some limitations to using mutual information of a noisy function. It assumes that the relationship between the two variables is linear and does not account for non-linear relationships. It also does not consider the direction of the relationship, only the overall dependence. Additionally, it may not be suitable for all types of data and may require careful consideration when interpreting the results.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
490
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
937
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
964
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
824
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
111
  • Set Theory, Logic, Probability, Statistics
Replies
0
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
12
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
890
Back
Top