Mutual information of a noisy function

Click For Summary
SUMMARY

The discussion centers on the mutual information between a random variable Y, defined as \(Y=\alpha x+ \mathcal{N}(\mu,\sigma)\), and an independent variable x, which is not a random variable. Participants clarify that while Y is influenced by x, the lack of randomness in x prevents the establishment of a joint probability distribution \(P(x,Y)\). Consequently, the mutual information \(I(x;Y)\) cannot be defined as \(H(x)\) is not applicable to a non-random variable. The conclusion is that without x being treated as a random variable, mutual information cannot be computed.

PREREQUISITES
  • Understanding of random variables and probability distributions
  • Familiarity with mutual information and entropy concepts
  • Knowledge of regression analysis and linear functions
  • Basic principles of noise in signal processing
NEXT STEPS
  • Explore the concept of mutual information in the context of regression analysis
  • Study the properties of joint probability distributions for random and non-random variables
  • Learn about the implications of noise in signal transmission and its effect on mutual information
  • Investigate alternative methods for estimating relationships between variables when one is not random
USEFUL FOR

Data scientists, statisticians, and researchers in signal processing who are interested in understanding the relationship between observed variables and the implications of noise in regression models.

joshthekid
Messages
46
Reaction score
1
So let suppose I have a random variable Y that is defined as follows:
$$Y=\alpha x+ \mathcal{N}(\mu,\sigma) \text{ where } x \text{ } \epsilon \text{ }\mathbb {R}$$
and
$$\mathcal{N}(\mu,\sigma)\text{ is a i.i.d. normally distributed random variable with mean }\mu\text{ and variance }\sigma$$
So I know Y is a random variable, but x is not, however is seems to me that their is a probability measure
$$P(x,Y)\text{ } \epsilon\text{ }[0,1]$$
Therefore, the mutual information is
$$I(x;Y)=H(x)+H(x|Y)=H(Y)+H(Y|x)$$
However it seems that
$$H(x)=-\int_{x}p(x)lnp(x)dx$$
is not defined because x is not a random variable so is their really any mutual information between x and Y? is p(x,Y) an actual joint probability distribution? Any insight would be awesome thanks.
 
Last edited:
Physics news on Phys.org
joshthekid said:
So let suppose I have a random variable Y that is defined as follows:
$$Y=\alpha x+ \mathcal{N}(\mu,\sigma) \text{ where } x \text{ } \epsilon \text{ }\mathbb {R}$$
and
$$\mathcal{N}(\mu,\sigma)\text{ is a i.i.d. normally distributed random variable with mean }\mu\text{ and variance }\sigma$$
So I know Y is a random variable, but x is not, however is seems to me that their is a probability measure
$$P(x,Y)\text{ } \epsilon\text{ }[0,1]$$
Therefore, the mutual information is
$$I(x;Y)=H(x)+H(x|Y)=H(Y)+H(Y|x)$$
However it seems that
$$H(x)=-\int_{x}p(x)lnp(x)dx$$
is not defined because x is not a random variable so is their really any mutual information between x and Y? is p(x,Y) an actual joint probability distribution? Any insight would be awesome thanks.

For some reason it is not showing the text between equations. Here it is in full
 
Do you know the value of ##x## (is ##x## a constant?), or you want to estimate it from ##Y##?
 
EngWiPy said:
Do you know the value of ##x## (is ##x## a constant?), or you want to estimate it from ##Y##?
x is not constant it is the independent variable in this case. Without the noise term this would just be a simple linear function $$Y=\alpha x$$
 
Fine, but what it does represent in reality? When you observe ##Y##, do you know what is ##x##?
 
EngWiPy said:
Fine, but what it does represent in reality? When you observe ##Y##, do you know what is ##x##?
Yes. Ultimately this is a regression problem. I have observed values x and observed values Y and want to know the mutual information between them with the knowledge that Y is a linear function of x. For example let's say I send a signal x which is received by a receiver that transforms x by multiplying it by a constant, but their is some unknown source of noise added between reception and transmission. I want to know how much of Y can be explained by x.
 
I suspect that ##x## is unknown at the time of observing ##Y##, which makes it random. Say you have two signals ##x_1## and ##x_2##, and you transmitted ##x_1##. You receive ##y=\alpha\,x_1+n##, where ##n## is the noise. ##x_1## is a number at the transmitter, but at the receiver it is random (it could be ##x_1## or ##x_2##) because it is unknown.
 
There is no jpint probability function that you can assign to the set (x, Y). Since x is not a random variable, it prevents any attempt at assigning joint probabilities.

You could reverse the roles of x and Y and say that ##X = (y - \mathcal{N}(\mu,\,\sigma^{2}) )/\alpha##
In that case, X would be a random variable, y would not be a random variable and there would still not be a joint probability.
 
Last edited:

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 12 ·
Replies
12
Views
5K