Mutual information of a noisy function

  • #1
46
1
So let suppose I have a random variable Y that is defined as follows:
$$Y=\alpha x+ \mathcal{N}(\mu,\sigma) \text{ where } x \text{ } \epsilon \text{ }\mathbb {R}$$
and
$$\mathcal{N}(\mu,\sigma)\text{ is a i.i.d. normally distributed random variable with mean }\mu\text{ and variance }\sigma$$
So I know Y is a random variable, but x is not, however is seems to me that their is a probability measure
$$P(x,Y)\text{ } \epsilon\text{ }[0,1]$$
Therefore, the mutual information is
$$I(x;Y)=H(x)+H(x|Y)=H(Y)+H(Y|x)$$
However it seems that
$$H(x)=-\int_{x}p(x)lnp(x)dx$$
is not defined because x is not a random variable so is their really any mutual information between x and Y? is p(x,Y) an actual joint probability distribution? Any insight would be awesome thanks.
 
Last edited:

Answers and Replies

  • #2
46
1
So let suppose I have a random variable Y that is defined as follows:
$$Y=\alpha x+ \mathcal{N}(\mu,\sigma) \text{ where } x \text{ } \epsilon \text{ }\mathbb {R}$$
and
$$\mathcal{N}(\mu,\sigma)\text{ is a i.i.d. normally distributed random variable with mean }\mu\text{ and variance }\sigma$$
So I know Y is a random variable, but x is not, however is seems to me that their is a probability measure
$$P(x,Y)\text{ } \epsilon\text{ }[0,1]$$
Therefore, the mutual information is
$$I(x;Y)=H(x)+H(x|Y)=H(Y)+H(Y|x)$$
However it seems that
$$H(x)=-\int_{x}p(x)lnp(x)dx$$
is not defined because x is not a random variable so is their really any mutual information between x and Y? is p(x,Y) an actual joint probability distribution? Any insight would be awesome thanks.
For some reason it is not showing the text between equations. Here it is in full
 
  • #3
1,367
61
Do you know the value of ##x## (is ##x## a constant?), or you want to estimate it from ##Y##?
 
  • #4
46
1
Do you know the value of ##x## (is ##x## a constant?), or you want to estimate it from ##Y##?
x is not constant it is the independent variable in this case. Without the noise term this would just be a simple linear function $$Y=\alpha x$$
 
  • #5
1,367
61
Fine, but what it does represent in reality? When you observe ##Y##, do you know what is ##x##?
 
  • #6
46
1
Fine, but what it does represent in reality? When you observe ##Y##, do you know what is ##x##?
Yes. Ultimately this is a regression problem. I have observed values x and observed values Y and want to know the mutual information between them with the knowledge that Y is a linear function of x. For example lets say I send a signal x which is received by a receiver that transforms x by multiplying it by a constant, but their is some unknown source of noise added between reception and transmission. I want to know how much of Y can be explained by x.
 
  • #7
1,367
61
I suspect that ##x## is unknown at the time of observing ##Y##, which makes it random. Say you have two signals ##x_1## and ##x_2##, and you transmitted ##x_1##. You receive ##y=\alpha\,x_1+n##, where ##n## is the noise. ##x_1## is a number at the transmitter, but at the receiver it is random (it could be ##x_1## or ##x_2##) because it is unknown.
 
  • #8
FactChecker
Science Advisor
Gold Member
6,057
2,341
There is no jpint probability function that you can assign to the set (x, Y). Since x is not a random variable, it prevents any attempt at assigning joint probabilities.

You could reverse the roles of x and Y and say that ##X = (y - \mathcal{N}(\mu,\,\sigma^{2}) )/\alpha##
In that case, X would be a random variable, y would not be a random variable and there would still not be a joint probability.
 
Last edited:

Related Threads on Mutual information of a noisy function

  • Last Post
Replies
3
Views
6K
Replies
5
Views
344
Replies
2
Views
2K
  • Last Post
Replies
0
Views
3K
  • Last Post
Replies
6
Views
28K
Replies
7
Views
5K
Replies
3
Views
2K
Replies
1
Views
3K
  • Last Post
Replies
0
Views
1K
Top