Entropy (Information Theory Question)

  • Thread starter Thread starter Jskota
  • Start date Start date
  • Tags Tags
    Entropy Theory
Click For Summary

Homework Help Overview

The problem involves two independent integer-valued random variables, ##X## and ##Y##, where ##X## is uniformly distributed over the set {1,2,...,8} and ##Y## has a probability mass function defined as ##\text{Pr}\{Y=k\} = 2^{-k}## for positive integers. The questions focus on calculating the entropies ##H(X)##, ##H(Y)##, and ##H(X+Y,X-Y)##.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • The original poster attempts to calculate the entropies using properties of independent random variables and expresses confusion regarding the calculation of ##H(X+Y,X-Y)##, leading to a discrepancy in expected results.
  • Some participants question the validity of the log-det formula used in the calculations, suggesting it may not apply to discrete distributions.
  • Others explore the implications of independence on the entropy of combined random variables and discuss the nature of affine relationships between random variables.

Discussion Status

The discussion is ongoing, with participants providing insights and questioning assumptions related to the entropy calculations. There is a recognition of the complexity of the problem, and some participants have expressed a shift in understanding regarding the expected results.

Contextual Notes

Participants note the potential confusion arising from the application of continuous entropy concepts to discrete random variables and the need for clarity on the relationships between the random variables involved.

Jskota
Messages
5
Reaction score
0

Homework Statement


Let ##X## and ##Y## be two independent integer-valued random variables. Let ##X## be uniformly distributed over ##\left\{1,2,...,8\right\}##, and let ##\text{Pr}\left\{Y=k\right\} =2^{-k},~k=1,2,3,...##
(a) Find ##H(X)##.
(b) Find ##H(Y)##.
(c) Find ##H(X+Y,X-Y)##.

Homework Equations


I am confused about part (c). I have found the answers to (a) and (b), they are obviously 3 bits and 2 bits, respectively. However, the solution I get for (c) does not match the answer. The answer to (c) is apparently 5 bits.

The Attempt at a Solution


I argue that ##Z=X+Y## and ##W=X-Y##. Thus, I create the vectors ##\mathbf{u} = [Z,W]^T## and ##\mathbf{v}=[X,Y]^T## and write them as a linear transformation of each other as

##\mathbf{u}=\begin{bmatrix}1&1 \\ 1&-1 \end{bmatrix}\mathbf{v}=\mathbf{M}\mathbf{v}##.

Therefore, ##H(\mathbf{u})=H(X+Y,X-Y)=H(\mathbf{v})+\log_2\lvert\text{det}\left(\mathbf{M}\right)\rvert##. I then have

##\log_2\lvert\text{det}\left(\mathbf{M}\right)\rvert=1## bit
##H(\mathbf{v})=H(X)+H(Y|X)=H(X)+H(Y)=5## bits (since ##Y## is independent of ##X##).

This leaves me with the answer for (c) to be 6 bits.

Edit: Unless the formula I am using with log-det is only for continuous and not discrete distributions.
 
Last edited:
Physics news on Phys.org
I'm not familiar with this log2|det(M)| formula. Can you post a link?
It feels wrong. If Y = 2X, would H(Y) be different from H(X)?
It's sort of obvious that since X and Y are independent H(X+Y,X-Y) = H(X,Y) = H(X)+H(Y).
 
haruspex said:
I'm not familiar with this log2|det(M)| formula. Can you post a link?
It feels wrong. If Y = 2X, would H(Y) be different from H(X)?
It's sort of obvious that since X and Y are independent H(X+Y,X-Y) = H(X,Y) = H(X)+H(Y).

I think it was incorrect usage. It doesn't apply here since these are pmfs and not pdfs. I got it from a differential entropy wiki page.

Anyways, I don't disagree that if Y is a scale of X that the uncertainty in the RV will be the same. The probabilities are the same regardless of the values they take on the sample space.

I guess I didn't see it as obvious here since H(X+Y,X-Y) seems more complicated than it is. But the only way I guess I can understand this is that if given we know that Z is the sum and W is the difference, we can always determine X and Y. And so if X and Y are independent then the entropy is just the sum.

Do you know if, in general, when there is an affine relationship between RVs that the entropy is the same? It makes sense conceptually but there aren't really any theorems out there for it that I could find in my book (Cover-Thomas).
 
Jskota said:
Do you know if in general there is an affine relationship between the RV that the entropy is the same?
For discrete RVs, I would say they'd be the same given any bidirectional deterministic relationship. If Y = f(X) is a bijection, P(Y=f(x)) = P(X=x).
 
  • Like
Likes   Reactions: Jskota
haruspex said:
For discrete RVs, I would say they'd be the same given any bidirectional deterministic relationship. If Y = f(X) is a bijection, P(Y=f(x)) = P(X=x).
Okay. That is actually kind of what I was getting to last night. I eventually sort of proved it to myself that 5 bits makes sense a bit after I had posted this. Thank you for the help!
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 11 ·
Replies
11
Views
1K
Replies
19
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
Replies
7
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 0 ·
Replies
0
Views
2K