1. Oct 7, 2009

### smk037

1. The problem statement, all variables and given/known data
The entropy of a random variable X is defined as -E[ln(fX(X))]
where fX(X) is the pdf of the random variable X

Show that the translation of X by a constant (e.g. adding a constant value to X) does not effect the entropy.

3. The attempt at a solution

To be honest, I have no idea where to start this. I tried to look at it as integral(x(ln(fX(X)), but it did not help. I don't understand how the translation by a constant would not effect the expected value, since I would expect it to change the value of the function, and therefore the natural log of the function, and therefore the expected value of the function.

2. Oct 7, 2009

### Billy Bob

It would be $$E[\ln f_X(X)]=\int_{-\infty}^{\infty} [\ln f_X(x)] f_X(x)\,dx$$

3. Oct 7, 2009

### LCKurtz

Hint: Show and use

$$f_{X+c}(x) = f_{X}(x-c)$$