1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Question about entropy, probability

  1. Oct 7, 2009 #1
    1. The problem statement, all variables and given/known data
    The entropy of a random variable X is defined as -E[ln(fX(X))]
    where fX(X) is the pdf of the random variable X

    Show that the translation of X by a constant (e.g. adding a constant value to X) does not effect the entropy.

    3. The attempt at a solution

    To be honest, I have no idea where to start this. I tried to look at it as integral(x(ln(fX(X)), but it did not help. I don't understand how the translation by a constant would not effect the expected value, since I would expect it to change the value of the function, and therefore the natural log of the function, and therefore the expected value of the function.

    Thanks in advance
  2. jcsd
  3. Oct 7, 2009 #2
    It would be [tex]E[\ln f_X(X)]=\int_{-\infty}^{\infty} [\ln f_X(x)] f_X(x)\,dx[/tex]
  4. Oct 7, 2009 #3


    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Hint: Show and use

    [tex]f_{X+c}(x) = f_{X}(x-c)[/tex]
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook