Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Energy Vs Information

  1. Dec 28, 2006 #1
    Among the uncertainty relations in QM we have X,P and E,T.
    In my perspective (and mine only), measurement leads to information, and information is correlated to energy. Nevertheless, information is certain.
    Does this means that information is a reduction of energy?
     
  2. jcsd
  3. Dec 28, 2006 #2
    How is information related to energy? Information is related to entropy, if you're using Shannon's entropy.
     
  4. Dec 28, 2006 #3
    It's well known that the ocean is full of energy (being so far from absolute zero temperature), and yet that we can't use that enegy to power a boat.

    OTOH, Maxwell pointed out that if he gave a demon enough information to know the trajectory of every molecule in that ocean, then the demon could extract every ounce of energy to use for his own purposes.

    If entropy is chaos, or lack of knowledge, then information is a reduction of entropy and a measure of how much energy you can harness.
     
  5. Dec 29, 2006 #4
    StatMechGuy: "How is information related to energy? Information is related to entropy, if you're using Shannon's entropy".
    dG=dH-TdS (Gibs)
    If you accept that information is correlated to entropy, you must accept also that it is correlated to energy.

    cesiumfrog: "If entropy is chaos, or lack of knowledge, then information is a reduction of entropy and a measure of how much energy you can harness."
    How can you measure that?
     
  6. Dec 29, 2006 #5
    Doesn't the demon itself pose a problem? How much energy does it cost to track the trajectory of every molecule?
     
  7. Dec 29, 2006 #6
    I look at it this way.

    Any gain in information is a loss in entropy, Information is order and entropy is a measure of chaos or disorder.

    Measurement increases order because it is the process of correlation, mapping of variables in an ordered fashion, It reduces uncertainty and increases order.

    Energy is a quantification of this process.
     
  8. Dec 29, 2006 #7
    Through temperature. I can have a high entropy-low energy state or a low entropy-high energy state, it depends on the temperature. So it's not that direct.
     
  9. Dec 30, 2006 #8
    3trQN :"Any gain in information is a loss in entropy, Information is order and entropy is a measure of chaos or disorder."

    Is there any proof that any gain in information is a loss in entropy?


    Let me rephrase my question: If we take a specific Hamiltonian (lets say time dependent) acting on a state, we’ll get several energy levels (linearly superposed).
    But when we measure the energy of the state, we can get in this measurement only one specific energy among all the energy levels. Does this means that information leads to reduction or that the measurement leads to reduction?
     
  10. Dec 30, 2006 #9
    StatMechGuy:” Through temperature. I can have a high entropy-low energy state or a low entropy-high energy state, it depends on the temperature. So it's not that direct.”

    May you help me to understand what the temperature is?
    Originally it introduced for the description of the statistical systems. Then one obtain U=1/2*k*T, namely, the relation not only for single particle but for the single degree of freedom of the single particle. Thus one obtain 1ev=11000deg. and therefore it correctly describes high energy physics. ???
     
  11. Dec 30, 2006 #10
    StatMechGuy:”How is information related to energy? Information is related to entropy, if you're using Shannon's entropy.”

    The Shannon’s theory is the theory of communication (information rate and not information). For example, the white noise provide the max information rate.

    Cesiumfrog:” information is a reduction of entropy and a measure of how much energy you can harness.”

    I think that you should take into account also the negative statements. It seems to me that it is also provide information. But then the information can’t be a measure.
     
  12. Dec 30, 2006 #11
    From my understanding of the situation, temperature can be viewed several ways. In one case, from a termodynamic standpoint, you can define temperature as a constraint function of the configuration space when two systems are in thermodynamic equilibrium. Therefore, when two systems are in equilbrium, their temperature is the same.

    The other way I've seen it come in is as a Lagrange multiplier for maximizing the entropy of a system at constant energy. You can define both quantities in terms of the density operator, and then maximize the entropy (by defining [tex]S = \hat{\rho} \ln \hat{\rho} [/tex]) subject to the constraint that the energy expectation value is constant. When you do this, you recognize something that acts like the temperature, and therefore you call it "temperature".

    I believe Pauli's "Treatise on Thermodynamics gives a more historic definition of temperature in terms of ideal gases, but this definition predates quantum mechanics, and is probably not the greatest way to go about thing.

    Edit: As for the Shannon entropy thing, it is very easy to associate entropy with a lack of information about a system. In a state of zero entropy, as in full information, it would be possible for us to just build an enormous computer and numerically integrate the equations of motion from that point. The fact is, in a thermodynamic system we treat the system as a probabilistic one, since it is untractable to do what I just said. Shannon's entropy is very applicable to thermodynamics.
     
    Last edited: Dec 30, 2006
  13. Dec 30, 2006 #12
    StatMechGuy:” From my understanding of the situation, temperature can be viewed several ways.”

    In both cases considered you use many-body (many-particles) systems (statistical ensembles). I expect that when applied to single particle, the same notion should lead to nonsense, but this what not happends. This is the content of my question.
     
  14. Dec 30, 2006 #13
    You can't talk about the temperature of a single particle. Temperature is a result of having a statistical ensemble. From a statistical-mechanical standpoint, temperature dictates how the particles are distributed through the energy spectrum of the system. I'm fairly confident you can talk about temperature without talking about a large number of particles.
     
  15. Dec 30, 2006 #14
    Thank you.
     
  16. Feb 4, 2007 #15
    I think the purpoise of life is to transform the energy of a sun into (genetic and structural) information.

    If we only had more knowledge to exploit outside energies..

    heres a vid about how mutch more energy is waiting for us.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?