Energy vs Information: Uncovering the Correlation

  • Context: Graduate 
  • Thread starter Thread starter khtur
  • Start date Start date
  • Tags Tags
    Energy Information
Click For Summary
SUMMARY

This discussion explores the correlation between information and energy through the lens of quantum mechanics and thermodynamics. Participants assert that measurement leads to information, which is intrinsically linked to entropy, particularly referencing Shannon's entropy. The conversation highlights Maxwell's demon as a thought experiment illustrating how information can enable energy extraction. Key insights include the assertion that an increase in information corresponds to a decrease in entropy, thereby suggesting that information can be viewed as a quantification of energy harnessing potential.

PREREQUISITES
  • Understanding of quantum mechanics principles, particularly measurement and uncertainty relations.
  • Familiarity with thermodynamic concepts, including entropy and temperature.
  • Knowledge of Shannon's entropy and its implications in information theory.
  • Basic grasp of statistical mechanics and its application to many-body systems.
NEXT STEPS
  • Research the implications of Maxwell's demon in thermodynamics and information theory.
  • Study the relationship between Shannon's entropy and thermodynamic entropy in detail.
  • Explore the concept of temperature as a statistical ensemble property in quantum systems.
  • Investigate practical applications of information theory in energy harnessing technologies.
USEFUL FOR

Physicists, thermodynamic researchers, information theorists, and anyone interested in the intersection of energy, information, and entropy in quantum mechanics.

khtur
Messages
3
Reaction score
0
Among the uncertainty relations in QM we have X,P and E,T.
In my perspective (and mine only), measurement leads to information, and information is correlated to energy. Nevertheless, information is certain.
Does this means that information is a reduction of energy?
 
Physics news on Phys.org
How is information related to energy? Information is related to entropy, if you're using Shannon's entropy.
 
It's well known that the ocean is full of energy (being so far from absolute zero temperature), and yet that we can't use that energy to power a boat.

OTOH, Maxwell pointed out that if he gave a demon enough information to know the trajectory of every molecule in that ocean, then the demon could extract every ounce of energy to use for his own purposes.

If entropy is chaos, or lack of knowledge, then information is a reduction of entropy and a measure of how much energy you can harness.
 
StatMechGuy: "How is information related to energy? Information is related to entropy, if you're using Shannon's entropy".
dG=dH-TdS (Gibs)
If you accept that information is correlated to entropy, you must accept also that it is correlated to energy.

cesiumfrog: "If entropy is chaos, or lack of knowledge, then information is a reduction of entropy and a measure of how much energy you can harness."
How can you measure that?
 
cesiumfrog said:
OTOH, Maxwell pointed out that if he gave a demon enough information to know the trajectory of every molecule in that ocean, then the demon could extract every ounce of energy to use for his own purposes.

Doesn't the demon itself pose a problem? How much energy does it cost to track the trajectory of every molecule?
 
I look at it this way.

Any gain in information is a loss in entropy, Information is order and entropy is a measure of chaos or disorder.

Measurement increases order because it is the process of correlation, mapping of variables in an ordered fashion, It reduces uncertainty and increases order.

Energy is a quantification of this process.
 
If you accept that information is correlated to entropy, you must accept also that it is correlated to energy.

Through temperature. I can have a high entropy-low energy state or a low entropy-high energy state, it depends on the temperature. So it's not that direct.
 
3trQN :"Any gain in information is a loss in entropy, Information is order and entropy is a measure of chaos or disorder."

Is there any proof that any gain in information is a loss in entropy?


Let me rephrase my question: If we take a specific Hamiltonian (lets say time dependent) acting on a state, we’ll get several energy levels (linearly superposed).
But when we measure the energy of the state, we can get in this measurement only one specific energy among all the energy levels. Does this means that information leads to reduction or that the measurement leads to reduction?
 
StatMechGuy:” Through temperature. I can have a high entropy-low energy state or a low entropy-high energy state, it depends on the temperature. So it's not that direct.”

May you help me to understand what the temperature is?
Originally it introduced for the description of the statistical systems. Then one obtain U=1/2*k*T, namely, the relation not only for single particle but for the single degree of freedom of the single particle. Thus one obtain 1ev=11000deg. and therefore it correctly describes high energy physics. ?
 
  • #10
StatMechGuy:”How is information related to energy? Information is related to entropy, if you're using Shannon's entropy.”

The Shannon’s theory is the theory of communication (information rate and not information). For example, the white noise provide the max information rate.

Cesiumfrog:” information is a reduction of entropy and a measure of how much energy you can harness.”

I think that you should take into account also the negative statements. It seems to me that it is also provide information. But then the information can’t be a measure.
 
  • #11
Anonym said:
May you help me to understand what the temperature is?
Originally it introduced for the description of the statistical systems. Then one obtain U=1/2*k*T, namely, the relation not only for single particle but for the single degree of freedom of the single particle. Thus one obtain 1ev=11000deg. and therefore it correctly describes high energy physics. ?

From my understanding of the situation, temperature can be viewed several ways. In one case, from a termodynamic standpoint, you can define temperature as a constraint function of the configuration space when two systems are in thermodynamic equilibrium. Therefore, when two systems are in equilbrium, their temperature is the same.

The other way I've seen it come in is as a Lagrange multiplier for maximizing the entropy of a system at constant energy. You can define both quantities in terms of the density operator, and then maximize the entropy (by defining S = \hat{\rho} \ln \hat{\rho}) subject to the constraint that the energy expectation value is constant. When you do this, you recognize something that acts like the temperature, and therefore you call it "temperature".

I believe Pauli's "Treatise on Thermodynamics gives a more historic definition of temperature in terms of ideal gases, but this definition predates quantum mechanics, and is probably not the greatest way to go about thing.

Edit: As for the Shannon entropy thing, it is very easy to associate entropy with a lack of information about a system. In a state of zero entropy, as in full information, it would be possible for us to just build an enormous computer and numerically integrate the equations of motion from that point. The fact is, in a thermodynamic system we treat the system as a probabilistic one, since it is untractable to do what I just said. Shannon's entropy is very applicable to thermodynamics.
 
Last edited:
  • #12
StatMechGuy:” From my understanding of the situation, temperature can be viewed several ways.”

In both cases considered you use many-body (many-particles) systems (statistical ensembles). I expect that when applied to single particle, the same notion should lead to nonsense, but this what not happends. This is the content of my question.
 
  • #13
You can't talk about the temperature of a single particle. Temperature is a result of having a statistical ensemble. From a statistical-mechanical standpoint, temperature dictates how the particles are distributed through the energy spectrum of the system. I'm fairly confident you can talk about temperature without talking about a large number of particles.
 
  • #14
Thank you.
 
  • #15
I think the purpoise of life is to transform the energy of a sun into (genetic and structural) information.

If we only had more knowledge to exploit outside energies..

http://jaxx.upcnet.ro" a vid about how mutch more energy is waiting for us.
 
Last edited by a moderator:

Similar threads

  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 2 ·
Replies
2
Views
988
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 31 ·
2
Replies
31
Views
4K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 33 ·
2
Replies
33
Views
4K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 19 ·
Replies
19
Views
3K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K