Energy vs Information: Uncovering the Correlation

In summary, among the uncertainty relations in QM we have X,P and E,T. Measurement leads to information, and information is certain. Does this means that information is a reduction of energy?How is information related to energy? Information is related to entropy, if you're using Shannon's entropy.It's well known that the ocean is full of energy (being so far from absolute zero temperature), and yet that we can't use that energy to power a boat. However, Maxwell pointed out that if he gave a demon enough information
  • #1
khtur
3
0
Among the uncertainty relations in QM we have X,P and E,T.
In my perspective (and mine only), measurement leads to information, and information is correlated to energy. Nevertheless, information is certain.
Does this means that information is a reduction of energy?
 
Physics news on Phys.org
  • #2
How is information related to energy? Information is related to entropy, if you're using Shannon's entropy.
 
  • #3
It's well known that the ocean is full of energy (being so far from absolute zero temperature), and yet that we can't use that energy to power a boat.

OTOH, Maxwell pointed out that if he gave a demon enough information to know the trajectory of every molecule in that ocean, then the demon could extract every ounce of energy to use for his own purposes.

If entropy is chaos, or lack of knowledge, then information is a reduction of entropy and a measure of how much energy you can harness.
 
  • #4
StatMechGuy: "How is information related to energy? Information is related to entropy, if you're using Shannon's entropy".
dG=dH-TdS (Gibs)
If you accept that information is correlated to entropy, you must accept also that it is correlated to energy.

cesiumfrog: "If entropy is chaos, or lack of knowledge, then information is a reduction of entropy and a measure of how much energy you can harness."
How can you measure that?
 
  • #5
cesiumfrog said:
OTOH, Maxwell pointed out that if he gave a demon enough information to know the trajectory of every molecule in that ocean, then the demon could extract every ounce of energy to use for his own purposes.

Doesn't the demon itself pose a problem? How much energy does it cost to track the trajectory of every molecule?
 
  • #6
I look at it this way.

Any gain in information is a loss in entropy, Information is order and entropy is a measure of chaos or disorder.

Measurement increases order because it is the process of correlation, mapping of variables in an ordered fashion, It reduces uncertainty and increases order.

Energy is a quantification of this process.
 
  • #7
If you accept that information is correlated to entropy, you must accept also that it is correlated to energy.

Through temperature. I can have a high entropy-low energy state or a low entropy-high energy state, it depends on the temperature. So it's not that direct.
 
  • #8
3trQN :"Any gain in information is a loss in entropy, Information is order and entropy is a measure of chaos or disorder."

Is there any proof that any gain in information is a loss in entropy?


Let me rephrase my question: If we take a specific Hamiltonian (lets say time dependent) acting on a state, we’ll get several energy levels (linearly superposed).
But when we measure the energy of the state, we can get in this measurement only one specific energy among all the energy levels. Does this means that information leads to reduction or that the measurement leads to reduction?
 
  • #9
StatMechGuy:” Through temperature. I can have a high entropy-low energy state or a low entropy-high energy state, it depends on the temperature. So it's not that direct.”

May you help me to understand what the temperature is?
Originally it introduced for the description of the statistical systems. Then one obtain U=1/2*k*T, namely, the relation not only for single particle but for the single degree of freedom of the single particle. Thus one obtain 1ev=11000deg. and therefore it correctly describes high energy physics. ?
 
  • #10
StatMechGuy:”How is information related to energy? Information is related to entropy, if you're using Shannon's entropy.”

The Shannon’s theory is the theory of communication (information rate and not information). For example, the white noise provide the max information rate.

Cesiumfrog:” information is a reduction of entropy and a measure of how much energy you can harness.”

I think that you should take into account also the negative statements. It seems to me that it is also provide information. But then the information can’t be a measure.
 
  • #11
Anonym said:
May you help me to understand what the temperature is?
Originally it introduced for the description of the statistical systems. Then one obtain U=1/2*k*T, namely, the relation not only for single particle but for the single degree of freedom of the single particle. Thus one obtain 1ev=11000deg. and therefore it correctly describes high energy physics. ?

From my understanding of the situation, temperature can be viewed several ways. In one case, from a termodynamic standpoint, you can define temperature as a constraint function of the configuration space when two systems are in thermodynamic equilibrium. Therefore, when two systems are in equilbrium, their temperature is the same.

The other way I've seen it come in is as a Lagrange multiplier for maximizing the entropy of a system at constant energy. You can define both quantities in terms of the density operator, and then maximize the entropy (by defining [tex]S = \hat{\rho} \ln \hat{\rho} [/tex]) subject to the constraint that the energy expectation value is constant. When you do this, you recognize something that acts like the temperature, and therefore you call it "temperature".

I believe Pauli's "Treatise on Thermodynamics gives a more historic definition of temperature in terms of ideal gases, but this definition predates quantum mechanics, and is probably not the greatest way to go about thing.

Edit: As for the Shannon entropy thing, it is very easy to associate entropy with a lack of information about a system. In a state of zero entropy, as in full information, it would be possible for us to just build an enormous computer and numerically integrate the equations of motion from that point. The fact is, in a thermodynamic system we treat the system as a probabilistic one, since it is untractable to do what I just said. Shannon's entropy is very applicable to thermodynamics.
 
Last edited:
  • #12
StatMechGuy:” From my understanding of the situation, temperature can be viewed several ways.”

In both cases considered you use many-body (many-particles) systems (statistical ensembles). I expect that when applied to single particle, the same notion should lead to nonsense, but this what not happends. This is the content of my question.
 
  • #13
You can't talk about the temperature of a single particle. Temperature is a result of having a statistical ensemble. From a statistical-mechanical standpoint, temperature dictates how the particles are distributed through the energy spectrum of the system. I'm fairly confident you can talk about temperature without talking about a large number of particles.
 
  • #14
Thank you.
 
  • #15
I think the purpoise of life is to transform the energy of a sun into (genetic and structural) information.

If we only had more knowledge to exploit outside energies..

http://jaxx.upcnet.ro" [Broken] a vid about how mutch more energy is waiting for us.
 
Last edited by a moderator:

1. What is the difference between energy and information?

Energy is a physical quantity that is associated with the ability to do work, while information is a measure of the organization and complexity of a system. Energy can be converted into different forms, such as mechanical, electrical, or chemical energy, but information cannot be converted in the same way.

2. How are energy and information related?

There is a strong correlation between energy and information in the sense that both are necessary for a system to function effectively. Energy is required to process and transmit information, while information is needed to make efficient use of energy. This relationship is often referred to as the "energy-information nexus."

3. How does the concept of entropy play a role in the correlation between energy and information?

Entropy is a measure of the disorder or randomness in a system. In the context of energy and information, entropy can be seen as a measure of the amount of information needed to describe a system or the amount of energy needed to maintain order. As systems become more organized, the amount of energy and information needed increases.

4. Can energy and information be exchanged or traded?

While energy and information are closely related, they cannot be directly exchanged or traded. However, energy can be used to create or transmit information, and information can be used to optimize and conserve energy. This exchange between energy and information is crucial for the functioning of many systems.

5. How does understanding the correlation between energy and information have practical applications?

Understanding the relationship between energy and information can have numerous practical applications in fields such as engineering, biology, and economics. For example, it can help optimize energy usage in buildings, improve communication systems, and inform decision-making processes in various industries. Additionally, this understanding can lead to the development of more efficient and sustainable technologies.

Similar threads

  • Quantum Physics
Replies
11
Views
1K
  • Quantum Physics
Replies
33
Views
2K
Replies
3
Views
1K
  • Quantum Physics
Replies
31
Views
3K
  • Quantum Physics
Replies
2
Views
1K
Replies
41
Views
2K
Replies
10
Views
1K
Replies
1
Views
1K
Replies
19
Views
2K
  • Quantum Physics
Replies
24
Views
1K
Back
Top