A Theoretical status of the Landauer Principle

134
8
I am interested to understand the current theoretical status of Landauer's Principle and related ideas. I am looking for key papers and results in the subject.

I will highlight one key paper.

An improved Landauer principle with finite-size corrections

Abstract:
Landauerʼs principle relates entropy decrease and heat dissipation during logically irreversible processes. Most theoretical justifications of Landauerʼs principle either use thermodynamic reasoning or rely on specific models based on arguable assumptions. Here, we aim at a general and minimal setup to formulate Landauerʼs principle in precise terms. We provide a simple and rigorous proof of an improved version of the principle, which is formulated in terms of an equality rather than an inequality. The proof is based on quantum statistical mechanics concepts rather than on thermodynamic argumentation. From this equality version, we obtain explicit improvements of Landauerʼs bound that depend on the effective size of the thermal reservoir and reduce to Landauerʼs bound only for infinite-sized reservoirs.
 

Demystifier

Science Advisor
Insights Author
2018 Award
10,107
3,028
The Landauer's principle is associated with statistical physics, not necessarily with quantum statistical physics. In its simplest and most general form, it says that erasing information with Shannon entropy ##S_{\rm sh}## increases the entropy of the environment by the amount ##\Delta S\geq S_{\rm sh}##. It can be thought of as a version of the 2nd law of statistical physics, according to which total entropy (which does not necessarily need to have the form of thermodynamic entropy) cannot decrease.
 
Last edited:
134
8
I was trying to understand the Landauer principle using a (Two-Level System)TLS.

Consider a TLS with Hamiltonian ##H(x) = x|1><1| +0 |0><0|##

The bit is in an unknown state initially, and x is zero initially. We will describe a protocol to reset the bit.

We allow x to increase quasistatically while it is coupled to a heat bath with temperature T and inverse temperature ##\beta##

the infinitesimal work done is equal to probability the system is in ##|1>## times change in energy

$$ dW = P(x) dx $$

where,
##p(x) = \frac{e^{-\beta x} }{1+e^{-\beta x}} = \frac{1 }{1+e^{\beta x}} ##

x is slowly increased to infinity in the quasistatic limit, so the process is reversible.
The work done can be calculated as ##W = KT \log 2##.

as x->##\infty## the system is always in ground state, initially the system has ##<E> = 0##.
Therefore the heat absorbed = Work done = ## Q = W = KT \log 2##

It can be shown that the TLS can also perform as a reversible engine operating at Carnot efficiency. Consider the following process

1)Start with a bit in an unknown state. We start with x = 0. We reset the bit against a cold bath ##T_c##, in the process described above. As x goes from 0 to ##\infty## The work done = heat supplied to cold bath = ##K T_c \log 2##
2) Now ##x = \infty##, we now couple to system the system to hot bath and slowly allow x to decrease to zero, the work extracted = heat lost by the bath = ##K T_h \log 2##

Available information can be used to do work by extracting energy from a bath reversibly. The heat is extracted from the bath and work is done. The information available is now lost. It is a reversible process. The reverse process is that work can be used to to erase information. When you erase information a heat bath must get hotter by ##KT \log 2##.

While we are using language of quantum mechanics this can be applied to classical mechanics as we are not exploiting the properties associated with coherence.
 

A. Neumaier

Science Advisor
Insights Author
6,707
2,666
the infinitesimal work done is equal to probability the system is in ##|1>## times change in energy
$$ dW = P(x) dx $$
where ##p(x) = \frac{e^{-\beta x} }{1+e^{-\beta x}} = \frac{1 }{1+e^{\beta x}} ##
Why? In maximal generality, work is (generalized) force times (generalized) displacement, dW=Fdx. But what would allow you to treat probability as a force?
 
134
8
Why? In maximal generality, work is (generalized) force times (generalized) displacement, dW=Fdx. But what would allow you to treat probability as a force?
Work done here is change in energy. The lower level has zero energy so it won't contribute. So the work done should be probability the particle is in upper state*change in energy of the state. I assumed the probability does not change because of an infinitesimal transformation.

Fluctuations don't matter because I am doing it in the quasistatic limit.

Edit: also note x in my notation is energy.
 

A. Neumaier

Science Advisor
Insights Author
6,707
2,666
Work done here is change in energy. The lower level has zero energy so it won't contribute. So the work done should be probability the particle is in upper state*change in energy of the state. I assumed the probability does not change because of an infinitesimal transformation.

Fluctuations don't matter because I am doing it in the quasistatic limit.

Edit: also note x in my notation is energy.
Thus you treat energy as generalized displacement and probability as generalized force. I still see no physical reason why this should make sense. Usually one applies a prepared force, gets a displacement as result and deduces that work has been done by the force.

Yor choice doesn't fit this pattern, hence seems quite arbitrary.
 
134
8
Thus you treat energy as generalized displacement and probability as generalized force. I still see no physical reason why this should make sense. Usually one applies a prepared force, gets a displacement as result and deduces that work has been done by the force.

Yor choice doesn't fit this pattern, hence seems quite arbitrary.
I feel it makes sense and it is equivalent to the interpretation of force as F.dx in the context of classical potentials.

for concreteness consider the following classical potential.

##V(x) = \infty## for x<0 and x>1
##V(x) = 0## for x between (0,1/2)
and ##V(x) = E## for x between (1/2,1)

This is the classical equivalent of TLS system i consider where the 2 levels are when x is between (0,1/2) and (1/2,1)

Now if you change the value of E to E+dE by changing some external field. If the particle is in between (0,1/2) work done is zero. and if the particle is between (1/2,1) the work done is dE.

In the quasistatic limit, the work done will be given by PdE (in this new notation) where p is probability that is particle is between (1/2,1) given by the boltzman distribution
 
Last edited:

A. Neumaier

Science Advisor
Insights Author
6,707
2,666
I feel it makes sense and it is equivalent to the interpretation of force as F.dx in the context of classical potentials.
I still don't think it is sensible, hence cannot discuss it further.
 

Lord Jestocost

Gold Member
2018 Award
420
281
Why? In maximal generality, work is (generalized) force times (generalized) displacement, dW=Fdx. But what would allow you to treat probability as a force?
The equation ##dW = P(x) dx## means nothing else than "raising an energy level ##E## that is populated with probability ##p(E)## by energy ##dE## costs work ##p(E)dE## ".
 
134
8
I just noticed the same argument is also there in Information erasure without energy cost. See section 2 information erasure using a thermal reservoir.

Also in the context of the OP, This paper is also a very a important development in the subject.
 

A. Neumaier

Science Advisor
Insights Author
6,707
2,666
Thus you treat energy as generalized displacement and probability as generalized force. I still see no physical reason why this should make sense. Usually one applies a prepared force, gets a displacement as result and deduces that work has been done by the force.

Yor choice doesn't fit this pattern, hence seems quite arbitrary.
Work is a notion that applies meaningfully only to acual systems - single systems.

As commonly understood, probability is not a property of a system but of an ensemble of systems. Thus you cannot apply a given probability to a given single process and obtain as a result a displacement of energy.
But this is exactly how Jarzynski paper starts, look at equation 3 in.

A nonequilibrium equality for free energy differences
No. He considers a classical system between equilibrium situations, and the ensembles he subsequently discussed are of imagined copies, not real ensembles. This makes a huge difference.

I just noticed the same argument is also there in Information erasure without energy cost. See section 2 information erasure using a thermal reservoir.
Yes, after (2.1) he makes the same meaningless assumptions as you do.

Finding an invalid statement elsewhere doesn't make the statement valid.
 

Lord Jestocost

Gold Member
2018 Award
420
281
I am interested to understand the current theoretical status of Landauer's Principle and related ideas.
Maybe, the following papers by L. B. Kish et al. and the references therein might be of help:

Critical Remarks on Landauer’s principle of erasure–dissipation
[PDF] arxiv.org

Demons: Maxwell’s demon, Szilard’s engine and Landauer’s erasure–dissipation
[PDF] worldscientific.com
 

Want to reply to this thread?

"Theoretical status of the Landauer Principle" You must log in or register to reply here.

Related Threads for: Theoretical status of the Landauer Principle

Replies
51
Views
13K
Replies
1
Views
1K
Replies
2
Views
480
  • Posted
Replies
3
Views
2K
Replies
6
Views
3K
Replies
5
Views
1K
Replies
4
Views
1K

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving

Hot Threads

Top