Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

A Re-interpretation of the third law of thermodynamics

  1. Jun 18, 2017 #1
    Consider the first paragraph of this paper - https://arxiv.org/abs/gr-qc/0611004:

    A fundamental problem in thermodynamic and statistical physics is to study the response of a system in thermal equilibrium to an outside perturbation. In particular, one is typically interested in calculating the relaxation timescale at which the perturbed system returns to a stationary, equilibrium configuration. Can this relaxation time be made arbitrarily small? That the answer may be negative is suggested by the third-law of thermodynamics, according to which the relaxation time of a perturbed system is expected to go to infinity in the limit of absolute zero of temperature. Finite temperature systems are expected to have faster dynamics and shorter relaxation times—how small can these be made? In this paper we use general results from quantum information theory in order to derive a fundamental bound on the maximal rate at which a perturbed system approaches thermal equilibrium.

    Take a system in thermal equilibrium and perturb the system. How long does before the system relaxes back to a stationary, equilibrium configuration?

    The excerpt mentions that this relaxation time cannot be made arbitrarily small, because the third-law of thermodynamics stipulates that the relaxation time of a perturbed system is infinite at zero temperature.

    The third-law of thermodynamics (https://en.wikipedia.org/wiki/Third_law_of_thermodynamics) is stated in the form

    The entropy of a perfect crystal at absolute zero is exactly equal to zero.

    How can this statement be re-interpreted to mean that the relaxation time of a perturbed system is infinite at zero temperature?
     
  2. jcsd
  3. Jun 18, 2017 #2

    jambaugh

    User Avatar
    Science Advisor
    Gold Member

    The third law can be expressed in many forms. Typically as "the entropy of an isolated system never decreases". How that would relate to the perturbed system statement is what the paper purports to state, so I'd say read, or reread the paper. (and maybe the paper's claim is bogus, just because it is on arxiv doesn't mean it has been peer reviewed and approved for publication. I'll take a look after this post and follow up if appropriate.)

    [Edit] { upon quick review I don't think the paper's claim is "bogus". }

    Note that in statistical mechanics temperature is also defined in terms of energy and entropy. Specifically it is the reciprocal of the Lagrange multiplier [itex] \beta = \frac{1}{\kappa_B T}[/itex] you get when you maximize the entropy of a system (thus asserting thermodynamic equilibrium) under the constraint that the expectation value or mean value of the energy is a given value. You should focus on this to better understand the claim.

    Lagrange multipliers can be interpreted as forces of constraint and thus also apply to rates as we perturb those constraints... rather in line with what the paper claims.
     
    Last edited: Jun 18, 2017
  4. Jun 18, 2017 #3

    stevendaryl

    User Avatar
    Staff Emeritus
    Science Advisor

    Isn't that the second law?
     
  5. Jun 18, 2017 #4

    jambaugh

    User Avatar
    Science Advisor
    Gold Member

    Well color me pink... you are quite right Stevendaryl and I should read more carefully.
    [edit] I edited a strikethough in case anyone reads that without reading what follows.
     
    Last edited: Jun 18, 2017
  6. Jun 18, 2017 #5

    jambaugh

    User Avatar
    Science Advisor
    Gold Member

    I now better understand the OP's question which is how the other got from crystals at zero temperature to
    The more general form of the third law is that as the temperature of any system approaches zero its entropy will approach a finite value. That value relates to the degeneracy of the lowest energy state. I'm thinking this is a necessary assumption in the argument the paper makes about the relaxation time which, from my quick read, invokes a boundary on information transfer rates.
     
  7. Jun 18, 2017 #6

    radium

    User Avatar
    Science Advisor
    Education Advisor

    The idea of such a bound on the relaxation rate has actually been around for a long time in the context of quantum critical systems (which saturate the bound). It inspired the KSS viscosity bound (which can be violated) proposed in the context of the AdS/CFT correspondence and other proposed transport bounds. Sometimes this timescale is called Planckian dissipation.

    The important idea for this bound is the 1/T scaling. The constant in front was not specified originally. The idea comes from the concept of universality, in systems that saturate this bound the temperature is the only scale. It's also related the phase of the wavefunction in a quantum system. In a quantum system there is a timescale called the decoherence time which has no classical analogue. It is the time a wave function remembers its phase, which is finite above zero temperature.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted