In order for decoherence to work, does it require the second law of

In summary: Then the output would be the tensor product of the outputs.In summary, the second law of thermodynamics does not have to be valid in order for decoherence to work. Decoherence is the process by which the various microstates of a quantum system become more probable as a result of their interaction with the surrounding environment.
  • #1
StevieTNZ
1,933
878
In order for decoherence to work, does it require the second law of thermodynamics to be valid?
 
Physics news on Phys.org
  • #2


Could you elaborate ? Don't see where you are coming from...

On the other hand, microscopically, do you see a possibility for the second law of thermodynamics NOT to be valid ? After all, it is, microscopically, almost a tautology: events with high probability (massive amount of corresponding microstates) are more probable than events with small probability (small amount of rare corresponding microstates).
 
  • #3


I guess what I'm asking is how does decoherence work? Some say thermal randomness is why you average out the possiblities of the wavefunction to get a classical looking result.
 
  • #4


As for environmentally induced decoherence, you need a large environment, but it doesn't have to be in equilibrium. This is a commonly misstated/misleading condition in the literature. A thermal environment is sufficient, but not necessary.
 
  • #5


StevieTNZ said:
In order for decoherence to work, does it require the second law of thermodynamics to be valid?

Both decoherence and the second law of thermodynamics are consequences of quantum mechanics.

StevieTNZ said:
I guess what I'm asking is how does decoherence work?

The website http://www.decoherence.de/ (by Erich Joos, one of the professionals working on the subject) gives reliable information on decoherence on various levels of depth.
 
  • #6


Read the first page. Nice article. Thanks for sharing.
 
  • #7


StevieTNZ said:
I guess what I'm asking is how does decoherence work? Some say thermal randomness is why you average out the possiblities of the wavefunction to get a classical looking result.

Why is randomness involved when describing something which is otherwise perfectly easy to understand? I find randomness akin to buzzwords so readily attached to physical sentances.

[tex]|\psi>= \sum_i |i><i|\psi>[/tex]

[tex]|before>= \sum_i |i>|\epsilon><i|\psi>[/tex]

which evolves into

[tex]|after> = \sum_i |\epsilon_i><i|\psi>[/tex]

Where [tex]\epsilon[/tex] is the environment, so not only can the system be effected by the environment, but the environment can be effected by the system! Bascially you could imagine leaving quantum waves to oscillate inside a box. They may decohere eventually into some kind of tangible state. This is a very simple explanation of decoherence.
 
Last edited:
  • #8


QuantumClue said:
Why is randomness involved when describing something which is otherwise perfectly easy to understand? I find randomness akin to buzzwords so readily attached to physical sentances.

[tex]|\psi>= \sum_i |i><i|\psi>[/tex]

[tex]|before>= \sum_i |i><\epsilon><i|\psi>[/tex]

which evolves into

[tex]|after> = \sum_i |\epsilon_i><i|\psi>[/tex]

Where [tex]\epsilon[/tex] is the environment, so not only can the system be effected by the environment, but the environment can be effected by the system! Bascially you could imagine leaving quantum waves to oscillate inside a box. They may decohere eventually into some kind of tangible state. This is a very simple explanation of decoherence.

I don't understand your equations. You have not defined your variables. Is [tex]\psi[/tex] the initial system state and [tex]\epsilon[/tex] the initial environment state? Then I think the before state of the system + environment should have a ket and not an expectation value. Furthermore, where is the decoherence? You have said nothing about interference or the reduced density matrix. If the environment is small, you have only expressed a loss of the factorization (correlation). There can still be recoherence at a later time (Poincare recurrence).

Randomness is very appropriate in the discussion of environmentally induced decoherence. The large environment acts as a source of random noise as can be clearly seen in the Langevin and influence functional (path integral) formalisms. One can directly attribute the environmentally induced decoherence to its induced fluctuations and dissipation. The three phenomena are strictly related. Out of laziness I would cite my own (not yet published) paper: http://arxiv.org/abs/1011.3286" [Broken]. For stationary Gaussian processes, the three phenomena can be related:

[tex]\underbrace{\tilde{\alpha}(\omega)}_{decoherence} =\; \underbrace{\tilde{\nu}(\omega)}_{noise} \; - \omega \underbrace{ \tilde{\gamma}(\omega)}_{damping}[/tex]

These three kernels characterize the non-Markovian (algebraic) Lindblad dissipator, noise (or diffusion), and damping.
 
Last edited by a moderator:
  • #9


C. H. Fleming said:
I don't understand your equations. You have not defined your variables. Is [tex]\psi[/tex] the initial system state and [tex]\epsilon[/tex] the initial environment state? Then I think the before state of the system + environment should have a ket and not an expectation value. Furthermore, where is the decoherence? You have said nothing about interference or the reduced density matrix. If the environment is small, you have only expressed a loss of the factorization (correlation). There can still be recoherence at a later time (Poincare recurrence).

Randomness is very appropriate in the discussion of environmentally induced decoherence. The large environment acts as a source of random noise as can be clearly seen in the Langevin and influence functional (path integral) formalisms. One can directly attribute the environmentally induced decoherence to its induced fluctuations and dissipation. The three phenomena are strictly related. Out of laziness I would cite my own (not yet published) paper: http://arxiv.org/abs/1011.3286" [Broken]. For stationary Gaussian processes, the three phenomena can be related:

[tex]\underbrace{\tilde{\alpha}(\omega)}_{decoherence} =\; \underbrace{\tilde{\nu}(\omega)}_{noise} \; - \omega \underbrace{ \tilde{\gamma}(\omega)}_{damping}[/tex]

These three kernels characterize the non-Markovian (algebraic) Lindblad dissipator, noise (or diffusion), and damping.

[tex]|i>|\epsilon>[/tex] is the tensor product [tex]|i> \otimes |\epsilon>[/tex]. The i's form the einselected basis.

The vector basis and environment can be attained by tensor multiplying the basis vectors of the subsystems together - so the joint state is the before state.

As for quantum randomness you might state it is an important in understanding environmental decoherence. I argue that is only a dogma of the physical sciences. The main example used by physics today to describe randomness exists in the form of radiation. It seems however that there may be a lack of knowledge on the system, so not understanding the system is completely analogous to stating there is something which appears random. But otherwise, you don't really require to know if a system is random or not. You could quite easily believe in a Pilot Wave theory of matter, and still understand the basic dynamics of decoherence.
 
Last edited by a moderator:
  • #10


nice paper by the way. Just looking over it now.
 
  • #11


QuantumClue said:
[tex]|i>|\epsilon>[/tex] is the tensor product [tex]|i> \otimes |\epsilon>[/tex]. The i's form the einselected basis.

The vector basis and environment can be attained by tensor multiplying the basis vectors of the subsystems together - so the joint state is the before state.

As for quantum randomness you might state is it an important in understanding environmental decoherence. I argue that is only a dogma of the physical sciences. The main example used by physics today to describe randomness exists in the form of radiation. It seems however that there may be a lack of knowledge on the system, so not understanding the system is completely analogous to stating there is something which appears random. But otherwise, you don't really require to know if a system is random or not. You could quite easily believe in a Pilot Wave theory of matter, and still understand the basic dynamics of decoherence.

Yes, I thought that is what you meant to write, but look and see that you actually wrote [tex]\langle \epsilon \rangle[/tex], which is an expectation value that doesn't make sense.

Furthermore, a radiative field (like the electromagnetic field) does provide random noise to the system. Quantum fields are a class of large environments. I (and many others before me) have calculated these kernels for a scalar and electromagnetic field, e.g. in http://arxiv.org/abs/1012.5067" [Broken] (also new and unpublished). So these are not divergent, but compatible explanations of environmentally induced decoherence. Even for a zero temperature electromagnetic field you drive the system with ground state fluctuations of the field. This is random in the Brownian sense, not in the anti-Bohmian sense that maybe you infer.

As for my decoherence paper, I apologize that it is a bit technical. It needs to be rewritten a bit.
 
Last edited by a moderator:
  • #12


C. H. Fleming said:
Yes, I thought that is what you meant to write, but look and see that you actually wrote [tex]\langle \epsilon \rangle[/tex], which is an expectation value that doesn't make sense.

Furthermore, a radiative field (like the electromagnetic field) does provide random noise to the system. Quantum fields are a class of large environments. I (and many others before me) have calculated these kernels for a scalar and electromagnetic field, e.g. in http://arxiv.org/abs/1012.5067" [Broken] (also new and unpublished). So these are not divergent, but compatible explanations of environmentally induced decoherence. Even for a zero temperature electromagnetic field you drive the system with ground state fluctuations of the field. This is random in the Brownian sense, not in the anti-Bohmian sense that maybe you infer.

As for my decoherence paper, I apologize that it is a bit technical. It needs to be rewritten a bit.

I completely apologize, a total mistake, one not realized until now you pointed that out. Yes, it is not an expectation value.

I do understand, many ambitious people nowadays refer to many experiments that exhibit or may appear to exhibit total randomness. I do not share this sentiment. I am a follower of the deterministic idea of quantum physics, where everything was predetermined (most likely at?) the big bang, where there is a [tex]\Psi[/tex]-state governing the entire universe. If I am to believe this, which I strongly do, then I do not believe that randomness is a physical entity, but rather a topic of knowledge on the system. I can make an otherwise completely randomized system determinable. I simply freeze the system by effecting the range of quantum psi function describing the system by making periodic observations on my collection of atoms. Then I can have atoms, preferably being stable for many many years, and if I had the right device, even for ever. That will make a seemingly random system, appear much more deterministic. And the reason why is because of two main factors, one is that I have altered the state and in doing so, I have altered the knowledge which is maximally obtainable from the system.

Yes, your paper is quite technical.
 
Last edited by a moderator:
  • #13


QuantumClue said:
I do understand, many ambitious people nowadays refer to many experiments that exhibit or may appear to exhibit total randomness. I do not share this sentiment. I am a follower of the deterministic idea of quantum physics, where everything was predetermined (most likely at?) the big bang, where there is a [tex]\Psi[/tex]-state governing the entire universe. If I am to believe this, which I strongly do, then I do not believe that randomness is a physical entity, but rather a topic of knowledge on the system. I can make an otherwise completely randomized system determinable. I simply freeze the system by effecting the range of quantum psi function describing the system by making periodic observations on my collection of atoms. Then I can have atoms, preferably being stable for many many years, and if I had the right device, even for ever. That will make a seemingly random system, appear much more deterministic. And the reason why is because of two main factors, one is that I have altered the state and in doing so, I have altered the knowledge which is maximally obtainable from the system.

This is what I inferred from your words, but you should also understand that the open system exhibits randomness in the Brownian sense. This randomness would exist even classically (for positive temperature). The environment (or field) is very large and you can not keep track of its every degree of freedom (there are, in fact, uncountably many). It's effect upon the system is mathematically equivalent to a stochastic process when you look at the dynamics of system observables (see Ford, Lewis and O'Connell PRA 1988 "The Quantum Langevin Equation").
 
  • #14


C. H. Fleming said:
This is what I inferred from your words, but you should also understand that the open system exhibits randomness in the Brownian sense. This randomness would exist even classically (for positive temperature). The environment (or field) is very large and you can not keep track of its every degree of freedom (there are, in fact, uncountably many). It's effect upon the system is mathematically equivalent to a stochastic process when you look at the dynamics of system observables (see Ford, Lewis and O'Connell PRA 1988 "The Quantum Langevin Equation").

I don't personally see how a Brownian motion infers randomness anymore than a radiative system would appear like something random? Could you elaborate for me?
 
  • #15


QuantumClue said:
I don't personally see how a Brownian motion infers randomness anymore than a radiative system would appear like something random? Could you elaborate for me?

It doesn't, they are the same. Position coupling to the field gives you the motion damping of canonical Brownian motion while momentum coupling to the field gives you the Abraham-Lorentz damping.

I am just pointing out that even in the classical theory where the closed system + environment evolution is widely accepted as deterministic, the open system still exhibits random dynamics. If the environment induces irreversible dynamics (e.g. decoherence), then the environment necessarily induces random fluctuations. You are free to believe that the closed system + environment is deterministic, but you should not discount the perspective of stochastic dynamics. It is as legitimate as the second law itself.
 
  • #16


C. H. Fleming said:
It doesn't, they are the same. Position coupling to the field gives you the motion damping of canonical Brownian motion while momentum coupling to the field gives you the Abraham-Lorentz damping.

I am just pointing out that even in the classical theory where the closed system + environment evolution is widely accepted as deterministic, the open system still exhibits random dynamics. If the environment induces irreversible dynamics (e.g. decoherence), then the environment necessarily induces random fluctuations. You are free to believe that the closed system + environment is deterministic, but you should not discount the perspective of stochastic dynamics. It is as legitimate as the second law itself.

Thank you for clearing that up, I thought I was misunderstanding the crux of your arguement.
 

1. What is decoherence and how does it relate to the second law of thermodynamics?

Decoherence is a process in physics where a quantum system interacts with its surrounding environment, leading to the loss of quantum coherence and the emergence of classical behavior. This process is closely related to the second law of thermodynamics, which states that entropy, or disorder, always increases in a closed system. Decoherence is necessary for the emergence of classical behavior, which is a result of the increase in entropy as the quantum system loses its coherence.

2. Can decoherence occur without the second law of thermodynamics?

No, decoherence requires the second law of thermodynamics to occur. The increase in entropy is necessary for the loss of quantum coherence and the emergence of classical behavior.

3. How does the second law of thermodynamics affect the predictability of a system undergoing decoherence?

The second law of thermodynamics leads to an increase in entropy, which results in the loss of information and predictability in a system undergoing decoherence. As the system interacts with its environment, the information about its quantum state becomes entangled with the environment, making it difficult to predict the system's future behavior.

4. Is there a connection between decoherence and the arrow of time?

Yes, there is a connection between decoherence and the arrow of time. The arrow of time is a concept that describes the asymmetry of time, meaning that time only moves in one direction. Decoherence is a process that leads to the emergence of classical behavior and the loss of quantum coherence, which is in line with the arrow of time and the second law of thermodynamics.

5. Can the second law of thermodynamics be violated by decoherence?

No, the second law of thermodynamics is a fundamental law of nature and cannot be violated. Decoherence is a process that is consistent with the second law and is necessary for the emergence of classical behavior. Any apparent violation of the second law would be a result of incomplete understanding or incorrect application of the law.

Similar threads

  • Quantum Physics
3
Replies
71
Views
3K
Replies
15
Views
1K
  • Quantum Physics
Replies
10
Views
836
  • Quantum Physics
Replies
4
Views
2K
Replies
6
Views
1K
Replies
2
Views
1K
  • Quantum Physics
Replies
31
Views
2K
Replies
69
Views
6K
Replies
11
Views
1K
Back
Top