In order for decoherence to work, does it require the second law of

StevieTNZ
Messages
1,934
Reaction score
873
In order for decoherence to work, does it require the second law of thermodynamics to be valid?
 
Physics news on Phys.org


Could you elaborate ? Don't see where you are coming from...

On the other hand, microscopically, do you see a possibility for the second law of thermodynamics NOT to be valid ? After all, it is, microscopically, almost a tautology: events with high probability (massive amount of corresponding microstates) are more probable than events with small probability (small amount of rare corresponding microstates).
 


I guess what I'm asking is how does decoherence work? Some say thermal randomness is why you average out the possiblities of the wavefunction to get a classical looking result.
 


As for environmentally induced decoherence, you need a large environment, but it doesn't have to be in equilibrium. This is a commonly misstated/misleading condition in the literature. A thermal environment is sufficient, but not necessary.
 


StevieTNZ said:
In order for decoherence to work, does it require the second law of thermodynamics to be valid?

Both decoherence and the second law of thermodynamics are consequences of quantum mechanics.

StevieTNZ said:
I guess what I'm asking is how does decoherence work?

The website http://www.decoherence.de/ (by Erich Joos, one of the professionals working on the subject) gives reliable information on decoherence on various levels of depth.
 


Read the first page. Nice article. Thanks for sharing.
 


StevieTNZ said:
I guess what I'm asking is how does decoherence work? Some say thermal randomness is why you average out the possiblities of the wavefunction to get a classical looking result.

Why is randomness involved when describing something which is otherwise perfectly easy to understand? I find randomness akin to buzzwords so readily attached to physical sentances.

|\psi>= \sum_i |i><i|\psi>

|before>= \sum_i |i>|\epsilon><i|\psi>

which evolves into

|after> = \sum_i |\epsilon_i><i|\psi>

Where \epsilon is the environment, so not only can the system be effected by the environment, but the environment can be effected by the system! Bascially you could imagine leaving quantum waves to oscillate inside a box. They may decohere eventually into some kind of tangible state. This is a very simple explanation of decoherence.
 
Last edited:


QuantumClue said:
Why is randomness involved when describing something which is otherwise perfectly easy to understand? I find randomness akin to buzzwords so readily attached to physical sentances.

|\psi>= \sum_i |i><i|\psi>

|before>= \sum_i |i><\epsilon><i|\psi>

which evolves into

|after> = \sum_i |\epsilon_i><i|\psi>

Where \epsilon is the environment, so not only can the system be effected by the environment, but the environment can be effected by the system! Bascially you could imagine leaving quantum waves to oscillate inside a box. They may decohere eventually into some kind of tangible state. This is a very simple explanation of decoherence.

I don't understand your equations. You have not defined your variables. Is \psi the initial system state and \epsilon the initial environment state? Then I think the before state of the system + environment should have a ket and not an expectation value. Furthermore, where is the decoherence? You have said nothing about interference or the reduced density matrix. If the environment is small, you have only expressed a loss of the factorization (correlation). There can still be recoherence at a later time (Poincare recurrence).

Randomness is very appropriate in the discussion of environmentally induced decoherence. The large environment acts as a source of random noise as can be clearly seen in the Langevin and influence functional (path integral) formalisms. One can directly attribute the environmentally induced decoherence to its induced fluctuations and dissipation. The three phenomena are strictly related. Out of laziness I would cite my own (not yet published) paper: http://arxiv.org/abs/1011.3286" . For stationary Gaussian processes, the three phenomena can be related:

\underbrace{\tilde{\alpha}(\omega)}_{decoherence} =\; \underbrace{\tilde{\nu}(\omega)}_{noise} \; - \omega \underbrace{ \tilde{\gamma}(\omega)}_{damping}

These three kernels characterize the non-Markovian (algebraic) Lindblad dissipator, noise (or diffusion), and damping.
 
Last edited by a moderator:


C. H. Fleming said:
I don't understand your equations. You have not defined your variables. Is \psi the initial system state and \epsilon the initial environment state? Then I think the before state of the system + environment should have a ket and not an expectation value. Furthermore, where is the decoherence? You have said nothing about interference or the reduced density matrix. If the environment is small, you have only expressed a loss of the factorization (correlation). There can still be recoherence at a later time (Poincare recurrence).

Randomness is very appropriate in the discussion of environmentally induced decoherence. The large environment acts as a source of random noise as can be clearly seen in the Langevin and influence functional (path integral) formalisms. One can directly attribute the environmentally induced decoherence to its induced fluctuations and dissipation. The three phenomena are strictly related. Out of laziness I would cite my own (not yet published) paper: http://arxiv.org/abs/1011.3286" . For stationary Gaussian processes, the three phenomena can be related:

\underbrace{\tilde{\alpha}(\omega)}_{decoherence} =\; \underbrace{\tilde{\nu}(\omega)}_{noise} \; - \omega \underbrace{ \tilde{\gamma}(\omega)}_{damping}

These three kernels characterize the non-Markovian (algebraic) Lindblad dissipator, noise (or diffusion), and damping.

|i>|\epsilon> is the tensor product |i> \otimes |\epsilon>. The i's form the einselected basis.

The vector basis and environment can be attained by tensor multiplying the basis vectors of the subsystems together - so the joint state is the before state.

As for quantum randomness you might state it is an important in understanding environmental decoherence. I argue that is only a dogma of the physical sciences. The main example used by physics today to describe randomness exists in the form of radiation. It seems however that there may be a lack of knowledge on the system, so not understanding the system is completely analogous to stating there is something which appears random. But otherwise, you don't really require to know if a system is random or not. You could quite easily believe in a Pilot Wave theory of matter, and still understand the basic dynamics of decoherence.
 
Last edited by a moderator:
  • #10


nice paper by the way. Just looking over it now.
 
  • #11


QuantumClue said:
|i>|\epsilon> is the tensor product |i> \otimes |\epsilon>. The i's form the einselected basis.

The vector basis and environment can be attained by tensor multiplying the basis vectors of the subsystems together - so the joint state is the before state.

As for quantum randomness you might state is it an important in understanding environmental decoherence. I argue that is only a dogma of the physical sciences. The main example used by physics today to describe randomness exists in the form of radiation. It seems however that there may be a lack of knowledge on the system, so not understanding the system is completely analogous to stating there is something which appears random. But otherwise, you don't really require to know if a system is random or not. You could quite easily believe in a Pilot Wave theory of matter, and still understand the basic dynamics of decoherence.

Yes, I thought that is what you meant to write, but look and see that you actually wrote \langle \epsilon \rangle, which is an expectation value that doesn't make sense.

Furthermore, a radiative field (like the electromagnetic field) does provide random noise to the system. Quantum fields are a class of large environments. I (and many others before me) have calculated these kernels for a scalar and electromagnetic field, e.g. in http://arxiv.org/abs/1012.5067" (also new and unpublished). So these are not divergent, but compatible explanations of environmentally induced decoherence. Even for a zero temperature electromagnetic field you drive the system with ground state fluctuations of the field. This is random in the Brownian sense, not in the anti-Bohmian sense that maybe you infer.

As for my decoherence paper, I apologize that it is a bit technical. It needs to be rewritten a bit.
 
Last edited by a moderator:
  • #12


C. H. Fleming said:
Yes, I thought that is what you meant to write, but look and see that you actually wrote \langle \epsilon \rangle, which is an expectation value that doesn't make sense.

Furthermore, a radiative field (like the electromagnetic field) does provide random noise to the system. Quantum fields are a class of large environments. I (and many others before me) have calculated these kernels for a scalar and electromagnetic field, e.g. in http://arxiv.org/abs/1012.5067" (also new and unpublished). So these are not divergent, but compatible explanations of environmentally induced decoherence. Even for a zero temperature electromagnetic field you drive the system with ground state fluctuations of the field. This is random in the Brownian sense, not in the anti-Bohmian sense that maybe you infer.

As for my decoherence paper, I apologize that it is a bit technical. It needs to be rewritten a bit.

I completely apologize, a total mistake, one not realized until now you pointed that out. Yes, it is not an expectation value.

I do understand, many ambitious people nowadays refer to many experiments that exhibit or may appear to exhibit total randomness. I do not share this sentiment. I am a follower of the deterministic idea of quantum physics, where everything was predetermined (most likely at?) the big bang, where there is a \Psi-state governing the entire universe. If I am to believe this, which I strongly do, then I do not believe that randomness is a physical entity, but rather a topic of knowledge on the system. I can make an otherwise completely randomized system determinable. I simply freeze the system by effecting the range of quantum psi function describing the system by making periodic observations on my collection of atoms. Then I can have atoms, preferably being stable for many many years, and if I had the right device, even for ever. That will make a seemingly random system, appear much more deterministic. And the reason why is because of two main factors, one is that I have altered the state and in doing so, I have altered the knowledge which is maximally obtainable from the system.

Yes, your paper is quite technical.
 
Last edited by a moderator:
  • #13


QuantumClue said:
I do understand, many ambitious people nowadays refer to many experiments that exhibit or may appear to exhibit total randomness. I do not share this sentiment. I am a follower of the deterministic idea of quantum physics, where everything was predetermined (most likely at?) the big bang, where there is a \Psi-state governing the entire universe. If I am to believe this, which I strongly do, then I do not believe that randomness is a physical entity, but rather a topic of knowledge on the system. I can make an otherwise completely randomized system determinable. I simply freeze the system by effecting the range of quantum psi function describing the system by making periodic observations on my collection of atoms. Then I can have atoms, preferably being stable for many many years, and if I had the right device, even for ever. That will make a seemingly random system, appear much more deterministic. And the reason why is because of two main factors, one is that I have altered the state and in doing so, I have altered the knowledge which is maximally obtainable from the system.

This is what I inferred from your words, but you should also understand that the open system exhibits randomness in the Brownian sense. This randomness would exist even classically (for positive temperature). The environment (or field) is very large and you can not keep track of its every degree of freedom (there are, in fact, uncountably many). It's effect upon the system is mathematically equivalent to a stochastic process when you look at the dynamics of system observables (see Ford, Lewis and O'Connell PRA 1988 "The Quantum Langevin Equation").
 
  • #14


C. H. Fleming said:
This is what I inferred from your words, but you should also understand that the open system exhibits randomness in the Brownian sense. This randomness would exist even classically (for positive temperature). The environment (or field) is very large and you can not keep track of its every degree of freedom (there are, in fact, uncountably many). It's effect upon the system is mathematically equivalent to a stochastic process when you look at the dynamics of system observables (see Ford, Lewis and O'Connell PRA 1988 "The Quantum Langevin Equation").

I don't personally see how a Brownian motion infers randomness anymore than a radiative system would appear like something random? Could you elaborate for me?
 
  • #15


QuantumClue said:
I don't personally see how a Brownian motion infers randomness anymore than a radiative system would appear like something random? Could you elaborate for me?

It doesn't, they are the same. Position coupling to the field gives you the motion damping of canonical Brownian motion while momentum coupling to the field gives you the Abraham-Lorentz damping.

I am just pointing out that even in the classical theory where the closed system + environment evolution is widely accepted as deterministic, the open system still exhibits random dynamics. If the environment induces irreversible dynamics (e.g. decoherence), then the environment necessarily induces random fluctuations. You are free to believe that the closed system + environment is deterministic, but you should not discount the perspective of stochastic dynamics. It is as legitimate as the second law itself.
 
  • #16


C. H. Fleming said:
It doesn't, they are the same. Position coupling to the field gives you the motion damping of canonical Brownian motion while momentum coupling to the field gives you the Abraham-Lorentz damping.

I am just pointing out that even in the classical theory where the closed system + environment evolution is widely accepted as deterministic, the open system still exhibits random dynamics. If the environment induces irreversible dynamics (e.g. decoherence), then the environment necessarily induces random fluctuations. You are free to believe that the closed system + environment is deterministic, but you should not discount the perspective of stochastic dynamics. It is as legitimate as the second law itself.

Thank you for clearing that up, I thought I was misunderstanding the crux of your arguement.
 
Back
Top