Decoherence: Random System-Environment Interaction Explained

  • Thread starter Thread starter chafelix
  • Start date Start date
  • Tags Tags
    Decoherence
chafelix
Messages
26
Reaction score
0
In statistical physics, we have a system interacting with a (random) medium.
This is what shifts and broadens the system states and this is well understood.
Clearly the random interaction is responsible for the shift and broadening(because
if it were nonrandom, it would simply cause a renormalization of the system with infinitely sharp states)Randomness is interpreted as follows: I have a large number of systems(such as an impurity) and each one sees a DIFFERENT local field. The KEY for "memory loss" or broadening is the RANDOM interaction with the medium.



Now I am trying to draw an analogy with decoherence where we have a system
interacting with a medium, in this case the "environment".
To get decoherence in statistical physics we still
need the system-environment interaction to
be random. I interpret this to mean that the environment consists of a large number of
microscopic entities interacting with the system via microscopic forces. In contrast
to statistical physics though we no longer have an ensemble of systems(e.g. Schroedinger
cats), but only one. However, again the interaction with the environment is crucial-switch it off and one has no decoherence-
So what I am trying to understand is in what way can we understand the system -environment interaction in decoherence to be random. I.e. if we look at Schroedinger's cat case, where we do NOT have a large number of cats, each one seing a different local environment(e.g. a broken or unbroken poison bottle), in what sense is that cat-environment interaction "random"
 
Physics news on Phys.org
chafelix said:
In contrast to statistical physics though we no longer have an ensemble of systems(e.g. Schroedinger cats), but only one.
It is not clear why do you think that you should switch to one event.
As I see decoherence is just the same about ensemble and then you don't have the question how deterministic single event contribute to probabilistic ensemble.
 
In statistical physics I start with an ensemble of "systems", each in a state according to a density matrix population. Now if I read you correctly, in decoherence I have a million cats, each originally in the SAME state ('live'), and they interact with the environment, which is always initially the same(i.e. same state, room, poison device+ breaking mechanism) via a V(x,t). The Schroedinger equation is completely deterministic, so you will get the same (mixed) state as a function of time, including the final result.
The only randomness has to do with the mechanism that triggers the breaking of the bottle. Am I missing something thus far?
If not, then the randomness that ****must necessarily be there **** in V(t) has to come from some intrinsically quantum process. This is different from statistical physics where the randomness comes from having non-identical replicas of the medium/environment, according to the density matrix
 
Decoherence describes how wavefunction undergoes irreversible change.
So initially we should start with reversible situation and then when we introduce decoherence we should get irreversible change.
In your example you already incorporate non reversibility (cat can not become alive but can become dead). So I do not understand how to get on with your example.
 
Decoherence describes how wavefunction undergoes irreversible change.
So initially we should start with reversible situation and then when we introduce decoherence we should get irreversible change.
And, unless I misunderstand, this is the result of tracing over the environment.
But I am more interested in the "entanglement memory loss time scale", described below.

But decoherence says this happens on a time scale, which is related to the strength of the interaction. Now this is quite similar to the case in statistical physics. There, one lets
atomic systems evolve in a medium, each such system seeing a different V(t). The resulting U-matrices have random phases at long times and this is how one gets decoherence in statistical physics, i.e. how tracing over the environment brings memory loss and broadens the spectral functions. This is the picture I am trying to understand in decoherence, i.e. the emergence of a decoherence time(measuring how fast entangled states decohere) similar to the memory loss time in statistical physics.
So the part I am not sure I understand is : where is the randomness here? Why is the system-environment V(t) random? Is it because the environment consists of a large number of microscopic systems and each one sees a different V(t) with the original "system" interacting with the environment?
 
chafelix said:
But I am more interested in the "entanglement memory loss time scale", described below.
Do you mean photon entanglement? Kind of backtracking from measurement at coincidence counter back to photon source? Or something else?

chafelix said:
So the part I am not sure I understand is : where is the randomness here? Why is the system-environment V(t) random? Is it because the environment consists of a large number of microscopic systems and each one sees a different V(t) with the original "system" interacting with the environment?
I would say that one of the sources of randomness is fluctuations of electromagnetic field around ground level. And interaction with large number of microscopic systems shouldn't happen always the same way even if they see the same V(t).
 
Ok, looks like at least I managed to explain the question. By "entanglement memory loss time scale" I meant a superposition of products of system and environment states, e.g.
Sum_{i,j} |System_i>|Environment_j> which decoheres after a characteristic time. Just like
U-matrices from atomic systems in a medium which see a different V(t) each also acquire random phases after a time scale roughly 1/HWHM of the relevant spectral function.

I would say that one of the sources of randomness is fluctuations of electromagnetic field around ground level.
Fair enough, I take this to mean that typical interactions involve light scattering or something similar. But I need to get a feeling if there are more such sources and their relative importance. So, any others?

And interaction with large number of microscopic systems shouldn't happen always the same way even if they see the same V(t).
That I am not sure what it means.
My thinking was that the "environment" consists of a large number of microscopic systems, none of them seing the same interaction with the "system", among other things because their distance is not exactly the same
 
Back
Top