How Does Environmentally Induced Decoherence Affect Quantum State Reduction?

  • B
  • Thread starter Feeble Wonk
  • Start date
  • Tags
    Decoherence
In summary: The unitary dynamic evolution is pure and zero entropy and exact for the composite. The reduced density operator of the system alone is mixed and has higher entropy. The reduced density operator of the environment alone is mixed and has higher entropy. But the total entropy of the composite is zero and this is the only thing that is exact and pure. As to what you are missing, that is subtler but perhaps the following will help. In summary, the concept of spontaneous quantum state reduction through environmentally induced decoherence involves the interaction between a system and its environment, causing the system to become "mixed" and increase in entropy while the composite system remains in a "pure" and zero entropy state.
  • #36
That's not what factorisation is - what you are taking about I have no idea what is meant.

In MW each part of a mixed state is considered a world. Its an interpretive thing - not an actual process.

Thanks
Bill

I have zero idea why you think Many World is not an actual process. If you will ask the physicists.. they will tell you that in Many worlds, there are really many worlds or branches.
 
Physics news on Phys.org
  • #37
jlcd said:
I have zero idea why you think Many World is not an actual process. If you will ask the physicists.. they will tell you that in Many worlds, there are really many worlds or branches.

Decoherence is a process, interpreting each outcome as a separate world isn't.

Thanks
Bill
 
  • #38
Can you give an example of Factorization and how the critiques reasoned they were different ways to factor it?
 
  • #39
zonde said:
I am trying to understand why this argument concerns only MWI and not any non collapse treatment of QM. Or MWI is the only non collapse interpretation? Somehow I got different impression from comments on different threads.
There are also other non-collapse interpretations. Let me give only two examples:
1. MWI with a priori preferred basis. For instance, it may be the position basis, so the ontology is not the state in the Hilbert space ##|\psi\rangle##, but the wave function ##\langle x_1, ... , x_n|\psi\rangle##.
2. MWI with additional variables. E.g. Bohmian interpretation where particle positions are also ontological.

What their result shows is that ##|\psi\rangle## alone is not enough.
 
  • Like
Likes bhobba
  • #40
jlcd said:
Can you give an example of Factorization and how the critiques reasoned they were different ways to factor it?

I won't discuss that paper because I don't agree with much of it. You can do a search on threads here - it has been discussed a lot - a lot more than it deserves IMHO.

But here is the issue with factorisation. Suppose you have a particle detector. Its natural in analysing how such works to divide the detector and what being observed into exactly that - the detector and what it observes. Theory shows as a result of decoherence you get a mixed state where the off diagonal elements are zero and the diagonal terms give a probability of detecting a particle and not detecting it. So far so good. But what if you instead decompose it into what's being observed + half the detector and the other half. You would have rocks in your head doing that - you job is much much harder. But as a matter of principle you must get the same result - if you don't then things are really rotten in the state of Denmark and stinks to high heaven. In many area's of physics like balls rolling down inclined planes you have exactly the same problem - but everyone believes as an unstated assumption it doesn't make any difference - still it's an issue. Those concerned about the factorisation problem say decoherence is just a result of factoring it into what's being observed an what does the observing. I personally think its a crock - but it can't be dismissed out of hand.

In relation to the early universe as the universe evolves obviously structures develop and interacts with other things. The claim of the factorisation crowd is decoherence is simply a result of humans factoring it that way so doesn't explain anything - hence the claim nothing happens in MW.

I don't agree - but that's the argument. Its also got nothing to do with observers. But that's a matter of opinion - you can have a look at threads where its discussed and make up your own mind.

Thanks
Bill
 
  • Like
Likes eloheim
  • #41
There is a factorization problem in Newtonian mechanics?
 
  • #42
ddd123 said:
There is a factorization problem in Newtonian mechanics?

Of course there isn't. Its just putting it into perspective.

Thanks
Bill
 
  • #43
Maybe it's not about taking the factorization problem seriously. In my naive view, it's a reductio of the idea that the mathematical ket of the Universe is ontologically representative. The operations done to the ket to factor it differently are all legal, so it works as a reductio. Where do you not agree?
 
  • #44
ddd123 said:
Maybe it's not about taking the factorization problem seriously. In my naive view, it's a reductio of the idea that the mathematical ket of the Universe is ontologically representative. The operations done to the ket to factor it differently are all legal, so it works as a reductio. Where do you not agree?

I am not sure I understand your point. But as far as the wave-function of the universe goes in many interpretations it makes no sense eg what prepared the universe?

Thanks
Bill
 
  • #45
The criticized interpretation is MWI so the reductio is directed at that. That it becomes non sequitur in other contexts is only natural I guess.
 
  • #46
But here is the issue with factorisation. Suppose you have a particle detector. Its natural in analysing how such works to divide the detector and what being observed into exactly that - the detector and what it observes. Theory shows as a result of decoherence you get a mixed state where the off diagonal elements are zero and the diagonal terms give a probability of detecting a particle and not detecting it. So far so good. But what if you instead decompose it into what's being observed + half the detector and the other half. You would have rocks in your head doing that - you job is much much harder. But as a matter of principle you must get the same result - if you don't then things are really rotten in the state of Denmark and stinks to high heaven. In many area's of physics like balls rolling down inclined planes you have exactly the same problem - but everyone believes as an unstated assumption it doesn't make any difference - still it's an issue. Those concerned about the factorisation problem say decoherence is just a result of factoring it into what's being observed an what does the observing. I personally think its a crock - but it can't be dismissed out of hand.

I wonder if the rocks in the head is a good example of factorization. When you are standing on the rocky ground (say in the Andes mountain) and remember that all is quantum (in our latest understanding of decoherence post Copenhagen where there is no classical-quantum division/boundary but all quantum), then one can ask why is the rock on the ground and not inside your organs like inside the head... because the universe can decompose it such that the rocks would be anywhere in your body. Is this a valid example?
 
  • #47
bhobba said:
But here is the issue with factorisation. Suppose you have a particle detector. Its natural in analysing how such works to divide the detector and what being observed into exactly that - the detector and what it observes.
This is clearly assuming the conclusion (begging the question) fallacy. In such a way you won't explain anything.
 
  • #48
I don't get the idea of decoherence, even if factorization problem can be solved.
QM gives statistical predictions. But statistics are calculated from individual events. If we say that there is no more fundamental description than wavefunction then we have to represents every individual event with identical wavefunction. But detections are rather random. So we say that particle exists as wavefunction until point X when it is randomized (wavefunction collapse).
Now decoherence idea says that initial coherent wavefunction can become non-coherent. So the first question is: after decoherence is each separate particle still described by exactly the same (but non-coherent) wavefunction or does non-coherence means that each particle is described by slightly different wave function?
 
  • #49
zonde said:
This is clearly assuming the conclusion (begging the question) fallacy. In such a way you won't explain anything.

Your 'clearly' logic escapes me.

Its a fringe issue - but a genuine one.

Thanks
Bill
 
  • #50
zonde said:
I don't get the idea of decoherence, even if factorization problem can be solved.

Then you need to study it more:
http://philsci-archive.pitt.edu/5439/1/Decoherence_Essay_arXiv_version.pdf

In particular you need to understand mixed states and the difference between proper mixed states and improper ones.

BTW it requires delving into the math - it can't be explained in words - at least I can't do it.

zonde said:
Now decoherence idea says that initial coherent wavefunction can become non-coherent. So the first question is: after decoherence is each separate particle still described by exactly the same (but non-coherent) wavefunction or does non-coherence means that each particle is described by slightly different wave function?

There is no wave-function after decoherence because its in a mixed state.

Thanks
Bill
 
  • #51
bhobba said:
Your 'clearly' logic escapes me.
Probably that's because you understand the problem differently.
As I see it, the problem with factorization is that there is no way how interaction can be included into state vector of the universe
or let's say it is not clear how interaction and interacting parts can be unequivocally defined given state vector of the universe equipped with unitary evolution.
So a way to explain it would be to show what is needed to end up with "detector" interacting with "what it observes" but not start with these things.
 
  • #52
bhobba said:
In particular you need to understand mixed states and the difference between proper mixed states and improper ones.
I think I understand it. Proper mixed state is "particle here" plus "particle there".
Improper mixed state is "superposition of particle here and particle there" plus "superposition of particle here and particle there but with different phase".

The problem is that improper mixed state shows that there can be no interference even without collapse. And that means observation does not have to collapse wavefunction and now we have even less ideas how to arrive at randomized detections that contribute to the same statistical result.

Basically decoherence breaks "collapse" explanation but does not give anything in place.
bhobba said:
There is no wave-function after decoherence because its in a mixed state.
You don't describe individual particle with density matrix. Of course there is wave-function for individual particle after decoherence.
 
  • #53
zonde said:
I think I understand it. Proper mixed state is "particle here" plus "particle there".
Improper mixed state is "superposition of particle here and particle there" plus "superposition of particle here and particle there but with different phase".

Wrong:
http://pages.uoregon.edu/svanenk/solutions/Mixed_states.pdf

I suggest you spend some time becoming familiar with the concepts mathematically.

When you can explain it, mathematically, in a post, that's when you will understand it.

Here is the outline. Quantum states, despite what you may have read, are not elements of a vector space, they are positive operators of unit trace. By definition operators of the form |u><u| are called pure states - they are the usual states because they can be mapped to a vector space. All other states are called mixed and it can be shown they are the convex sum of pure states ie of the form Σ ci |bi><bi| where the ci are positive and sum to one. If you have an observation whose outcomes are the |bi><bi| then the Born Rule shows ci is the probability of getting |bi><bi|. Note - and this is very very important - a mixed state is NOT a superposition.

Once that is understood then the difference between improper and proper mixed states can be explained.

Thanks
Bill
 
  • #54
bhobba said:
Quantum states, despite what you may have read, are not elements of a vector space, they are positive operators of unit trace. By definition operators of the form |u><u| are called pure states - they are the usual states because they can be mapped to a vector space.
From the link you gave:
"A pure state of a quantum system is denoted by a vector (ket) [itex]|\psi\rangle[/itex] with unit length, i.e. [itex]\langle\psi|\psi\rangle[/itex] = 1, in a complex Hilbert space H."

So you are telling one thing but give links that say other other things.
bhobba said:
I suggest you spend some time becoming familiar with the concepts mathematically.
You have not demonstrated such a competence that I should take from you suggestions about how to spend my time. Considering this it's a rude remark.
 
  • #55
zonde said:
"A pure state of a quantum system is denoted by a vector (ket) [itex]|\psi\rangle[/itex] with unit length, i.e. [itex]\langle\psi|\psi\rangle[/itex] = 1, in a complex Hilbert space H." So you are telling one thing but give links that say other other things

No I am not.

Its basic linear algebra that |u><u|, as I said, 'can be mapped to a vector space'. In particular its because the operators are of unit trace they are of unit length. Here is the proof. Write the |u> in |u><u| as c|u'> where |u'> is of unit length. Since its of unit trace, trace (|u><u|) = 1 = |c|^2 trace (|u'><u'| = |c|^2 = 1 ie |u> is of unit length.

You will not make progress in QM until you understand basic linear algebra and the Dirac notation.

zonde said:
You have not demonstrated such a competence that I should take from you suggestions about how to spend my time. Considering this it's a rude remark.

Instead of casting doubt on my competence a more fruitful approach is to ask for an explanation if something seems contradictory. You will learn more that way.

Thanks
Bill
 
Last edited:
  • #56
I've been reading about Factorization that Hobba recommended and have this question.

In radioactive decays. There is phase randomization and collapse. In many worlds without factorization, does it mean there was no radioactive decay or this couldn't occur?
 
  • #57
For further problems with the usual attempted explanation of classical emergence through 'decoherence', see http://tinyurl.com/hn3g2tj
Comments welcome.
 
  • #58
jlcd said:
I've been reading about Factorization that Hobba recommended and have this question.

I doubt if I recommended anything on factorisation. While a legit issue IMHO far too much is made of it.

Thanks
Bill
 
  • #59
rkastner said:
Comments welcome.

'The idea that unitary-only dynamics can lead naturally to preferred observables, such that decoherence suffices to explain emergence of classical phenomena (e.g., Zurek 2003) has been shown in the peer-reviewed literature to be problematic. However, claims continue to be made that this approach, also known as ‘Quantum Darwinism,’ is the correct way to understand classical emergence.'

Obviously it cant. An extra interpretive assumption is required. That doesn't mean however it's not the correct way to go, I don't think it is, but that means diddly squat. I read a lot on decoherence and QM interpretations but I can't recall anyone making claims it solves interpretive issue by itself. Occasionally we see posts here making that or similar claims - myself or others quickly point out its simply not possible - and pretty obviously so.

Thanks
Bill
 
  • #60
bhobba said:
Wrong:
http://pages.uoregon.edu/svanenk/solutions/Mixed_states.pdf

I suggest you spend some time becoming familiar with the concepts mathematically.

When you can explain it, mathematically, in a post, that's when you will understand it.

Here is the outline. Quantum states, despite what you may have read, are not elements of a vector space, they are positive operators of unit trace. By definition operators of the form |u><u| are called pure states - they are the usual states because they can be mapped to a vector space. All other states are called mixed and it can be shown they are the convex sum of pure states ie of the form Σ ci |bi><bi| where the ci are positive and sum to one. If you have an observation whose outcomes are the |bi><bi| then the Born Rule shows ci is the probability of getting |bi><bi|. Note - and this is very very important - a mixed state is NOT a superposition.

Once that is understood then the difference between improper and proper mixed states can be explained.

Thanks
Bill

All what you write is correct and pedagogical.
But in this post zonde has a probleme with proper and improper mixed states.
You give him a link about pure and impure states. the problem is that the words proper and improper cannot be found in the paper.
Have you a link with the mathematical machinery for proper and improper mixed states or do you think that the difference is a question of interpretation?
 
  • #62
I find no mathematics in this link behind proper and improper states.
just words like you prepare, you ignore and so on. In Everett thesis the observer is a system,it is a part of the theory. When it has observed something it is in a given state, if it reads it again it is in another state. The physical memory is a part of the model.
I am looking for something like that behind proper and improper states.
 
  • #63
naima said:
I find no mathematics in this link behind proper and improper states.

Its not a mathematical difference. Its a preparation difference. I have written on this many many times so one more time. A proper mixture is when states are randomly presented for observation. An improper mixture is one that was not prepared that way. Its simple and I will not pursue it further here in an old thread that has been resurrected..

Thanks
Bill
 
  • #64
bhobba said:
Its not a mathematical difference. Its a preparation difference. I have written on this many many times so one more time. A proper mixture is when states are randomly presented for observation. An improper mixture is one that was not prepared that way. Its simple and I will not pursue it further here in an old thread that has been resurrected..

There is actually a theorem involved in the claim that there is no mathematical difference. I forgot where I read this, but someone proved a theorem to the effect that every mixed state is obtainable by tracing out degrees of freedom from a pure state. (In general, the pure state might belong to a larger, fictitious Hilbert space, though).
 
  • Like
Likes bhobba
  • #65
What is an improper vs. a proper mixed state? Any state is represented a trace-class positive semidefinite self-adjoint propagator with trace 1, the statistical operator. You can distinguish pure states, where the statistical operator is a projection operator and mixed states, where it is not. If your system is in a state described by a statistical operator all you know about it are the probabilities for outcomes of measurements. It doesn't matter how the system has been prepared in this state. I don't get the point of what's written on page 10 of the cited article in #61. How do you distinguish (by observations) between case 2 and 3? According to standard quantum theory there is no possibility to distinguish the two cases!
 
  • #66
vanhees71 said:
What is an improper vs. a proper mixed state?

An improper mixed state is one obtained by starting with the density matrix for a pure state, and then tracing over some of the degrees of freedom. So it's really where it came from, rather than the results. The result is the same, whether it's proper or improper.
 
  • #67
But, how can you distinguish proper from improper mixed states? In the example in the paper you end up with unpolarized particles in both cases, described by the stat. op. ##\hat{\rho}=1/2 \hat{1}##. Imho there's no way to distinguish the two cases with measurements made only with particle A (which is why you trace out particle B in this example). Only if you make joint measurements on both particle A and particle B you can observe the correlations implied by the preparation in the pure two-particle state.
 
  • #68
vanhees71 said:
But, how can you distinguish proper from improper mixed states?

They can't be distinguished.
 
  • Like
Likes bhobba
  • #69
vanhees71 said:
But, how can you distinguish proper from improper mixed states? In the example in the paper you end up with unpolarized particles in both cases, described by the stat. op. ##\hat{\rho}=1/2 \hat{1}##. Imho there's no way to distinguish the two cases with measurements made only with particle A (which is why you trace out particle B in this example). Only if you make joint measurements on both particle A and particle B you can observe the correlations implied by the preparation in the pure two-particle state.

You can distinguish a proper from an improper mixed state by measuring a nonlocal variable. An example is given in http://philsci-archive.pitt.edu/5439/1/Decoherence_Essay_arXiv_version.pdf Secion 1.2.3 on p10.
 
  • #70
Of course, but then I don't use the reduced description but the state of the full system. It was exactly the example on p 10 of the above mentioned paper which lead to my question. It's as often in these interpretational discussions much ado about nothing!
 

Similar threads

  • Quantum Interpretations and Foundations
Replies
7
Views
1K
  • Quantum Physics
2
Replies
40
Views
7K
  • Quantum Interpretations and Foundations
Replies
25
Views
1K
Replies
6
Views
2K
Replies
12
Views
2K
Replies
2
Views
2K
Replies
102
Views
16K
  • Beyond the Standard Models
2
Replies
38
Views
8K
  • Quantum Physics
Replies
8
Views
4K
Replies
4
Views
3K
Back
Top