High School How Does Environmentally Induced Decoherence Affect Quantum State Reduction?

  • Thread starter Thread starter Feeble Wonk
  • Start date Start date
  • Tags Tags
    Decoherence
Click For Summary
The discussion centers on the complexities of environmentally induced decoherence and its role in quantum state reduction. Participants express confusion over the definitions of "system," "apparatus," and "environment," particularly regarding their interactions and the implications for entropy. It is clarified that while a system can transition from a pure to a mixed state due to decoherence, the overall composite system remains in a pure state. The conversation emphasizes the philosophical distinction between the potential states represented by wave functions and the actual states observed post-interaction. Ultimately, the dialogue seeks to deepen understanding of how decoherence influences quantum states without resorting to mathematical formalism.
Physics news on Phys.org
  • #62
I find no mathematics in this link behind proper and improper states.
just words like you prepare, you ignore and so on. In Everett thesis the observer is a system,it is a part of the theory. When it has observed something it is in a given state, if it reads it again it is in another state. The physical memory is a part of the model.
I am looking for something like that behind proper and improper states.
 
  • #63
naima said:
I find no mathematics in this link behind proper and improper states.

Its not a mathematical difference. Its a preparation difference. I have written on this many many times so one more time. A proper mixture is when states are randomly presented for observation. An improper mixture is one that was not prepared that way. Its simple and I will not pursue it further here in an old thread that has been resurrected..

Thanks
Bill
 
  • #64
bhobba said:
Its not a mathematical difference. Its a preparation difference. I have written on this many many times so one more time. A proper mixture is when states are randomly presented for observation. An improper mixture is one that was not prepared that way. Its simple and I will not pursue it further here in an old thread that has been resurrected..

There is actually a theorem involved in the claim that there is no mathematical difference. I forgot where I read this, but someone proved a theorem to the effect that every mixed state is obtainable by tracing out degrees of freedom from a pure state. (In general, the pure state might belong to a larger, fictitious Hilbert space, though).
 
  • Like
Likes bhobba
  • #65
What is an improper vs. a proper mixed state? Any state is represented a trace-class positive semidefinite self-adjoint propagator with trace 1, the statistical operator. You can distinguish pure states, where the statistical operator is a projection operator and mixed states, where it is not. If your system is in a state described by a statistical operator all you know about it are the probabilities for outcomes of measurements. It doesn't matter how the system has been prepared in this state. I don't get the point of what's written on page 10 of the cited article in #61. How do you distinguish (by observations) between case 2 and 3? According to standard quantum theory there is no possibility to distinguish the two cases!
 
  • #66
vanhees71 said:
What is an improper vs. a proper mixed state?

An improper mixed state is one obtained by starting with the density matrix for a pure state, and then tracing over some of the degrees of freedom. So it's really where it came from, rather than the results. The result is the same, whether it's proper or improper.
 
  • #67
But, how can you distinguish proper from improper mixed states? In the example in the paper you end up with unpolarized particles in both cases, described by the stat. op. ##\hat{\rho}=1/2 \hat{1}##. Imho there's no way to distinguish the two cases with measurements made only with particle A (which is why you trace out particle B in this example). Only if you make joint measurements on both particle A and particle B you can observe the correlations implied by the preparation in the pure two-particle state.
 
  • #68
vanhees71 said:
But, how can you distinguish proper from improper mixed states?

They can't be distinguished.
 
  • Like
Likes bhobba
  • #69
vanhees71 said:
But, how can you distinguish proper from improper mixed states? In the example in the paper you end up with unpolarized particles in both cases, described by the stat. op. ##\hat{\rho}=1/2 \hat{1}##. Imho there's no way to distinguish the two cases with measurements made only with particle A (which is why you trace out particle B in this example). Only if you make joint measurements on both particle A and particle B you can observe the correlations implied by the preparation in the pure two-particle state.

You can distinguish a proper from an improper mixed state by measuring a nonlocal variable. An example is given in http://philsci-archive.pitt.edu/5439/1/Decoherence_Essay_arXiv_version.pdf Secion 1.2.3 on p10.
 
  • #70
Of course, but then I don't use the reduced description but the state of the full system. It was exactly the example on p 10 of the above mentioned paper which lead to my question. It's as often in these interpretational discussions much ado about nothing!
 
  • #71
vanhees71 said:
Of course, but then I don't use the reduced description but the state of the full system. It was exactly the example on p 10 of the above mentioned paper which lead to my question. It's as often in these interpretational discussions much ado about nothing!

It is not much ado about nothing. Not distinguishing these has mislead some into believing that decoherence solves the measurement problem, including physicists as distinguished as Anderson. If you don't like interpretation, it must be noted that most great physicists cared deeply about it. In modern times, one can read the comments about the importance of the distinction between proper and improper mixed states in https://www.amazon.com/dp/0198509146/?tag=pfamazon01-20.
 
Last edited by a moderator:
  • Like
Likes eloheim and rkastner
  • #72
atyy said:
It is not much ado about nothing. Not distinguishing these has mislead some into believing that decoherence solves the measurement problem, including physicists as distinguished as Anderson. If you don't like interpretation, it must be noted that most great physicists cared deeply about it. In modern times, one can read the comments about the importance of the distinction between proper and improper mixed states in https://www.amazon.com/dp/0198509146/?tag=pfamazon01-20.

Yeah, with the various threads on why quantum mechanics is not weird, I've been trying to clarify in my mind exactly why I still think it is weird. It's definitely the measurement problem, but I have a hard time formalizing exactly why it bothers me. But roughly speaking, orthodox quantum mechanics seems a little schizophrenic. On the one hand, most people like to assume that there is nothing going on in a measurement process that cannot be explained by quantum mechanics. But if you try to describe the whole composite system (system being measured plus system doing the measuring) using quantum mechanics, then I don't see that anything vaguely like the QM collapse postulate--after a measurement, the system is an eigenstate of the property being measured--happens. I don't see anything vaguely like the more minimal description--you get some eigenvalue with probabilities given by the Born rule--happens, either. If we are using QM to describe the composite system, then it's hard to see why there should be definite outcomes for measurements at all, or why probabilities come into play at all.

Decoherence is where the schizophrenia comes in. If you take the density matrix of the complete system, and trace out the environmental degrees of freedom, then you end up with a mixed-state density matrix. But then people want to interpret the mixed state using the ignorance interpretation. That doesn't make sense to me--you KNOW that the mixed state didn't arise from ignorance about the true state, because you just created the mixed state by tracing out the environmental degrees of freedom. It seems as though you're willfully forgetting what you just did.

So to me, orthodox QM just doesn't make sense. Maybe one of the other interpretations--objective collapse, or many-worlds, or Bohmian mechanics--makes sense, but the orthodox interpretation doesn't. It seems like people are willfully fooling themselves.
 
Last edited by a moderator:
  • Like
Likes eloheim and Nugatory
  • #73
Well, maybe in realizing that you have to think hard to figure out what bothers you you only realized that there's nothing to bother about. Quantum theory tells us that nature is inherently stochastic/probabilistic/statistical. So what? That's how it is!

What is (or was for quite a while) an interesting theoretical challenge is that in our everyday experience macroscopic objects obey almost exactly the laws of classical physics, and we do not see quantum interference effects at macroscopic objects. That's why it took some time to discover the quantum behavior (starting with black-body radiation in the late 1880ies). I think, contrary to what atyy said in #71 that is clearly solved by decoherence and that we are simply not able to resolve the fast scales of dynamics of microscopic scales for many-body systems. So we get the classical world from coarse graining the description of the macroscopically relevant slow observables at macroscopic scales. It must also have to do with the formalism of the renormalization group in QFT/stat phys. The Wilsonian interpretation is precisely that picture of effective theories on low energy-momentum (slow and long-distance scale varying) scales emerging from more microscopic theories which reveal themselves only at high energy-momentum (fast and short-distance) scales. In this sense classical theory is an effective theory of quantum theory with some range of applicability.

The socalled measurement problem is then simply the question of how microscopic systems, sufficiently isolated from the environment to reveal quantum behavior, interact with the measurement apparatus, which provides "the environment" in being necessarily the "classicality condition" of measurement apparati as already discussed by Bohr in the early 1930ies (i.e., before Heisenberg confused the quantum community with his collapse in the 50ies ;-)).

What always has bothered me before I learned about the works on decoherence was this quantum-classical cut, introduced ad hoc as an explanation for the classical behavior of measurement apparati and the even more ad hoc assumption of a collapse of the state which in almost all real measurements never occur, because the quantum object is "destroyed" in the measurement process and thus it's not even necessary to find a description as an isolated quantum system anymore. What happens at or shortly after the "measurement" is entirely a property of the measurement apparatus and not of a general theory/model of the world.
 
  • Like
Likes Nugatory
  • #74
vanhees71 said:
Well, maybe in realizing that you have to think hard to figure out what bothers you you only realized that there's nothing to bother about. Quantum theory tells us that nature is inherently stochastic/probabilistic/statistical. So what? That's how it is!

I am not arguing with either you or stevendaryl here (I have a great deal of sympathy for both positions) but the only takeaway here may be that the two have you have different thresholds for weirdness. There is a strong element of personal taste involved when considering whether an internally consistent and empirically supported position is also satisfactory.
 
  • Like
Likes bhobba
  • #75
vanhees71 said:
Well, maybe in realizing that you have to think hard to figure out what bothers you you only realized that there's nothing to bother about. Quantum theory tells us that nature is inherently stochastic/probabilistic/statistical.

But it doesn't tell us that, at all. The only way that probabilities come into it is by the dubious steps of separating the measurement apparatus from the thing being measured, and then treating the former in a way that is inconsistent with the way the latter is treated.
 
  • #76
stevendaryl said:
But it doesn't tell us that, at all. The only way that probabilities come into it is by the dubious steps of separating the measurement apparatus from the thing being measured, and then treating the former in a way that is inconsistent with the way the latter is treated.

Specifically, you treat the system being measured as something whose state evolves unitarily according to Schrodinger's equation, and you treat the measuring device as something that has definite outcomes for measurements. That seems inconsistent to me.
 
  • #77
vanhees71 said:
(i.e., before Heisenberg confused the quantum community with his collapse in the 50ies ;-)).
It was von Neumann who in his 1932 book, where he made QM mathematically fully respectable, also made the collapse (then called state reduction) definite and prominent. Bohm then coined 1951 the name collapse for state reduction. Many people from the quantum optics community finally observed in 1986+ the collapse as quantum jumps in certain continuous measurements of single atoms in an ion trap, so that it is now in various quantum optics books; see, e.g., Section 8.2 of Gerry & Knight 2005.

It is not appropriate to blame Heisenberg for all this - I don't even know what Heisenberg contributed.
 
Last edited:
  • #78
Nugatory said:
I am not arguing with either you or stevendaryl here (I have a great deal of sympathy for both positions) but the only takeaway here may be that the two have you have different thresholds for weirdness. There is a strong element of personal taste involved when considering whether an internally consistent and empirically supported position is also satisfactory.

But that is not the issue. Bohr's position is fine - it's weird live with it, we can do science with it. Dirac's position is also fine - it's weird, but will presumably be solved by quantum theory not being the final theory.

What vanhees71 is claiming is that there is no measurement problem, no classical/quantum cut in a minimal interpretation - ie. without BM or MWI. Vanhees71's claim is extremely controversial, and as far as I can tell, it is wrong, and not a matter of taste. The book by Haroche and Raimond rebuts vanhees71's position that decoherence solves the measurement problem.
 
  • Like
Likes eloheim
  • #79
stevendaryl said:
If you take the density matrix of the complete system, and trace out the environmental degrees of freedom, then you end up with a mixed-state density matrix. But then people want to interpret the mixed state using the ignorance interpretation. That doesn't make sense to me--you KNOW that the mixed state didn't arise from ignorance about the true state, because you just created the mixed state by tracing out the environmental degrees of freedom. It seems as though you're willfully forgetting what you just did.

So to me, orthodox QM just doesn't make sense.
This only proves that the talk in orthodox QM about ignorance doesn't make sense. Once one accepts that the mixed state obtained by tracing out the environmental degrees of freedom is all there is to a state of a subsystem, nothing depends anymore on knowledge or ignorance. The mixed state is a complete description of the single system. In rare cases it happens to be a pure state, for example when one looks at a single silver atom in a Stern-Gerlach experiment, projects the state to the region where one of the beams produced lives, and traces over all degrees of freedom except the silver atom spin. Every case of a preparation of a pure state can be explained in a similar way. Thus there is nothing at all that depends on knowledge or ignorance - except the common talk in the textbooks.
 
  • Like
Likes vanhees71
  • #80
A. Neumaier said:
This only proves that the talk in orthodox QM about ignorance doesn't make sense. Once one accepts that the mixed state obtained by tracing out the environmental degrees of freedom is all there is to a state of a subsystem, nothing depends anymore on knowledge or ignorance. The mixed state is a complete description of the single system.

Okay, but a mixed state can potentially describe a nonzero probability of (say) a cat being dead and a cat being alive. Okay, if you don't want to talk about cats, you can replace it by any other two macroscopically distinguishable possibilities. The mixed state formalism can account for a nonzero probability for two different macroscopically distinguishable possibilities. So either both possibilities are real (which to me means many-worlds), or one or the other is real so somehow a single possibility was selected.
 
  • Like
Likes eloheim
  • #81
stevendaryl said:
but a mixed state can potentially describe a nonzero probability of (say) a cat being dead and a cat being alive.
A theoretical mixed state, but not a mixed state realized in Nature according to the tracing out rule given - unless the state of the big system from which this state was obtained by tracing out the environment was already very weird. A mixed state is admissible in the arguments only if we can tell how to prepare them, given the laws of Nature and the tracing out rule. We can do that for pure spin states and for superpositions of tensor products of a few spin states, but even that only in carefully controlled situations. But no apparatus in the universe would prepare a cat in a mixed state of the kind you proposed. At least no known one - which is sufficient to explain why we don't observe these strange things. Nothing needs to be selected since the state cannot be prepared in the first place.
 
  • Like
Likes Mentz114
  • #82
atyy said:
The book by Haroche and Raimond rebuts vanhees71's position that decoherence solves the measurement problem.
In which paragraph or page?
 
  • #83
A. Neumaier said:
A theoretical mixed state, but not a mixed state realized in Nature according to the tracing out rule given - unless the state of the big system from which this state was obtained by tracing out the environment was already very weird.

Well, part of the difficulty here is that we really can't do quantum mechanics with 10^{23} particles except in heuristic ways. So the weirdness is perhaps lost in the complexity. But it seems to me that you could set up a situation in which a microscopic difference (whether an electron is spin-up or spin-down) is magnified to make a macroscopic difference. That's what Schrodinger's cat is about. For that matter, that's what any measurement does. So if you consider it weird for a microscopic difference to be magnified to become a macroscopic difference, then such weirdness is an inherent part of the empirical content of QM.

Suppose you set things up so that:
  • The detection of a spin-up electron leads to a dead cat.
  • The detection of a spin-down electron leads to a live cat.
Then you create an electron that is in a superposition \alpha |up\rangle + \beta |down\rangle, and you send it to the detector. What happens? Well, the Copenhagen interpretation would tell us that macroscopic objects like cats are classical, not quantum. So rather than leading to a superposition of a dead cat and a live cat, what we would get is EITHER a dead cat, with probability |\alpha|^2, or a live cat, with probability |\beta|^2. But that seems inconsistent to me. Why, for small systems, do we get superpositions, rather than alternatives, but for large systems, we get alternatives? That's the weirdness, if not outright inconsistency, of standard quantum mechanics.

Of course, some people claim that decoherence explains why we get alternatives, rather than superpositions, but I don't think it actually does that. What it explains is that superpositions rapidly spread with time: You start off with a single particle in a superposition of states, and then it interacts with more particles putting that composite system into a superposition, and that composite system interacts with the environment (the electromagnetic field) putting it into a superposition of states. The superposition doesn't go away, but it spreads to infect the whole universe (or our little part of it, anyway). But then a trace over everything other than the system of interest gives us what looks like a mixed state, where we can interpret the components of the mixture as alternatives, rather than superpositions.
 
  • Like
Likes Feeble Wonk
  • #84
I think that's a misunderstanding of decoherence. We don't suddenly change the interpretation when we compute reduced density matrices. In fact, we never need to compute the reduced density matrix for decoherence. We could just work with the full quantum state. It's only a matter of convenience to compute the reduced density matrix. Quantum mechanics is a theory that predicts relative frequencies for certain events. It provides us with a probability distribution for each observable. In fact, we could get rid of the Hilbert space and operators completely and reformulate QM purely as a bunch of evolution equations for these probability distributions. Decoherence explains, why those probability distributions don't usually exhibit oscillatory behaviour. For example it explains, why the probability distribution for the throw of a die is ##P_i = \frac{1}{6}## and not rather ##P_1 = P_3 = P_5 = \frac{1}{3}, P_2 = P_4 = P_6 = 0##. So decoherence explains why the probability distributions, that QM predicts, agree with those that we would expect classically.

What more do you expect from a physical theory than a prediction of relative frequencies? And if you don't expect more, then why does QM have problems?
 
  • #85
rubi said:
Quantum mechanics is a theory that predicts relative frequencies for certain events.

I don't think it really does that. Can you say what an "event" is, without making a macroscopic/microscopic distinction?

[edit]What I should have said is that I don't think quantum mechanics gives probabilities (relative or otherwise) without additional assumptions that seem ad hoc.
 
  • #86
stevendaryl said:
I don't think it really does that. Can you say what an "event" is, without making a macroscopic/microscopic distinction?
Yes, I think so: Let ##A## be any observable you want. Let ##\sigma(A)## be its spectrum and ##\psi_a## be the generalized eigenvectors of ##A##. The set of of events for this observable is ##\mathcal B(\sigma(A))##, the smallest sigma algebra that contains all the open sets of ##\sigma(A)## and for each such event ##B##, its probability is given by ##P(B) = \int_B |\left<\psi_a,\Psi\right>|^2\mathrm d a##. For example, ##A## could be the position operator ##\hat x(t)## at time ##t## and ##B## could just be the event "The position at time t lies between 2 and 3", which would mathematically be represented by the interval ##B=(2,3)##. This should account for every event you could think of.
 
Last edited:
  • #87
rubi said:
Yes, I think so: Let ##A## be any observable you want. Let ##\sigma(A)## be its spectrum and ##\psi_a## be the generalized eigenvectors of ##A##. The set of of events for this observable is ##\mathcal B(\sigma(A))##, the smallest sigma algebra that contains all the open sets of ##\sigma(A)## and for each such event ##B##, its probability is given by ##P(B) = \int_B |\left<\psi_a,\Psi\right>|^2\mathrm d a##. For example, ##A## could be the position operator ##\hat x(t)## at time ##t## and ##B## could just be the event "The position at time t lies between 2 and 3", which would mathematically be represented by the interval ##B=(2,3)##. This should account for every event you could think of.

Okay, this is definitely NOT the standard way of presenting quantum mechanics, which is what I had a complaint about. But let me take your presentation. It does not, so far, have any connection to anything with empirical content. To make a connection with something observable, you have to associate probabilities with measurement outcomes. Which means that you have to face the measurement problem, of what does it mean to measure some observable?
 
  • #88
stevendaryl said:
Okay, this is definitely NOT the standard way of presenting quantum mechanics, which is what I had a complaint about.
The formula ##P(B)## I wrote down is just the Born rule. I just wrote it in a way that allows you to directly plug in the events ##B## that you are interested in. I think it is fairly standard, at least we regularly teach it this way at my university.

But let me take your presentation. It does not, so far, have any connection to anything with empirical content. To make a connection with something observable, you have to associate probabilities with measurement outcomes.
The probabilities are given by ##P(B)##. For each observable, QM allows you to compute such a probability distribution. Let's say we measure the spin of a particle. Then you my formula would give you probabilities ##P_\uparrow = P(\{\uparrow\})## and ##P_\downarrow = P(\{\downarrow\})##. These are the probabilities that predict the relative frequencies of spin measurements.

Which means that you have to face the measurement problem, of what does it mean to measure some observable?
I don't understand this question. Can you explain how you would answer this question in the case of classical mechanics and how it would be different from quantum mechanics? What would it mean to measure an observable in CM?
 
  • #89
rubi said:
I don't understand this question. Can you explain how you would answer this question in the case of classical mechanics and how it would be different from quantum mechanics? What would it mean to measure an observable in CM?

To measure an observable means to set things up so that there is a correspondence between possible values of the observable and macroscopically distinguishable states of the measuring device. An example might be a pointer that pivots in a semicircle. Then you set things up so that the angle of the pointer is affinely related to the value of a real-valued observable.

Implicit in this is the assumption that the pointer actually has a definite value. If the pointer could be in a superposition of positions, then I don't know what it would mean to say that it measures an observable. And that's the case with quantum mechanics. If the system being measured is in a superposition of different values of an observable, and you let the system interact with a measurement device, I would expect (if we analyzed the measurement device itself using quantum mechanics) the result to be that the measurement device would be put into a superposition of states. (or that a larger system, including measuring device + environment, would be put into a superposition of states).
 
  • #90
The problem, which to me seems like an inconsistency in the quantum formalism, is that for a small system, such as a single electron, observables don't have definite values, in general. If an electron has spin state \left( \begin{array} \\ \alpha \\ \beta \end{array} \right), what is the z-component of its spin? The question doesn't have an answer. It's in a superposition of spin-up and spin-down. But if you take a macroscopic system such as a detector, and you measure the z-component of the spin, you don't get a superposition of answers, you get either spin-up or spin-down. The macroscopic system has a definite state.

Why do macroscopic systems have definite states, if microscopic systems don't?
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 8 ·
Replies
8
Views
4K