Can a conscious observer collapse the probability wave?

Click For Summary
The discussion centers on the role of a conscious observer in collapsing the quantum wave function, particularly in the context of the double slit experiment. It is debated whether a conscious observer is necessary for this collapse or if any measurement, regardless of consciousness, suffices. The consensus leans toward the idea that any interaction or measurement leads to collapse, independent of whether the observer retains the information. Observers are considered inadequate recording devices due to the unreliability of human memory, suggesting that the interference pattern may persist even with observation. Ultimately, the conversation highlights the complexity of defining measurement in quantum mechanics and the ongoing debate surrounding the observer's role.
  • #91
Meselwulf said:
The old quantum idea before decoherence was that it took a human being to collapse the wave function.

Actually very few believed that - only Wigner, Von Neumann and their cohort. Wigner later abandoned it however.

Thanks
Bill
 
Physics news on Phys.org
  • #92
bhobba said:
I fail to see your point. There are two types of models - stochastic (ie fundamentally random) and deterministic. If it was deterministic then you would be able to define a probability measure of 0 and 1 - which you can't do if Gleason's theorem holds. Do you really believe in contextuality and theories that have it like BM? But yes that is an assumption I make and adhere to.

Your determinism argument only makes sense if you want to assign probabilities to subspaces at all. Why should we? We know it makes sense because we observe it, but that's not a good reason for assuming it. Doing so introduces exactly what we really want to understand.


Since I accept as a given the measurement postulate that is not an issue. The advantage of the fact it is now a mixed state is the interpretation is different.

If you accept the measurement postulate then you cannot derive anything relevant to solving the measurement problem from using it. Because solving the measurement problem (even partly) means to explain the origin of the measurement postulate.

'Seriously, a mixed state is an ensemble description. In fact, one of the peculiar things about the interplay between mixed state statistics and quantum statistics is that considering particles in a "mixed state" is indistinguishable from considering them in a randomly drawn pure state if that random drawing gives a statistically equivalent description as the mixed state. Worse, there are *different* ensembles of *different* pure states which are all observationally indistinguishable from the "mixed state". What describes a mixed state, or all of these ensembles, is the density matrix rho.'

Like I said, stating that a single reduced state described by a density operator is indistinguishable from an actual ensemble (no matter which realization) requires using the measurement postulate. So it does not help at all for saying anything about how measurement works. Decoherence does not solve the measurement problem, not even remotely, not with the Gleason theorem, not with MWI, just not at all.
 
  • #93
bhobba said:
Actually very few believed that - only Wigner, Von Neumann and their cohort. Wigner later abandoned it however.

Thanks
Bill

I always was of the opinion that to collapse the wave function it was essential to be wearing glasses with heavy frames, a skinny black tie, and a white lab coat.
 
  • #94
f95toli said:
There are LOTS of us out there working to solve these problems, and the vast majority of us have very little interest in the "philosophy of QM": we just want our devices to work better and "decoherence theory" gives us a route for improvement.
Good post! However, I never quite got what people mean when they talk about "decoherence theory". I know the theory of open quantum systems and how decoherence arises there. Is this equivalent to "decoherence theory" or is there more to it? If yes, what are the axioms of "decoherence theory"?
 
  • #95
Jazzdude said:
Your determinism argument only makes sense if you want to assign probabilities to subspaces at all. Why should we? We know it makes sense because we observe it, but that's not a good reason for assuming it. Doing so introduces exactly what we really want to understand.

Its from the postulate observables are Hermitian operators whose eigenvalues are the possible outcomes. The spectral theorem implies, since obviously the actual values are unimportant, the projection operators of the decomposition give the probability of getting that outcome. That is easy to see if you consider a function of the observable that gives its expectation. Although it is a stronger assumption than made by Gleason's Theorem you can in fact derive the standard trace formula from the simple assumption the expectations are additive as Von Neumann did in his proof against hidden variable theories. In fact that's the precise assumption Bell homed in on in his refutation - its not necessarily true of hidden variable theories.

Jazzdude said:
If you accept the measurement postulate then you cannot derive anything relevant to solving the measurement problem from using it. Because solving the measurement problem (even partly) means to explain the origin of the measurement postulate.

I don't get it - I really don't. The problem of observing a pure state is it discontinuously changes to an unpredictable state and the system can not be assumed to be in that state prior to observation. But, like the link I gave on mixed states said: 'a "mixed state" is indistinguishable from considering them in a randomly drawn pure state if that random drawing gives a statistically equivalent description as the mixed state.'. Both are part of the measurement postulate but the second situation does not have the problems of the first such as in Schrodinger's Cat where the cat can be alive and dead at the same time prior to observation. Being in a mixed state it is either alive or dead. It does not solve all the problems - only some of them - but it does solve some of them.

Thanks
Bill
 
  • #96
Jazzdude, do you think the dBB interpretation solves the measurement problem?

For me, the measurement problem is mainly to explain collapse / the appearance of collapse and not necessarily to explain the Born rule. If we require an explanation for every probabilistic element of QM, we are implicitly assuming that the theory is deterministic.
 
  • #97
bhobba said:
I don't get it - I really don't.
I think this is a semantic issue. I use the following definitions:

measurement problem: explain collapse
measurement postulate: collapse + Born rule

If I get him right, Jazzdude wants to explain the measurement postulate while you want to solve the measurement problem.

/edit: I forgot the "oberservables are self-adjoint operators and outcomes are eigenvalues" part in the measurement postulate. This is probably not under doubt by Jazzdude.
 
Last edited:
  • #98
kith said:
Good post! However, I never quite got what people mean when they talk about "decoherence theory". I know the theory of open quantum systems and how decoherence arises there. Is this equivalent to "decoherence theory" or is there more to it? If yes, what are the axioms of "decoherence theory"?

It makes use of the standard postulates of QM - nothing new is required.

Thanks
Bill
 
  • #99
bhobba said:
It makes use of the standard postulates of QM - nothing new is required.
I think so, too. The question is why do people talk about decoherence theory in the first place and what does it include.
 
  • #100
kith said:
I think this is a semantic issue. I use the following definitions:

measurement problem: explain collapse
measurement postulate: collapse + Born rule

If I get him right, Jazzdude wants to explain the measurement postulate while you want to solve the measurement problem.

Maybe.

To me the measurement postulate is E(R) = Tr(pR) where p is the state. I assume its true. The measurement problem for a pure state follows from the postulate in that its easy to see if p is a pure state it will in general discontinuously change to another pure state. However if p is a mixed state of the outcomes of an observation then the interpretation of the postulate is different - it can be assumed to be in one of those states prior to observation with a certain probability. Because decoherence converts a pure state to a mixed state there is no discontinuous change of the state - it can be assumed to be in that state prior to observation. Because of that, as the link I gave said 'taking a partial trace amounts to the statistical version of the projection postulate.'.

If that doesn't do it I am afraid I will leave it to someone else - I am sort of pooped.

Thanks
Bill
 
  • #101
I think there's some fundamental confusion what exactly solving the measurement problem means. It means that you have to answer the quesitons what a measurement is, why possible measurement results are given by the spectra of hermitian operators, where the indeterminism comes from, why we observe a collapsed state, and why we observe the statistics as given by the Born rule.

In other words, you have to give reasons for all the measurement related statements in the postulates of canonical quantum theory. This comes down to deriving the measurement postulate and all associated structure from something simpler, ideally even from nothing but unitary quantum theory.

Specifically, I am not allowed to assume that observables are given by hermitian operators whose spectrum defines the possible outcomes, I'm not allowed to assume that density operators describe ensembles, etc.

Kith, I don't think that dBB solves the measurement problem, I don't think that any established theory does.
 
  • #102
Jazzdude said:
I think there's some fundamental confusion what exactly solving the measurement problem means. It means that you have to answer the quesitons what a measurement is, why possible measurement results are given by the spectra of hermitian operators, where the indeterminism comes from, why we observe a collapsed state, and why we observe the statistics as given by the Born rule.

Ahhhh. Yes - most definitely. With that view I agree with what you write. I have my own answers to such questions and decoherence is just one small part of it. Indeed such is known as an interpretation - my interpretation is the ensemble interpretation combined with decoherence.

Jazzdude said:
Kith, I don't think that dBB solves the measurement problem, I don't think that any established theory does.

Nor do I - my view doesn't solve all the issues - I simply like it because the problems it doesn't solve I find acceptable. As I often say all current interpretations suck - you simply choose the one that sucks the least to you.

Thanks
Bill
 
Last edited:
  • #103
bhobba said:
Nor do I - my view doesn't solve all the issues - I simply like it because the problems it doesn't solve I find acceptable. As I often say all current interpretations suck - you simply choose the one that sucks the least to you.



Problem is: decoherence + ensemble interpretation doesn't solve a single thing.
You got all the usual paradoxes and unanswered questions...
 
  • #104
Quantumental said:
Problem is: decoherence + ensemble interpretation doesn't solve a single thing. You got all the usual paradoxes and unanswered questions...

Obviously since I hold to it I don't agree. But you are not the only one to hold that view - indeed there are those who believe that the ensemble interpretation (with or without decoherence) is simply a restating of the math and should not even be given the title of an actual interpretation.

Thanks
Bill
 
  • #105
Jazzdude said:
I think there's some fundamental confusion what exactly solving the measurement problem means.
The term is not as clearly defined as you suggest and I don't think you are representing the mainstream view. Schlosshauer for example defines the measurement problem as the combination of "the problem of definite outcomes" and "the problem of the preferred basis" which is only a small part of your definition.

Jazzdude said:
Specifically, I am not allowed to assume that observables are given by hermitian operators whose spectrum defines the possible outcomes
Why not? What would be an "allowed" assumption for the observables? Why are functions on the phase space "allowed" and self-adjoint operators on the Hilbert space are not?

Jazzdude said:
In other words, you have to give reasons for all the measurement related statements in the postulates of canonical quantum theory.
We cannot give reasons for all measurement related statements in any scientific theory, because the theory has to say how the mathematical objects relate to experimentally observable quantities. I can only think of two reasons to question the validity of the postulates of a theory:
(1) the theory is not consistent
(2) there exists a simpler theory which makes the same predictions

(1) is arguably true for the postulates of orthodox QM, but the only contradiction is between unitarian evolution and collapse. So if we are able to explain collapse (and many interpretations accomplish this), the inconsistencies go away. (2) may be true, but as long as we haven't found this simpler theory, we cannot claim that the current theory needs an explanation
 
  • #106
bhobba said:
Obviously since I hold to it I don't agree. But you are not the only one to hold that view - indeed there are those who believe that the ensemble interpretation (with or without decoherence) is simply a restating of the math and should not even be given the title of an actual interpretation.


Except that you are not allowed to disagree by the laws of logic unless you actually have explanations for the quantum phenomena.

It's not "my view" that the ensemble interpretation with or without decoherence does not solve anything, it is objective reality.
 
  • #107
bhobba said:
Nor do I - my view doesn't solve all the issues - I simply like it because the problems it doesn't solve I find acceptable. As I often say all current interpretations suck - you simply choose the one that sucks the least to you.
hahaha, that's an answer i love!

kith said:
Why not? What would be an "allowed" assumption for the observables? Why are functions on the phase space "allowed" and self-adjoint operators on the Hilbert space are not?
simply you are not allowed to give measurement any special role. just like in classical physics you would need to calculate the outcomes of a measurement apparatus by applying the equations of motion to it and finding that the calculated behavior is consistent with the display (e.g. the calculated amplitude of the needle in a galvanometer corresponds to the labels on the display). this is required to legitimize that your detector measures exactly what he is said to measure and not something entirely different.

alternatively if you want to say that a measurement can be represented by a self-adjoint operator you must exactly define where this operator arises from and why applying it to the state yields the value you are searching for. say if i give you the blueprint of a detector you must be able to calculated the corresponding self-adjoint operator it measures and proof that the measurement process using the operator is consistent with the equations of motion of the theory (thus applying it to the state is merely a short-cut to calculate the results). guessing the observable for a detector is not rigorous enough.

and sure. in any case you need to find an adequate representation of your detector within the theory. and that already implies some interpretation of what parts of the apparatus is actually relevant for the measuring and thus must be modeled (though that should be experimentally checkable). in case of QM and in the simplest case one would expect that a detector can be represented by the potential (and other physical fields) it puts the measured object in and that themselves arise from the components your detector is build from (which in their finest decomposition will be themselves molecules, atoms and so on thus objects the theory must describe).

that said every theory requires a kind of 'interpretation' that translates everything we experience in reality into an adequate representation within the theory. obviously this translation must be well defined and unique for every object. in case of classical physics this is mostly obvious but becomes difficult in QM due to the fact that it focuses on describing microscopic objects. but this is the only way to construct a general theory that in principle can be applied to any problem. otherwise a theory is not a complete description and has interpretation related degrees of freedom that can be used bend the results in any way needed (i.e. if the theory yields wrong results i could just say hey, my self adjoint measurement operator was wrong (does not represent my new detector) and i just construct one that gives me the results i want and say that the new operator is the adequate representation)
 
Last edited:
  • #108
Quantumental said:
Except that you are not allowed to disagree by the laws of logic unless you actually have explanations for the quantum phenomena. It's not "my view" that the ensemble interpretation with or without decoherence does not solve anything, it is objective reality.

Yea - I guess guys like Ballentine have got it all wrong then - he uses it in his standard textbook to solve pretty much every issue and even purports to show (see Chapter 9) that any other interpretation leads to problems. Now, even though I hold to that interpretation, I am not saying I necessarily agree with him but it does show its not quite the 'objective reality' you seem to think it is. The fact of the matter is what any interpretation solves or even if it needs to be solved is a matter of opinion - nothing to do with 'objective reality' whatever that is in this connection.

Thanks
Bill
 
Last edited:
  • #109
Killtech said:
alternatively if you want to say that a measurement can be represented by a self-adjoint operator you must exactly define where this operator arises from and why applying it to the state yields the value you are searching for.

No - all you need is to have it as an axiom - which it is. See Chapter 2 of Ballentine.

Of course you may decide to give it a deeper justification - but you don't have to. In this case though I believe there is. Suppose there is a system and observational apparatus with n outcomes yi. Write them out as a vector sum yi |bi>. Problem is the yi are not invariant to a change in basis and since that is an entirely arbitrary man made thing it should be expressed in such a way as to be invariant. By changing the |bi> to |bi><bi| we have sum yi |bi><bi| which is a Hermitian operator whose eigenvalues are the possible outcomes of the measurement and basis invariant.

Thanks
Bill
 
  • #110
bhobba said:
No - all you need is to have it as an axiom - which it is. See Chapter 2 of Ballentine.
let's move the idea to the extreme. i give you the theory of everything: the black box function theory. this theory is very simple and consists of only one axiom: for every possible setup there exists an exact black box function that yields always the right results. of course you can measure things and therefore obtain parts of the back box function and use it to predict future experiments with the same setup. but because there is no interpretation available you will never be able to derive the black box function theoretically from the experimental setup. thus all you get is a pure empirical theory without any content at all. and of course by definition it describes the world perfectly and is always right.

but that's not what we are searching for. we want a theory that given any blueprint of an experimental setup can calculate everything correctly without any need of measurements beforehand. but therefore you need an interpretation to translate the setup into the terms of the theory first so you can do all the calculations. and the theory should require minimal input, that is if the electron change can be derived within the theory then this is preferable instead of having it as a kind of variable dependent on measurements.
 
Last edited:
  • #111
Killtech said:
but that's not what we are searching for. we want a theory given any possible blueprint of an experimental setup can calculate everything correctly without any measurements beforehand. but therefore you need an interpretation to translate the setup into the terms of the theory first so you can do all the calculations.

Why do you believe QM requires measurements beforehand to make predictions?

QM is a theory about measurements but it does not require any beforehand.

Or is your beef that values cannot be assigned independent of measurement - sorry - Bell and Aspect ruled it out.

Thanks
Bill
 
Last edited:
  • #112
bhobba said:
Why do you believe QM requires measurements beforehand to make predictions?

QM is a theory about measurements but it does not require any beforehand.

Or is you beef that values cannot be assigned independent of measurement - sorry - Bell and Aspect ruled it out.

Thanks
Bill
how you know then that a spin detector actually measures the spin?
you cannot check it within the theory so you must check it experimentally. or you rely on a different (classical) theory to give it a justification (though that theory is known to be wrong for microscopic objects).
if you yield wrong results in a setup you cannot rule out that the observable operator you were using was perhaps wrong and if it was it does not falsify the theory because the theory does not give any derivation for the operator for a given detector. so the theory is fail safe in that regard.
 
Last edited:
  • #113
Killtech said:
how you know then that a spin detector actually measures the spin?
you cannot check it within the theory so you must check it experimentally.

The same way an engineer designs anything - a combination of theory and of course testing.

Killtech said:
if you yield wrong results in a setup you cannot rule out that the observable operator you were using was perhaps wrong and if it was it does not falsify the theory because the theory does not give any derivation for the operator for a given detector. so the theory is fail safe in that regard.

Thats why experimental results are checked independently. If an experiment produces anomalous results all sorts of things are checked - but so far QM has come through unscathed.

Do you have any actual comment about QM rather than this general philosophical waffling?

Thanks
Bill
 
Last edited:
  • #114
given any self-adjoint operator derive a blue print for a detector that measures it.
 
  • #115
Killtech said:
given any self-adjoint operator derive a blue print for a detector that measures it.

Given E=MC2 design a cyclotron and detector to measure it.

Sorry mate this will be my last reply to this off topic irrelevancy. I suggest you take it to the philosophy forums.

Thanks
Bill
 
  • #116
Darwin123 said:
According to decoherence theory, the isolated system containing environmental system and probed system really evolve by Schroedinger equation. The "randomness" of the measured results corresponds to unknown phases in the environmental system. There is an assumption here that there are far more unknown phases in the environmental system then in the measured system. Thus, the environment is considered complex.
That doesn't cut it. In my view, decoherence theory is actually something completely different than that-- it is something that allows you to treat subsystems via projections. That's it, that's all it does. It never says anything at all about isolated systems, because we never do observations on isolated systems. That is the key statement at the very heart of "the measurement problem", and note that decoherence has nothing whatever to say about it (because decoherence theory is all about how to treat subsystems). Even with decoherence theory, which in my view is just basic quantum mechanics, one still has the unanswered question: does the isolated system evolve by the Schroedinger equation, or doesn't it? Taking a stand on that question invokes an interpretation of quantum mechanics, and decoherence theory simply doesn't help at all.

Let me give an example, the Schroedinger cat. Decoherence theory has no trouble saying why the cat is in a mixed state, so is either dead or alive-- it's because "the cat" is actually a projection from a much larger isolated system. So in "true" physical terms, there is no such thing as "the cat", it is merely a choice we make to consider only a fraction of what the reality holds. Decoherence theory is no help with this, all it does is recognize that in fact "the cat" does not exist as an independent entity in the theory of quantum mechanics, it is a kind of social construct that involves a projection from that which is treated in the physical theory. The social construct is easily constructed as being either alive or dead, and there is no contradiction with the unitary evolution of the actual physical entities treated by the Schroedinger equation (if one holds that interpretation). Hence, decoherence explains why our social constructs behave as they do (pure states project into mixed states, that's just basic quantum mechanics-- the same would be true for the social construct of "one electron" in what is actually a two-electron system, or writ large, in a white dwarf star). What decoherence does not explain is that the isolated system is doing-- why, when we observe an "alive cat" projection, is there nothing left of the "dead cat" projection, if in fact the entire system was a pure state to begin with? Decoherence has nothing at all to say about that, you still have to choose: either the state was initially pure and evolved into something whose projections became pure substates (Copenhagen), or it was initially pure and evolved into a bunch of entangled projections of which our perceptions are restricted to only one (many worlds), or it was never pure in the first place because wave functions for macro systems don't really exist, macro systems are always mixed states so are always only statistical amalgamations (the ensemble view).

One question that I haven't entirely satisfied in my own mind is why you can't consider the unknown phases as "hidden variables". The answer, to the degree that I understand it, is that the unknown phases in the decoherence model do not have the properties of a "hidden variables" defined in Bell's Theorem. When Bell proved that "hidden variables" do not explain quantum mechanics, he carefully defined "hidden variable" in a mathematically formal way. However, the phases of the waves in decoherence theory are "variables" and they are "hidden" in the broadest meaning of the words.
Yes, I think that's right-- it's like von Neumann's "no-go" theorem about hidden variables, he chose a restricted definition of how they have to behave. I believe that if one wishes to hold that macro systems evolve strictly deterministically, one has gone beyond the ensemble view (which is inherently statistical) and into the Bohmian view (which is deterministic, and involves the kind of generalized hidden variables that you are talking about).
1) Why can't the unknown phases in the environment of the probed system be considered "hidden variables"?
They can-- to a Bohmian. To someone using the ensemble interpretation, the unknown phases don't really solve the problem if you think the initial state is a pure state with unknown phases. Such a pure state must still evolve unitarily, even under decoherence, and there still is a dead cat in there just as much as an alive one. There is no way that the initial phases can all prefer an alive cat after one half-life of the apparatus, why would they turn out that way?
2) Why isn't "decoherence theory" ever called a "hidden variable" theory?
Because decoherence only explains the behavior of the projection, whereas hidden variable theory is about the whole isolated system.
 
  • #117
bhobba said:
As I often say all current interpretations suck - you simply choose the one that sucks the least to you.
I completely agree with you that choosing an interpretation is very much making a "devil's bargain," and as such is quite subjective. But I would like to offer you an alternative to the thought that all interpretations suck, which is that what we regard as a "sucky" aspect of our devil's bargain might actually end up being a game-changing insight into how physics can move forward.

As an example, I give you the interpretation of classical mechanics that was normally adopted, which was often viewed as "sucky" in Newton's day: it said that what is going to happen is only determined by what has already happened, not by some "first cause" or what "should" happen. To many in Newton's day, this was a complete failure of the theory-- it completely sucked that you had to know what had already happened before you could know what was going to happen, that was like "passing the buck" as far as they were concerned. Some went as far as saying it didn't tell you anything at all, it was completely circular to have to know what had already happened to know what was going to happen! But no one thinks of that as a "sucky" element of classical mechanics now, instead we simply moved the goal posts of what a physical theory is supposed to do.

In other words, instead of making the interpretation fit our preconceptions about physics, we learned to modify our conceptions of physics to fit the workable interpretation of classical mechanics. I submit the only problem with quantum mechanics is that there are still too many allowable interpretations, so we cannot see what the "lesson" is that we should be using to change what we think physics is. The only thing that sucks about the interpretations is that they force us to look in different directions to see the future of physics, placing us in an uncomfortably uncertain place. That's why we still need to find the "best" interpretation, the one that guides future progress and teaches us what physics is supposed to be at this stage.
 
  • #118
Killtech said:
...
Thanks for your post. I have never thought about some of these things before, so my answer will necessarily be half-baked.

The first important question for me is, how can we know how to measure a given observable and which observable does a given apparatus measure. The answer to this question is not entirely clear to me even in classical mechanics. Further input is appreciated.

You suggest, that we have to construct the Hamiltonian of the apparatus and calculate explicitly that the pointer/needle/whatever points to a label which is the actual value of the observable we want to measure.

This raises a couple of issues for me. First of all, it explicitly assumes that the observable has a well-defined value at all times. This would require our QM theory to have value-definiteness (like dBB) which is a very strong assumption. Why should we assume this?

If we leave it out, decoherence brings us in close analogy to the classical case: we construct a Hamiltonian for the apparatus, the system and their interaction; we use unitarian evolution; we trace over the environment; we get decoherence and the interference is gone. Most importantly, the basis we get decoherence in determines what observable is being measured.

The only thing that we don't get is a definite outcome. But once we use a collapse-free interpretation, we have a fully consistent theory.
 
  • #119
Ken G said:
Decoherence has nothing at all to say about that, you still have to choose: either the state was initially pure and evolved into something whose projections became pure substates (Copenhagen), or it was initially pure and evolved into a bunch of entangled projections of which our perceptions are restricted to only one (many worlds), or it was never pure in the first place because wave functions for macro systems don't really exist, macro systems are always mixed states so are always only statistical amalgamations (the ensemble view).
That's an interesting view of which I'm not sure if it is correct. My current view is that the interpretational question really lies in the interpretation of the mixed state of the (sub)system and not in assumptions about the state of the whole, because decoherence can be derived from the unitarian dynamics of the whole. Now in your view, we have already chosen an interpretation by the initial state we use. This seems uncommon, because decoherence is derived from the unitarian dynamics of the whole in the theory of open systems, where no interpretational questions are discussed. Also I'm not sure if we can always find an initial state of the whole where the state of the subsystem is led from a pure superposition state to a pure eigenstate as your Copenhagen version would imply.

Independent of this, I'd really like to hear your view on the measurement issues raised by Jazzdude, me and Killtech. ;-)
 
Last edited:
  • #120
kith said:
Thanks for your post. I have never thought about some of these things before, so my answer will necessarily be half-baked.

The first important question for me is, how can we know how to measure a given observable and which observable does a given apparatus measure. The answer to this question is not entirely clear to me even in classical mechanics. Further input is appreciated.

You suggest, that we have to construct the Hamiltonian of the apparatus and calculate explicitly that the pointer/needle/whatever points to a label which is the actual value of the observable we want to measure.

This raises a couple of issues for me. First of all, it explicitly assumes that the observable has a well-defined value at all times. This would require our QM theory to have value-definiteness (like dBB) which is a very strong assumption. Why should we assume this?

If we leave it out, decoherence brings us in close analogy to the classical case: we construct a Hamiltonian for the apparatus, the system and their interaction; we use unitarian evolution; we trace over the environment; we get decoherence and the interference is gone. Most importantly, the basis we get decoherence in determines what observable is being measured.

The only thing that we don't get is a definite outcome. But once we use a collapse-free interpretation, we have a fully consistent theory.
as you correctly write the basis that arises from an observable determines what is being measured. so you will have to find a relation between the hamiltonian and your basis and postulate that this relation is responsible for the decoherence (for all systems). finding this relation is a real addition to the theory and could complete it in the sense that it provides a first vague (but somewhat defined) mechanism to determine when the collapse actually happens and what causes it.

for example in the simple case of a hydrogen atom you could argue that only energy eigenstates have a time independent charge density. so if you classically couple the EM-field to the charge you find that only those are no source for EM-waves (although they have an angular momentum ;)). mixed states of two energy states oscillate periodically with a frequency proportional to their energy difference. all mixed states lose energy in this way thus must be unstable. this distinguishes the energy eigenstates from all other basis and could be a hint of how the above mentioned relation could look like.the other option i see would be to try to construct a justification of measurement from the equations of motion (EOM). but one finds that these yield unphysical results in systems where a wave function interacts with multiple objects that are macroscopic far away because the EOM describe quantum objects as pure waves with no particle nature whatsoever. thus they are wrong in general and relay on the measurement postulate as a supplement for macroscopic interactions. a possibility of generalizing them to yield better results at macro level is to add non-linear terms that change the macro behavior. it makes sense to go for non-linear dynamics because they are know to produce results astonishingly similar to QM predictions. for example they provide a source of randomness arising from chaos (sensitivity to initial conditions), collapse behavior and may have soliton solutions that behave like waves microscopically but as particles macroscopically. finally the QED is a linearization (2nd quantization) of naturally non-linear field theory (dirac-maxwell, EM-field classically coupled to the charge density of wave function).

however non-linear dynamics are way more complex and much much harder to solve. in case of dirac-maxwell little is known about any solutions even in the simplest free case. on the other hand a non-linear QM must provide a derivation of the measurement postulate because the usual probability interpretation breaks apart when the wave function is no more normed to 1. due to the current lack of a mechanism to decide when the collapse happens it is very difficult to guess the form of the non linear interactions to reproduce it.

in any case you need to extend QM by something to solve the measurement problem.
 
Last edited:

Similar threads

Replies
23
Views
7K
Replies
3
Views
3K
Replies
9
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
65
Views
3K
  • · Replies 35 ·
2
Replies
35
Views
5K
Replies
8
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
Replies
20
Views
5K