The Time Symmetry Debate in Quantum Theory

  • Thread starter Thread starter TrickyDicky
  • Start date Start date
  • Tags Tags
    Quantum
  • #101
TrickyDicky said:
It is in those correspondence rules that the problem arises, and depending on how one interprets the Born rule, for instance, you might have a smaller or bigger problem.

I can't follow you there. I know Ballentine pretty well and what he shows is based on the invarience of probabilities the dynamics follows.

TrickyDicky said:
I would say the Born rule was devised having the particle picture of classical mechanics in mind. Don't you?

What do you mean by devised? Historically - probably - but so? We now know it follows from much more general considerations having nothing to do with particles eg Gleason's theorem.

Thanks
Bill
 
Physics news on Phys.org
  • #102
TrickyDicky said:
Well if you reduce the discussion to abstract observables without attributing them to any particular object be it a particle, a field or whatever, you don't have a way to connect it with the physical/pragmatic side so no measurement problem for you, but as Ballentine warned in the quote posted by devil then you don't really have a physical theory but just a set of abstract axioms without connection with experiment.

I think that's what I intended to say: that the "measurement problem" is about the connection between the notion of "observable" that is a primitive in the quantum theory, and the "observable" that is something that requires a measurement apparatus and a measurement procedure. But I don't see how that supports the claim that the measurement problem has anything to do, intrinsically, with classical properties of particles.

I would say the Born rule was devised having the particle picture of classical mechanics in mind. Don't you?

No, I don't see much of a connection between the two. The Born rule about probabilities, or something like it, is forced on us by the assumption, or empirical fact, that an observation always produces an eigenvalue of the corresponding operator, and that operators don't commute (so it's not possible for all observables to have definite values simultaneously). I don't see that there is anything particularly particle-like about any of this.

When you were talking about observables were you thinking about them in terms of properties of particles, fields...?

They are properties of a system, as a whole. The electric and magnetic field at a point in space is an observable. The mass, position, momentum, magnetic moment of a lump of iron are all observables. Yes, those observables are all macroscopic sums of observables associated with individual atoms of the iron, but the observables don't require particles to make sense of them. So I really don't understand the point you are making about the relationship between observables and particles.
 
  • #103
No. Kolgmorgrov's axioms are clear on this point:
http://en.wikipedia.org/wiki/Probability_axioms
'This is the assumption of unit measure: that the probability that some elementary event in the entire sample space will occur is 1. More specifically, there are no elementary events outside the sample space.'

If something has probability 1 it must occur.

I think it will be easier to explain this by example. Consider random process that produces series of red points somewhere in a unit disk with uniform probability density. The probability of the event that the next point will concide with any point A of the disk is equal to 0.

However, after the event occurs, some point of the disk will be red. At that instant, an event with probability 0 has happened.

Actually, all events that happen in such random process are events that have probability 0.

So "event has probability 0" does not mean "impossible event".

Similarly, "probability 1" does not mean "certain event". Consider probability that the red point will land at point with both coordinates irrational. This can be shown to be equal to 1 in standard measure theory. However, there is still infinity of points that have rational coordinates, and these can happen - they are part of the disk.

In the language of abstract theory, all this is just a manifestation of the fact that equal measures do not imply that the sets are equal.
 
  • #104
I would say the Born rule was devised having the particle picture of classical mechanics in mind. Don't you?

Very good point. As far as know, there are actually two Born rules, although people tend to think they are the same.

The first rule, well working in scattering and quantum chemistry, is the assumption that

$$
|\psi(\mathbf r)|^2 ~\Delta V,
$$

gives probability that the particle is in the small volume element ##\Delta V## around ##\mathbf r##.

This really refers to particles and their configuration.

The second rule, I think proposed after the first one, is that

$$
p_k = |\langle \phi_k|\psi \rangle|^2
$$

gives the probability that the system in state ##\psi## will manifest energy ##E_k## (or get into state ##\phi_k## in other versions) when "measurement of energy" is performed (or even spontaneously, due to interaction with environment in other versions). This is more abstract and does not require particles.

We should really distinguish these two rules. The first one is easy and does not depend on the measurement problem, and is gauge-invariant.

The second is difficult to understand, because it is connected to measurements and is gauge-dependent - if we choose different gauge to calculate ##\psi##, we get different ##p_k##.
 
  • #105
bhobba said:
No. Kolgmorgrov's axioms are clear on this point:
http://en.wikipedia.org/wiki/Probability_axioms
'This is the assumption of unit measure: that the probability that some elementary event in the entire sample space will occur is 1. More specifically, there are no elementary events outside the sample space.'

If something has probability 1 it must occur.
It's true that if an event will definitely occur, then it must have probability 1. But it's not the case that if an event has probability 1, it will definitely occur. See this wikipedia page.
 
  • #106
Jano L. said:
I think it will be easier to explain this by example. Consider random process that produces series of red points somewhere in a unit disk with uniform probability density. The probability of the event that the next point will concide with any point A of the disk is equal to 0.

However, after the event occurs, some point of the disk will be red. At that instant, an event with probability 0 has happened.

Actually, all events that happen in such random process are events that have probability 0.

So "event has probability 0" does not mean "impossible event".

Similarly, "probability 1" does not mean "certain event". Consider probability that the red point will land at point with both coordinates irrational. This can be shown to be equal to 1 in standard measure theory. However, there is still infinity of points that have rational coordinates, and these can happen - they are part of the disk.

In the language of abstract theory, all this is just a manifestation of the fact that equal measures do not imply that the sets are equal.

lugita15 said:
It's true that if an event will definitely occur, then it must have probability 1. But it's not the case that if an event has probability 1, it will definitely occur. See this wikipedia page.

Good points that simply go to support Jano L. posts #71, #74, #78... IMO showing that Bill's reliance on Gleason's theorem can not be used in the general case for what he thinks it can, but only for discretized, lattice models of physical systems, a very strong assumption in the light of what we know, or at least I think most physicists still favor a continuous picture of nature as exemplified by successful theories like GR.
bhobba said:
What do you mean by devised? Historically - probably - but so?
Probably no, certainly, you just have to read Born's original 1926 paper.


bhobba said:
We now know it follows from much more general considerations having nothing to do with particles eg Gleason's theorem.
I wouldn't be so sure we know that. See above.
 
  • #107
Jano L. said:
I think it will be easier to explain this by example. Consider random process that produces series of red points somewhere in a unit disk with uniform probability density. The probability of the event that the next point will concide with any point A of the disk is equal to 0.

However, after the event occurs, some point of the disk will be red. At that instant, an event with probability 0 has happened.

Actually, all events that happen in such random process are events that have probability 0.

So "event has probability 0" does not mean "impossible event".

That's certainly true, mathematically. On the other hand, in the real world, we never measure real-valued observables to infinite precision. We never really observe: "The particle's momentum is P", we observe something like "The particle's momentum is somewhere in the range P\ \frac{+}{-} \ \Delta P. For this reason, if we have two states | \psi \rangle and |\phi \rangle such that \langle \psi | \phi \rangle = 1, they are considered the same state, as far as quantum mechanics is concerned. Adding or subtracting a set of measure zero does nothing.
 
  • #108
Last edited by a moderator:
  • #109
That's certainly true, mathematically. On the other hand, in the real world, we never measure real-valued observables to infinite precision. We never really observe: "The particle's momentum is P", we observe something like "The particle's momentum is somewhere in the range P +− ΔP.
Yes, but we really discussed theoretical difference between probabilistic and deterministic description. I think the limitations of observations have no bearing on that argument.

...if we have two states |ψ⟩ and |ϕ⟩ such that ⟨ψ|ϕ⟩=1, they are considered the same state, as far as quantum mechanics is concerned. Adding or subtracting a set of measure zero does nothing.

It does nothing to probability. That was my point - in probabilistic theory, the probability is 1 both for certain and almost certain event. We cannot adequately describe the difference between the two in such a theory. Ergo deterministic theory is not just a special case of probabilistic theory. They are different kinds of theories constructed for different purposes.
 
  • #110
T0mr said:
The electron in orbit was given as an example of a possible path. I have read about this argument before. That an classically orbiting electron should emit radiation presumably because it is a charged object and an accelerated charged object (changing direction) will emit electromagnetic radiation. Yet if you were to put opposite charge on two spheres, one light and one heavy, and then set the lighter in orbit (in space) around the heavier would the two spheres not act just as the two body problem for gravitational force.

... but electromagnetism is 10^39 times stronger than gravitation ...

I think the Bohr model is pretty dead, there are incompatibilities to empirical spectral lines, and it also violates the uncertainty principle, and even if you magically could fix all that – where is your single localized particle in the double-slit experiment?

It doesn’t work...
 
  • #111
bhobba said:
This is in fact the defining property of an inertial frame - the Earth isn't exactly inertial but for many practical purposes such as this experiment it is.

I’m sorry bhobba, I’m completely lost... are you saying that the inertial frame of Earth has anything to do with the double-slit experiment??
 
  • #112
bhobba said:
QM does not insist on analogies with a classical particle model. All it assumes is position is an observable - which is a fact.

bhobba said:
In QM the symmetries are in the quantum state and observables - in classical mechanics its in the Lagrangian.

stevendaryl said:
I don't see that, at all. To me, the "measurement problem" is the conceptual difficulty that on the one hand, a measurement has an abstract role in the axioms of quantum mechanics, as obtaining an eigenvalue of a self-adjoint linear operator, and it has a physical/pragmatic/empirical role in actual experiments as a procedure performed using equipment. What is the relationship between these two notions of measurement? The axioms of quantum mechanics don't make it clear.

I don't see that it has anything particularly to do with particles.

TrickyDicky said:
Well if you reduce the discussion to abstract observables without attributing them to any particular object be it a particle, a field or whatever, you don't have a way to connect it with the physical/pragmatic side so no measurement problem for you, but as Ballentine warned in the quote posted by devil then you don't really have a physical theory but just a set of abstract axioms without connection with experiment.

stevendaryl said:
I think that's what I intended to say: that the "measurement problem" is about the connection between the notion of "observable" that is a primitive in the quantum theory, and the "observable" that is something that requires a measurement apparatus and a measurement procedure. But I don't see how that supports the claim that the measurement problem has anything to do, intrinsically, with classical properties of particles.

Jano L. said:
We should really distinguish these two rules. The first one is easy and does not depend on the measurement problem, and is gauge-invariant.

The second is difficult to understand, because it is connected to measurements and is gauge-dependent - if we choose different gauge to calculate ##\psi##, we get different ##p_k##.


Guys, it’s very interesting to read this discussion, and this stuff is always hard to talk about. Still, let me give you something to chew on while the ‘battle’ continues. :wink:

http://arxiv.org/abs/0707.0401
J.S. Bell's Concept of Local Causality said:
“The beables of the theory are those elements which might correspond to elements of reality, to things which exist. Their existence does not depend on ‘observation’. Indeed observation and observers must be made out of beables.”

Or as he explains elsewhere,

“The concept of ‘observable’ ... is a rather woolly concept. It is not easy to identify precisely which physical processes are to be given the status of ‘observations’ and which are to be relegated to the limbo between one observation and another. So it could be hoped that some increase in precision might be possible by concentration on the beables ... because they are there.”

Bell’s reservations here (about the concept “observable” appearing in the fundamental formulation of allegedly fundamental theories) are closely related to the so-called “measurement problem” of orthodox quantum mechanics, which Bell encapsulated by remarking that the orthodox theory is “unprofessionally vague and ambiguous” in so far as its fundamental dynamics is expressed in terms of “words which, however legitimate and necessary in application, have no place in a formulation with any pretension to physical precision” – such words as “system, apparatus, environment, microscopic, macroscopic, reversible, irreversible, observable, information, measurement.” As Bell elaborates,

“The concepts ‘system’, ‘apparatus’, ‘environment’, immediately imply an artificial division of the world, and an intention to neglect, or take only schematic account of, the interaction across the split. The notions of ‘microscopic’ and ‘macroscopic’ defy precise definition. So also do the notions of ‘reversible’ and ‘irreversible’. Einstein said that it is theory which decides what is ‘observable’. I think he was right – ‘observable’ is a complicated and theory-laden business. Then the notion should not appear in the formulation of fundamental theory.”

As Bell points out, even Bohr (a convenient personification of skepticism regarding the physical reality of unobservable microscopic phenomena) recognizes certain things (for example, the directly perceivable states of a classical measuring apparatus) as unambiguously real, i.e., as beables.

[...]

The unprofessional vagueness and ambiguity of orthodox quantum theory, then, is related to the fact that its formulation presupposes these (classical, macroscopic) beables, but fails to provide clear mathematical laws to describe them. As Bell explains,

“The kinematics of the world, in [the] orthodox picture, is given by a wavefunction ... for the quantum part, and classical variables – variables which have values – for the classical part... [with the classical variables being] somehow macroscopic. This is not spelled out very explicitly. The dynamics is not very precisely formulated either. It includes a Schrödinger equation for the quantum part, and some sort of classical mechanics for the classical part, and ‘collapse’ recipes for their interaction.”

There are thus two related problems. First, the posited ontology is rather different on the two sides of (what Bell calls) “the shifty split” – that is, the division between “the quantum part” and “the classical part.” But then, as a whole, the posited ontology remains unavoidably vague so long as the split remains shifty – i.e., so long as the dividing line between the macroscopic and microscopic remains undefined. And second, the interaction across the split is problematic. Not only is the account of this dynamics (the “collapse” process) inherently bound up in concepts from Bell’s list of dubious terms, but the very existence of a special dynamics for the interaction seems to imply inconsistencies with the dynamics already posited for the two realms separately. As Bell summarizes,

“I think there are professional problems [with quantum mechanics]. That is to say, I’m a professional theoretical physicist and I would like to make a clean theory. And when I look at quantum mechanics I see that it’s a dirty theory. The formulations of quantum mechanics that you find in the books involve dividing the world into an observer and an observed, and you are not told where that division comes... So you have a theory which is fundamentally ambiguous...”

The point of all this is to clarify the sort of theory Bell had in mind as satisfying the relevant standards of professionalism in physics.

Don’t know why I love this paper, but I do – it’s ‘crisp & clear’...
 
  • #113
DevilsAvocado said:
I’m sorry bhobba, I’m completely lost... are you saying that the inertial frame of Earth has anything to do with the double-slit experiment??

It has nothing to do with it per se.

My comment was in relation to the claim the measurement problem had something to do with QM holding the particle picture as fundamental. QM doesn't do that - the dynamics are, just like Classical Mechanics, determined by symmetry arguments. There is no particle assumption other than position is an observable which is an experimentally verified fact.

For many practical purposes the Earth can be considered to have these symmetry properties - that was my point.

Thanks
Bill
 
  • #114
Jano L. said:
Consider probability that the red point will land at point with both coordinates irrational.

Well since the rationals have Lebesque measure zero and there is no way to observationally tell the difference between a rational and rational point, since that would require an infinite measurement precision, it's not a well defined problem physically.

If you seriously doubt a probability of 1 does not mean a dead cert then I think this thread is not the appropriate place to discuss it. I think the Set Theory, Logic, Probability and Statistics statistics subforum is more appropriate so I will be doing a post there.

Thanks
Bill
 
  • #115
TrickyDicky said:
Gleason's theorem can not be used in the general case for what he thinks it can, but only for discretized, lattice models of physical systems, a very strong assumption in the light of what we know, or at least I think most physicists still favor a continuous picture of nature as exemplified by successful theories like GR.

Gleason's theorem holds for infinite dimensional Hilbert spaces:
http://kof.physto.se/theses/helena-master.pdf

I have zero idea why you would think otherwise.

It even holds for non-separable spaces - not that that is of any value to QM.

The issue with Gleason's theorem is its physical basis is a bit unclear - mathematically what's going on is well understood, it depends on non contextuality, and, again mathematically, contextuality is a bit of an ugly kludge, but exactly, from a physical point of view why you require it is open to debate. This is the exact out Bohmian Mechanics uses and its a valid theory. But the Hilbert space formalism is ugly if you don't assume it - you can't define a unique probability measure so the question is - what use is using a Hilbert space to begin with - and indeed for BM the usual formulation is secondary in that interpretation.

My point is Born's rule is not dependent on a particle model - its basis is non-contextually in the usual formulation, or specific assumptions in other formulations like BM.

Thanks
Bill
 
  • #116
stevendaryl said:
Adding or subtracting a set of measure zero does nothing.

Exactly. This is bog standard stuff from more advanced probability texts that take a rigorous approach. Finding probabilities associated with determining rational or irrational numbers is not a well defined problem since the rationals have Lebesque measure zero.

I think a discussion on exactly what probability 0 and 1 means is best dome on the probability subforum and I will do a post there.

Thanks
Bill
 
  • #117
DevilsAvocado said:
Guys, it’s very interesting to read this discussion, and this stuff is always hard to talk about. Still, let me give you something to chew on while the ‘battle’ continues. :wink:

http://arxiv.org/abs/0707.0401


Don’t know why I love this paper, but I do – it’s ‘crisp & clear’...

Because it brings us Bell in his deep and intelligent own words, contrary to the tradition of misinterpreting him that abounds in QM literature :devil:.
:smile:
 
  • #118
bhobba said:
The issue with Gleason's theorem is its physical basis is a bit unclear - mathematically what's going on is well understood, it depends on non contextuality, and, again mathematically, contextuality is a bit of an ugly kludge, but exactly, from a physical point of view why you require it is open to debate. This is the exact out Bohmian Mechanics uses and its a valid theory. But the Hilbert space formalism is ugly if you don't assume it - you can't define a unique probability measure so the question is - what use is using a Hilbert space to begin with - and indeed for BM the usual formulation is secondary in that interpretation.

My point is Born's rule is not dependent on a particle model - its basis is non-contextually in the usual formulation, or specific assumptions in other formulations like BM.

Thanks
Bill
Bill, I agree with the quoted part.
Non-contextuality is a strong assumption IMO. But yes it makes the Hilbert formalism "ugly" not to adopt it. But Gleason's theorem assumes non-contextuality and that was the sense of my comment about lack of generality of the theorem as there are QM interpretations that don't assume non-contextuality (you mentioned BM but there are also the modal interpretations and others).
I have a doubt about this because I've seen quantum non-contextuality defined in two ways that I guess are equivalent, maybe you can help me connect them: as referred to independence of the measurement arrangement and as basis independence of the probability assigned to a vector.
 
  • #119
I think a discussion on exactly what probability 0 and 1 means is best dome on the probability subforum and I will do a post there.

I am looking forward to it. However, the argument was about something different: that deterministic theory is a special kind of probabilistic theory. I am quite interested what others think about this.
 
  • #120
TrickyDicky said:
Bill, I agree with the quoted part.
Non-contextuality is a strong assumption IMO. But yes it makes the Hilbert formalism "ugly" not to adopt it. But Gleason's theorem assumes non-contextuality and that was the sense of my comment about lack of generality of the theorem as there are QM interpretations that don't assume non-contextuality (you mentioned BM but there are also the modal interpretations and others).
I have a doubt about this because I've seen quantum non-contextuality defined in two ways that I guess are equivalent, maybe you can help me connect them: as referred to independence of the measurement arrangement and as basis independence of the probability assigned to a vector.


http://arxiv.org/pdf/1207.1952v1.pdf

..."The concept of contextuality states that the outcomes of measurement may depend on what measurements are performed alongside"...
 
  • #121
TrickyDicky said:
Bill, I agree with the quoted part.
Non-contextuality is a strong assumption IMO. But yes it makes the Hilbert formalism "ugly" not to adopt it. But Gleason's theorem assumes non-contextuality and that was the sense of my comment about lack of generality of the theorem as there are QM interpretations that don't assume non-contextuality (you mentioned BM but there are also the modal interpretations and others).

I did a Google search on the phrase "non-contextuality" and although it gets many hits, I still don't really understand what it means. Can someone give a real definition, and briefly explain why it's relevant in interpretations of quantum mechanics?
 
  • #122
stevendaryl said:
I did a Google search on the phrase "non-contextuality" and although it gets many hits, I still don't really understand what it means. Can someone give a real definition, and briefly explain why it's relevant in interpretations of quantum mechanics?

conjugates values that does not depend on the context, that is independent of its antecedents.

http://plato.stanford.edu/entries/kochen-specker/https://www.physicsforums.com/showpost.php?p=4401195&postcount=12

audioloop said:
really, you have to define locality/non-locality just like subsets of contextuality/noncontextuality,
contextuality is broader, subsumes locality/nonlocality.
that every state that is contextual with respect to the defined test of contextuality is nonlocal as per the CHSH (Clauser, Horne, Shimony, Holt test) but the converse is not true, or as i like to ask:

Every state that is contextual is nonlocal.
...and the inverse, is every state that is nonlocal is contextual ?-----
measured values (atributtes, characteristics, properties) in context, just is related to that, "context", be real goes beyond properties, the possibility of values requires pre-existent objects or process, without objects, there is no possibility of properties (values).
 
  • #123
audioloop said:
conjugates values that does not depend on the context, that is independent of its antecedents.

http://plato.stanford.edu/entries/kochen-specker/


https://www.physicsforums.com/showpost.php?p=4401195&postcount=12

That leaves me a little puzzled, still. Maybe somebody can give examples of a toy model that is contextual?

What I do understand is this: The argument that it is impossible for a hidden-variables theory that reproduces the predictions of quantum mechanics in an EPR-type twin particle experiment assumes that the hidden variable is unaffected by choices made at the measuring devices. But such an effect is ruled out by locality, so I don't see why "contextuality" matters in that case.
 
  • #124
stevendaryl said:
That leaves me a little puzzled, still. Maybe somebody can give examples of a toy model that is contextual?

What I do understand is this: The argument that it is impossible for a hidden-variables theory that reproduces the predictions of quantum mechanics in an EPR-type twin particle experiment assumes that the hidden variable is unaffected by choices made at the measuring devices. But such an effect is ruled out by locality, so I don't see why "contextuality" matters in that case.

An example of a contextual model can be seen if a context is considered to reside in the future. If Alice and Bob can signal from the future to the past as to what they are planning to measure, then entangled state correlations are easier to explain. Nothing needs to propagate faster than c for such mechanism to operate (and to properly describe any existing experiments).

So here we have locality respected while non-contextuality is not, which is essentially what you are looking for. Most contextual models seem "strange" as in counter-intuitive. I personally don't see them as any stranger than non-local ones.
 
  • #125
stevendaryl said:
That leaves me a little puzzled, still. Maybe somebody can give examples of a toy model that is contextual?

Spekkens Toy Model take the epistemic view and has local and non-contextual variables (= fails to reproduce violations of Bell inequalities), here’s a short introduction and here’s the arXiv paper.

Jan-Åke Larsson has made a contextual extension of this toy model:

http://arxiv.org/abs/1111.3561
A contextual extension of Spekkens' toy model said:
Quantum systems show contextuality. More precisely, it is impossible to reproduce the quantum-mechanical predictions using a non-contextual realist model, i.e., a model where the outcome of one measurement is independent of the choice of compatible measurements performed in the measurement context. There has been several attempts to quantify the amount of contextuality for specific quantum systems, for example, in the number of rays needed in a KS proof, or the number of terms in certain inequalities, or in the violation, noise sensitivity, and other measures. This paper is about another approach: to use a simple contextual model that reproduces the quantum-mechanical contextual behaviour, but not necessarily all quantum predictions. The amount of contextuality can then be quantified in terms of additional resources needed as compared with a similar model without contextuality. In this case the contextual model needs to keep track of the context used, so the appropriate measure would be memory. Another way to view this is as a memory requirement to be able to reproduce quantum contextuality in a realist model. The model we will use can be viewed as an extension of Spekkens' toy model [Phys. Rev. A 75, 032110 (2007)], and the relation is studied in some detail. To reproduce the quantum predictions for the Peres-Mermin square, the memory requirement is more than one bit in addition to the memory used for the individual outcomes in the corresponding noncontextual model.

stevendaryl said:
What I do understand is this: The argument that it is impossible for a hidden-variables theory that reproduces the predictions of quantum mechanics in an EPR-type twin particle experiment assumes that the hidden variable is unaffected by choices made at the measuring devices. But such an effect is ruled out by locality, so I don't see why "contextuality" matters in that case.

I agree, I always thought that contextuality means that the entire measurement setup has to be taken in consideration for the context of outcome, i.e. if Alice put her polarizer orthogonal to Bob this will have an effect on the outcome of photon B = non-locality...
 
Last edited:
  • #126
DrChinese said:
An example of a contextual model can be seen if a context is considered to reside in the future. If Alice and Bob can signal from the future to the past as to what they are planning to measure, then entangled state correlations are easier to explain. Nothing needs to propagate faster than c for such mechanism to operate (and to properly describe any existing experiments).

I’m lost DrC... how could “Alice and Bob signal from the future” without an FTL-mechanism that is forbidden by both QM and SR?
 
  • #127
Jano L. said:
I am looking forward to it. However, the argument was about something different: that deterministic theory is a special kind of probabilistic theory. I am quite interested what others think about this.

Yea - but in discussing that you raised the issue. I still believe it is. Indeed if it isn't then Kochen-Sprecker is in deep trouble because that's what it assumes - namely for QM to be deterministic you need to be able to define a measure of only 0 and 1.

I have started a thread over in that sub-forum about it and already their are some interesting replies.

Thanks
Bill
 
  • #128
Here I go again...
From Age of Entanglement:

"...But what if we let relativity enter the game even deeper? What if the detectors are in relative motion such that each detector in its own reference frame analyzes its photon before the other?...

"...once one assumes that the collapse is a real phenomenon, and once one considers specific models, then the conflict is real and testable"...if both measurements happen before the other, then the quantum correlation should disappear, however large the speed of the spooky action!

"Once the engineering was made feasible, "this experiment was also performed in Geneva in the spring of 1999", reported Gisin. "The two-photon interferences were still visible, independently of the relative velocity between Alice and Bob's reference frames." Alice, in her reference frame, measures her photon first; from Bob's point of view, he has measured his photon first; yet, the correlation is still present..."
 
  • #129
Charles Wilson said:
From Age of Entanglement:

Ah! :!) Thank you for reminding me. I must get that book, NOW!
 
  • #130
stevendaryl said:
I did a Google search on the phrase "non-contextuality" and although it gets many hits, I still don't really understand what it means. Can someone give a real definition, and briefly explain why it's relevant in interpretations of quantum mechanics?

Conceptually its very simple. Suppose you have some observable A = sum y1|b1><b1| + y2|b2><b2| + y3 |b3><b3| where |b3> means outcome |b1> or |b2> did not occur. Outcome |b1> occurs with probability |<u|b1>|^2 from the Born rule. Now consider the observable C = y1 |b1><b1| + c2|c2><c2| + c3|c3><c3|. Now from the Born rule outcome |b1> will occur with exactly the same probability even though the second outcome is different. This is known as non-contextuality because a property does not depend on what else you happen to be measuring with it. It allows a probability measure to be uniquely defined regardless of what basis it is part of ie the other possible outcomes of an observation. Now it turns out, due to Gleason's theorem, that the assumption of non-contextuality all by itself is enough to prove Born's rule. In fact it would be a pretty silly choice of Hilbert space as the formalism for the states of QM if it wasn't true. This is what's meant by non-contextuality being unnatural and counter-intuitive.

But now look at it physically and forgetting the Hilbert space formalism. We have zero reason to believe that changing what else you measure will not affect other things you measure at the same time - after all you have a different apparatus. This is what's meant by the physical basis is unclear. And indeed interpretations of QM such as Bohmian Mechanics exist that are contextual.

Thanks
Bill
 
Last edited:
  • #131
DevilsAvocado said:
I agree, I always thought that contextuality means that the entire measurement setup has to be taken in consideration for the context of outcome, i.e. if Alice put her polarizer orthogonal to Bob this will have an effect on the outcome of photon B = non-locality...

It means measurements are not affected by what else you happen to be measuring at the same time. If you have an observable A = sum ai |bi><bi| the probability of outcome |b1> does not depend on the other |bi>. What you mention is just one example.

Thanks
Bill
 
Last edited:
  • #132
Charles Wilson said:
"...once one assumes that the collapse is a real phenomenon, and once one considers specific models, then the conflict is real and testable"...if both measurements happen before the other, then the quantum correlation should disappear, however large the speed of the spooky action!

But that is precisely what many interpretations such as the ensemble interpretation specifically deny ie its a real phenomena. In that interpretation the state is simply something that tells us about the probabilistic behavior of a conceptual ensemble of systems. Collapse is simply selecting an element of the ensemble - nothing in a real sense occurred. Since such is possible I am at a loss to understand stuff like the above - all you are arguing for is interpretations where its not real - not that QM has any issues.

Thanks
Bill
 
Last edited:
  • #133
Yeesh!
I'll admit the pull quote is not rigorous. I hate using the word "collapse" because I think it's loaded from Bohr's Metaphysics. "But everybody else uses it..."
"So if everybody sez they're going to jump off a cliff, are you going to follow 'em?"

Well...ummm...no. I thought it was valuable to add into the discussion, however, since it pushed the tension between SR and QM. I thought that it was interesting that the Naive Realist position (Not necessarily G E Moore's NR) would state that the interference patterns should disappear and they do not. QM wins again!

But SR still believes, in their reference frame, that they won.

That's why I posted.

CW
 
  • #134
... but electromagnetism is 10^39 times stronger than gravitation ...

I think the Bohr model is pretty dead, there are incompatibilities to empirical spectral lines, and it also violates the uncertainty principle, and even if you magically could fix all that – where is your single localized particle in the double-slit experiment?

It doesn’t work...

How does the Bohr model violate the uncertainty principle? The Bohr model does correctly predict spectral lines for atomic hydrogen. That is something at least. I get the impression that we feel very confident we have exhausted all mechanical analogies for things like the double slit experiment when we cannot possibly have. There are an infinitude of possible mechanical analogies we could use to model a particle like an electron (and all but one will be wrong). I don't know if anyone has seen the experiments with silicon droplets but I saw these a while back and found them very interesting:

http://www.youtube.com/watch?v=nmC0ygr08tE

Also Morgan Freeman narrates:
http://www.youtube.com/watch?v=sGCtMKthRh4
 
  • #135
Charles Wilson said:
*"...But what if we let relativity enter the game even deeper? What if the detectors are in relative motion such that each detector in its own reference frame analyzes its photon before the other?...

*"...once one assumes that the collapse is a real phenomenon, and once one considers specific models, then the conflict is real and testable"

*it will be done

Fundamental quantum optics experiments conceivable with satellites -- reaching relativistic distances and velocities
http://arxiv.org/abs/1206.4949
the line up:
David Rideout, Thomas Jennewein, Giovanni Amelino-Camelia, Tommaso F. Demarie, Brendon L. Higgins, Achim Kempf, Adrian Kent, Raymond Laflamme, Xian Ma, Robert B. Mann, Eduardo Martin-Martinez, Nicolas C. Menicucci, John Moffat, Christoph Simon, Rafael Sorkin, Lee Smolin, Daniel R. Terno.

Super Physics Smackdown: Relativity vs Quantum Mechanics...In Space

Read more: http://www.technologyreview.com/view/428328/super-physics-smackdown-relativity-v-quantum-mechanicsin-space/#ixzz2UyZfdG1L
From MIT Technology Review

*it will be done
(objective reduction models)

Observation of a kilogram-scale oscillator near its quantum ground state
http://iopscience.iop.org/1367-2630/11/7/073032/pdf/1367-2630_11_7_073032.pdf

http://eprints.gla.ac.uk/32707/1/ID32707.pdf

the line up:
B Abbott, R Abbott, R Adhikari, P Ajith, B Allen, G Allen, R Amin, S B Anderson, W G Anderson,
, M A Arain, , M Araya, H Armandula, P Armor, Y Aso, S Aston, P Aufmuth...
 
Last edited:
  • #136
stevendaryl said:
That leaves me a little puzzled, still. Maybe somebody can give examples of a toy model that is contextual?

What I do understand is this: The argument that it is impossible for a hidden-variables theory that reproduces the predictions of quantum mechanics in an EPR-type twin particle experiment assumes that the hidden variable is unaffected by choices made at the measuring devices. But such an effect is ruled out by locality, so I don't see why "contextuality" matters in that case.

http://digital.library.pitt.edu/u/ulsmanuscripts/pdf/31735033440391.pdf
 
  • #137
DevilsAvocado said:
I’m lost DrC... how could “Alice and Bob signal from the future” without an FTL-mechanism that is forbidden by both QM and SR?

I thought the same, I still don't know what to make of that answer.
 
  • #138
DevilsAvocado said:
I’m lost DrC... how could “Alice and Bob signal from the future” without an FTL-mechanism that is forbidden by both QM and SR?

I'm not sure if this is what he meant, but there are "time-symmetric" formulations of wave propagation in which the future affects the present in the same way the past does. It's not FTL, in the sense that propagation speed is always <= c, although the propagation can be into the past as well as into the future. This is consistent with SR in the weak sense that there is no violation of lorentz invariance.
 
  • #139
bhobba said:
Conceptually its very simple. Suppose you have some observable A = sum y1|b1><b1| + y2|b2><b2| + y3 |b3><b3| where |b3> means outcome |b1> or |b2> did not occur. Outcome |b1> occurs with probability |<u|b1>|^2 from the Born rule. Now consider the observable C = y1 |b1><b1| + c2|c2><c2| + c3|c3><c3|. Now from the Born rule outcome |b1> will occur with exactly the same probability even though the second outcome is different. This is known as non-contextuality because a property does not depend on what else you happen to be measuring with it. It allows a probability measure to be uniquely defined regardless of what basis it is part of ie the other possible outcomes of an observation. Now it turns out, due to Gleason's theorem, that the assumption of non-contextuality all by itself is enough to prove Born's rule. In fact it would be a pretty silly choice of Hilbert space as the formalism for the states of QM if it wasn't true. This is what's meant by non-contextuality being unnatural and counter-intuitive.

But now look at it physically and forgetting the Hilbert space formalism. We have zero reason to believe that changing what else you measure will not affect other things you measure at the same time - after all you have a different apparatus. This is what's meant by the physical basis is unclear. And indeed interpretations of QM such as Bohmian Mechanics exist that are contextual.

Thanks
Bill
I guess you meant "This is what's meant by contextuality being unnatural and counter-intuitive."

Bill, I find this answer quite reasonable.
Maybe you can help me understand this better. Especially the part where the physical basis is unclear which connects with the QM completeness/incompleteness issue.
When you say that the Hilbert formalism is silly if we don't assume non-contextuality I interpret you mean that non-contextuality brings an independent probabilistic picture and that independence fits well with the superposition principle and therefore vector spaces.
Going back to Gleason's theorem and why I was associating a point-particle model to the Born rule, the theorem proves that the Born rule for the probability of obtaining specific results to a given measurement, follows naturally from the structure formed by the lattice of events in a real or complex Hilbert space. Now lattices are discrete subgroups and are formed by points (zero-dimensional discrete topology) that can be physically interpreted as point particles. but the key starting point is the Hilbert space, its linearity allows the Born rule to be interpreted as following naturally from the points-events lattice.

As you rightly say forgetting for a moment the Hilbert formalism we don't have compelling reasons to rule out contextuality, but the only popular contextual interpretation seems to be BM, probably because the hilbert formalism tightly limits what one can make with a contextual interpretation.
 
  • #140
stevendaryl said:
I'm not sure if this is what he meant, but there are "time-symmetric" formulations of wave propagation in which the future affects the present in the same way the past does. It's not FTL, in the sense that propagation speed is always <= c, although the propagation can be into the past as well as into the future. This is consistent with SR in the weak sense that there is no violation of lorentz invariance.

Something like Feynman-Wheeler absorber theory? but that was refuted many years ago, among other things assumed no self-interaction of particles.
 
  • #141
TrickyDicky said:
Something like Feynman-Wheeler absorber theory? but that was refuted many years ago, among other things assumed no self-interaction of particles.

There are others:

Relational Blockworld:
http://arxiv.org/abs/0908.4348

Yakir Aharonov and Jeff Tollaksen's take on Time Symmetry:
http://arxiv.org/abs/0706.1232
 
  • #142
but that was refuted many years ago, among other things assumed no self-interaction of particles.

The F-W theory consists of the basic equations (action principle) and a peculiar boundary condition, which is not the only one possible.

Which part do you think was refuted?
 
  • #143
stevendaryl said:
I'm not sure if this is what he meant, but there are "time-symmetric" formulations of wave propagation in which the future affects the present in the same way the past does. It's not FTL, in the sense that propagation speed is always <= c, although the propagation can be into the past as well as into the future. This is consistent with SR in the weak sense that there is no violation of lorentz invariance.

http://prl.aps.org/abstract/PRL/v110/i21/e210403
"The role of the timing and order of quantum measurements is not just a fundamental question of quantum mechanics"
"we entangle one photon from the first pair with another photon from the second pair. The first photon was detected even before the other was created. The observed two-photon state demonstrates that entanglement can be shared between timelike separated quantum systems"the two vector formalism fit neatly with that process.

Can a Future Choice Affect a Past Measurement's Outcome?
http://arxiv.org/ftp/arxiv/papers/1206/1206.6224.pdf

https://www.physicsforums.com/showpost.php?p=4053068&postcount=31
https://www.physicsforums.com/showpost.php?p=4053118&postcount=32
https://www.physicsforums.com/showpost.php?p=4056855&postcount=36-------
but there is the possibility that the photons interchange information at the time of the monogamy creation or like i prefer to say heterogamy (one up, one down) or is an inherent symmetrical process, we need more experimental testing to know.

https://www.physicsforums.com/showpost.php?p=4402245&postcount=135
http://arxiv.org/pdf/1206.4949v2.pdf
"Physical theories are developed to describe phenomena in particular regimes, and generally are valid only within a limited range of scales. For example, general relativity provides an eective description of the Universe at large length scales, and has been tested from the cosmic scale down to distances as small as 10 meters. In contrast, quantum theory provides an eective description of physics at small length scales. Direct tests of quantum theory have been performed at the smallest probeable scales at the Large Hadron Collider, 10-20 meters, up to that of hundreds of kilometers. Yet, such tests fall short of the scales required to investigate potentially signicant physics that arises at the intersection of quantum and relativistic regimes. We propose to push direct tests of quantum theory to larger and larger length scales, approaching that of the radius of curvature of spacetime, where we begin to probe the interaction between gravity and quantum phenomena"
"The tests have the potential to determine the applicability of quantum theory at larger length scales"

Super Physics Smackdown: Relativity vs Quantum Mechanics...In Space
http://www.technologyreview.com/view/428328/super-physics-smackdown-relativity-v-quantum-mechanicsin-space/#ixzz2UyZfdG1L
.
 
  • #144
Charles Wilson said:
Here I go again...
From Age of Entanglement:

"...But what if we let relativity enter the game even deeper? What if the detectors are in relative motion such that each detector in its own reference frame analyzes its photon before the other?...

"...once one assumes that the collapse is a real phenomenon, and once one considers specific models, then the conflict is real and testable"...if both measurements happen before the other, then the quantum correlation should disappear, however large the speed of the spooky action!

"Once the engineering was made feasible, "this experiment was also performed in Geneva in the spring of 1999", reported Gisin. "The two-photon interferences were still visible, independently of the relative velocity between Alice and Bob's reference frames." Alice, in her reference frame, measures her photon first; from Bob's point of view, he has measured his photon first; yet, the correlation is still present..."

The paper is very interesting. Though there seems to be a tiny ‘issue’...

http://lanl.arxiv.org/abs/quant-ph/0007009 said:
However, it is possible that it is not the detector that triggers the collapse. The photons could take the decision already at the beamsplitter and go out through one output port, like in the Bohm-de-Bloglie pilot wave picture [26] (which much inspired Suarez). With the beam-splitter as choice-device superluminal signaling is not possible (to our knowledge). A corresponding experimental test would be more demanding, a beam-splitter would have to be in motion. A clever way-out could be the use of an acousto-optical modulator representing a beam-splitter moving with the speed of the acoustic wave. We are working on such an experiment.
[my bolding]

My absolutely unscientifically guess is that the “stuff” happens at the polarizer/beam-splitter...

(Does anyone know if they proceeded with that new experiment?)


P.S: I'll get back on PM.
 
  • #145
T0mr said:
How does the Bohr model violate the uncertainty principle?

Angular momentum depends on both the radius of the orbit and the velocity of the electron in that orbit. The uncertainty principle stipulates that the radius OR velocity MUST be uncertain = angular momentum can NOT be quantized, because it can NOT be known.

The Bohr model does correctly predict spectral lines for atomic hydrogen.

Yup

That is something at least.

Something is not everything... :wink:

I get the impression that we feel very confident we have exhausted all mechanical analogies for things like the double slit experiment when we cannot possibly have.

We shall never ‘give up’, but I think it was Feynman who said that it’s proven that an alternative/succeeding theory has to be at least as ‘weird’ as QM. So, there is not much hope for a classical fruit with a big nut inside...

There are an infinitude of possible mechanical analogies we could use to model a particle like an electron (and all but one will be wrong). I don't know if anyone has seen the experiments with silicon droplets but I saw these a while back and found them very interesting:

Droplets are sweet, but they do nothing. I could put a rubber duck there instead, with same result. Medium is everything... w ∙ a ∙ v ∙ e ∙ s
 
  • #146
DrChinese said:
There are others:

Relational Blockworld:
http://arxiv.org/abs/0908.4348

Ah! Captain RUTA & RBW! But how do we signal from the future in RBW with 'only' spacetimematter?

“past, present and future are co-constructed as well, there are no dynamical entities or dynamical laws in our fundamental formalism [...] accordingly, all dynamical explanation supervenes on, and is secondary to, non-dynamical topological facts about the graph world”

Yakir Aharonov and Jeff Tollaksen's take on Time Symmetry:
http://arxiv.org/abs/0706.1232

Ouch... :frown: but wait... conservation of the CPT symmetry requires time reversal to rename particles as antiparticles and vice versa... tachyonic antitelephone anyone??


(sorry, DrC hysterical lame jokes :blushing:)
 
  • #147
TrickyDicky said:
the structure formed by the lattice of events in a real or complex Hilbert space.

The technical meaning of lattice used here is different to what you are interpreting it as. Its an algebraic structure used in Quantum Logic:
https://www.physicsforums.com/newreply.php?do=newreply&p=4403159

Just to elaborate a bit on contextuality being unnatural in the Hilbert Space formalism.

If you choose that as your formalism you would expect the states to tell us something as far as the results of experiment are concerned so you can make predictions. Technically that means defining some kind of measure on the states. If contextuality was true you couldn't do that because it would depend on the basis you expand the state out as. Not only that but modern physics has taught us coordinates (and basis are a generalization of coordinates), being an arbitrary man made thing, are independent of the physics - this is one of key insights of Einstein in GR with his principle of covariance (as Kretchmann pointed out to Einstein, and Einstein eventually accepted, it's a principle devoid of physical content, but is of great heuristic importance - however that is another story).

Victor Stenger wrote a nice book on this a few years ago now:
http://www.colorado.edu/philosophy/vstenger/nothing.html

It quite amusing actually. Some people interpreted this as Stenger was saying the Laws of Nature came from nothing. That wasn't the case at all - they came from symmetries which are hardly nothing. The thing is symmetries are so appealing to our intuition it seems to come from nothing. For example that momentum exists and is conserved in an inertial frame comes from the space translation symmetry of an inertial frame and since that is the definition of an intertal frame you think its pulled out of a hat and comes from nowhere. First definitions contain no physics - the import of an inertial frame is that out there in interstellar space frames exist that are to a very high degree inertial which is an observational matter - the universe doesn't have to be like that - it just is. And secondly you need something to be symmetrical in - in this case its the laws of QM - and their validity is an experimental/observational matter - they may or may not be true. Its just that this symmetry stuff is so appealing and all pervasive in modern physics it seems like magic and beautiful beyond compare - which it is - when you understand it.

Thanks
Bill
 
Last edited by a moderator:
  • #148
Jano L. said:
Which part do you think was refuted?

I am pretty sure the FW theory has never been refuted - in fact I think that would be pretty hard to do since it was deliberately cooked up to be equivalent to ordinary EM but without fields. The issue with it is no-one has been able to figure out a quantum version - to the best of my knowledge anyway.

Thanks
Bill
 
  • #149
bhobba said:
I am pretty sure the FW theory has never been refuted - in fact I think that would be pretty hard to do since it was deliberately cooked up to be equivalent to ordinary EM but without fields. The issue with it is no-one has been able to figure out a quantum version - to the best of my knowledge anyway.

The expansion of the universe is not time symmetric in the thermodynamic limit.
Feynman himself stated that self-interaction is needed to correctly account for the Lamb shift.
 
  • #150
The expansion of the universe is not time symmetric in the thermodynamic limit.
I am not sure what you mean. How do you check whether expansion is time-symmetric? How does it connect to the FW theory?

Feynman himself stated that self-interaction is needed to correctly account for the Lamb shift.

Can you give a reference? People often state many things without convincing arguments. The Lamb shift was measured originally for hydrogen, whose atom consists of two particles. It is hard to show that self-action is necessary when the main forces in play are those of the interaction between different particles, and their effect is hard to evaluate.

The Lamb shift can be explained in other ways, one among which is the interaction of the atom with other particles in the surroundings (their EM field). Self-interaction of one particle on itself is not necessary.
 

Similar threads

Back
Top