Bell's theorem and Harrison's (2006) inequality

  • #51
DrChinese said:
1. You know I would never imply that... :smile:

2. Um, this statement is actually wrong, Vanesch.
Aspect's test DID send local realists back to the drawing board!

Note that my statement didn't include any statement about a falsification of any LR theory. I said that certain experiments, on which quantum theory made successful predictions (and hence survived the falsification), were suggested by LR theories.
As such, these LR theories were useful, in that they suggested experiments one would otherwise maybe not even have thought about of performing. And that's the essence of what I'm trying to argue in this thread here: that these alternative theories do have some uses (in this case, suggest tests of QM). To me, its main use is already to show that a lot of popular textbook arguments of the necessity of QM are simply erroneous reasoning. In the same way that Bohmian mechanics was useful to show the erroneous reasoning in von Neumann's impossibility proof.

Also, maybe it sent *some* LR to the drawing board, but that would then be a naive lot. The SED model is way older than the Aspect experiments. A simple paper that fascinated me is Boyer, "Derivation of the Blackbody Radiation Spectrum without quantum assumptions", Phys, Rev. 182, vol5 1969, where the SED model is in fact used.

And so have subsequent enhancements such as GHZ! That is why Santos is working so hard on refinements to his model. What, this is perhaps his 10th iteration/refinement to make it agree?
His models are frequently attacked and duly reconstructed within months to keep the subject alive. Plus, he must adapt to new experimental results. And yet the QM model has NOT needed similar adjustment.

As I said, the point is not so much (and I have to say I don't like Santos/Marchall's "religious brotherhood" style either) QM versus SED, but to understand the relationship.
Whether a model is found by twiddling and tweaking doesn't really matter much after the fact (I could cite again the string community and related, who have been twiddling and tweaking quantum models to try to get out compatibility with that simple experiment: drop your pen to the floor and see it fall!). What matters is the existence or not, of the model. It is IMO always interesting to study examples of alternatives (be it to suggest experiments, or to broaden the understanding of the reasons why there is equivalence).

Hey, I don't have a problem if Santos spends time on it. I don't even claim his model is scientifically useless (although it appears that way to me). And perhaps a future discovery will show its power, sure, I can acknowledge that. But what credit does it does it deserve today? It is pretty obvious that the primary angle to keep his ideas alive is to hang his hat on detection inefficiency and noise as a way to escape the day of reckoning. Not very impressive to me, but if you want to give it more significance then I am OK with that.

I think it already showed its use. I'm for instance not even sure that Kinsler would have thought of his 3rd order correlations, or GHZ would have proposed their tests, if, after Aspect, the world would have chanted in unity about the ultimate achievement. The existence of these critics, and the existence of their alternative models, is what motivates progress. Like competition in economy, and opposition in politics.
It points to where there has been complacent and erroneous thinking.
 
Last edited:
Physics news on Phys.org
  • #52
vanesch said:
... I could cite again the string community and related, who have been twiddling and tweaking quantum models to try to get out compatibility with that simple experiment: drop your pen to the floor and see it fall! ...

That's pretty funny (and probably accurate as well)!

I have no issue at all with attacks (in the competitive spirit you mention) on QM when there is a genuine issue or alternative hypothesis. I think that the experimental envelope is constatnly being pushed, and I would be the last to advocate that we stop the scientific study because "we already know it all".

It seems strange to see people constructing theories that say "Nature is LR but Experiments will always say QM" in the presence of Bell's Theorem. In my opinion, for SED to be a viable local realistic alternative to QM: it MUST make a prediction for entangled photon spin correlation that is at odds with Malus' Law (cos^2). I mean, that's ultimately the point of the 87% efficiency threshold that Santos claimed must be surpassed to distinguish (i.e. that there is a difference in the predictions which is being masked due to experimental loopholes). I just do not see how that makes any sense, because to assert that is essentially to say that Malus' Law is wrong too. And I consider that to be pretty fundamental.

Oops, I just realized I probably opened up another can of worms. Sorry...
 
  • #53
DrChinese said:
A Local Realitic theory is a theory composed with the following ideas:

a. Locality: often considered as the same thing as Lorentz invariance, it is essentially the idea that effects do not propagate locally faster than c.

b. Reality: In the words of Einstein, who was the ultimate local realist: "I think that a particle must have a separate reality independent of the measurements. That is: an electron has spin, location and so forth even when it is not being measured. I like to think that the moon is there even if I am not looking at it."


If that is indeed what "local realistic" means, then the terminology is completely wrong as a description of what is ruled out by empirical violations of Bell type inequalities. There exists a valid argument from "a" (locality, plus some of the empirical predictions of QM) to "b": the EPR argument. (The original EPR version, however, was both obscure as the logic and the point -- and a bit archaic now that Bell has provided a more precise definition of locality. See quant-ph/0601205 for an updated version of the argument from "a" to "b".) Let me state the point clearly in words: the only way a theory can predict the empirically-supported perfect-correlation (when Alice and Bob both measure along the same axis the spins of a pair of spin 1/2 particles in the singlet state) and respect relativistic locality, is for the theory to encode, in advance of any measurements, definite outcomes for all possible spin measurements -- i.e., locality *entails* what dr chinese above calls "realism", i.e., "locality" entails "local realism".

Thus, what's being tested in the Bell test experiments isn't the conjunction of two premises (locality + realism = "local realism") but simply locality. Anybody who is confused about this point needs to go back and read Bell, because he explains it as well as anyone could.

This point is orthogonal to the debate about QM vs SED that's been going on here, but it seems to me a fundamental point since both sides in this other debate have swallowed this standard terminology ("local realism") without realizing that it is based on a flawed understanding of Bell's work.
 
  • #54
***
Thus, what's being tested in the Bell test experiments isn't the conjunction of two premises (locality + realism = "local realism") but simply locality. Anybody who is confused about this point needs to go back and read Bell, because he explains it as well as anyone could. ***

Perhaps you have ever heard about the possibility for negative probabilities or even complex amplitudes of detection ?? Feynmann, Dirac (even prior to the existence of the Bell inequalities :cool: - this guy was clearly clairvoyant-), Barut and others have given explicit ways to violate the Bell inequalities in local theories in this way. :cool: So your statement is clearly false (as are most crappy papers analysing what Bell had to say) and you ignore this with the same pleasure as you dismiss predeterminism.

Since Dr. Chinese here challenges the work of Santos, perhaps Dr. Chinese should explain to us how photon and electron detectors work. What do we measure exactly and to what do we imagine it corresponds to ? Then we or he could understand why it might be that a detector inefficiency is possibly a fundamental issue (and indeed the consequence of a different view upon measurement, NOT the entangled state) and not some temporary technical limitation. Also, he could illuminate us by telling how a GENUINE entangled state is produced !

Vanesch, congrats with your scientific attitude.

Careful
 
Last edited:
  • #55
Careful said:
Perhaps you have ever heard about the possibility for negative probabilities or even complex amplitudes of detection ?? Feynmann, Dirac (even prior to the existence of the Bell inequalities :cool: - this guy was clearly clairvoyant-), Barut and others have given explicit ways to violate the Bell inequalities in local theories in this way. :cool: So your statement is clearly false (as are most crappy papers analysing what Bell had to say) and you ignore this with the same pleasure as you dismiss predeterminism.

"Negative probability" is a contradiction in terms. Look at the axioms that need to be satisfied for something to be a "probability". Page 1 of any probability/stats textbook.

Formally (i.e., leaving aside the actual meanings of relevant concepts) your statement is true: you can violate Bell's inequalities with a local theory if you allow probability distributions that are sometimes negative. But please. If that's the best available argument against my claim, it's just a complicated way of confessing that my claim is true.


Vanesch, congrats with your scientific attitude.

I'll second that. Vanesch's comments on this thread are a much needed breath of scientific fresh air in the face of dogmatic/religous attachment to QM. My only complaint would be that one shouldn't give quite as much credit to QM as Vanesch has done here (though I know from other discussions he agrees with me about this). QM (assuming thereby we mean the orthodox or Copenhagen theory) is a bad theory. It is "unprofessionally vague and ambiguous" (Bell's description) about such crucial things as: what it's about, when its two mutually incompatible recipes for time-evolution apply (i.e., what exactly is this thing "measurement" which makes unitary evolution stand aside momentarily in favor of collapse), etc. If there were no such foundational problems with orthodox QM, I would incline toward the view that SED is pointless (along the lines of, say, some non-atomic continuum theory of matter that manages somehow to explain the ideal gas law and some of chemistry, but in today's context where it is just absolutely certain that matter is atomic, so the alternative would be at best a curiosity). But given that these foundational problems do exist, dogmatic attachment to orthodox QM is simply indefensible, and anyone who maintains this attitude (and the associated vitriolic dismissal of things like SED and Bohmian Mechanics) thereby reveals himself as a non-thinking, anti-scientific dogmatist.

Or, if you like, I could tell you what I really think. :smile:
 
  • #56
Careful said:
1. Since Dr. Chinese here challenges the work of Santos, perhaps Dr. Chinese should explain to us how photon and electron detectors work. What do we measure exactly and to what do we imagine it corresponds to ? Then we or he could understand why it might be that a detector inefficiency is possibly a fundamental issue (and indeed the consequence of a different view upon measurement, NOT the entangled state) and not some temporary technical limitation.

Also, he could illuminate us by telling how a GENUINE entangled state is produced !

Not sure I follow what you are asking...


1. Sure, I challenge the work of Santos... but more from a philosophical perspective rather than saying there is an error in it per se. Detector efficiency may be fundamental to Santos' position, but I don't think too many scientists will see it as such. I think his focus is much too narrow to gain any mainstream attention. And despite his best efforts, I do not see how a stochastic approach will ever work to accomplish his goal: a local realistic alternative to QM that respects Bell. But I could be wrong.

If we need to have a separate thread about the pros and cons of Santos and Marshall's work, then I would be happy to participate. However, I don't want to mislead anyone into thinking I am an expert on it. Nor should anyone think that I am denying that they are respected scientists. However, SED is less mainstream than Bohmian Mechanics, which is itself not mainstream. Given the nature of this forum, I think that is relevant.


2. Parametric Down Conversion produces entangled photon pairs. With the evidence, how could you not believe this... unless, of course, you deny the existence of entanglement a priori. The only problem is that clearly, you can measure the difference between groups of entangled photon pairs vs. pairs in which there is no entanglement. (And there are many different ways to entangle particles.) So the question is really: what do YOU call the photon pairs produced by PDC?

I do not believe there is a deeper level of reality than the HUP implies. Therefore, I do not believe there is definite real value for observables outside the context of a measurement. I consider this an orthodox view, hardly in need of further description.
 
  • #57
Vanesch said:
Vanesch, congrats with your scientific attitude.

I third that. Vanesch, thank you for taking the defense of my position while I was away. Your eloquent comments precisely characterize my views as well on why it is worth investigating local realistic theories such as SED.

Zapperz said:
No SED theory has ever been attempted to match the results of ARPES, RPES, even multiphoton photoemission processes. In this day and age, photoelectric effect is chicken feed.

SED has not yet been successful in this regime. However, there is currently ongoing work by Dan Cole, who's paper Vanesch cited. The three papers on detectors I cited by Santos do also discuss this problem, if you care to read them. I also have specific ideas along these lines, which I won't discuss.

However, there is an important point to consider about SED and local realist theories in general. If SED is correct, then the physical description of atomic physical processes will also be much more detailed and complex than will the standard quantum mechanical treatment. In fact, the actual physics of SED is very nonlinear, when modeled precisely and accurately. But this is very difficult to do because of the nonlinearity in the theory. In fact, because analytical analyses alone of nonlinear systems is not very reliable, SED theorists are taking advantage of numerical simulations of atomic physical processes described by SED; and lo and behold, these numerical simulations are beginning to show that SED works where it was once thought to fail, such as in generating the probability density distribution for the position of an electron in the ground state of the hydrogen atom. The techniques of SED are also becoming extremely useful in analysis of Casimir and van der waals forces in various boundary conditions:

http://www.bu.edu/simulation/publications/dcole/PDF/DCColeBUPresentationApril162003.pdf

Now you might say that that's why QM is a better theory, because it gives a correct first-order description of atomic spectra and photoemission processes, whereas SED has to resort to a nonlinear description of light-atom interactions. However, a nonlinear description is what would be expected from a more fundamental and accurate stochastic local realist description of atomic physics. In fact, one could have said the same about Newtonian gravity versus general relativity when it was being developed. You could have argued, what is the use of a nonlinear field equation to describe, say, the motion of a test particle in a gravitation potential, when we already have a perfectly linear theory (Newtonian gravity) that does this just fine? Of course the claim was that GR would be the more fundamental and accurate description of gravity, to which Newtonian gravity is only a very good approximation; and given this to be the claim, then there would eventually be new or different predictions that GR would make against Newtonian gravity. And indeed there were.

Likewise, the same claim would be made about SED, that it gives a more accurate description of atomic-optical physics to which standard QM is an excellent mathematical approximation. Therefore, SED will make new predictions that standard QM does not. And of course we know this is true! But such tests have yet to be carried out. So, I would say give it time.

Vanesch said:
Don't get me wrong, I don't think that any amount of funding can make a totally ill founded theory work as well as QM. But maybe SED would have had an equally successful development if it would have received as much attention. So I think it is not totally fair to ask of SED to give you the same level of actuality and sophistication as QM has today, given the hugely different amounts of means that were invested in both paths.

Indeed this is partly true. Theoretical QM research received several orders of magnitude more man power, grant money, and time than has SED (BTW, I think this is the same reason that Bohmian mechanics has yet to be made fully relativistic). However, another significant reason for the limited scope of SED, is the fact that the necessary mathematical and computational tools to accurately analyze the nonlinear partial differential equations of SED for nonlinear systems were only developed in the 80's when many researchers in the field had already become pessimistic about the theory. There is a very nice review article of the history of SED that can be found here:

http://www.bu.edu/simulation/publications/dcole/PDF/SwedenCole2005.pdf

Vanesch said:
After all, QM also faces its gorilla: gravity (with which SED has no problems for instance).

EDIT:
the chicken-feed list:
-photo-electric "lumpiness"
-black body radiation
-stability and spectrum of hydrogen
-gyromagnetic ratio for electrons up to order 6 in alpha
- Lamb shift
- Bell experiments with PDC xtals

Now, ask your average student a list of results which were the historical motivations which made people finally accept quantum theory? This is what I find intriguing.

EDIT2: personally, I find this exploration more "cost-effective" than pondering for 30 years about how to tie up my shoes in 11 dimensions.

These are excellent points. Just to add to the SED chicken feed list, the Casimir effect, Unruh-Davies radiation, and Aharonov-Bohm effect. And the SED description of the AB-effect also has an experimentally distinguishable prediction:

"The Paradoxical Forces for the Classical Electromagnetic Lag Associated with the Aharonov-Bohm Phase Shift". Timothy H. Boyer.
http://arxiv.org/abs/physics/0506180

Vanesch said:
In fact, I fight every dogmatic religious attitude with religious conviction In the same way as I would argue against a religious Local Realist, I argue against a Religious Bohmian, or a Religious quantum theorist.

Same here. In fact, I have a currently running debate with Sheldon Goldstein about the problems with the physical interpretation of the wave function in BM, as well as one with Trevor Marshall on conservation of energy issues in SED.

Vanesch said:
My point is simply that the simplicity of this SED model and the accuracy of its predictions (true, within a very restricted domain for the moment) is intriguing, and that we might learn something if only we understood why. I have a hard time believing that it is pure coincidence that quantum theory and SED models give so close results, with so different postulates. So the point is not so much SED versus QM, but how come that SED and QM give same predictions.

Exactly. From a philosophy of science perspective, if we understand what functional aspect of the mathematical structure and physical ontology of these different theories gives them much of the same predictive power, that would also be of considerable value to the scientific methodology of physics. There are many alternative formulations of physics, such as Brans-Dicke theory, which Vanesch also mentioned, or Bohmian mechanics, Everett's MWI, GRW spontaneous collapse, or even SED, which all have vastly different ontologies, but which are still empirically very close. As a consequence, it is very difficult as a theorist, to know which ontological interpretation is closer to the objective truth. Developing a rigorous means by which to help make this judgement would be of value for any theorist, and especially those who work on competing theories which are very far from being experimentally testable, i.e. string theory, loop quantum gravity, Hawking's quantum cosmology, even semiclassical gravity!.

Vanesch said:
One should not religiously commit to a single theory, and view competitors as personal rivals. Competitive theories are the backbone of scientific inquiry.

Yes! In fact, these arguments about how local realistic challenges to the standard formalism of QM can give us deeper insights into it, has already been proven in my opinion. Einstein's critical mind allowed him to see more deeply into the foundations of quantum mechanics than many of its most ardent defenders. And the kind of philosophically motivated critical questions he asked but could not yet answer were to bear fruit barely 10 years after his death when they were taken up again by another progressive critic of standard QM - John Bell.


DrChinese said:
It seems strange to see people constructing theories that say "Nature is LR but Experiments will always say QM" in the presence of Bell's Theorem.

Indeed that would seem strange to "see people constructing theories that say "Nature is LR but Experiments will always say QM" in the presence of Bell's Theorem." Santos and Marshall are not saying this however. They are saying that "Nature is LR and experiments are consistent with this."

DrChinese said:
In my opinion, for SED to be a viable local realistic alternative to QM: it MUST make a prediction for entangled photon spin correlation that is at odds with Malus' Law (cos^2). I mean, that's ultimately the point of the 87% efficiency threshold that Santos claimed must be surpassed to distinguish (i.e. that there is a difference in the predictions which is being masked due to experimental loopholes). I just do not see how that makes any sense, because to assert that is essentially to say that Malus' Law is wrong too. And I consider that to be pretty fundamental.

I thought you said you were very familiar with Santos and Marshall's work? Marshall and Santos showed a long time ago that stochastic noise does in fact modify Malus Law, in such a way that is still consistent with observation. Please read the abstract of this paper:

Stochastic optics: A local realistic analysis of optical tests of Bell inequalities
http://prola.aps.org/abstract/PRA/v39/i12/p6271_1


DrChinese said:
If we need to have a separate thread about the pros and cons of Santos and Marshall's work, then I would be happy to participate. However, I don't want to mislead anyone into thinking I am an expert on it. Nor should anyone think that I am denying that they are respected scientists. However, SED is less mainstream than Bohmian Mechanics, which is itself not mainstream. Given the nature of this forum, I think that is relevant.

I would be willing to participate in such a separate thread. However, SED not being mainstream has not deteriorated the quality of the arguments or discussion in this thread. Moreover, SED is solid, peer-reviewed work, just as is Bohmian mechanics.


DrChinese said:
2. Parametric Down Conversion produces entangled photon pairs. With the evidence, how could you not believe this... unless, of course, you deny the existence of entanglement a priori. The only problem is that clearly, you can measure the difference between groups of entangled photon pairs vs. pairs in which there is no entanglement. (And there are many different ways to entangle particles.) So the question is really: what do YOU call the photon pairs produced by PDC?

DrChinese, you apparently are not very familiar with Marshall and Santos' work. They and others have accounted for PDC entanglement of photons within the stochastic optical formalism:

"What is entanglement?" Emilio Santos.
I conjecture that only those states of light whose Wigner function is positive are real states, and give arguments suggesting that this is not a serious restriction. Hence it follows that the Wigner formalism in quantum optics is capable of interpretation as a classical wave field with the addition of a zeropoint contribution. Thus entanglement between pairs of photons with a common origin occurs because the two light signals have amplitudes and phases, both below and above the zeropoint intensity level, which are correlated with each other.
http://arxiv.org/abs/quant-ph/0204020

A Local Hidden Variables Model for Experiments involving Photon Pairs Produced in Parametric Down Conversion: Alberto Casado, Trevor Marshall, Ramon Risco-Delgado, Emilio Santos.
http://arxiv.org/abs/quant-ph/0202097

A. Casado, T. W. Marshall, and E. Santos, J. Opt. Soc. Am. B, 14,
494-502 (1997).

A. Casado, A. Fern´andez-Rueda, T. W. Marshall, R. Risco-Delgado,
and E. Santos, Phys. Rev. A 55, 3879-3890 (1997).

A. Casado, A. Fern´andez-Rueda, T. W. Marshall, R. Risco-Delgado,
and E. Santos, Phys. Rev. A 56, 2477-2480 (1997).

A. Casado, T. W. Marshall, and E. Santos, J. Opt. Soc. Am. B 15,
1572-1577 (1998).

A. Casado, A. Fern´andez-Rueda, T. W. Marshall, J. Mart´inez, R. Risco-Delgado, and E. Santos, Eur. Phys. J. D 11, 465 (2000).

A. Casado, T.W. Marshall, R. Risco-Delgado, and E. Santos, Eur. Phys. J. D 13, 109 (2001).

DrChinese said:
I do not believe there is a deeper level of reality than the HUP implies. Therefore, I do not believe there is definite real value for observables outside the context of a measurement. I consider this an orthodox view, hardly in need of further description.

That I would have to sharply disagree with. Bohmian mechanics proves the opposite of what you believe regarding the HUP or that observables don't have a definite real value before measurement.

Regards,
Maaneli
 
Last edited:
  • #58
** ''Negative probability" is a contradiction in terms. Look at the axioms that need to be satisfied for something to be a "probability". Page 1 of any probability/stats textbook. **

I thought you would say this. It implies you did not understand anything of this proposal (and neither about probability theory, I know Kolmogorov did not understand this either so you are in good company). :bugeye: Negative probability could mean negative energy, the amplitudes could reveal something about how detection works... Anyway, probability only needs to be positive in the limit of infinite measurements, the mistake people like you, Shimony ... make is that you always assume statistics to apply to single events.

***
Formally (i.e., leaving aside the actual meanings of relevant concepts) your statement is true: you can violate Bell's inequalities with a local theory if you allow probability distributions that are sometimes negative. But please. If that's the best available argument against my claim, it's just a complicated way of confessing that my claim is true. ***

Absolutely not, again I would invite you to think about it. As far as I know, Sorkin's proposal goes in a similar direction but one needs to revise
measurement completely (as well as stop thinking in terms of one particle situations).

As far as it goes, I explained why BM does not solve measurement either ... it is rather nonsensical that the electron goes through the left slit and the measurement apparatus points out right :cool:. The problem I have with all these stories such as BM, Copenhagen, MWI is that these are offering a very simple naive way out (although BM definitely does a better job), while I hear most of these people complaining about naive realists which come up with much more subtle and intelligent constructions :smile:

Careful
 
Last edited:
  • #59
***
1. Sure, I challenge the work of Santos... but more from a philosophical perspective rather than saying there is an error in it per se. Detector efficiency may be fundamental to Santos' position, but I don't think too many scientists will see it as such. I think his focus is much too narrow to gain any mainstream attention. And despite his best efforts, I do not see how a stochastic approach will ever work to accomplish his goal: a local realistic alternative to QM that respects Bell. But I could be wrong.
***

I ask you how such detector (as well as the detection process) WORKS so that we can see whether Santos is an idiot or not.

***
2. Parametric Down Conversion produces entangled photon pairs. With the evidence, how could you not believe this... unless, of course, you deny the existence of entanglement a priori. The only problem is that clearly, you can measure the difference between groups of entangled photon pairs vs. pairs in which there is no entanglement. (And there are many different ways to entangle particles.) So the question is really: what do YOU call the photon pairs produced by PDC? ***

States produced in parametric downconversion are product states, of course you can write them as a sum of entangled states and then claim that entanglement has been observed which is what you say - but if that is your case, then it is a very weak one indeed. I asked for a GENUINELY entangled state, how to produce such one ??

I am not claiming that the entanglement correlations do not exist (although they have not been observed) but that the explanation QM attributes to them is wrong since it depends upon unphysical processes (consciousness or action at a distance).

Careful
 
Last edited:
  • #60
ttn said:
I'll second that. Vanesch's comments on this thread are a much needed breath of scientific fresh air in the face of dogmatic/religous attachment to QM. My only complaint would be that one shouldn't give quite as much credit to QM as Vanesch has done here (though I know from other discussions he agrees with me about this). QM (assuming thereby we mean the orthodox or Copenhagen theory) is a bad theory.


Ugh, you can't say that either! The quantum formalism is a vastly successful formalism, if you read it in the following way: imagine a professor telling his students: "ok, today I'm going to introduce you to something called "quantum mechanics". First of all, it is - as is any new scientific theory - very strange ; some say, ununderstandable ; but I'll show you how you have to use it, how to make calculations, and I can tell you that people have done so, and always could find agreement with all non-gravitational experiments (even the effect of the fixed gravitational potential can be taken into account, which makes exception to the cited limitation in some simple cases). This is the main reason why you should learn it. Don't ask me what it "means". Just learn how to do the calculations. That's the quantum formalism [...] "

You can transpose that to any scientific theory, it is its essence. The problem is, not many people (especially students) are interested in "learning to do calculations of outcomes of experiments". People want philosophy, or better yet, they want revelation. They want to know what it actually means, and to what great secrets of nature they will be introduced. They want to know the truth, they want to know "what really happens", not simply some calculational rules. Well, it's a lesson in philosophy we receive from modern physics, that as of now (and probably for a long time to come), we won't know the "truth". That doesn't stop some from claiming they do, but this is not different from any sect guru and its adepts who claims to know the enlightment. The only thing we finally know, is that certain formal systems of calculation are extremely accurate in a certain scope of application. That's way much sobering than the Great Story of the Meaning of Life, the Universe and Everything (and which was probably the main reason for many students to get enrolled into physics in the first place, not in the least because of the hype in popular literature about this). In fact, some might even regret finally to have come to the lecture of "how do I calculate outcomes of experiments", and tell themselves that, all matters equal, it would have been a better idea to learn "how to increase the contents of my bank account", the philosophical challenge of both endeveours being upon reflection, about similarly meager.

Because of that disappointment, and because of the inquiring nature of the human mind, and because that's why they came here for in the first place, and because it sells more books, people cannot be satisfied with that all-too-limited set of "lectures of how do I calculate outcomes of experiments". They want to know "the Truth". Now, where there is demand, there will be offer, so that's what you get: the Truth. In other words, an ontological interpretation of the formal rules you use to get outcomes of experiments. A story, which tells you what Really Happens (TM). Exactly like the sect guru tells you the Truth, and what Really Happens.
We, as humans, need that, in order to satisfy our minds, to motivate ourselves to work ourselves through all that formal stuff, and also to find inspiration in our thinking. I call an "ontological interpretation" the "toy world" that is associated with a certain formalism. You can even have some liberty in setting up such a toy world that corresponds to a given formalism (as long as it is faithfully in agreement with the elements of said formalism of course). When there is such liberty, then you can argue endlessly of the merits of one over the other (which is what happens here in this and related threads).
Every new scientific formalism has had its dose of "ontological criticism". With Newton, the main problem was to know what was it, physically, that was "pushing" on the planets, and was formally *represented* by the "force of gravity" ? Were it invisible angels ?
With Maxwell, the question was what were these "fields" in space where there was nothing ? Vibrations in some bizarre invisible liquid ? Relativity poses the question of what is exactly this "space-time manifold" ? Some kind of 4-dimensional pasta in a twisted form ?
But the real whopper came with quantum theory, of which Bohr simply said that it describes *nothing*. :cool: First, the jet-setters found that a cool idea, something different than usual: hey, we're describing very accurately "nothing". All that naive lot is thinking about "stuff", but we think about "nothing", that's way cooler ! There are still a lot of adepts of the "there's nothing" view, but now that it has lost its initial fanciness due to its fashionable novelty, many people start to realize that having a mental picture of the toy world of "nothing" is not what they came for initially, when they wanted to learn the Truth, and that it doesn't help them thinking about it. Come in the Local Realists, the Everettians and the Bohmians.

Personally, I need a story too, and that's why I apply to quantum theory exactly the same kind of reasoning as to all others: take the elements of your formalism as "reality". If Maxwell has fields, take them as real. If Newton has forces, take them as real. If GR has a 4-dim manifold, take it as real. Well, if quantum theory has a unitary structure, take it as real. You then end up in MWI (that's why I consider myself, in as far as I'm thinking about a quantum toy world, an MWI-er), and many people don't like that because it looks so totally different from what we thought the world was like when we were kids. But it is no more or no less real than all that other stuff: it is real in the *toy world* that you mentally set up in order for you to picture the formalism.

SED has simply *another* toy world, and Bohmian mechanics yet another toy world. SED, because it has a totally different formalism (classical fields with noise terms) ; Bohm because it takes over the formalism of (unitary) quantum theory, and adds an extra formal element to it similar to the Newtonian formalism: particles and forces.

So it is a bit strange that a Bohmian would find the quantum formalism a "bad theory", because he includes it. He only added an extra machinery for the simple sake of being able to construct a different toy world which is closer to his intuitive desires. Nevertheless, the Bohmian relationship with quantum theory is entirely understood, because it was initially set up on purpose to be so. There's no surprise when both find identical predictions. We know mathematically why this is so (and in a rather straightforward way). Nevertheless, because of the totally different toy world offered by BM, it can offer a refreshing perspective on things like the two-slit experiment for instance. It's fun to know you can think of that experiment in several ways (in different toy worlds), and nevertheless obtain the same results: we can think that there is "nothing", or we can think that there are "parallel worlds", or we can think that there are genuine particles guided with some non-local quantum force. These mental pictures are entirely different, although they share the same core calculation (which is nothing else but the quantum formalism, eventually embellished with some extra machinery - nevertheless, the right result finds its origin in the schroedinger equation).

However, SED is entirely different. SED is a theory of coupled classical field equations with noise terms. There is a relationship with quantum theory of course, because in QFT, the field operators satisfy similar non-linear equations (without the noise terms), but what is not understood is how come that these noise terms in the classical equations can mimic so many aspects of the operator solution without the noise terms. That's an entirely formal question, apart from any philosophical interference (and I think that it is the important reason to consider SED up to some point). From the SED PoV, it needs to be understood how come that solutions to non-linear partial differential equations with noise terms agree with solutions of nonlinear operator equations (in totally different spaces). From the QM PoV, it needs to be understood how come that the solutions to operator equations in high-dimensional spaces are well described by "simple" solutions of non-linear PDE in 3-D by adding noise.

So there are two entirely different discussions here. One is pseudo-philosophical, and concerns personal preferences for toy worlds. It is only pseudo-philosophical, because the true philosophical attitude is to say that, unfortunately, we don't know what is true, and we're limited to guessing. This then leads to religious brotherhood attitudes where the Good (us) fights the Evil (them).

The other discussion is about understanding the relationship between different formalisms which (within certain limited scopes) succeed in making identical predictions. *this* is the interesting discussion.

In conclusion: I think it is wrong to say that quantum theory is a "bad theory". It works marvelously. However, I think it is wrong to reify it, and it is enlightening sometimes to look upon its results from different angles, be it Bohmian or SED, or what ever.
 
  • #61
There is something highly unethical to talk about detection loopholes or detector efficiency while ignoring some very fundamental aspects and responses to such things.

Unlike most of you, *I* have been involved in actual measurement of such things since the start of my graduate school years, and since about 1 1/2 years ago, have been making high QE photocathodes. So I can talk about background noise, dark current, detector signal, blah blah blah till everyone turns blue. Trying to distinguish between what is "noise" and what is "signal" is a HUGE part of my work. If you look at the raw data from photoemission spectroscopy, for example (i.e. if you make cuts in the data in my avatar), you will see background noise, detector noise, dark currents, etc... Yet, according to SED (and Santos), the these "random" background noise can somehow mimic "actual signal"! NO KIDDING!

How convenient can that be when you can simply stick something in ad hoc, and voila, you can mimic the actual signal simply by burying something in the detector noise. Or did we forget that SED comes with its own set of assumptions about the nature of such background fluctuations? And unlike QM, many of these "assumptions" have not even been tested at the most fundamental level to even see if they are consistent with observation.

Photodetector performance is such a crucial issue, and has been studied so extensively, it is not even funny. Yet, I have seen no actual study done to see how well the detector performance actually matches any of SED's assumption. If we can verify everything from Fowler-Nordheim law at finite temperatures to and the Richardson-Dushman relations for photocathodes, how come this void for SED remains? One would think this is one very fundamental aspect of verification of SED to be taken seriously. Or maybe it is because it is not falsifiable?

However, the most disturbing and unethical aspect of this discussion is the complete void of citation to the TONS of issues that have already been addressed regarding the detection efficiency. All I see are references given to various detection issues that somehow supports SED's point of view on the Bell-type experiments (while ignoring the more stringent CHSH-type experiments). Nowhere was there any mention, by the so-call experts or students in SED, papers such as by S. Massar et al[1] or A. Cabello[2] that have either formulated a Bell-type inequality that are insensitive to detector inefficiency, or that one can distinguish already between quantum optics prediction versus classical with just a detector at 69% efficiency (which we already have!). Or what about Tittel et al.[3] experiment that analyzed their data without subtracting any accidental coincidences (something that many have claimed would reveal "non-quantum" results)?

Where are the rebuttals from the SED camp to those papers? Check any of Santos or Marshall's published papers and citations to their papers that addressed many of the issues that they brought up. So how come they did not address any of these? And I only did a very quick search on a few papers that I am aware of. The rest of you who, I presumed, work in this field or very much interested in it, should have a truckload of literature that you are sitting on. So why were these types of papers that have addressed such detector issues WITHHELD from being listed here alongside those that were so quickly advertized?

There are more of these type of papers. This is why I find such omission here very disturbing. It somehow conveys that the issues brought up by SED are "unanswerable" and thus, must be true. If you omitted such on info on purpose, then shame on you. If you simply were ignorant of all of these large bodies of information, then what else have you missed that you SHOULD have known before pushing this thing onto us?

Zz.

[1]S. Massar et al. PRA 66, 052112 (2002).
[2] A. Cabello PRA 72, 050101 (2005).
[3] W. Tittle et al. PRL 81, 3563 (1998).
 
Last edited:
  • #62
RandallB said:
Sorry QuantunEnigma I’m not buying it.
You just happen to join the forum here on very same day that Gordon finally has more than one pointless page on his web site, and the first post you made attempts to draw attention to that site.
If you’re not WM, you must be someone helping him – and no I’m not going to mention the name of the site here for you, I’ve seen nothing there worthy of sharing with anyone.

So just what is your point – are you looking to fill in the blanks in “W-Local” and “W-factoring” with something from DrC who does know something? Follow the path to his website info if you want to learn something worthwhile.

If you can not make your point short, direct and clear on your own website please listen to Zz and don’t waste our time with it here

As long as one person knows the secret web address then I have done an accidental good job.


I did not know it was illegal to give it out when I did.


The address had many pages long before i communicated here but do not let facts kerb your enthusiasms.


Was it Albert Einstein said, "Rich thinkers always meet violent opposition from poor minds" please?
 
  • #63
Maaneli said:
I thought you said you were very familiar with Santos and Marshall's work? Marshall and Santos showed a long time ago that stochastic noise does in fact modify Malus Law, in such a way that is still consistent with observation. Please read the abstract of this paper:

Moreover, SED is solid, peer-reviewed work, just as is Bohmian mechanics.

...That I would have to sharply disagree with. Bohmian mechanics proves the opposite of what you believe regarding the HUP or that observables don't have a definite real value before measurement.

Regards,
Maaneli

I really have to be amused at someone who touts both SED and BM in the same post.

In case it wasn't clear, I consider the idea that Malus' Law is incorrect to be the death knell for any hypothesis. You may as well argue that c is really 5% higher than the usual published value and the difference is noise. Apparently, experimental noise only comes in one kind: the kind that keeps an agenda alive.

BTW, if you think that I don't consider alternative theories and speculative hypotheses... you are completely wrong. In that regard, I am probably no different than anyone else, and I read plenty of unpublished articles. But call it for what it is, and don't elevate it above proven useful theory.
 
  • #64
DrChinese said:
1. Sure, I challenge the work of Santos... but more from a philosophical perspective rather than saying there is an error in it per se. Detector efficiency may be fundamental to Santos' position, but I don't think too many scientists will see it as such. I think his focus is much too narrow to gain any mainstream attention. And despite his best efforts, I do not see how a stochastic approach will ever work to accomplish his goal: a local realistic alternative to QM that respects Bell. But I could be wrong.

The point of SED is that the Bell violations in quantum theory are idealistic formal extrapolations which have no feasible counterpart in the empirical world. It took me some time too to understand that, but I think that SED sees quantum states that violate Bell as something similar to complex analytical extension of the 1/r law or something: formally you can calculate it, but you'll never encounter a complex distance between two bodies.

I'll play the devil's advocate:

I already gave the example of "ideal" thermal engines. Imagine a proponent of thermodynamics (here, SED), which claims that you cannot violate the second law of thermodynamics (Bell's inequalities).
Now, Maxwell Deamon (Bell) comes along and proves the MD theorem in Formal Heat Engine Theory (QM): using ideal heat engines (100% efficient "photon detectors"), which convert perfectly, heat into work, it is easy to show how to violate the second law. Here goes the proof:

Entropy is given, in adiabatic conditions, by T dS = dQ.

Now, consider a heat reservoir R1 at T1, and another R2 at T2>T1.

Take a heat engine E1, which takes an amount of heat dQ1 from R1, and converts it to work dW. Drive with that work, a heat engine E2, which converts it to heat in R2. Conservation of energy requires us:

dQ1 = dW = dQ2 (1)

Now, the entropy change of the first heat reservoir is dS1 = - dQ1/T1, and that of R2 is dS2 = + dQ2/T2.

dS = dS1 + dS2 ; using (1), this gives:

dS = dW (1/T2 - 1/T1) = dW (T1 - T2) / T1 T2

Given that T2 > T1, we have that dS is negative: violation of the second law. Hence the theorem of MD tells us that in Formal Heat Engine Theory, the second law is violated.

There has been experimental evidence for this. Not that one has seen a RAW DATA violation of the second law, but that was due to the finite efficiency of the heat engines used (around 20-25% as of today between heat reservoirs of 400 and 300 K).

It has been experimentally established, that 1000 Joule was extracted at 300K, and about 180 Joules was restored in the 400K reservoir.
Now, if we correct for the 20% efficiency of our heat engine, that means that with a perfect engine, we have 1000 Joules * 0.2 = 200 Joules (the rest was lost due to efficiency in the engine) extracted at 300 K, and 180 Joules that could ideally be restored to the 400 K reservoir.

dS1 = - 200 Joules/300K = -0.66 J/K
dS2 = + 180 J/ 400K = + 0.45 J/K

dS1 + dS2 = -0.66 J/K + 0.45 J/K = - 0.21 J/K

The experimental error is estimated at about 0.01 J/K using errors on the efficiency, the thermometers and calorimeters, so this means that a violation of the second law of thermodynamics with 20 sigma has been observed.

(end devil's advocate).

You directly see the irony here. It is how SED proponents read the claims of Bell identity violation using fair sampling corrections.
The error in the above reasoning has of course been the hypothesis of ideal heat engines and the "correction" for the inefficiency of our experimental heat engine. SED proponents claim that ideal photodetectors with 100% efficiency fall in the same category.

Given the nature of this forum, I think that is relevant.

Well, the "mainstream" clause is essentially to keep out crackpottery. As long as it is about peer-reviewed published stuff (and related, eventually non-peer reviewed stuff), and the idea is not to call 90% of all working scientists misguided idiots, it is "mainstream enough".

Don't get me wrong. I'm (of course) not an avid SED proponent! But SED has booked some intriguing successes, which have a published record. Even if SED is ultimately wrong, those successes remain, and SED is a good "reality check" to find out if certain quantum claims are really so quantum. I think that is way scientific enough to sanction discussion about it here - be it simply to clear up some errors in reasoning.


2. Parametric Down Conversion produces entangled photon pairs. With the evidence, how could you not believe this... unless, of course, you deny the existence of entanglement a priori. The only problem is that clearly, you can measure the difference between groups of entangled photon pairs vs. pairs in which there is no entanglement.
(And there are many different ways to entangle particles.) So the question is really: what do YOU call the photon pairs produced by PDC?

This is simply obtained by considering a non-linear dielectric in classical electromagnetism (and adding noise to every mode: that's the non-classical part of SED). Actually, the quantum description is derived from this (often used in practice) classical description of non-linear dielectrics: its series expansion in coupled modes classically, gives you the photon couplings in the quantum version.

In a SED description, an "entangled pair" simply comes down to EM pulses with or without a phase/polarisation relation, superposed over noise.

"Detection of photons" in SED is a stochastic process as a function of the incident intensity, and the famous "subtraction" of Santos is the fact that the incident intensity of purely the noise modes is subtracted in a detector. Correlated pulses of light will hence give you correlations between detection events. The thing that generates a lot of funny ("quantum") effects in SED is that the noise modes are also present in the optical system (they are not independent after-the-fact noises at the detector).

As such, you can understand that, for SED people, a "100% efficient photodetector" is nonsense, because each pulse will only have a finite probability (after subtraction) of seeing a click. There's an upper limit to the probability of clicking upon a pulse. It corresponds to about the 87% needed in order to avoid Bell.

Again, you don't have to buy this. But it points out the difference in approach, and why people find each other idiots. SED people find QO people "idiots" because they use " corrections for over-unity devices", and QO people find SED people "idiots" because they don't accept trivial experimental corrections.

That said, I wonder indeed how SED talks itself out of GHZ experiments...
Haven't seen that yet.
 
  • #65
DrChinese said:
Welcome to PhysicsForums, QuantunEnigma. In the hopes that I am not being baited (as RandallB points out above):

A Local Realitic theory is a theory composed with the following ideas:

a. Locality: often considered as the same thing as Lorentz invariance, it is essentially the idea that effects do not propagate locally faster than c.

b. Reality: In the words of Einstein, who was the ultimate local realist: "I think that a particle must have a separate reality independent of the measurements. That is: an electron has spin, location and so forth even when it is not being measured. I like to think that the moon is there even if I am not looking at it."

Bell discovered that QM leads to some theoretical predictions that are nonsensical (and violate b. above), such as negative probabilities. However, they are supported by experiment.

I hope this answers your question.

You were not baited and a "mutual friend" ? liked your response also he said. We had discussion re naive realism and the problem that anyone would believe it. But your answer is not naive realism as we understand it and so it is a good and helpfull answer to build on.
 
  • #66
DrChinese said:
Bell discovered that QM leads to some theoretical predictions that are nonsensical (and violate b. above), such as negative probabilities. However, they are supported by experiment.

I hope this answers your question.

I think this cannot be correct. Please which experiments support negative probabilities?
 
  • #67
vanesch said:
Ugh, you can't say that either! The quantum formalism is a vastly successful formalism, if you read it in the following way: imagine a professor telling his students: "ok, today I'm going to introduce you to something called "quantum mechanics". First of all, it is - as is any new scientific theory - very strange ; some say, ununderstandable ; but I'll show you how you have to use it, how to make calculations, and I can tell you that people have done so, and always could find agreement with all non-gravitational experiments (even the effect of the fixed gravitational potential can be taken into account, which makes exception to the cited limitation in some simple cases). This is the main reason why you should learn it. Don't ask me what it "means". Just learn how to do the calculations. That's the quantum formalism [...] "

I would have no objection to this. The problem is that Copenhagen quantum theory is not the same thing as the quantum formalism. If students were just told "there is this formalism that allows us to calculate the probabilities for various things, but we really don't have a theory yet, i.e., nobody knows what the heck is going on to give rise to these various outcomes" that would be fine, and then maybe some of the brighter students could work on trying to develop a theory. The problem is, it isn't presented this way (because the "founders" of Copenhagen didn't think of it this way, and Copenhagen has basically been accepted as orthodoxy). It's presented as: "we have the theory, it's all worked out, we have a complete description of what's going on physically to give rise to these measurement outcomes, and the theory is: you shouldn't talk about such things, or if you do you better limit your talk to the wave function only and not ever mention that t here might be a more detailed level of description that actually makes sense of some things, oh and by the way even though the wave function alone provides a complete description of physical states you shouldn't think of the wave function as describing anything physically real [?!??], it's only about our knowledge, oh and also don't worry too much about the fact that the time evolution of the wave function is different depending on whether or not someone is looking -- sure, the wf provides a complete description of physical states, but when we use that second time evolution equation we'll just switch over to thinking of the wf as only representing our knowledge so as to avoid the implication that our mere act of looking changes the physical dynamics... of course, then again, it's pretty cool that the mere act of looking changes the physical dynamics, yeah, that totally sticks it to those jerk classical physicists who believed in an objective external reality that did its thing independent of human consciousness... etc"

This is a bad theory.


The problem is, not many people (especially students) are interested in "learning to do calculations of outcomes of experiments". People want philosophy, or better yet, they want revelation.

No, people want *physics*. At least, reasonable physicists do. Look at what your view implies: Ptolemaic and Copernican models of the solar system are really the same thing, and it's merely a "philosophy" question (or something that isn't scientific, can only be answered by "revelation") which one is "really true". Or: is matter made of atoms? On your view (apparently) there is no such question, at least not as a scientific question. Sure, maybe philosophers or religious zealots could ask such a question, but good scientists know that as long as you've got some magic equations to tell you what the temperature is (or whatever) that's as far as science can go. Well I say: one look at the history of science should demonstrate immediately and conclusively that this is not as far as science can go. And anybody who says that this no longer applies in the quantum realm is trapped in a circular argument: Copenhagen is true because Copenhagen is true.


They want to know what it actually means, and to what great secrets of nature they will be introduced. They want to know the truth, they want to know "what really happens", not simply some calculational rules.

Yup. I agree completely -- assuming "they" is transposed to refer to "good scientists" rather than "philosophical/religious nuts" or whatever you had in mind...


Well, it's a lesson in philosophy we receive from modern physics, that as of now (and probably for a long time to come), we won't know the "truth".

And therefore we never will and therefore we should stop thinking about it and trying to discover it? How, in retrospect, would we judge someone who said that about astronomy in 1400 or about the basic nature of matter in 1700?



Personally, I need a story too, and that's why I apply to quantum theory exactly the same kind of reasoning as to all others: take the elements of your formalism as "reality". If Maxwell has fields, take them as real. If Newton has forces, take them as real. If GR has a 4-dim manifold, take it as real. Well, if quantum theory has a unitary structure, take it as real. You then end up in MWI (that's why I consider myself, in as far as I'm thinking about a quantum toy world, an MWI-er), and many people don't like that because it looks so totally different from what we thought the world was like when we were kids. But it is no more or no less real than all that other stuff: it is real in the *toy world* that you mentally set up in order for you to picture the formalism.

This is all beside the point. Here's the real issue: is there, or is there not, a single real world "out there" independent of us? If there is, then one and only one of the various possible "toy worlds" (i.e., theories) will correspond to the real thing. That is the true theory. It is of course true that there can be underdetermination, i.e., different theories which make the same sets of empirical predictions in some (say, present) context of knowledge. That just means it isn't yet clear which theory is true. But you seem to want to leap from this to a conclusion like "we can therefore never know which theory is true, and therefore we should quit thinking about it, quit worrying about which one is true, perhaps quit thinking that there is a real world out there at all." Well I say that's just crazy! It would have been the end of science if such a view had been accepted in the past, and nothing has changed.

BTW, to distance myself from one of the strawmen you attack, this does not mean that we must dogmatically latch onto some one theory today in the absense of sufficient evidence distancing it from the alternatives and securing its relation to the facts. If the evidence doesn't yet prove one theory right as against its competitors, then it would be irrational to believe that anyone theory is definitely right. But postponing judgment until the evidence is in (and maybe this'll take a million years, who knows) is not the same as giving up entirely on the concept of there being a truth of the matter. By the way, the fact that there can still be distinct, open, viable theories at some stage in history (e.g., today) does not mean that "anything goes" and we should accept something "unprofessionally vague and ambiguous" such as Copenhagen as also viable. Just because we don't yet know what's true, doesn't mean we can't identify crap when we see it.


SED has simply *another* toy world, and Bohmian mechanics yet another toy world. SED, because it has a totally different formalism (classical fields with noise terms) ; Bohm because it takes over the formalism of (unitary) quantum theory, and adds an extra formal element to it similar to the Newtonian formalism: particles and forces.

Sure, Bohm adds something to the "wave function only" description of orthodox QM. This is the basis for all of the bogus charges that it is unnecessarily cumbersome, that it should be dismissed by Occam's razor, that it is just OQM plus some arbitrary metaphysics, etc. But the real truth is that Bohm also *subtracts* a lot of junk that is present in OQM, namely the various measurement axioms. In Bohm's theory no such axioms are needed because the "toy universe" described by that theory makes no dynamical distinction between "measurement" and "non-measurement". There is just one kind of dynamics, and it applies all the time, whether a "measurement" is happening or not -- so all of the formal rules about measurement that are "axioms" from the POV of OQM, are theorems -- implications of the basic postulates of the theory -- in Bohmian Mechanics. This is really beside the current point, but it's worth noting since so many people fail to understand this.




So it is a bit strange that a Bohmian would find the quantum formalism a "bad theory", because he includes it.

It's not the mere formalism which is a "bad theory". Copenhagen (with all its extra-formal principles such as "completeness", there is nothing beyond the HUP as dr chinese spouts endlessly, and also its interpretation of some of the formal rules, namely the measurement postulates) is the bad theory.


He only added an extra machinery for the simple sake of being able to construct a different toy world which is closer to his intuitive desires.

That is completely false and unfair. The main benefit of Bohm is simply that he provides a clear, consistent theory whose postulates are 100% absolutely clear. It is, as one commentator put it, a real "physicists' theory" as contrasted with Copenhagen and its vague muddled confusions about "completeness" and anti-realism and collapse and whatnot. That Bohm also provides a simple, intuitive physical picture of quantum processes is gravy.


However, SED is entirely different.

Suppose, just for the sake of argument (and I will eat my shoes if this turns out to be true), SED was someday proved to make all the same predictions as QED. Then would it, or wouldn't it, be "entirely different"? That is, would it then, like Bohm, be (according to you) just another philosophical/metaphysical/religious/bu**sh** story to append to what's really scientific (viz, the quantum formalism)? Or would it be a genuinely different theory? Or what? I say they're all "entirely different" theories. SED, orthodox QM or QED or whatever, Bohmian Mechanics, GRW, etc. are all completely different theories. At most one of them is true because they say wildly different things about the physical world. You seem to want to equate a theory with its empirical predictions, which (I think we will agree, probably) makes QED and SED distinct theories, but renders OQM and Bohm and GRW and ... all just so many different "bedtime stories" associated with the same one physical theory. Unfortunately, in addition to being based on a crazy anti-realist premise, this way of classifying things also renders Ptolemy and Copernicus "the same theory". So much for the Copernican revolution and everything it led to in astronomy and physics...




So there are two entirely different discussions here. One is pseudo-philosophical, and concerns personal preferences for toy worlds. It is only pseudo-philosophical, because the true philosophical attitude is to say that, unfortunately, we don't know what is true, and we're limited to guessing.

That's a false dichotomy. Sure, maybe we have to guess today. But the real issue is: should we, or shouldln't we, be trying to do things in science that will result in us *not* just having to guess *tomorrow*? That is, should we or shouldn't we take the idea that one and only one of these theories is true, and get on with the task of trying to understand and generalize them all for the sake of eventually finding out which one really is true? That's what I took you to be saying before about SED, which is what I agreed with. To whatever extent SED is able to explain many or all of the various observations that are normally cited as proof of some quantum theory (and I don't know enough to be anything but mildly skeptical that "all" could really be the case), it means that belief in the quantum theory was premature and, in fact, scientifically, we can't be sure (today), i.e., "we're limited to guessing", which might be right -- which means that the proper scientific attitude to take is to keep working to understand how both theories work, how it is exactly that they manage to predict the same things even though they are so different, and hopefully use that knowledge to find some areas where they predict *different* things so that (in some "tomorrow") we can resolve the question empirically, scientifically. It seems you apply different standards when it comes to (say) OQM vs Bohm vs GRW, and I don't understand why. It's exactly the same issue.


This then leads to religious brotherhood attitudes where the Good (us) fights the Evil (them).

Oh please. So the idea that there is a real world out there and it is the task of science to figure out what it's like, leads somehow to religious jihads? Is that supposed to be an argument against scientific realism or something? I think you'll find if you look at history that a generally scientific attitude (based on realism) correlates r ather negatively with the instigation of religious jihads.



In conclusion: I think it is wrong to say that quantum theory is a "bad theory". It works marvelously.

So did Ptolemaic astronomy. But anyway, you're just equivocating again here between "quantum theory" meaning merely the quantum formalism, and its meaning the actual *theory* (Copenhagen, or whatever) that is presented in texts and believed by most people. It's the latter that's bad, not the former. (But since you seem, for some strange philosophical reason, to reject any distinction between mere formalism and theory, maybe I'm wrong to say that this is a mere equivocation on your part -- there really is some kind of substantive philosophical disagreement between us here; it's not just a minor logical error on your part.)
 
  • #68
vanesch said:
I'll play the devil's advocate: [snip]



Vanesch, that is a brilliant example. :!)
 
  • #69
QuantunEnigma said:
As long as one person knows the secret web address then I have done an accidental good job.

The address had many pages long before i communicated here but do not let facts kerb your enthusiasms.

Was it Albert Einstein said, "Rich thinkers always meet violent opposition from poor minds" please?
Gordon
(Or as you call yourself: MW, Mostly Wrong, QuantunEnigma, any more?)

Oh please, you think your a “Rich Thinker”
In post #66 you ask DrC for a negative probability example when he has already given you just that in his link that you quoted back to him!

On Aug 20 your site had only one working page with links to dozens of THIS PAGE DOWN BEING WORKED on links. That hardly counts as “many pages”,

On Aug 21 you put up about a dozen more pages, but any chance you might document your claim the “Bell Logic is false” in your explanations of your versions of W ‘locality’ and ‘factoring’ is still buried behind “This Page Down” links.
Then on the 22nd rather than use your MW id you created the extra QuantunEnigma id to lure DrC and others to your site.

That’s not rich thinking – that’s baiting and dishonest.
You owe both DrC and Zz an apology.
 
  • #70
To Zz's point: Call it for what it is... SED is a speculative work-in-progress that has yet to yield a single useful discovery. In the meantime:

Experimental violation of a Bell's inequality with efficient detection

M. A. ROWE, D. KIELPINSKI, V. MEYER, C. A. SACKETT, W. M. ITANO, C. MONROE & D. J. WINELAND (Nature, 2001)

Abstract:
Local realism is the idea that objects have definite properties whether or not they are measured, and that measurements of these properties are not affected by events taking place sufficiently far away. Einstein, Podolsky and Rosen used these reasonable assumptions to conclude that quantum mechanics is incomplete. Starting in 1965, Bell and others constructed mathematical inequalities whereby experimental tests could distinguish between quantum mechanics and local realistic theories. Many experiments have since been done that are consistent with quantum mechanics and inconsistent with local realism. But these conclusions remain the subject of considerable interest and debate, and experiments are still being refined to overcome 'loopholes' that might allow a local realistic interpretation. Here we have measured correlations in the classical properties of massive entangled particles (9Be+ ions): these correlations violate a form of Bell's inequality. Our measured value of the appropriate Bell's 'signal' is 2.25 +/- 0.03, whereas a value of 2 is the maximum allowed by local realistic theories of nature. In contrast to previous measurements with massive particles, this violation of Bell's inequality was obtained by use of a complete set of measurements. Moreover, the high detection efficiency of our apparatus eliminates the so-called 'detection' loophole.

To Vanesch's analogy (with thermodynamics): We are being asked to accept that "noise" accounts for violation of Bell Inequalities. Yet regardless of detector efficiency, the results are the same! Aspect's early inefficient tests yield almost precisely the same results as the later, more refined tests (as compared to the predictions of QM). Where is the movement towards the SED predicted values you might expect when visibility increases?

And to anyone who is actually in SED's court: If Malus' Law does not hold, please answer the following question: What are the values for the coincidence rate at 0, 22.5 and 45 degress (as compared to the cos^2 function from both classical optics and QM)? A specific value, so we have something to discuss... after all, it can't match QM without running afoul of Bell...
 
Last edited:
  • #71
ttn said:
Vanesch, that is a brilliant example. :!)

Talk about both sides of the fence... :smile:
 
  • #72
ZapperZ said:
There is something highly unethical to talk about detection loopholes or detector efficiency while ignoring some very fundamental aspects and responses to such things.

Unlike most of you, *I* have been involved in actual measurement of such things since the start of my graduate school years, and since about 1 1/2 years ago, have been making high QE photocathodes. So I can talk about background noise, dark current, detector signal, blah blah blah till everyone turns blue. Trying to distinguish between what is "noise" and what is "signal" is a HUGE part of my work. If you look at the raw data from photoemission spectroscopy, for example (i.e. if you make cuts in the data in my avatar), you will see background noise, detector noise, dark currents, etc... Yet, according to SED (and Santos), the these "random" background noise can somehow mimic "actual signal"! NO KIDDING!

How convenient can that be when you can simply stick something in ad hoc, and voila, you can mimic the actual signal simply by burying something in the detector noise. Or did we forget that SED comes with its own set of assumptions about the nature of such background fluctuations? And unlike QM, many of these "assumptions" have not even been tested at the most fundamental level to even see if they are consistent with observation.

Zapper, the stochastic noise isn't really ad-hoc in SED. It is actually derived from the necessary equilibrium condition between fluctuating charges and fluctuating radiation emitted by those charges. In fact, the ZP radiation interacting with these charges is in fact equivalent to the radiation reaction of the classical charges. This is called the fluctuation-dissipation theorem. And the radiation emitted is indeed randomly phased and polarized. For optics, this radiation would actually consitute a background noise with real, physical effects because the zero-point fields are real. Stochastic optics approximates the ZPE radiation as a stochastic noise term with the E and B fields of the noise fluctuating to an average intensity scaled by hf/2. The physical ideas behind SED are discussed concisely on page 3 of this review article by Cole:
http://www.bu.edu/simulation/publica...enCole2005.pdf

And the method by which the stochastic noise is deduced from the Wigner function is found in this article which I already cited:

"What is entanglement?" Emilio Santos.
I conjecture that only those states of light whose Wigner function is positive are real states, and give arguments suggesting that this is not a serious restriction. Hence it follows that the Wigner formalism in quantum optics is capable of interpretation as a classical wave field with the addition of a zeropoint contribution. Thus entanglement between pairs of photons with a common origin occurs because the two light signals have amplitudes and phases, both below and above the zeropoint intensity level, which are correlated with each other.
http://arxiv.org/abs/quant-ph/0204020

But even if you still considered this inadequate, you cannot dismiss the physical consequences of classical fields + stochastic noise. Like Vanesch is emphasizing, it is certainly not just a coincidence that these fundamental tenets of SED explain all the physical phenomena which were the justification for abandoning classical electrodynamics and optics and accepting QM and quantum optics.



ZapperZ said:
Photodetector performance is such a crucial issue, and has been studied so extensively, it is not even funny. Yet, I have seen no actual study done to see how well the detector performance actually matches any of SED's assumption. If we can verify everything from Fowler-Nordheim law at finite temperatures to and the Richardson-Dushman relations for photocathodes, how come this void for SED remains? One would think this is one very fundamental aspect of verification of SED to be taken seriously. Or maybe it is because it is not falsifiable?

However, the most disturbing and unethical aspect of this discussion is the complete void of citation to the TONS of issues that have already been addressed regarding the detection efficiency. All I see are references given to various detection issues that somehow supports SED's point of view on the Bell-type experiments (while ignoring the more stringent CHSH-type experiments). Nowhere was there any mention, by the so-call experts or students in SED, papers such as by S. Massar et al[1] or A. Cabello[2] that have either formulated a Bell-type inequality that are insensitive to detector inefficiency, or that one can distinguish already between quantum optics prediction versus classical with just a detector at 69% efficiency (which we already have!). Or what about Tittel et al.[3] experiment that analyzed their data without subtracting any accidental coincidences (something that many have claimed would reveal "non-quantum" results)?

Where are the rebuttals from the SED camp to those papers? Check any of Santos or Marshall's published papers and citations to their papers that addressed many of the issues that they brought up. So how come they did not address any of these? And I only did a very quick search on a few papers that I am aware of. The rest of you who, I presumed, work in this field or very much interested in it, should have a truckload of literature that you are sitting on. So why were these types of papers that have addressed such detector issues WITHHELD from being listed here alongside those that were so quickly advertized?

There are more of these type of papers. This is why I find such omission here very disturbing. It somehow conveys that the issues brought up by SED are "unanswerable" and thus, must be true. If you omitted such on info on purpose, then shame on you. If you simply were ignorant of all of these large bodies of information, then what else have you missed that you SHOULD have known before pushing this thing onto us?

Zz.

[1]S. Massar et al. PRA 66, 052112 (2002).
[2] A. Cabello PRA 72, 050101 (2005).
[3] W. Tittle et al. PRL 81, 3563 (1998).

As has already been mentioned, in SED there are some preliminary ideas for theories of detection and how detectors work. I also already cited these papers by Santos. But keep in mind that though SED has not completely addressed detection theory, it must be qualified that it was only until 2002 that the stochastic optics formalism was completed to account for all the optical tests of Bell's theorem. Moreover, Marshall has since retired from the field due to his health, thus making Santos essentially the only one actively working in the field. And both men are in their early 70's. So progress along the lines of formulating a theory of detection will understandably be very slow, until there is young blood in this field.

Regarding those detection efficiency papers you mentioned, while I was unaware of the papers you cited, the test requiring the 69% efficiency is not relevant to stochastic optics. To refute SO requires detection efficiencies >87%. Moreover, that paper does not reference stochastic optics at all. I don't know the details of Tittel's paper.Massar's proposal may have been misunderstood by you. I just did an arxiv search under his name and found an even more recent paper than the one you cite, in which he says the following:

Violation of local realism vs detection efficiency (2003) Serge Massar, Stefano Pironio
"We put bounds on the minimum detection efficiency necessary to violate local realism in Bell experiments. These bounds depends of simple parameters like the number of measurement settings or the dimensionality of the entangled quantum state. We derive them by constructing explicit local-hidden variable models which reproduce the quantum correlations for sufficiently small detectors efficiency."
http://lanl.arxiv.org/abs/quant-ph/0210103

So, these resistances to detector efficiencies put bounds on the validity of the LHV's that Massar et al. constructed. They do not make any reference to any of Santos or Marshall's work. Moreover, the LHV model Massar constructs is clearly not the same as Santos and Marshall's. So we don't know how or even if these detector efficiency bounds apply to stochastic optics.

Furthermore, I think all those papers are also undercut by the fact that in his 2006 FQX grant abstract, Kwiat still acknowledges that in the proposed "loop-hole free" test, "The greatest technical challenge for this experiment will be closing the detection loophole by exceeding the approximately 70% limit for total photon-detection efficiency".

It's important to understand that papers in this field of semiclassical approaches to optics and QM are fairly scattered. It is very difficult to be an expert in the work of Santos and Marshall, whose publications span 15 years, and to also be aware of every possible spin-off or criticism of their work since then. So no one - at least not me - is withholding information. Not being aware of every single publication ever in this field, or not being able to answer every single criticism posed, does not at all imply that SED can't answer these questions. It just hasn't been given the time or man power to address them.
 
Last edited by a moderator:
  • #73
***
Experimental violation of a Bell's inequality with efficient detection

M. A. ROWE, D. KIELPINSKI, V. MEYER, C. A. SACKETT, W. M. ITANO, C. MONROE & D. J. WINELAND (Nature, 2001)

Abstract:
Local realism is the idea that objects have definite properties whether or not they are measured, and that measurements of these properties are not affected by events taking place sufficiently far away. Einstein, Podolsky and Rosen used these reasonable assumptions to conclude that quantum mechanics is incomplete. Starting in 1965, Bell and others constructed mathematical inequalities whereby experimental tests could distinguish between quantum mechanics and local realistic theories. Many experiments have since been done that are consistent with quantum mechanics and inconsistent with local realism. But these conclusions remain the subject of considerable interest and debate, and experiments are still being refined to overcome 'loopholes' that might allow a local realistic interpretation. Here we have measured correlations in the classical properties of massive entangled particles (9Be+ ions): these correlations violate a form of Bell's inequality. Our measured value of the appropriate Bell's 'signal' is 2.25 +/- 0.03, whereas a value of 2 is the maximum allowed by local realistic theories of nature. In contrast to previous measurements with massive particles, this violation of Bell's inequality was obtained by use of a complete set of measurements. Moreover, the high detection efficiency of our apparatus eliminates the so-called 'detection' loophole.
***

I thought that one did not close the locality loophole (the two detectors were too close to each other). Anyway, both loopholes seem to be interconnected. But I am still waiting for your response how efficient photon and electron detectors work (I am just curious). Perhaps Zapperz can provide an explanation for us for say an electron detector (I promise not to turn blue) ?!

Careful
 
  • #74
DrChinese said:
To Zz's point: Call it for what it is... SED is a speculative work-in-progress that has yet to yield a single useful discovery. In the meantime:

Experimental violation of a Bell's inequality with efficient detection

M. A. ROWE, D. KIELPINSKI, V. MEYER, C. A. SACKETT, W. M. ITANO, C. MONROE & D. J. WINELAND (Nature, 2001)

Abstract:
Local realism is the idea that objects have definite properties whether or not they are measured, and that measurements of these properties are not affected by events taking place sufficiently far away. Einstein, Podolsky and Rosen used these reasonable assumptions to conclude that quantum mechanics is incomplete. Starting in 1965, Bell and others constructed mathematical inequalities whereby experimental tests could distinguish between quantum mechanics and local realistic theories. Many experiments have since been done that are consistent with quantum mechanics and inconsistent with local realism. But these conclusions remain the subject of considerable interest and debate, and experiments are still being refined to overcome 'loopholes' that might allow a local realistic interpretation. Here we have measured correlations in the classical properties of massive entangled particles (9Be+ ions): these correlations violate a form of Bell's inequality. Our measured value of the appropriate Bell's 'signal' is 2.25 +/- 0.03, whereas a value of 2 is the maximum allowed by local realistic theories of nature. In contrast to previous measurements with massive particles, this violation of Bell's inequality was obtained by use of a complete set of measurements. Moreover, the high detection efficiency of our apparatus eliminates the so-called 'detection' loophole.

To Vanesch's analogy (with thermodynamics): We are being asked to accept that "noise" accounts for violation of Bell Inequalities. Yet regardless of detector efficiency, the results are the same! Aspect's early inefficient tests yield almost precisely the same results as the later, more refined tests (as compared to the predictions of QM). Where is the movement towards the SED predicted values you might expect when visibility increases?

And to anyone who is actually in SED's court: If Malus' Law does not hold, please answer the following question: What are the values for the coincidence rate at 0, 22.5 and 45 degress (as compared to the cos^2 function from both classical optics and QM)? A specific value, so we have something to discuss... after all, it can't match QM without running afoul of Bell...

DrChinese, I recall that the answer to your questions about Malus' Law are given in the paper by Marshall and Santos that I asked you to read:

Stochastic optics: A local realistic analysis of optical tests of Bell inequalities
http://prola.aps.org/abstract/PRA/v39/i12/p6271_1

I don't have full access to PROLA from my computer, but hopefully you do. The answers are in there.

Also, if stochastic optics is fundamentally correct, then it is not at all surprising why Aspects experimental results are the same as current one's. I don't see the beef here.

Moreover, the paper you cited does not apply to SO, as I had already mentioned that the theory has not yet been applied to electrons, let alone ions. But even then, you should probably know that that paper you cite is also no longer considered a valid test of local realism because the two subsystems (the two ions) were not really separated systems during measurement and so the test cannot be considered a real implementation of a detection loophole-free test of Bell inequalities, even if it represents a relevant progress in this sense. Ands this is a conclusion confirmed in personal correspondences with Kwiat and Genovese.

~M
 
  • #75
Maaneli said:
1. Moreover, the paper you cited does not apply to SO, as I had already mentioned that the theory has not yet been applied to electrons, let alone ions.

2. But even then, you should probably know that that paper you cite is also no longer considered a valid test of local realism because the two subsystems (the two ions) were not really separated systems during measurement and so the test cannot be considered a real implementation of a detection loophole-free test of Bell inequalities, even if it represents a relevant progress in this sense. Ands this is a conclusion confirmed in personal correspondences with Kwiat and Genovese.

1. Glad to know that only photons are local realistic. :smile:

2. This is a valid experiment which closes the detection loophole. They do not claim that strict einsteinian locality is maintained. If you already believe in local reality, then requiring spacelike separation isn't relevant to the detection issue. (Think about it...)
 
  • #76
Careful said:
1. I thought that one did not close the locality loophole (the two detectors were too close to each other).

2. Perhaps Zapperz can provide an explanation for us for say an electron detector (I promise not to turn blue) ?!

Careful

1. Why would a local realist raise this objection? I can see why ttn might, but that doesn't quite make sense from you.

2. Why do you keep raising this issue? If you have a point to make, say it! You can google this info as fast as anyone.
 
  • #77
Maaneli said:
Zapper, the stochastic noise isn't really ad-hoc in SED.

"What is entanglement?" Emilio Santos.
I conjecture that only those states of light whose Wigner function is positive are real states, and give arguments suggesting that this is not a serious restriction.

How could you not say these are not ad hoc, and then quote what is clearly a "shove-in-by-hand" conjecture?

But even if you still considered this inadequate, you cannot dismiss the physical consequences of classical fields + stochastic noise. Like Vanesch is emphasizing, it is certainly not just a coincidence that these fundamental tenets of SED explain all the physical phenomena which were the justification for abandoning classical electrodynamics and optics and accepting QM and quantum optics.

But you're willing to somehow consider it is purely a coincidence that ALL of the Bell-type, CHSH-type, and GHZ-type experiments are violated.

As has already been mentioned, in SED there are some preliminary ideas for theories of detection and how detectors work. I also already cited these papers by Santos. But keep in mind that though SED has not completely addressed detection theory, it must be qualified that it was only until 2002 that the stochastic optics formalism was completed to account for all the optical tests of Bell's theorem. Moreover, Marshall has since retired from the field due to his health, thus making Santos essentially the only one actively working in the field. And both men are in their early 70's. So progress along the lines of formulating a theory of detection will understandably be very slow, until there is young blood in this field.

Then people who are trumpting its horns are putting the care WAY before the horse, like maybe in the next county. SED has a serious shortcoming of verifying the most fundamental aspect of what it is predicting at the detector level. Those of us who depend on a detector detecting VALID signals are amazed that one could get something that is nothing more than background noise to mimick a real signal. If you truly think that is true, then stop using your modern electronics, because the band structures of many semiconductors that we verified using photoemission obviously are WRONG.

This is why I asked for SED to spew out an agreement to ARPES, etc. If you cannot produce something that is the CLOSEST to the fundamental level of your theory, how do you know that all those you build on top of are valid?

Regarding those detection efficiency papers you mentioned, while I was unaware of the papers you cited, the test requiring the 69% efficiency is not relevant to stochastic optics. To refute SO requires detection efficiencies >87%. Moreover, that paper does not reference stochastic optics at all. I don't know the details of Tittel's paper.Massar's proposal may have been misunderstood by you. I just did an arxiv search under his name and found an even more recent paper than the one you cite, in which he says the following:

Violation of local realism vs detection efficiency (2003) Serge Massar, Stefano Pironio
"We put bounds on the minimum detection efficiency necessary to violate local realism in Bell experiments. These bounds depends of simple parameters like the number of measurement settings or the dimensionality of the entangled quantum state. We derive them by constructing explicit local-hidden variable models which reproduce the quantum correlations for sufficiently small detectors efficiency."
http://lanl.arxiv.org/abs/quant-ph/0210103

So, these resistances to detector efficiencies put bounds on the validity of the LHV's that Massar et al. constructed. They do not make any reference to any of Santos or Marshall's work.

Look at the paper *I* cited. They cited Santos's work. I also want to let you know that I would rather be given the published version of a paper, and not the arxiv reference - unless of course you don't want me to do a citation index search on these things.

It's important to understand that papers in this field of semiclassical approaches to optics and QM are fairly scattered. It is very difficult to be an expert in the work of Santos and Marshall, whose publications span 15 years, and to also be aware of every possible spin-off or criticism of their work since then. So no one - at least not me - is withholding information. Not being aware of every single publication ever in this field, or not being able to answer every single criticism posed, does not at all imply that SED can't answer these questions. It just hasn't been given the time or man power to address them.

I strongly disagree. I came from the high-Tc superconductivity field in which the number of papers being published PER WEEK exceeds many fields in a year! And I don't want to put this field down, but I bet you the frequency of papers published in this field is miniscule when compared with condensed matter. So your excuse of not knowing the existence of such-and-such a paper rings hollow in my book, especially earlier when you pleaded ignorance of GHZ experiments. All I did today was do a quick citation index on one of Santos's papers and out comes several other citations. If someone who isn't an "expert" in this field and have only followed it closely as an interested party can obtain such references, how difficult can this be?

Zz
 
  • #78
DrChinese said:
1. Glad to know that only photons are local realistic.

2. This is a valid experiment which closes the detection loophole. They do not claim that strict einsteinian locality is maintained. If you already believe in local reality, then requiring spacelike separation isn't relevant to the detection issue. (Think about it...)

To the first point, haha your so funny. Again, just wait until SO can be extended to fermions. You can't say only photons are local realistic just because the formalism has not been applied to massive particles. You just simply have to be agnostic on the latter.

To the second point, you can't be serious :rolleyes:. If the ions are not separate systems, then it is even easier to suggest local causal influences between the ions. I agree that complete spacelike separation isn't relevant to confirming local reality - which is how a local realist would interpret all these experiments, since they still agree with certain LHV models. However, it is obviously a necessary (though not sufficient) condition for testing the existence of nonlocal reality, if it really does exist.

~M
 
  • #79
ZapperZ said:
How could you not say these are not ad hoc, and then quote what is clearly a "shove-in-by-hand" conjecture?

The ZPE from SED is not ad-hoc as I have already shown. The stochastic optical approximation of the ZP radiation in SED as noise, may be considered somewhat ad-hoc. But there are degrees of arbitrariness in a theory. This is clearly discussed in the review article I cited, and this earlier paper:

Stochastic optics: A local realistic analysis of optical tests of Bell inequalities
http://prola.aps.org/abstract/PRA/v39/i12/p6271_1

What Santos does in his 'Entanglement' paper is first give various arguments for why all real states of light have a positive Wigner function. He then show trivially how one can interpret that function as an actual probability distribution of the amplitudes of
the radiation modes. Thus quantum optics becomes a disguised stochastic theory, where the states of light are probability distributions defined on the set of possible realizations of the electromagnetic field. He then goes to say,

"We propose the name stochastic optics for the stochastic interpretation of quantum optics derived from the Wigner function. From another point of view, the stochastic interpretation provides an explicit hidden variables theory where the amplitudes of the electromagnetic field are the ”hidden” variables!. The most dramatic consequence of stochastic optics is that the vacuum is no longer empty, but filled with a random electromagnetic radiation having an energy hf/2 per radiation mode, on the average, as is shown ineq.(5). That radiation corresponds precisely to the additional term introduced by Max Planck in his second radiation law (see e.g. ). The picture that emerges is that space contains a random background of electromagnetic waves providing what we shall call a zeropoint field (ZPF)."

To me this is not very ad-hoc. Starting with this defendable argument that all real states of light have a positive Wigner function, everything else follows. Of course, don't take my word for it. Read the paper.

http://arxiv.org/abs/quant-ph/0204020
ZapperZ said:
But you're willing to somehow consider it is purely a coincidence that ALL of the Bell-type, CHSH-type, and GHZ-type experiments are violated.

I don't understand this sentence. I am willing to consider the possibility that all of these experiments are consistent with local hidden variables that also violate the inequalities. In fact these inequalities describe a specific class of LHV's which are violated. And there are different classes of LHV's, such as stochastic optics, that also violate those inequalities. So if LHV's are fundamental, and QM is an accurate approximation of these inequality violating LHV's, then it is pefectly reasonable why these experiments violate the said inequalities. I don't see how that makes me consider these experiments to be a coincidence.
ZapperZ said:
Then people who are trumpting its horns are putting the care WAY before the horse, like maybe in the next county. SED has a serious shortcoming of verifying the most fundamental aspect of what it is predicting at the detector level. Those of us who depend on a detector detecting VALID signals are amazed that one could get something that is nothing more than background noise to mimick a real signal. If you truly think that is true, then stop using your modern electronics, because the band structures of many semiconductors that we verified using photoemission obviously are WRONG.

This is why I asked for SED to spew out an agreement to ARPES, etc. If you cannot produce something that is the CLOSEST to the fundamental level of your theory, how do you know that all those you build on top of are valid?

SED and SO theorists know that what they build on top is valid simply because it works, meaning that to the extent that SED and SO have been analyzed and applied, they have been as successful as QM in making predictions consistent with current experiments. And it is these approaches which the SED theorists build upon. Now by no means does that guarantee that the current approach of SED will continue to be successful. In fact, I am willing to bet that for SED to develop an accurate theory of detection and atomic band structure, it will have to be modified in some way, or some ingenious new approach to problem the will have to be developed. Now I could give you my thoughts on precisely how one could do this, but I suspect that would only be suitable for the independent research forum. Nevertheless, I think SED and SO's current success are good enough to build upon.
ZapperZ said:
Look at the paper *I* cited. They cited Santos's work. I also want to let you know that I would rather be given the published version of a paper, and not the arxiv reference - unless of course you don't want me to do a citation index search on these things.

The paper you cited cites only one paper by Santos, and it is along with two other citations in the context of just completing the solution to an equation. Moreover, that paper is from 1992. The complete stochastic optical formalism based on Wigner functions was developed from 1996-2002. Also, in that paper, there is no reference to the term stochastic optics. It is not enough to just see the citation by Santos. You have to know in what context it is used and how old it is in relation to others papers by the same author.

Also, I gave the arxiv reference because it was full access and I don't know if you have full access to PROLA. Moreover, if the paper is published, it would say so and in which journal, on the arxiv reference. In fact, here it is.
http://scitation.aip.org/getabs/servlet/GetabsServlet?prog=normal&id=PLRAAN000068000006062109000001&idtype=cvips&gifs=yes
ZapperZ said:
I strongly disagree. I came from the high-Tc superconductivity field in which the number of papers being published PER WEEK exceeds many fields in a year! And I don't want to put this field down, but I bet you the frequency of papers published in this field is miniscule when compared with condensed matter. So your excuse of not knowing the existence of such-and-such a paper rings hollow in my book, especially earlier when you pleaded ignorance of GHZ experiments. All I did today was do a quick citation index on one of Santos's papers and out comes several other citations. If someone who isn't an "expert" in this field and have only followed it closely as an interested party can obtain such references, how difficult can this be?

Zz

That's true that your field has more publications per week than any other field. However, I would guess that even then you don't have the time to read every paper and even for those you do read, my guess is that you are not an expert on them. Or, at least it takes some time to become an expert. Also, you are a professional working CM physicist who's job it is to read these papers. And you seem to have been doing this for at least 20 years. I on the other hand am still a full-time physics student and have only seriously entered this field of theoretical and experimental quantum and stochastic optics in the past year. And I don't want to have a trivial knowledge of every publication in the field. I am taking the time to study the Marshall and Santos papers carefully. So you can see the situation is different.

~M
 
Last edited by a moderator:
  • #80
Maaneli said:
To the second point, you can't be serious :rolleyes:. If the ions are not separate systems, then it is even easier to suggest local causal influences between the ions...

Now you are being funny.

QM needs nothing else to fill in the blanks, and neither would an explicit non-local theory such as Bohmian Mechanics (nor MWI for that matter). A local realistic theory would! There would need to be a previously unknown force or causal agent. And that, my friend, should be detectable using conventional means. And they would CERTAINLY need to be explicitly present in SED to explain this experiment.

So throw out the experimental results that bother you, and sleep tight tonight. :smile:
 
  • #81
DrChinese said:
Now you are being funny.

QM needs nothing else to fill in the blanks, and neither would an explicit non-local theory such as Bohmian Mechanics (nor MWI for that matter). A local realistic theory would! There would need to be a previously unknown force or causal agent. And that, my friend, should be detectable using conventional means. And they would CERTAINLY need to be explicitly present in SED to explain this experiment.

So throw out the experimental results that bother you, and sleep tight tonight. :smile:

There is a causal agent. I'm not sure what you mean by conventional means, but SO does show you how to test for it experimentally. I have already cited the papers on this thread proposing such experiments.

Also, you should realize that your opinion of that experiment you cited contradicts the opinions of specialists in the field such as Kwiat.

~M
 
Last edited:
  • #82
Maaneli said:
I don't understand this sentence. I am willing to consider the possibility that all of these experiments are consistent with local hidden variables that also violate the inequalities. In fact these inequalities describe a specific class of LHV's which are violated. And there are different classes of LHV's, such as stochastic optics, that also violate those inequalities. So if LHV's are fundamental, and QM is an accurate approximation of these inequality violating LHV's, then it is pefectly reasonable why these experiments violate the said inequalities. I don't see how that makes me consider these experiments to be a coincidence.

The reason he said ALL is that there are many types of tests, including particles other than light. And there are many permutations, some including 3, 4 and more particles. And yet ALL of the results are exactly in keeping with the predictions of QM. Quantum erasers, where is the noise there?

Zz points out that the world of applied physics is so much larger, you must make a conclusion based on the weight of ALL of the evidence; and not just based on a small fraction.

Sure, some of the arguments of SED might have had some merit back in the late 70's/early 80's when the Bell tests were getting going. But today, SED has been left in the dust relative to the subject of local realism. This is because QM makes specific predictions about a wide range of behaviors, those predictions have been tested, and QM was not falsified (as it could have been) in the process. That is good science.
 
  • #83
Maaneli said:
I don't understand this sentence. I am willing to consider the possibility that all of these experiments are consistent with local hidden variables that also violate the inequalities. In fact these inequalities describe a specific class of LHV's which are violated.

You have no ability to say that (no SED proponent can) when you haven't shown how CHSH and GHZ experiments agree with SED. Even Santos haven't made that claim!

SED and SO theorists know that what they build on top is valid simply because it works, meaning that to the extent that SED and SO have been analyzed and applied, they have been as successful as QM in making predictions consistent with current experiments. And it is these approaches which the SED theorists build upon. Now by no means does that guarantee that the current approach of SED will continue to be successful. In fact, I am willing to bet that for SED to develop an accurate theory of detection and atomic band structure, it will have to be modified in some way, or some ingenious new approach to problem the will have to be developed. Now I could give you my thoughts on precisely how one could do this, but I suspect that would only be suitable for the independent research forum. Nevertheless, I think SED and SO's current success are good enough to build upon.

Then go ahead and build on top of them. But until you have gone back to the fundamental pillars and verify that they are valid, all you have is simply a phenomenological theory.

Look at what is happening to QM. Even with all its successes, we STILL tests its fundamental building blocks, such as the superposition principle. This is what the Stony Brook/Delft experiements were designed to test. I've been asking SED proponents for YEARS to show me ONE complete simulation of a typical photoemission spectroscopy result. Just ONE. This will truly test of the fundamental assumptions of SED in regards to how photons interact with a material to produce an electrical signal, AND, that signal can, in turn, tells us something about the material itself!

This is such at the most fundamental and basic level that I don't understand why SED people don't just jump into it. Success in it means that you have a whole bunch of condensed matter physicists paying a closer attention to what you have to say!

That's true that your field has more publications per week than any other field. However, I would guess that even then you don't have the time to read every paper and even for those you do read, my guess is that you are not an expert on them.

At the very least, I'm AWARE of them, and if the same subject came up, I would have remembered reading something about it. I certainly would not have withheld information on it even it was contrary to what I was going to present.

Or, at least it takes some time to become an expert. Also, you are a professional working CM physicist who's job it is to read these papers. And you seem to have been doing this for at least 20 years. I on the other hand am still a full-time physics student and have only seriously entered this field of theoretical and experimental quantum and stochastic optics in the past year. And I don't want to have a trivial knowledge of every publication in the field.

I find that rather incredulous. I mean, GHZ papers are "trivial knowledge"? I pay attention to everything that Zeilinger publishes, and I'm not even in the SAME field! I mean, this guy could get the Nobel Prize some time in my lifetime. I would hardly think that such a paper, appearing in PRL and Nature no less, would be considered as "trivial knowledge".

Zz.
 
  • #84
ZapperZ said:
You have no ability to say that (no SED proponent can) when you haven't shown how CHSH and GHZ experiments agree with SED. Even Santos haven't made that claim!

I don't know what Santos and Marshall think of the CHSH and GHZ experiments, but I have dispatched an e-mail asking them.

ZapperZ said:
Look at what is happening to QM. Even with all its successes, we STILL tests its fundamental building blocks, such as the superposition principle. This is what the Stony Brook/Delft experiements were designed to test. I've been asking SED proponents for YEARS to show me ONE complete simulation of a typical photoemission spectroscopy result. Just ONE. This will truly test of the fundamental assumptions of SED in regards to how photons interact with a material to produce an electrical signal, AND, that signal can, in turn, tells us something about the material itself!

This is such at the most fundamental and basic level that I don't understand why SED people don't just jump into it. Success in it means that you have a whole bunch of condensed matter physicists paying a closer attention to what you have to say!

If photoemission spectroscopy would be that convincing to CM physicists, then maybe you're right. Maybe that is a good place for SED theorists to focus one's energy. As of right now, efforts are fairly scattered. As an example, there was an SED conference back in 2001. Read the subjects of talks given:
http://www.bu.edu/simulation/conferences.html

ZapperZ said:
At the very least, I'm AWARE of them, and if the same subject came up, I would have remembered reading something about it. I certainly would not have withheld information on it even it was contrary to what I was going to present.

Well now I know about the GHZ experiments. Keep in mind that Vanesch was aware of the inequality, but not aware of the experiments, and he is also interested in SED and quantum optics, the latter probably longer than I have. Also, I hope your not implying that I intentionally withheld information.

ZapperZ said:
I find that rather incredulous. I mean, GHZ papers are "trivial knowledge"? I pay attention to everything that Zeilinger publishes, and I'm not even in the SAME field! I mean, this guy could get the Nobel Prize some time in my lifetime. I would hardly think that such a paper, appearing in PRL and Nature no less, would be considered as "trivial knowledge".

Zz.

Well my efforts and attention have also been focused elsewhere. You may pay attention to everything Zeilinger publishes, but you are also clearly very ignorant of Santos and Marshall's papers and theories. So there is always a tradeoff in one's "awareness". And once again, I have only seriously become interested in this field in the past year, where I initially started from the very basics of classical optics and Bell's theorem. You can't start from the most advanced and cutting-edge information about the field, when you are just starting out, even if it is "trivial knowledge". I don't think this is a trivial feat for a full-time physics student who is still taking courses, and is still conducting experimental research on single-bubble sonoluminescence, in parallel with research on foundations of QM and quantum optics.

~M
 
  • #85
RandallB said:
Gordon
(Or as you call yourself: MW, Mostly Wrong, QuantunEnigma, any more?)

Oh please, you think your a “Rich Thinker”
In post #66 you ask DrC for a negative probability example when he has already given you just that in his link that you quoted back to him!

On Aug 20 your site had only one working page with links to dozens of THIS PAGE DOWN BEING WORKED on links. That hardly counts as “many pages”,

On Aug 21 you put up about a dozen more pages, but any chance you might document your claim the “Bell Logic is false” in your explanations of your versions of W ‘locality’ and ‘factoring’ is still buried behind “This Page Down” links.
Then on the 22nd rather than use your MW id you created the extra QuantunEnigma id to lure DrC and others to your site.

That’s not rich thinking – that’s baiting and dishonest.
You owe both DrC and Zz an apology.

Gordon says he is committed to returning as soon as he has "revised one large page with appendages. He says the new site was converted from the old one on 12 August. Factoring etc was up on the old site for years which contained more than 2Mb."

You are so emotional obsessed about "Gordon" why do you not give him a phonecall. Like me you have or can get his number easily and it might make you more healthy in your poor mind.

I did not refer to myself as a rich thinker but I see you understand the other person addressed.

I am not a mouthpiece for WM and i do not think you are a mouthpiece for drChinese. So speak for yourself about an experiment that proves negative probabilities. What a joke. Did they come out with negative relative frequencies?

If not then you know the secret address where drChinese example is discussed and done over i believe? (What date did you see it?) So stop hiding behing the gurus and think for yourself unless you think that science is like religion and facts do not matter any more?

I forgot to say about your [Q]Then on the 22nd rather than use your MW id you created the extra QuantunEnigma id to lure DrC and others to your site.[/Q] I told you it was a mistake. And there is no money to be made at the site so why would anyone want to lure anyone there?

Hope you soon reply with more about the proof of negative probabilities.
 
Last edited:
  • #86
**
2. Why do you keep raising this issue? If you have a point to make, say it! You can google this info as fast as anyone. **

Well, YOU keep on saying that detector efficiency cannot possibly be a fundamental thing. I, on the other hand, think the arguments of Santos and Marshall are cleverly done. Since I am theorist, I only remember some old photon/electron (the ``bubble'' chamber) detectors which we studied at university during the first two years (and like Maaneli, I think it is better to think deep about fewer topics than to read about them all.). So all I am asking here is how one can be sure that if one detects a signal, it corresponds to only one electron scattering off the apparatus ? Or, when no detection is made, no electron is passing ? I guess one should also take into account electrons being annihilated in the apparatus ... So, it is a reasonable question with interest for many I guess... (perhaps Vanesch, who plays with detectors a lot can give some answer ?)

EDIT : clearly the detector efficiency depends upon the distance between source and the latter if one does not think of particles or photons as bullet like objects (clearly also on the wavelength ...).

Careful
 
Last edited:
  • #87
Careful said:
So all I am asking here is how one can be sure that if one detects a signal, it corresponds to only one electron scattering off the apparatus ? Or, when no detection is made, no electron is passing ?

I cannot say as certainly for electrons, but for photons this issue has been well addressed. This is an issue that has banged around a while in the Local Realist group, which is why I was asking to understand your point.

Detector manufacturers use entangled photons as a technique to calibrate their detectors to an extremely high rate of accuracy. I don't have the figures for a specific manufacturer in front of me, but it is high and addresses your question exactly: a detection matches one photon, no detection matches no photon. The basic concept can be seen in studies like the following:

http://marcus.whitman.edu/~beckmk/QM/grangier/Thorn_ajp.pdf J. J. Thorn, M. S. Neel, V. W. Donato, G. S. Bergreen, R. E. Davies, and M. Becka)

"Here we present an experiment, suitable for an undergraduate laboratory, that unequivocally demonstrates the quantum nature of light. Spontaneously downconverted light is incident on a beamsplitter and the outputs are monitored with single-photon counting detectors. We observe a near absence of coincidence counts between the two detectors—a result inconsistent with a classical wave model of light, but consistent with a quantum description in which individual photons are incident on the beamsplitter. More explicitly, we measured the degree of second-order coherence between the outputs to be g(2)(0)50.01776 +/- 0.0026, which violates the classical inequality g(2)(0)>1 by 377 standard deviations."

I am certain that Santos is familiar with these type experiments, but chooses to reject the results. However, the combination of these experiments with detector efficiency tests are quite convincing to anyone who reads them. I will look for some information from one of the manufacturers. Zz may be able to add something on this as well.

Edit to add: the Perkin Elmer model SPCM-AQR-13-FC (Single Photon Mutilplier detector) was used and it has a dark count rate of less than 250 per second. The experiments were conducted with over 7000 coincidences per second by way of comparison.
 
Last edited by a moderator:
  • #88
QuantunEnigma said:
I told you it was a mistake.
If you can show me where you admitted to making a mistake in intentionally playing tag team with MW on this thread to promote his website (“I didn’t know” is a 5 yr olds excuse not an admission of error or apology) then I acknowledge that as an apology, but I don’t find anything like that in your posts.

an experiment that proves negative probabilities ……… Did they come out with negative relative frequencies? ……. you know the secret address where DrC example is
Where did you see a claim that experiment produces negative probability counts (frequencies?) that is a calculation from using the Bell Theorem.
And what secret address? - you quoted it in your post #66;
follow his reference links if you don’t like DrC’s short version.

stop hiding behind the gurus and think for yourself unless you think that science is like religion and facts do not matter any more?

What gurus, is that what you consider to MW to be? I’ve shared with the people that count on here where my position disagrees with theirs but I don’t demand anyone accept or listen to my ideas until I have find a real way to support them.

Fact is it is you and MW that demand to be listen to, just for claiming Bell Logic is Wrong or ‘Silly”.
MW claims to have documented it for years but has been unable to cut and paste a single relevant word or fact supporting such a huge claim (religion?) anywhere I can find.
Gordon was asked to “show the beef” on this two years ago – tell him to put it on his website not here – if it is more than a little Chicken or Turkey on the bun, maybe I’ll find it. The two of you have done more than enough to destroy your own credibility to justify posting here.

Yet you claim a success in post #66 in hooking one person to take the time to vainly search for a straight answer on MW’s web site;
and then insult me and my “poor mind”, for doing so.
So this is my last response to you as if you keep this up you are Zz’s problem anyway – he does a needed and good job at keeping this overall forum rational and is more than fair. (By the way, Thanks Zz for what you do for the group that goes unseen)
 
  • #89
RandallB said:
If you can show me where you admitted to making a mistake in intentionally playing tag team with MW on this thread to promote his website (“I didn’t know” is a 5 yr olds excuse not an admission of error or apology) then I acknowledge that as an apology, but I don’t find anything like that in your posts.

Where did you see a claim that experiment produces negative probability counts (frequencies?) that is a calculation from using the Bell Theorem.
And what secret address? - you quoted it in your post #66;
follow his reference links if you don’t like DrC’s short version.
What gurus, is that what you consider to MW to be? I’ve shared with the people that count on here where my position disagrees with theirs but I don’t demand anyone accept or listen to my ideas until I have find a real way to support them.

Fact is it is you and MW that demand to be listen to, just for claiming Bell Logic is Wrong or ‘Silly”.
MW claims to have documented it for years but has been unable to cut and paste a single relevant word or fact supporting such a huge claim (religion?) anywhere I can find.
Gordon was asked to “show the beef” on this two years ago – tell him to put it on his website not here – if it is more than a little Chicken or Turkey on the bun, maybe I’ll find it. The two of you have done more than enough to destroy your own credibility to justify posting here.

Yet you claim a success in post #66 in hooking one person to take the time to vainly search for a straight answer on MW’s web site;
and then insult me and my “poor mind”, for doing so.
So this is my last response to you as if you keep this up you are Zz’s problem anyway – he does a needed and good job at keeping this overall forum rational and is more than fair. (By the way, Thanks Zz for what you do for the group that goes unseen)

I thank all of the group for their contributions. They are the gurus that i meant and not MW or the way you are drChinese mouthpiece sometimes. You do not hide behind MW so he is not the guru I was talking about. And I see our correspondence is ruining the flow so I am closing down on this subject out of my depth ...

I thought I said that I made a mistake somewhere - for which I am always sorry ... so I have no problem to saying sorry again to everyone. You will learn that 25 year and 75 olda makes mistakes also.

Also I knew what SED was and I hope you will learn also that your different words about "proof of negative" probability is different from "experimental proof of negative probabilities".

The secret site is the one that you and i are monitoring, that is all. The one that cannot be mentioned like this [SPAM link removed - the NEXT time I have to do this, you're gone! - Zz.] because I do not know how to make it a hidden legal link to show what we are talking about. If you would show me how please I would edit this posting and leave in peace.

And i could have not put it in number 66 because i did not know then how either. Are you mistaking may be it for another address. I will look at number 66 again.
 
Last edited by a moderator:
  • #90
QuantunEnigma said:
because I do not know how to make it a hidden legal link to show what we are talking about.

Re-read the Guidelines that you have agreed to. Personal, unverified theories and ideas are NOT allowed. If you have a personal theory to push, do so in the IR forum. You have been told this before, and I very seldom repeat what has been said.

Furthermore, citation to some website does NOT strengthen your case! So stop doing that. This is not how science, and especially physics, is done. If you think you have a valid idea, cite a respected peer-reviewed journal.

I expect that this is the last time I have to explain this.

Zz.
 
  • #91
Thanks for that link, DrChinese.
 
  • #92
***
Detector manufacturers use entangled photons as a technique to calibrate their detectors to an extremely high rate of accuracy. I don't have the figures for a specific manufacturer in front of me, but it is high and addresses your question exactly: a detection matches one photon, no detection matches no photon. The basic concept can be seen in studies like the following: ****

No, no, detection efficiency depends on the frequency of light, distance to the source and so on... Sudarsan and Glauber have studied this in the sixties, you cannot obtain (even in QFT) a genuine violation of the Bell inequalities with optical light (this is also a point Santos makes) - so your manufacturer probably has some more detailed information in small print in the booklet of the apparatus. Now, for solid particles (like electrons or kaons), it appears to me you can make probability of detection extremely high (for example by bombarding the electrons with high frequency photons) - with only a neglegible dependency upon distance (but please go ahead and tell us a better way of doing it). So, I have read that some 11 years ago a conclusive Bell test with Kaons was devised (and traditional realists agreed upon the conclusive nature of the latter) - you can read that in Franco Selleri's book ; hum, where can we read the test results ?

Could I ask you why you believe the dark current (or dark count) to be the only parameter which has something to do with detector efficiency ? Dark current is just the internal thermal current of electrons in the apparatus under assumption of absence of external fields. This is what you substract from the received signal, but that does not imply automatically that the received signal is what you get from the source. I guess the dark current could interfere with the received signal so that some distortion of the latter occurs (although it could very well average out in which case dark current would be irrelevant - could someone comment on this ?), but that does not fix your efficiency. Moreover it is not so clear what this has to do with the zero point field in SED (the latter would only produce quantum corrections to the dark current I presume), there the zero point field is non thermal and has no observable effects (it is believed to provide atomic stability of course) unless one has different boundary conditions for the electromagnetic field or some particle is accelerated through it (like in the Unruh effect). This is for example how they can explain the Casimir Polder effect and how a zero point field can influence the dynamics of interacting fields (detection process).

Moreover, if you use presumed sources of ``entangled photons'' as a way to calibrate your detectors, it becomes rather impossible (from a logical point of view) to consider the possibility that entanglement might not exist in the first place, no ? :-)

Careful
 
Last edited:
  • #93
ZapperZ said:
Re-read the Guidelines that you have agreed to. Personal, unverified theories and ideas are NOT allowed. If you have a personal theory to push, do so in the IR forum. You have been told this before, and I very seldom repeat what has been said.

Furthermore, citation to some website does NOT strengthen your case! So stop doing that. This is not how science, and especially physics, is done. If you think you have a valid idea, cite a respected peer-reviewed journal.

I expect that this is the last time I have to explain this.

Zz.

QUOTE OF ZZ ACTION
The secret site is the one that you and i are monitoring, that is all. The one that cannot be mentioned like this [SPAM link removed - the NEXT time I have to do this, you're gone! - Zz.] because I do not know how to make it a hidden legal link to show what we are talking about. If you would show me how please I would edit this posting and leave in peace. END

Sir, I am writing to ask please correct the words of your deletion. Unless I am mistaken. The site IS NOT A SPAM SITE because not like Physics Forums it has no advertisements and I can see no reward for clicks or views. "It is a free public service science blog with references to 100 peer reviewed articles and it is based on peer reviewed and published theory that has not been changed or refuted or modified."

You are the one who can delete but in fairness your deletion should not distort facts unless i do not understand what is a SPAM SITE please. Can you say? Site deleted or personal science blog deleted.

Also, because the site is edited every day I think it cannot work under a loss of editorial freedom which your referral appears to require. The author thinks that.

I am note arguing with you right to delete, only that be fair like a scientist and present facts please. Unless i am wrong
 
  • #94
QuantunEnigma said:
QUOTE OF ZZ ACTION
The secret site is the one that you and i are monitoring, that is all. The one that cannot be mentioned like this [SPAM link removed - the NEXT time I have to do this, you're gone! - Zz.] because I do not know how to make it a hidden legal link to show what we are talking about. If you would show me how please I would edit this posting and leave in peace. END

Sir, I am writing to ask please correct the words of your deletion. Unless I am mistaken. The site IS NOT A SPAM SITE because not like Physics Forums it has no advertisements and I can see no reward for clicks or views. "It is a free public service science blog with references to 100 peer reviewed articles and it is based on peer reviewed and published theory that has not been changed or refuted or modified."

You are the one who can delete but in fairness your deletion should not distort facts unless i do not understand what is a SPAM SITE please. Can you say? Site deleted or personal science blog deleted.

Also, because the site is edited every day I think it cannot work under a loss of editorial freedom which your referral appears to require. The author thinks that.

I am note arguing with you right to delete, only that be fair like a scientist and present facts please. Unless i am wrong

Then cite the exact paper that support your position and not simply a whole webpage! That webpage is espousing a personal theory that is not back by conventional physics, regardless on whether it contains published papers or not. You have been told about our rules against speculative theories. Take it, or leave.

Zz.

P.S. I'll call it crackpot spam if that makes you feel any better.
 
Last edited:
  • #95
Careful said:
1. Could I ask you why you believe the dark current (or dark count) to be the only parameter which has something to do with detector efficiency ? Dark current is just the internal thermal current of electrons in the apparatus under assumption of absence of external fields. This is what you substract from the received signal, but that does not imply automatically that the received signal is what you get from the source. I guess the dark current could interfere with the received signal so that some distortion of the latter occurs (although it could very well average out in which case dark current would be irrelevant - could someone comment on this ?), but that does not fix your efficiency. Moreover it is not so clear what this has to do with the zero point field in SED (the latter would only produce quantum corrections to the dark current I presume), there the zero point field is non thermal and has no observable effects (it is believed to provide atomic stability of course) unless one has different boundary conditions for the electromagnetic field or some particle is accelerated through it (like in the Unruh effect). This is for example how they can explain the Casimir Polder effect and how a zero point field can influence the dynamics of interacting fields (detection process).

2. Moreover, if you use presumed sources of ``entangled photons'' as a way to calibrate your detectors, it becomes rather impossible (from a logical point of view) to consider the possibility that entanglement might not exist in the first place, no ? :-)

Careful

1. I didn't mean to imply that the detector dark count is the only variable involved. It's not. There are plenty of experimental setup issues that affect total efficiency, such as beam splitters, filters and polarizers. I was really trying to focus on the detector itself as a source for false readings.

2. Yes, this does make logical sense as a way to calibrate. It doesn't really matter if you call it entanglement or not, there is an effect that is readily measured and it is really undeniable. There are virtually no cases of coincidences when the PDC source is turned off (i.e. no coincidences due to noise). Once the PDC source is turned on, there are virtually no cases of double detections on one side (i.e. 3 fold coincidences). I don't know the exact rate of 1-fold coincidences (i.e. detection of only one of a pair), so I will see if I can drill into that stat.
 
  • #96
**1. I didn't mean to imply that the detector dark count is the only variable involved. It's not. There are plenty of experimental setup issues that affect total efficiency, such as beam splitters, filters and polarizers. I was really trying to focus on the detector itself as a source for false readings. **

Right ! Now, you seem to be stuck with the Von Neumann measurement postulate (or consciousness) and regard photons as non local plane waves. You cannot do that : (a) it makes no sense, you completely deny local physics (b) such waves are not normalizable, hence you cannot even apply the Von Neumann postulate here. Rather, you have to make a local model of detection (remember: the dynamics of QFT is local) and apply the reduction at a later stage (which gives you genuinely different results !). Now, localized wave packages (with characteristic wave length lambda) will spread around and local quantities (probability density) will decrease, affecting the probability of detection...

**
2. Yes, this does make logical sense as a way to calibrate. It doesn't really matter if you call it entanglement or not, there is an effect that is readily measured and it is really undeniable. There are virtually no cases of coincidences when the PDC source is turned off (i.e. no coincidences due to noise). **

Again, you ignore the possibility that when the source is turned on, there is an enhancement of such coincidences (as you would expect from local physics). I guess you can expect a delay between the arrival time of the ``created pair'' and arrival time of the original one. Moreover, in experiment, not all ``photons'' will have the same spatial density and many of them can get lost (as is the case in the low frequency range).

Now, you seem to be confused by my position. I am a local realist and I could really offer you good reasons why a non local world view is nonsensical (actually it is pretty easy to give a crazy local (in the strict sense!) theory behind the Von Neumann measurement postulate if you are creative enough) but I prefer to keep an open mind. As far as I am aware, it is possible to give a local theory of EPR with a local measurement dynamics (and without denying reality) - see my allusion to negative probabilities (you might be interested to read about this). However, on the other side, it is still possible to claim that a ``less quantal'' point of view is possible, see SED or better Barut self field. So, one would better do the experiment with neutral Kaons, or do experiments in which local realist predictions (deviating from the QM ones) can be checked. This would settle the matter more easily. But in ANY case (there is in the worst case a lack of creativity), there is no obvious problem with local realism as far as I am concerned, only some types of ``naive'' local hidden variable theories (the ones assuming screening off *and* dichotomic outcomes) could be killed off, but where remains the conclusive series of experiments ?

Careful
 
Last edited:
  • #97
Careful said:
Now, you seem to be confused by my position. I am a local realist and I could really offer you good reasons why a non local world view is nonsensical (actually it is pretty easy to give a crazy local (in the strict sense!) theory behind the Von Neumann measurement postulate if you are creative enough) but I prefer to keep an open mind. As far as I am aware, it is possible to give a local theory of EPR with a local measurement dynamics (and without denying reality) - see my allusion to negative probabilities (you might be interested to read about this).

Sure, I am confused any time a local realist tells me an experiment should not be considered because non-local influences were not adjusted for. If you believe there is a local model which differs from Malus in its predictions, then the thing I am immediately interested in is: what are the specific values that are really present?

Sure, I am interested in negative probabilities because I think any local realistic model will yield probabilities greater than 1 or less than zero somewhere along the line. The reason I "know" this is because of Bell's discovery.
 
  • #98
DrChinese said:
I am interested in negative probabilities because I think any local realistic model will yield probabilities greater than 1 or less than zero somewhere along the line. The reason I "know" this is because of Bell's discovery.

Could you give an example of a theory which is "local" but not "realistic" and which agrees with the QM predictions here? Or more basically, do you think that such a theory is (in light of Bell's discovery) possible?

If not, why use the cumbersome and vague phrase "local realism" when, evidently, you just mean "local"?
 
  • #99
ttn said:
Could you give an example of a theory which is "local" but not "realistic" and which agrees with the QM predictions here? Or more basically, do you think that such a theory is (in light of Bell's discovery) possible?

If not, why use the cumbersome and vague phrase "local realism" when, evidently, you just mean "local"?

1. If you accept, as I do, that the Heisenberg Uncertainty Principle is a "complete" description of reality, then I believe you have a non-realistic position. So any interpretation of orthodox QM that matches would be such a theory.

2. I cannot avoid the phrase "local realistic" in my discussion, because this is generally accepted from Bell's work. If you believe in locality, I believe you must therefore reject realism. I also believe that it *might* be possible to construct a non-local theory which is realistic, but I am not sure about this.
 
  • #100
drChinese's "negative probabilities" rebutted?

DrChinese Sure said:
Dear David, a rebuttal of your ''negative probability case" (from your website) may be found at [link deleted]

I would welcome any comments, especially any which make your case clearer to the general reader. Of course, if I've mispresented it, please let me know that too.

wm
 
Last edited by a moderator:

Similar threads

Back
Top