Careful said:
I do not dispute that QM is successful in calculating the spectrum of the H and He atom (you cannot predict above He) and explaining the Lamb shift.
?? I think that QM has rather more successes on its name than just H and He ! I am even convinced that the "classical field" approaches (the coupled Dirac-Maxwell fields + eventually some noise) have serious problems with higher than He configurations. At best these theories give the same predictions as the Hartree-Fock method with the self-consistent potential, but it is well-known in quantum chemistry that this gives a good approximation, but sometimes not good enough and one needs to add things like "configuration interaction" to get closer to experimental values.
However, I am convinced that these successes have perfect (albeit more difficult and subtle) classical explanations. A good step in this direction is the theory of stochastic electrodynamics which reproduces a good bunch of these so called exclusive results from second quantization in a firstly quantized framework (such as the Casimir effect, and the H atom I believe). Barut and Dowling have done quite some work on this issue ...
Yes, that's fascinating work, I agree. The problem is, however, with most of these approaches, that they tackle ONE SPECIFIC aspect of quantum predictions, and that we can then vaguely hope that they will, one day, be as successfull as standard quantum machinery in all the rest.
It is just not reasonable to accept the quantum machinery for about all the predictions it makes, *except* for those very few predictions that kill off your original belief of how nature ought to be.
The reason why this work is 1) fascinating and 2) probably misguided can be found by "reductio ad absurdum". Indeed, if these classical theories were correct, their computations would be gazillion times simpler than quantum computations. Non-linear partial differential equations in 3 dimensions are, computationally, peanuts as compared to, say the Feynman path integral in QFT, and can be attacked much much easier with finite-element methods than QFT. It would reduce quantum chemistry, and even nuclear physics, to something computationally just as easy as weatherforcasting. Not that weatherforecasting is so simple, but it is doable, while QFT calculations only start to be tractable with lattice techiques. So if it were possible to do so, it would have been done already since a long time.
My intention is not at all to dispose of standard QM, I am perfectly aware of the fact that it provides an effective way of calculating the statistics of experiments with microscopic objects when applied on the *correct* problems with considerable thought. However, I also gives the wrong answers in some cases and it does not provide any insight into the dynamics of a single particle.
I would like to know in what specific cases quantum theory comes up with the wrong experimental predictions which have been falsified by experiment.
I want to obtain *insight* into the microworld which I believe obeys the same laws as the macroworld (that is GR and electromagnetism), therefore entanglement is a crucial issue and one should not take it lightly.
In a way, I *also* adhere to a belief: it is that there are a few fundamental principles on which the entire formalism of physical theory has to be constructed.
Concerning your remark about the extrapolation of succes, I can only say that people have been looking for over 50 years for a perpetuum mobile; I hope one is not going to look 100 years for entanglement. By the way, Newton theory was also correct for 300 years.
If you want my guess, I don't think quantum mechanics in its present form will still be around (except as a useful approximation) 300 years from now - or it will, but then because of lack of progress (for instance, lack of experimental input on quantum gravity phenomena). But as of now, it is still the best thing we have - and it has to be admitted that it is vastly more successful in vastly different fields than anything that tries to rival with it. At best you get *identical* predictions in certain areas. Entanglement is a very standard part of the quantum formalism, and *is* confirmed by many experiments in the sense that these calculations DO correspond to predictions that are verified: see further.
Concerning Hartree, you have to include a classical radiation field determined by the probability current of the particles. In that way, you obtain a QM where each particle has its own wave function and where interactions propagate via a classical maxwell field determined by the sum over all probability currents times the appropriate charges (I think Barut called this the self field approach to QED, it's non-linear of course).
I expected that this is what you meant but wasn't sure. It is indeed the self-consistent field method used in quantum chemistry. A good approximation, but with known deviations from experiment, which is improved upon by configuration interaction techniques which are nothing else but "entanglements" of the different electrons. Even in the H2 molecule, this is experimentally visible (although small). More successes can be found with the H20 molecule, especially the angle between the two bonds, which for the H-F selfconsistent field method gives us 106.1 degrees, while the CI technique gives 104.9 degrees (experiment being 104.5 degrees). Took this from "Modern quantum chemistry" by Szabo and Ostlund.
There are many many examples like this. The problem with the CI technique is of course the huge system of equations that it generates - hence my proof by contradiction of a classical theory doing the same thing: if it worked, it would be done since long.