Demystifier said:
Concerning virtual particles, I am not saying that you do not need to take into account their mathematical contribution to measurable quantities. Of course you do. But I am saying that you can obtain the same result on measurable quantities by applying a different calculation method (a non-perturbative method) in which the concept of a virtual particle does not even make sense.
No. You are missing an important point about 'virtual particles' here.
As I tried to explain, there is no black and white distinction.
Is a photon ever a real particle to you?
They are from quantization of the electromagnetic field. But strictly speaking anything we refer to as a photon in experiment is from perturbation theory. You could claim no "photons" exist, they are all just 'artifacts' of perturbation theory and that the full theory only has the full fields.
Again, do you consider a photon emitted from the sun and absorbed by our eye are real or virtual? Let's make it simpler. A photon is emitted during de-excitation of an atom and is absorbed by another atom. Was this photon real or not?
As I tried to explain earlier, the only consistent way we currently have to define a particle as "real" is off at infinity. With finite times there is a full continuum, with some definitions being appropriate extensions given particular applications, but ultimately there is no black and white distinction between real and virtual particles.
Your proposed definition doesn't resolve any of these issues.
Demystifier said:
There is no such thing as a sum of fluctuations. There is only a sum of quantum states, represented e.g. by wave functions. In non-local realistic theories such as the Bohmian interpretation such wave functions are real and evolve deterministically, so they automatically contain the stuff you call "deterministic off-shell fluctuations".
No. As a realism theory, you are requiring the "fluctuation" to have a definite (although unknown) value. Saying it is a definite superposition of values is no different (it is just a change of basis). You will not get the correct answer.
Also, if you are saying the vacuum is "really" in a definite state containing off-shell values, then one could measure the component of vacuum in such states ... ie. one could measure the vacuum to be in a state that violates lorentz symmetry, etc. That is incorrect. It's not that the vacuum
expectation (ie. averaging over many measurements) has lorentz symmetry, but that every measurement will not violate special relativity.
Demystifier said:
1. In BM, everything is determined by the initial conditions, but the initial conditions themselves are essentially random.
Again, here by random, because you want a realism theory, you have to mean unknown but definite value. This is different than truly random.
Consider spin of an electron.
If you measure the electron to be in the spin up state (S_z=+1/2), then subsequently measure S_y, half the time you will get (S_y=-1/2) and the other half you will get (S_y=+1/2). Quantum mechanics says this is truly random. It was NOT in a definite, but unknown state until you measured it.
You instead are claiming it is.
So for quantum mechanics it is random. But for your BM interpretation, your "essentially random" is really only unknown initial conditions. Multiple interactions will have correlations that would differ from true randomness. They are not equivalent.
Demystifier said:
2. To have nonlocality and relativity at the same time, the concept of an "initial" condition should be radically revised. A part of it can be in the past, while another part of it can be in the future.
I must be frank here. I feel this is teetering on Metaphysics/philosophy and not actually physics. This proposes much additional structure that doesn't even sound testible.
Let me make a historical analogy. Consider Lorentz and his aether theory. He found that it had an interesting symmetry that forbid detecting the ether. He found this before Einstein even published his paper in 1905. The troubling thing is that even
after people came to understand relativity and the powerful understanding that came with it, Lorentz wouldn't give up on the ether. To him, this was just a
mathematical trick: Rods
really did shrink and clocks ran slow becaue their internal interactions were different when they were moving with respect to the aether.
The two theories both predicted the same results for experiment. So one could claim that we merely have two different / valid interpretations of the same physics. However in Lorentz theory, the Lorentz symmetry is just a mystical conspiracy of the math and several ad-hoc conjectures. In Special Relativity, the lorentz symmetry is fundamental and those same conjectures can be
derived.
Which is physics? I hope most would agree here that the existence of an aether which cannot be experimentally verified, and which to cast equations in the term of interactions with, requires demoting seemingly fundamental symmetries to mere 'coincidences',
and worse yet do not provide new predictions are at best metaphysics.
The lesson:
Good physics does not come from mangling theories to insist on
a priori beliefs based on intuition.
Your theories do the same. You destroy locality, but in such a contrived manner that it cannot be measured. Lorentz symmetry is now emergent, yet probing deeper at any level will not give details on this 'emergence'. And now you are laying ground to allow anything to be explained away by saying it deals with "initial" conditions in the future. Things will become even more convoluted if you actually manage to derive the lamb shift and the magnetic moment of the electron (it would be hard to take your "new QFT" seriously without at least showing how such calculations could be done in principle and showing the correct results should follow ... for relativistic quantum mechanics gets the wrong result, it took field theory to get the rest, so such calculations really demonstrate that final step from non-relativistic quantum -> relativistic quantum -> quantum field theory).
Any such theories are not seem by the mainstream as attempts to resolve 'foundational issues of quantum mechanics', but hiding behind the excuse of foundational issues to try to wedge in antiquated
a priori intuition back into a theory.
They serve as examples of not what additional interpretations are compatible, but what must be sacrificed in order to force particular unnecessary requirements into a theory. These sacrifices are too big. It prevents any possibility of making new predictions/advances, so the theories are dead ends. Just as Lorentz ether theory serves only as historical example, so too do these ideas. No new predictions have come from it.