Insights How I Stopped Worrying and Learned to Love Orthodox Quantum Mechanics - Comments

Click For Summary
The discussion centers on the interpretation of quantum mechanics, particularly the merits of Bohmian Mechanics as a coherent alternative to traditional quantum mechanics (QM). Participants express interest in the implications of non-relativistic versus relativistic frameworks, questioning the existence of non-relativistic particles in a fundamentally relativistic universe. The conversation also touches on the chiral fermion problem and the potential for string theory to provide a more fundamental understanding of particle behavior. There is a debate regarding the definitions of orthodox QM and the measurement problem, with differing opinions on the interpretations presented in Peres' work. Overall, the dialogue reflects a deep engagement with the foundational issues in quantum mechanics and the search for clarity in its interpretations.
  • #91
bluecap said:
(quoted from Wikipedia) Spatial information would be exhibited by states represented as functions on configuration space. The transitions may be non-deterministic or probabilistic or there may be infinitely many states.
This is my whole "worry" of quantum mechanics. How is classic physics "realized" from "infinitely many states"? It is quite simple to understand how one degree of freedom (spin up/down) can have only two possible outcomes in the real, physical world we know and trust but are there any constraints within quantum physics to only allow classically physical results of more complex systems or is that just "shut up and calculate" and the answers are always realistic once applied?

stevendaryl said:
String theory is an attempt to have a theory in which there is only one type of object (not a particle, I guess, since it's not a point-mass).
I made that "point" and demystifier replied that a string can split into 2 strings...
 
Physics news on Phys.org
  • #92
stevendaryl said:
A theory with just one particle would be pretty bizarre. But it might be possible, if that one particle travels back and forth through time (if you take literally the idea that an anti particle is a particle moving back in time).
Any further info about this requirement would be appreciated! In my mind retro-causality could be avoided with complete knowledge of the variables, is that not the case?
 
  • #93
bluecap said:
Oh I didn't mean Schrodinger Wave Function written in the position basis. I meant the state vectors (or whatever) used by Many Worlds and Bohmian where they are objective. If MWI and Bohmians can make them objective.. why can't Copenhagen make them objective?

Sorry for these basic questions (but I'd not ask more in this thread). I'll leave you experts to discuss stuff more professionally in this professional Insight thread.. thanks..
Of course, all I said about the wave function is equally valid for the representation free formulation, which makes QT indeed much more clear.

Of course, the quantum state is objective also in the minimal interpretation. We haven't even discussed about this question in the entire thread yet. The state is operationally defined by an equivalence class of preparation procedures and as such independent of any subjective influence.
 
  • #94
bluecap said:
So what's the problem with this view that the collapse in Copenhagen occurs after Decoherence, then you can know the location of the cut.. after decoherence.. as bhobba seemed to be saying above.. so there is no need to figure out where is the classical-quantum cut..

Maybe I'm misunderstanding something about decoherence, but in my superficial way of thinking about it, there isn't an objective, precise moment of decoherence. Somebody will please correct me if I'm wrong about this, but the way I think of it is that in any experiment, there is a division of the universe into:
  1. The system being studied (say, a particle)
  2. The apparatuses (apparati?) used to prepare and later measure the system.
  3. The rest of the universe (the "environment")
The system being studied can only briefly be described using a wave-function (pure state). After it interacts with systems 2 and 3, its state becomes entangled with the states of other (generally macroscopic) systems. At the point, unless you are using a wave function for the entire universe, you are forced to describe the system of interest using mixed states (density matrices), where the degrees of freedom due to systems 2 and 3 are "traced over". A density matrix can be interpreted using classical probability: the system is in this or that state, we just don't know which, and the density matrix gives the various probabilities. After you've switched to a mixed state description, you're free to think that the wave function of the system of interest has "collapsed", and you just don't know what state it's collapsed into. (This is slightly different from the "collapse" interpretation which says that the act of measurement causes the collapse. There doesn't actually have to be a measurement or observation, as long as the system of interest gets entangled with the environment).

Decoherence is just the process by which one system becomes hopelessly entangled with an environment so that for practical purposes, we switch from a pure state description to a mixed state description. But the whole decoherence process as I understand it (which I very well may not) depends on our splitting the universe into a system of interest plus everything else. So there is no objective decoherence process.
 
  • Like
Likes martinbn
  • #95
vanhees71 said:
Of course, the quantum state is objective also in the minimal interpretation. We haven't even discussed about this question in the entire thread yet. The state is operationally defined by an equivalence class of preparation procedures and as such independent of any subjective influence.

I don't think that defining a state as an equivalence class of preparations procedures eliminates subjectivity. The notion of equivalence of preparation procedures requires a judgement of when two preparation procedures are the same. That seems subjective to me.
 
  • #96
stevendaryl said:
Maybe I'm misunderstanding something about decoherence, but in my superficial way of thinking about it, there isn't an objective, precise moment of decoherence. Somebody will please correct me if I'm wrong about this, but the way I think of it is that in any experiment, there is a division of the universe into:
  1. The system being studied (say, a particle)
  2. The apparatuses (apparati?) used to prepare and later measure the system.
  3. The rest of the universe (the "environment")
The system being studied can only briefly be described using a wave-function (pure state). After it interacts with systems 2 and 3, its state becomes entangled with the states of other (generally macroscopic) systems. At the point, unless you are using a wave function for the entire universe, you are forced to describe the system of interest using mixed states (density matrices), where the degrees of freedom due to systems 2 and 3 are "traced over". A density matrix can be interpreted using classical probability: the system is in this or that state, we just don't know which, and the density matrix gives the various probabilities. After you've switched to a mixed state description, you're free to think that the wave function of the system of interest has "collapsed", and you just don't know what state it's collapsed into. (This is slightly different from the "collapse" interpretation which says that the act of measurement causes the collapse. There doesn't actually have to be a measurement or observation, as long as the system of interest gets entangled with the environment).

Decoherence is just the process by which one system becomes hopelessly entangled with an environment so that for practical purposes, we switch from a pure state description to a mixed state description. But the whole decoherence process as I understand it (which I very well may not) depends on our splitting the universe into a system of interest plus everything else. So there is no objective decoherence process.

I think what Bill meant was that the cut occurred the moment the system lost phase coherence when it becomes hopelessly entangled with an environment, then the stressed wave function (or state vector) collapses into one value (in collapse interpretation). We use the density matrix only as tools to trace the environment even if the superposition is still theoretically (what we think) there.. so the objective decoherence process occurs when the coherence of the system become decoherent (or lost phase coherence)... which may occur before we do any tracing.. maybe Bill can clarify this as he is well verse in decoherence and the cut...
 
  • #97
martinbn said:
My question was why the classical/quantum cut is a problem. Now you are just making general statements about the measurement problem.

The classical/quantum cut is the very definition of the measurement problem. They are equivalent.

martinbn said:
That is very strange. It's like complaining that in such and such book on algebraic geometry, where sets are used, there is no reference to Russel's paradox. If there are calculational mistakes you can point them out. But there is a difference between the foundations as the basics and the logical foundations. A book on the foundations of differential geometry will likely not talk about set theory and mathematical logic, and that is not not error.

No, it's complaining about a book on mathematics that claims that standard mathematics is wrong! Ballentine claims standard physics is wrong. Sorry, but standard physics is right, and Ballentine is rubbish.
 
  • #98
stevendaryl said:
I don't think that defining a state as an equivalence class of preparations procedures eliminates subjectivity. The notion of equivalence of preparation procedures requires a judgement of when two preparation procedures are the same. That seems subjective to me.
You can decide about the state only by measurement, and there's nothing subjective about it. Complete state determination can, of coarse, only be done on ensembles, never by just a single measurement due to the probabilistic nature of the quantum state, but what do you think is subjective about it? E.g., you can determine a system to be in a pure state by doing a simultaneous von Neumann filter measurement of a complete set of compatible observables. This is an objective procedure, but it can be realized in different ways using different measurement and filter devices. That's why I talked about "an equivalence class of preparation procedures".
 
  • #99
vanhees71 said:
You can decide about the state only by measurement, and there's nothing subjective about it.

I don't agree that that's true. A measurement occurs when the state of the system of interest becomes correlated with a macroscopic variable that we can check ourselves. The subjectivity is the choice of which variable will count as a measurement.
 
  • #100
jerromyjon said:
I made that "point" and demystifier replied that a string can split into 2 strings...
String theory is not only a theory of strings. Perturbative string theory is just an approximation of M-theory, which contains branes.
 
  • #101
haushofer said:
Perturbative string theory is just an approximation of M-theory, which contains branes.
So that makes them higher dimensional than the standard model, but still particles... is that in addition to strings or it constitutes them?
 
  • #102
atyy said:
No, it's complaining about a book on mathematics that claims that standard mathematics is wrong! Ballentine claims standard physics is wrong. Sorry, but standard physics is right, and Ballentine is rubbish.
Or standard physics is incomplete, but no one knows for sure yet.
 
  • #103
vanhees71 said:
Well, there is no cut, at least nobody could empirically prove that there is anything that doesn't follow quantum theory but must be described classically. Classical physics is understood as an effective description of quantum physics for sufficiently coarse-grained observables of macroscopic objects, and decoherence is among the strongest mechanisms at work to let macroscopic objects occur as classical. Another hint is that the classical-quantum cut is artificial and can be often shifted from one part of the description of a system as applicable.
Making the claim over and over again that the Measurement Problem and/or the Classical/Quantum Cut are issues that have been resolved by techniques such as decoherence is simply factually false. You can ignore the problems for most practical purposes if they don't interest you, but you are mistaken to assume they have been resolved.

But don't take my word for it; read the works of top level physicists who work in the foundations of quantum physics. As Anthony Leggett (winner of the 2003 Nobel Prize) says: Decoherence is a technical trick for pretending to have solved the measurement problem.
 
  • Like
Likes Demystifier, Boing3000 and dextercioby
  • #104
Physics Footnotes said:
Making the claim over and over again that the Measurement Problem and/or the Classical/Quantum Cut are issues that have been resolved by techniques such as decoherence is simply factually false. You can ignore the problems for most practical purposes if they don't interest you, but you are mistaken to assume they have been resolved.

But don't take my word for it; read the works of top level physicists who work in the foundations of quantum physics. As Anthony Leggett (winner of the 2003 Nobel Prize) says: Decoherence is a technical trick for pretending to have solved the measurement problem.

Decoherence is just scrambling the probability, it doesn't produce outcome like collapse.
I've been trying to understand Vanheez71 position because I'd like to become an Ensembler Intepretation proponent too because all these Copenhagen, MWI, BM seem adhoc and so medieval and I'd love to stop worrying and learn to love orthodox quantum mechanics too. Vanheez71 said "the quantum state is objective also in the minimal interpretation.". How does he treat the problems of outcome? In a few sentences. Can you summarize his views? Is his also the view of mainstream physicists who think the problem of outcome is not necessary? How does outcome occur in Vanheez's Minimal Ensembler Interpretation for single system. Does he believe single systems don't exist as in don't literally exist.. or does he believe it exists and he just wants to block thinking about it so he just focused on the minimum interpretation.? ut in his arguments, he seems to be saying single systems don't exist or does he mean simply not necessary to think of it. What is it he thinks based from those who have discuss with him for many years? I just want to understand it from another choice of words which others can express so I'd understand it better. It would take me a week to read all his messages at the archive. So I'd like some pointers of his main punchline from those who have thoroughly understood him.

And so as not to be off topic. I've been wondering. In Conventional Bohmian Mechanics.. are all the particles identical.. remember it is the wave function that do all the muscles and works.. and it just pushes the particles via the quantum potential.. therefore are the particles in say electron and quark identical particle (in BM) that you can interchange them with no effect.. again remember the properties of the particle are all stored in the wave function or state vector such that when the particle accelerate in the atom, it doesn't lose energy because the energy is in the wave function and by some dynamics with the quantum potential doesn't lose energy, it just push the worker particle around like slave.
 
  • #105
jerromyjon said:
Or standard physics is incomplete, but no one knows for sure yet.

No, that's not what I meant when I said Ballentine says standard physics is wrong. Ballentine claims Copenhagen makes wrong predictions. Thus Ballentine claims quantum mechanics has already been falsified. That is untrue.

Further, Ballentine avoids the classical-quantum cut and collapse, leading to wrong physics in his book. The classical-quantum cut and collapse are the clearest indications that quantum mechanics is incomplete. Because of Ballentine's error, some who read his book make wrong arguments in favour of the possible completeness of quantum mechanics.
 
  • #106
Physics Footnotes said:
Decoherence is a technical trick for pretending to have solved the measurement problem.

Bingo - it doesn't - it just morphed it.

I had a note about my view of locating the classical quantum cut just after decoherence. There is nothing that says you have to do that - its simply, after understanding decoherence it's the most reasonable place to put it - resolving many issues. But it says nothing about it being there.

What it does however is disprove Von-Neumann's infinite regress argument that since there is no place inherently different from any other is to place to cut the only real place that is different - the consciousness of the observer - we now know a place that is different - just after decoherence.

Its pretty much standard textbook stuff:
https://www.amazon.com/dp/3540357734/?tag=pfamazon01-20

Schlosshauer clearly explains what it does solve and what it does not solve.

I will repeat - it does not solve the measurement problem. The problem comes in 3 parts I will not detail (read the book if interested). It solves the first 2 - but stands impotent before the third - technically how does an improper mixed state become a proper one, colloquially why do we get any outcomes at all. There are numerous views on that - mine is - who cares - its just the way nature is. Other have a different view.

Make up your own mind - it does't really affect anything. I have said it before, and will say it again, the value of studying various interpretations is to understand the formalism better - what is it really saying and what is interpretation. A common one is this collapse idea. At first reading of QM you think it has collapse on observation - some textbooks even have it as a postulate. But MW, BM and Stochastic Mechanics all do not have collapse so it can't be part of the formalism - which it isn't, as you will be acutely aware of if you study Ballentine.

Thanks
Bill
 
  • Like
Likes Daria K
  • #107
bhobba said:
Bingo - it doesn't - it just morphed it.

I had a note about my view of locating the classical quantum cut just after decoherence. There is nothing that says you have to do that - its simply, after understanding decoherence it's the most reasonable place to put it - resolving many issues. But it says nothing about it being there.

What it does however is disprove Von-Neumann's infinite regress argument that since there is no place inherently different from any other is to place to cut the only real place that is different - the consciousness of the observer - we now know a place that is different - just after decoherence.

Is there a way or no way to prove it that outcome occurs right after decoherence? Because if there is a delay.. it means even when the phase of the system decoheres (when it suffers decoherence), the von-Neumann cut can still be moved.. and if outcome really occurs after decoherence.. does this show the wave function or state vector natural state is coherence and if something decoheres it.. it suddenly collapse? (assuming collapse is correct or let's say we are discussing about collapse instead of MWI or BM).. reminds me of Penrose gravitationally induced collapse where spacetime sort of got destabilized when coherence of the system is lost so the particle collapsed (because spacetime is telling it to collapse?) And again is there a way or no way to prove it that outcome occurs right after decoherence (or locating the classical-quantum cut right after decoherence)?

Its pretty much standard textbook stuff:
https://www.amazon.com/dp/3540357734/?tag=pfamazon01-20

Schlosshauer clearly explains what it does solve and what it does not solve.

I will repeat - it does not solve the measurement problem. The problem comes in 3 parts I will not detail (read the book if interested). It solves the first 2 - but stands impotent before the third - technically how does an improper mixed state become a proper one, colloquially why do we get any outcomes at all. There are numerous views on that - mine is - who cares - its just the way nature is. Other have a different view.

Make up your own mind - it does't really affect anything. I have said it before, and will say it again, the value of studying various interpretations is to understand the formalism better - what is it really saying and what is interpretation. A common one is this collapse idea. At first reading of QM you think it has collapse on observation - some textbooks even have it as a postulate. But MW, BM and Stochastic Mechanics all do not have collapse so it can't be part of the formalism - which it isn't, as you will be acutely aware of if you study Ballentine.

Thanks
Bill
 
  • #108
atyy said:
No, that's not what I meant when I said Ballentine says standard physics is wrong. Ballentine claims Copenhagen makes wrong predictions. Thus Ballentine claims quantum mechanics has already been falsified. That is untrue.

Ballentine does not do that. Ballentine has two issues I am aware of:

1. He considers the only version of Copenhagen is the one where the wave-function is objectively real. That's poppycock - the vast majority of versions of Copenhagen have the wave-function like Bayesian probability as a kind of rational beings expectation. Its actually pretty close to his Ensemble interpretation except for a different view of probability - one is Baysian - the other frequentest. Many interpretations of QM are like that - just a rehash of arguments about the meaning of probability:
http://math.ucr.edu/home/baez/bayes.html

2. The above is from his otherwise excellent textbook - but gee nobody is perfect and you have to take the book overall - from that viewpoint IMHO its still by far the best text out there - just my view of course. But elsewhere he has made another error - he states decoherence has no bearing in interpretive issues. Rubbish - it has revolutionized our understanding of QM interpretations clearly pinpointing QM's real issue I stated above. But then again Ballentine believes his ensemble interpretation solves all issues anyway. He is correct - but decoherence has deepened our understanding of that and many other interpretations. We also have interpretations like decoherent histories where it's part of the interpretation itself.

Thanks
Bill
 
Last edited:
  • #109
bluecap said:
Is there a way or no way to prove it that outcome occurs right after decoherence?

I stated clearly - and will restate it - NO. Von-Neumann proved you can put the quantum classical cut virtually anywhere. As far as I know, and having gone through the proof myself many moons ago, its still valid. Its an unproveable interpretive assumption placing it there. However it solves in one stroke many issues.

Just to be 100% sure on this - you can't do it - you can't prove the cut happens anywhere.

Thanks
Bill
 
  • #110
bhobba said:
Ballentine does not do that.

Yes, Ballentine does do that.

Error 1: In his discussion of the spin recombination experiment, he says that experimental data are inconsistent with Copenhagen.
Error 2: He is wrong in his discussion of the quantum Zeno paradox.

Neither error is incidental, but comes from his fundamental dislike of standard physics.
 
Last edited:
  • #111
vanhees71 said:
Of course, all I said about the wave function is equally valid for the representation free formulation, which makes QT indeed much more clear.

Of course, the quantum state is objective also in the minimal interpretation. We haven't even discussed about this question in the entire thread yet. The state is operationally defined by an equivalence class of preparation procedures and as such independent of any subjective influence.

After analyzing Vanhees71 statements and reading some of the archives for some hours. He is really a Copenhagenist in disguised! Here's why. First Vanhees71 stated.

1. "There is no cut"
2. "Of course, the quantum state is objective also in the minimal interpretation".

But Neumaier stated elsewhere:

"The minimal interpretation is significantly different from any version that deserves (in my view) to be called Copenhagen. In the Copenhagen interpetation (prevailing until the 1970es), each single object is in a well-defined (though possibly unknown) pure state, which collapses to a different state upon measurement. In contrast, in the (much later sensibly defined) minimal, statistical interpretation, the state is a property of the source (i.e., preparation procedure), not of the single quantum object. If you call the minimal interpretation a flavor of Copenhagen then the term ''Copenhagen interpretation'' loses its discriminating meaning."

Reference https://www.physicsforums.com/threa...presented-as-such.850860/page-21#post-5377217

In a bonafide statistical interpretation. The quantum state is not objective. But vanhees71 clearly stated: ""Of course, the quantum state is objective also in the minimal interpretation" so vanhees71 is really a Copenhagenist by heart. And i think its a reasonable view. The pragmatic bonafide statistical interpretation proponents are those who believe only measured statistics in the detectors makes sense.. who blank out what is between emission and detection. Vanhees71 is not this. Id like to know. Are mainstream physicists mostly bonafide statistical interpretation proponents or hidden or unwilling Copenhagists like Vanhees71?
 
  • #112
bluecap said:
""Of course, the quantum state is objective also in the minimal interpretation" so vanhees71 is really a Copenhagenist by heart.

I have already stated that Ballentine gets Copenhagen wrong - that is old news. He believes Copenhagenists think the wave function is real. Most versions do not believe that, although you can easily get that view reading some older textbooks.

Vanhees, like me, believes in the Ensemble interpretation which has a different view of what the wave-function is. Its simply the frequentest and Bayesian view of probability rehashed.

This one error does not invalidate that entire excellent textbook and it has bern rehashed over and over again - there is simply no need to keep going over it.

Thanks
Bill
 
  • Like
Likes Demystifier and bluecap
  • #113
bhobba said:
I have already stated that Ballentine gets Copenhagen wrong - that is old news. He believes Copenhagenists think the wave function is real. Most versions do not believe that, although you can easily get that view reading some older textbooks.

Vanhees, like me, believes in the Ensemble interpretation which has a different view of what the wave-function is. Its simply the frequentest and Bayesian view of probability rehashed.

This one error does not invalidate that entire excellent textbook and it has bern rehashed over and over again - there is simply no need to keep going over it.

Thanks
Bill

Oh actually first time to hear about this. Ill read Ballentine tomorrow curious to see what's all the fuss about it. Thanks for the tips. Btw do you consider the quantum state as objective or concern only the bayerian and frequentist aspects or side of it? Then you are a genuine Ensemble Interpretation proponent while Vanees71 is more a hybrid Ensembler/Copenhagen right? He believes the quantum state is objective while you are agnostic. We mustn't use categorication from book only or author but from technical consideration. Many thanks.
 
  • #114
bluecap said:
Oh actually first time to hear about this. Ill read Ballentine tomorrow curious to see what's all the fuss about it. Thanks for the tips. Btw do you consider the quantum state as objective or concern only the bayerian and frequentist aspects or side of it? Then you are a genuine Ensemble Interpretation proponent while Vanees71 is more a hybrid Ensembler/Copenhagen right? He believes the quantum state is objective while you are agnostic. We mustn't use categorication from book only or author but from technical consideration. Many thanks.

Both Vanhees and I advocate the Ensemble interpretation. Why you believe he is some kind of hybrid beats me.

The ignorance ensemble just applies it to the mixed state after decoherence - that's all.

Thanks
Bill
 
  • Like
Likes bluecap
  • #115
bhobba said:
I have already stated that Ballentine gets Copenhagen wrong - that is old news. He believes Copenhagenists think the wave function is real. Most versions do not believe that, although you can easily get that view reading some older textbooks.

Vanhees, like me, believes in the Ensemble interpretation which has a different view of what the wave-function is. Its simply the frequentest and Bayesian view of probability rehashed.

This one error does not invalidate that entire excellent textbook and it has bern rehashed over and over again - there is simply no need to keep going over it.

Thanks
Bill
In my opinion the problem is that bluecap and I have different understanding of the words "real" and "objective". First of all, I must admit, I don't know, what philosphers mean by "real" or "realistic interpretation". It's such a mess of different meanings that I like to avoid to use this word, and I don't know where in physics I need it anyway. A better distinction is whether you have an ontological or epistemological interpretation of the quantum state (which in the formalism is represented by the statistical operator): In the ontological version, the philosopher believes that the state, represented by the statistical operator, really is in one-to-one relation with the described object. In my opinion this interpretation has been refuted already in the very early days of modern QT: An electron is observed as a point-particle like object when one makes a position measurement (e.g., by putting a photo plate in its way) but not as a smeared-out continuous charge or mass distribution. That's why Born introduced the probability distribution, and I think that QT is only consistent with all observations and also relativistic causality structure of spacetime if one accepts this probabilistic meaning of the wave function. Thus I think the state is epistemic, i.e., it is a concise description of our knowledge about a system due to some preparation procedure bringing the system into this state. This implies that what some Copenhagen flavors of interpretation call "collapse" is just an update about our knowledge when measuring an observable on a system, which after an analysis of the interaction between the system in the measurement apparatus, enables me to associate another state for the system (although very often, the system is simply destroyed by the measurement, e.g., a photon gets absorbed in being registered via the photoeffect which enables its registration by the measurement device like a photoplate or a modern CCD camera).

Another question is, whether there is subjectivity in QT, and I don't think so. It's objectively defined what a quantum state is. It's independent of the individual researcher what it means to prepare a photon with a certain momentum distribution and polarization state, and in principle anybody can objectively prepare photons in the so described state.

The natural sciences don't deal with subjective notions but restrict themselves strictly to objective properties of observable (and quantifiable) phenomena.
 
  • Like
Likes bhobba and fresh_42
  • #116
vanhees71 said:
In my opinion the problem is that bluecap and I have different understanding of the words "real" and "objective". First of all, I must admit, I don't know, what philosphers mean by "real" or "realistic interpretation". It's such a mess of different meanings that I like to avoid to use this word, and I don't know where in physics I need it anyway. A better distinction is whether you have an ontological or epistemological interpretation of the quantum state (which in the formalism is represented by the statistical operator): In the ontological version, the philosopher believes that the state, represented by the statistical operator, really is in one-to-one relation with the described object. In my opinion this interpretation has been refuted already in the very early days of modern QT: An electron is observed as a point-particle like object when one makes a position measurement (e.g., by putting a photo plate in its way) but not as a smeared-out continuous charge or mass distribution. That's why Born introduced the probability distribution, and I think that QT is only consistent with all observations and also relativistic causality structure of spacetime if one accepts this probabilistic meaning of the wave function.

I was influenced by Everett who somehow was able to make the state, represented by the statistical operator, really in one-to-one relation with the described object without any collapse. Remember in Many Worlds, the wave function form many worlds where instead of collapse to one eigenstate.. all eigenstate exist.. but I think he has made a trick somewhere. In case you know how Everett did it.. please share how the trick is done in one message and that's it.. I'd no longer ask more in this thread.

Sorry for this novice question. Don't worry when Demystifer returns from his weekend hiatus tomorow.. i'd leave the discussions so experts can make more productive discussions without novices disturbing the tone. Thanks a lot Vanhees71! Btw.. I'm not a philosopher.. but applied science novice..

Thus I think the state is epistemic, i.e., it is a concise description of our knowledge about a system due to some preparation procedure bringing the system into this state. This implies that what some Copenhagen flavors of interpretation call "collapse" is just an update about our knowledge when measuring an observable on a system, which after an analysis of the interaction between the system in the measurement apparatus, enables me to associate another state for the system (although very often, the system is simply destroyed by the measurement, e.g., a photon gets absorbed in being registered via the photoeffect which enables its registration by the measurement device like a photoplate or a modern CCD camera).

Another question is, whether there is subjectivity in QT, and I don't think so. It's objectively defined what a quantum state is. It's independent of the individual researcher what it means to prepare a photon with a certain momentum distribution and polarization state, and in principle anybody can objectively prepare photons in the so described state.

The natural sciences don't deal with subjective notions but restrict themselves strictly to objective properties of observable (and quantifiable) phenomena.
 
  • #117
bluecap said:
I was influenced by Everett who somehow was able to make the state, represented by the statistical operator, really in one-to-one relation with the described object without any collapse. Remember in Many Worlds, the wave function form many worlds where instead of collapse to one eigenstate.. all eigenstate exist.. but I think he has made a trick somewhere. In case you know how Everett did it.. please share how the trick is done in one message and that's it.. I'd no longer ask more in this thread.

Sorry for this novice question. Don't worry when Demystifer returns from his weekend hiatus tomorow.. i'd leave the discussions so experts can make more productive discussions without novices disturbing the tone. Thanks a lot Vanhees71! Btw.. I'm not a philosopher.. but applied science novice..
Let me clarify my point so I’d get a quick answer from those who already know. Wave function is spread out, so it can’t be the particle itself. This was why Born proposed the probability interpretation about a century ago. This was what Vanhees71 was also talking about that’s why he said this was abandoned in 1927. But in Everett Relative State or general Many Worlds. What you have is simply more superposition and entanglement.. but just the same you still have spread out waves in each branch.. so without Born probability square thing.. how does the wave manifest into particle (in each branch)? Can someone who know (like Stevendaryl) please answer this. Thanks.

Most discussions I read in the archives is about how to get Born rule to have worlds existing according to the weight or probability, but nowhere is it mentioned how wave turn into particle in each branch (I've been searching for hours at end. So hope someone can correct my misconception (if you think it’s necessary to reply this in a separate thread, then please create one to avoid watering down the issues in this thread which is about the condense matter quasiparticle thing).

Btw.. In Bohmian Mechanics, the mechanism of how wave manifest as particle is simply the quantum potential that pushes the particle around. Only problem here is it’s difficult to reconcile with QFT where the particle annihilates and creates (the quantum potential only pushes thing around, it doesn’t create or annihilate particles). That’s why I find Demystifier condense matter physics analogy of our relativistic particles as quasiparticles intriguing as it can explain QFT particles. And I’d like to know if there is a way to refute it… so if there is.. and there is no way for BM to explain QFT.. then I have to be stuck to MWI or Copenhagen as most likely (with others Objective Collapse, Cramers/Rastner Transactional, etc. in decreasing order of plausibility or vice versa), and at least we have one less as we eliminate Bohmian Mechanics due to severe inability to be relativistic. Many thanks.
 
  • Like
Likes vanhees71 and Demystifier
  • #118
jerromyjon said:
So that makes them higher dimensional than the standard model, but still particles... is that in addition to strings or it constitutes them?
In perturbative string theory, branes are solitonic objects. In M-theory, strings aren't there anymore.
 
  • Like
Likes bhobba
  • #119
martinbn said:
But the theory shouldn't tell you where to put the cut. It should be able to handle all possible scenarios. It's like asking from classical mechanics to tell you what the forces are, or what coordinates you should use.
Classical mechanics does not tell us what the forces are, but it is experiments that tell us what the forces are. On the other hand, experiments do not seem to tell us where the cut is.

Concerning the question what coordinates one should use, this is not a good analogy because in principle any coordinates are OK, except that in some coordinates the problem looks more complicated. By contrast, it is certainly not the case that any cut is OK.
 
  • Like
Likes atyy and vanhees71
  • #120
bluecap said:
That's right. For a week I kept wondering how the wave function decide to collapse after it is decohered.. my analogy (silly as it is) is like wave function is very sensitive and commit suicide (collapse) when any of its secret is known (or loss phase coherence). I'd continue to think but won't mention in this thread again.

So as not to be off topic. Demystifier idea of our particles like electron, quark as relativistic quasiparticles (like phonons) from condense matter physics is great with the real Bohmian particles as non-relativistic ontology.. actually I first heard of it early this year from his paper... and I'd like to ask Demystifier what is the speed limit of the real bohmian particles.. is it not limited by c? If you don't know. Hope Demystifer can answer this when he gets back. Thanks.
No speed limit at the fundamental level. (Which, as a byproduct, may also solve the the black-hole information paradox.)
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 118 ·
4
Replies
118
Views
13K
  • · Replies 21 ·
Replies
21
Views
3K
  • · Replies 22 ·
Replies
22
Views
7K
  • · Replies 395 ·
14
Replies
395
Views
25K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 108 ·
4
Replies
108
Views
17K
  • · Replies 9 ·
Replies
9
Views
3K