Classical and Quantum Mechanics via Lie algebras

Click For Summary
The discussion centers on the draft of a book titled "Classical and Quantum Mechanics via Lie Algebras," which aims to demonstrate the similarities between classical and quantum mechanics through the lens of Lie algebra. The author seeks feedback to enhance the presentation of the material, which includes a thermal interpretation of quantum mechanics, arguing that quantum mechanics can be understood in a coherent manner by considering thermodynamic principles. Key points include the assertion that classical and quantum mechanics are fundamentally similar and that fields, rather than particles, should be viewed as the primary entities in physics. Critics express skepticism about the thermal interpretation's alignment with modern probabilistic views of nature, while supporters highlight its unique ability to reconcile deterministic and stochastic interpretations of quantum mechanics. The thread emphasizes the need for clarity and rigor in discussing these complex topics.
  • #91
A. Neumaier said:
Yes. It may well turn out that gravitation is a pure thermodynamic effect. But in my book and lectures I am sticking to the most solidly accepted part of quantum mechanics, to avoid any unnecessary friction.
Yes, well understood and I concur. It is obvious that you are not explicitly trying imply anything outside the standard model, rather just express it from a particular context. I just threw this in as an extension, more or less as an afterthought, to illustrate that the interpretation could potentially run deeper than what you intend convey.

A. Neumaier said:
This was Schroedinger's idea, but turned out to be not realizable as the dimensions are vastly different. In the thermal interpretation, the ontological status of beables is given to the field expectations, which are true fields in spacetime rather than objects in a high-dimensional space. This is the improvement upon Schroedinger and the reason why everything works neatly and intuitively.
This is where your thinking on the subject appears superior to mine. I was well aware that that once you tried to take the analogies with ontological parts of the field, rather than the field itself, too seriously it runs into very distinct problems. The expectation values in QM are simply NOT the positions and momentums of distinct parts the way they are in classical physics. I was not trying to suggest in the analogy provided held in the particulate sense. Only that in defining a mass particle in terms of a deformable field the apparently localized structure is really no more distinct than a wave is in classical physics. I am still trying to work through the details of precisely how you deal with "field expectations" in an ontological sense. Just because it makes sense in a general way is no garrantee of a lack of incongruencies, but I have nothing to indicate specific incongruencies as yet.


A. Neumaier said:
I never heard of this term. Could you please provide a reference?
Here is one from Phys. Rev. A 54, 1996, "http://www.ino.it/~azavatta/References/PRA54p3489.pdf"".

This phenomena has also been used in "ghost imaging", which allows a camera to take a picture of something the camera cannot see. This is also used in single-pixel detector setups and it is argued by some that this is evidence that it does not depend on non-local quantum correlations.
http://arxiv.org/abs/0812.2633

I am also very curious about this experiment showing interference in uncorrelated separable lights sources, which is given in the context of cross beam experiments mentioned in the paper.
http://arxiv.org/abs/physics/0504166
Though I do not know how much weight to put on these results.
 
Last edited by a moderator:
Physics news on Phys.org
  • #92
Now I think I have a more complete picture of your interpretation. I went back to some of your earlier work. Primarily:
http://lanl.arxiv.org/abs/quant-ph/0303047"

You place no judgment at all on the noncommutativity of QM, other than as an empirical fact, and conceptually work with the "expectation values" as fluid properties in the classical thermodynamic sense. Thus Hilbert space remains a separate construct in an ontological sense with no specific ontological status assigned directly to it. That certainly does escape many classical issues while still maintaining a direct and unmodified formal transition from one to the other. The difficulty it appears then is making the point when people are so accustomed to assigning distinct empirical properties to distinct points in space.

How do you deal with the conservation issue with wave cancellations? In effect it boils down to, if two quantum waves overlap so as to cancel what happened to the energy associated with those waves? If they simply become non-existent there appears to be a conservation violation. Dirac got around this by simply assuming particles could only self-interact, hence they did not really disappear.

This self-interaction hypothesis is, however, dubious.
http://arxiv.org/abs/quant-ph/0312026" .
 
Last edited by a moderator:
  • #93
I'm only just beginning to be introduced to the thermal approach, but what I've seen so far shows promise. On another thread, I was grappling with the question of how we should regard "what a classical apparatus knows about itself", in a sense. The usual interpretation is that the projection onto the measuring device is a mixed state, and further, that a mixed state is "in a definite state we just don't know which." Then when we look, it merely confirms what was true before we looked. Although this runs into no problems in experience, it does not seem to be strictly true to our own theory-- our own theory tells us that a mixed state is just a mixed state, and not a definite state that we just don't know which. The latter conjures the concept of a probability distribution, but the former seems more inherently "fuzzy" to me. It seems your language offers the possibility of putting that difference on a firmer basis.

If you consider the subtext of what I'm saying, I suggesting that maybe you can take your idea even farther: out of the nebulous quantum realm, where "anything goes" pretty much, and into the well-worn classical realm, where surely there are no new surprises. But the status of a "mixed state" was always a bit nebulous in the classical realm too-- we say that the air in this room, treated classically, is in a definite state "we just don't know which one", but how do we really know that this is what the classical theory asserted? There is no instrument or perceptive agent anywhere that has the power to tell the definite state of the air in the room, so on what basis did we claim there was such a state?

On the other hand, if I shuffle a deck of cards, I might struggle with wondering if every card is in a definite micostate of internal particles, but I don't have difficulty asserting that the order of the cards is definite, even before the cards are looked at. This conforms to our tests, because we can objectively determine the order of the cards. So does the concept of "resolution" come up here too, is the status of a deck of cards really something different, from an information theory standpoint, than the microstate of the air in this room? Was there fuzziness in classical physics that we just never noticed?
 
  • #94
Ken G said:
I'm only just beginning to be introduced to the thermal approach, but what I've seen so far shows promise. On another thread, I was grappling with the question of how we should regard "what a classical apparatus knows about itself", in a sense. The usual interpretation is that the projection onto the measuring device is a mixed state, and further, that a mixed state is "in a definite state we just don't know which."
In the thermal interpretation, the classical properties are manifest as the expectations of the field operators. This matches naturally with a hydrodynamical description of classical matter. Thus the problem of identifying the classical properties simply vanishes. Statistical mechnaics guarantes that the fields are measurable in a coarse-grained sense, because of being in a mixed (thermal) state.
Ken G said:
There is no instrument or perceptive agent anywhere that has the power to tell the definite state of the air in the room, so on what basis did we claim there was such a state?
One doesn't need a fully precise description, since the obsefvation of a field is itself necessarily coarse-grained. It is enough that the description matches the actually possible resolution. In the thermal interpretation, this is guaranteed by the standard results from statistical mechnaics.
 
  • #95
my_wan said:
Now I think I have a more complete picture of your interpretation. I went back to some of your earlier work. Primarily:
http://lanl.arxiv.org/abs/quant-ph/0303047"

You place no judgment at all on the noncommutativity of QM, other than as an empirical fact, and conceptually work with the "expectation values" as fluid properties in the classical thermodynamic sense. Thus Hilbert space remains a separate construct in an ontological sense with no specific ontological status assigned directly to it. That certainly does escape many classical issues while still maintaining a direct and unmodified formal transition from one to the other. The difficulty it appears then is making the point when people are so accustomed to assigning distinct empirical properties to distinct points in space.
In quantum field theory, expectations of the field operators also apply to any point in space, and indeed one gets from the thermal interpretation very naturally the hydrodynamical description of classical matter, with definite properties at every point. Thus there is no such difficulty, once one thinks in terms of quantum fields.
my_wan said:
How do you deal with the conservation issue with wave cancellations?
The situation is not really different from that with a classical Maxwell field, where waves can destructively interfere.
 
Last edited by a moderator:
  • #96
A. Neumaier said:
In the thermal interpretation, the classical properties are manifest as the expectations of the field operators. This matches naturally with a hydrodynamical description of classical matter. Thus the problem of identifying the classical properties simply vanishes. Statistical mechnaics guarantes that the fields are measurable in a coarse-grained sense, because of being in a mixed (thermal) state.
Yes, that is an attractive feature. Some people are left unsatisfied by that, they want an ontology that is crisp, but then they face questions that cannot be answered satisfactorily. Your approach loosens the ontology, but the questions evaporate. I call that a good trade, but others seem to prefer to keep the questions instead. That way all they have to do is sweep the questions under the rug and they have the best of all possible worlds, though it is something of a mild self-delusion I would say.
It is enough that the description matches the actually possible resolution.
And that's the heart of it right there-- when is that enough, and when is it not enough. To me, it raises the same issue that the Copenhagen interpretation raises: which is paramount, our ontologies or our epistemologies? Are we trying to know what is, or are we trying to fit our images of what is to what we can know? It sounds like you are siding with Copenhagen and taking that latter approach, and that is one of the things I like about your approach. You also seem to be more specific about things that Copenhagen is willing to leave uncharacterized.
 
  • #97
A. Neumaier said:
The situation is not really different from that with a classical Maxwell field, where waves can destructively interfere.
It's interesting that you say this, because on several other threads about weak measurements and Bohm interpretations and so on, I've been trying hard to draw the parallels between the quantum and classical pictures. I'm finding many people are unwilling to consider those kinds of parallels-- I even had one person tell me I was embarrassing myself by trying to point them out! There's a kind of myth that "quantum is quantum and classical is classical and never the twain shall meet." I'm not sure where that thinking comes from, it seems to completely ignore the correspondence principle, but maybe it's because educators have had to stress "quantum weirdness" in order to get students interested in that nether world. If so, they may have succeeded too well!
 
  • #98
Dear Ken, can't you see anything wrong with the thermal interpretation. Dr.
Neumaier was basically claiming that when a 430 atom molecule in the form of
buckyball was sent to the double slit. The slits literally slit the
buckyball into hundreds or thousands of pieces and spatter them across the
detector. And since a detector is consist of millions of electrons. One of these
get triggered and we erroneously thought this one triggered was the location of
the original buckyball when it was just a part of it. This was possible because
according to him, the buckyball being emitted was not a particle to start with
but a field which is undefined. As a more distinct example in case you haven't
grasped the basic of this interpretation. It's like if you sent a cow to the
double slit. It slits the cow into dozens of pieces. When say the kidney hits one of the existing electrons in the detector. We thought the cow is located in that electron
detector position. Dr. Neumaier reasoning this was possible was because the cow
was a field to start with. Now with all our experimental might. Can't we test
this outrageous claim of Dr. Neumaier, or recalling all your knowledge as full
fledge physicist.. can you think of a way to*scrutinize it.*If you meet your
fellow physicists in the lab. Please ask if they can think of a way to test Dr.
Neumaier conjecture and whether there wasn't already existing test(s) that might
have already refuted it that we might not be aware of.. such as a test that
established the particle nature of matter in an absolute way. Thank you.
 
Last edited:
  • #99
rodsika said:
Dear Ken, can't you see anything wrong with the thermal interpretation. Dr.
Neumaier was basically claiming that when a 430 atom molecule in the form of
buckyball was sent to the double slit. The slits literally slit the
buckyball into hundreds or thousands of pieces and spatter them across the
detector. And since a detector is consist of millions of electrons. One of these
get triggered and we erroneously thought this one triggered was the location of
the original buckyball when it was just a part of it. This was possible because
according to him, the buckyball being emitted was not a particle to start with
but a field which is undefined.
It might be putting words in his mouth, but if I trust that your rendition is accurate, I would say that I do see that as a potentially valid picture, even if a bit bizarre at first look.

If I understand the perspective, he might say that the buckeyball isn't really a buckeyball in the first place, it is a field that we have labeled a buckeyball because when we have lots of it we have lots of buckeyballs, and when we get just one, we assume there was already one there, but we don't really know what was already there, it's just kind of an assumption on our part. We assert its existence and find no contradiction, but that's not the same as saying we know it existed, if there isn't really anything there called a "real buckeyball" in the first place. To be honest, I'm rather sympathetic to that approach, because I like keeping careful track of what we know versus what we are just assuming we know.
It's like if you sent a cow to the
double slit. It slits the cow into dozens of pieces. When say the kidney hits one of the existing electrons in the detector. We thought the cow is located in that electron
detector position.
That doesn't sound like a fair characterization, because a kidney is a distinguishably different piece of a cow, whereas the "buckeyball field" he is talking about doesn't have distinguishably different parts like that (we are not actually breaking the bonds in there, after all).

Please ask if they can think of a way to test Dr.
Neumaier conjecture and whether there wasn't already existing test(s) that might
have already refuted it that we might not be aware of.. such as a test that
established the particle nature of matter in an absolute way.
I'm pretty sure Dr. Neumaier's approach is designed to make all the same predictions as more typical modes of thought, so we already know it's not going to be testable. Instead, the question is, does this mindset make certain troubling questions go away, or does it build a picture of "what is" that we find repugnant in some way? I'm afraid questions like that are always going to be matters of personal taste, but I do see a certain plausibility in the idea that there really is no such thing as "a cow." A better ontology might assert the existence of fields of attributes of various systems or combinations of systems that when you put them altogether you end up with something we recognize as a cow by virtue of gross similarities of behavior and appearance to our mental image of what a cow is.
 
  • #100
Ken G said:
It might be putting words in his mouth, but if I trust that your rendition is accurate, I would say that I do see that as a potentially valid picture, even if a bit bizarre at first look.

If I understand the perspective, he might say that the buckeyball isn't really a buckeyball in the first place, it is a field that we have labeled a buckeyball because when we have lots of it we have lots of buckeyballs, and when we get just one, we assume there was already one there, but we don't really know what was already there, it's just kind of an assumption on our part. We assert its existence and find no contradiction, but that's not the same as saying we know it existed, if there isn't really anything there called a "real buckeyball" in the first place. To be honest, I'm rather sympathetic to that approach, because I like keeping careful track of what we know versus what we are just assuming we know.
That doesn't sound like a fair characterization, because a kidney is a distinguishably different piece of a cow, whereas the "buckeyball field" he is talking about doesn't have distinguishably different parts like that (we are not actually breaking the bonds in there, after all).

I'm pretty sure Dr. Neumaier's approach is designed to make all the same predictions as more typical modes of thought, so we already know it's not going to be testable. Instead, the question is, does this mindset make certain troubling questions go away, or does it build a picture of "what is" that we find repugnant in some way? I'm afraid questions like that are always going to be matters of personal taste, but I do see a certain plausibility in the idea that there really is no such thing as "a cow." A better ontology might assert the existence of fields of attributes of various systems or combinations of systems that when you put them altogether you end up with something we recognize as a cow by virtue of gross similarities of behavior and appearance to our mental image of what a cow is.

No we are not talking about an Ensemble. But a single buckyball at a time double
slit experiment. Dr. Neumaier said that after the buckyball was emitted. The
slits slit the field to various fragments and these hit the detector in all
regions. Since his field is literal with left and middle and right portion
maintained. After it passes thru the slits. The left field would focus on the
left, middle field on middle and right field on the right although diffraction
and interference would also produce constructive and destructive inteference.
Let's analyze just using single buckyball experiment.. let's forget ensemble as
we are scrutinizing what happens in single buckyball emission and detection. If
there is a test that can show a single buckyball still found at the detector.
Then this would refute Dr. Neumaier conjecture. Can you think of a test or other
experiment setup which doesn't use electrons as detection elements? Again Dr.
Neumaier arguments was that a detector is composed of millions of electrons as
detection elements. So a smeared splattered field can trigger just one of them
because after the one was triggered, it would use up all available energies in
the detection circuits with the rest of the electrons in passive mode unable to
fire. So if you can think of a way that we can detect the buckyball or photon
without using electrons. Then his conjecture can be put to experiment test and
be falsifiable. If Dr. Neumaier is right. Then the measurement problem was
solved and we can mention this in all physics textbook from hereon and he become
immortalized in the Physics Hall of Fame in the company of Einstein and Bohr.
 
  • #101
rodsika said:
No we are not talking about an Ensemble. But a single buckyball at a time double
slit experiment.
But that's just it, how do you know you have a single buckeyball in the first place? All you have is a way of emitting a flux of buckeyballs that is emitting them rather rarely, but you have no idea at what point you "actually have a buckeyball." You only know when you detect them. If there never really was a buckeyball in there, just some kind of field we are interpreting as a buckeyball because it produces buckeyball detections, how are you going to say you had a single buckeyball in there? You are starting with assumptions that are not used in the thermal approach, and you cannot show your assumptions are correct, you can only show they work for you-- that doesn't mean one cannot get similar successes without making those assumptions. As I said, I'm pretty sure we are talking about all the same predicted outcomes, so what can we really be talking about here other than the assumptions we make?

Since his field is literal with left and middle and right portion
maintained. After it passes thru the slits. The left field would focus on the
left, middle field on middle and right field on the right although diffraction
and interference would also produce constructive and destructive inteference.
Yes, the approach takes the field very literally, granting it a reality in spacetime. That's the part I'm not crazy about, I don't see any particular reason to grant a physical ontology to the fields and not the particles, or to the particles and not the fields. I think they're all mental constructs. But that doesn't make me right and him wrong.

Can you think of a test or other
experiment setup which doesn't use electrons as detection elements?
Sure, use buckeyballs. But we already know what you'll see, the experiments have been done. We're just trying to figure out how to talk about what we are seeing.
 
  • #102
Ken G said:
But that's just it, how do you know you have a single buckeyball in the first place? All you have is a way of emitting a flux of buckeyballs that is emitting them rather rarely, but you have no idea at what point you "actually have a buckeyball." You only know when you detect them. If there never really was a buckeyball in there, just some kind of field we are interpreting as a buckeyball because it produces buckeyball detections, how are you going to say you had a single buckeyball in there? You are starting with assumptions that are not used in the thermal approach, and you cannot show your assumptions are correct, you can only show they work for you-- that doesn't mean one cannot get similar successes without making those assumptions. As I said, I'm pretty sure we are talking about all the same predicted outcomes, so what can we really be talking about here other than the assumptions we make?

Yes, the approach takes the field very literally, granting it a reality in spacetime. That's the part I'm not crazy about, I don't see any particular reason to grant a physical ontology to the fields and not the particles, or to the particles and not the fields. I think they're all mental constructs. But that doesn't make me right and him wrong.
 
We can use electron microscope to view a 430-atom buckyball right, let's say it
is 5 nanometer in diameter? So after we viewed one. We send it out in the
emitter, here we know one buckyball is sent out and let's say we only do this
once.. no more second buckyball. A double slit should just nudge the position of
it after the slit. And we should still find the buckyball in the detector if we
tried to find it. Dr. Neumaier was claiming that the buckyball can no longer be
found.. that is.. the buckyball was no longer the original 5 nanometer size but
it is literally fragmentalized into different components much like a grenade and
these field is splattered all over the detector. And you are saying we can't
even test this out?

Sure, use buckeyballs. But we already know what you'll see, the experiments have been done. We're just trying to figure out how to talk about what we are seeing.
Done? Can they modify the setup so the detector elements aren't electrons but
something else?

Don't worry if anything you mention can refute Dr. Neumaier. I don't think he
would excommunicate anyone who has falsified him. He is humble enough that he
may even acknowledge that person. Although it is true other Ph.D.s would be
angry, and shun that person in his circle. Also he is living in Austra so world
away from the United States Academic Circle.
 
  • #103
rodsika said:
 
We can use electron microscope to view a 430-atom buckyball right, let's say it
is 5 nanometer in diameter? So after we viewed one.
OK, then that's a detected buckeyball, just like if it hits the detecting screen. You can view it as the end of one experiment. Now what do you do?
We send it out in the
emitter, here we know one buckyball is sent out and let's say we only do this
once..
That's just an assumption. How do you know you sent it to the emitter? Maybe you just sent a bunch of fields there that you are interpreting as a buckeyball. And when did it actually leave on its way? Maybe there is no specific moment when it left, being a field, there is only a moment when it was detected, being a detection. One must always watch very carefully where you make assumptions that are not in fact in evidence in the experimental setup. So you say, OK, I'll watch that darn buckeyball the whole time, so I'll know everything there is to know about it all the time, but then you are detecting it all the time-- it's not the same single experiment, it's a long string of new experiments, and its ultimate outcome could be a qualitatively different than that one single diffraction experiment we are trying to analyze.

that is.. the buckyball was no longer the original 5 nanometer size but
it is literally fragmentalized into different components much like a grenade and
these field is splattered all over the detector. And you are saying we can't
even test this out?
Yes, that's correct, we cannot test that out because the intermediate states are not tested in the experiment you specify, and if you do test those states, it's a different experiment and may come out quite differently in the end. The way I like to put it is, any version of realism must contend with this apparent truth: a question that is not asked in a definitive way can also not be answered in a definitive way (where by "definitive" I mean "empirically demonstrated").
Don't worry if anything you mention can refute Dr. Neumaier. I don't think he
would excommunicate anyone who has falsified him. He is humble enough that he
may even acknowledge that person.
I'm sure that's true, he's a good scientist. But the first goal of any quantum interpretation is to make sure it arrives at all the same predictions as more standard approaches, because the standard approaches have been tested well. If he had a theory that made any testably different predictions, the nature of his rhetoric would be very different. Instead of "look what this way of thinking gives you in terms of descriptive power", it would sound like "do this particular experiment, find this particular result, and give me my Nobel prize."
Although it is true other Ph.D.s would be
angry, and shun that person in his circle. Also he is living in Austra so world
away from the United States Academic Circle.
I don't actually think the United States Academic circle would have any problem with his approach, it's just another interpretation. They might say "oh no, not another interpretation, that's all we need," but the truth is, every interpretation does have its lessons and insights, and I think this one does too. An amazing thing about scientific ontology is that it can be completely different for different scientists, but if the epistemology is agreed on, the science proceeds without much of a hitch. Even quantum mechanics!
 
  • #104
That's just an assumption. How do you know you sent it to the emitter?
Maybe you just sent a bunch of fields there that you are interpreting as a
buckeyball. And when did it actually leave on its way? Maybe there is no
specific moment when it left, being a field, there is only a moment when it was
detected, being a detection. One must always watch very carefully where you make
assumptions that are not in fact in evidence in the experimental setup. So you
say, OK, I'll watch that darn buckeyball the whole time, so I'll know everything
there is to know about it all the time, but then you are detecting it all the
time-- it's not the same single experiment, it's a long string of new
experiments, and its ultimate outcome could be a qualitatively different than
that one single diffraction experiment we are trying to analyze.

Why are you making it so difficult and confusing. I'll explain again. if we can
really see the buckyball using electron microscope. Then it's a particle. I
guess the field argument is that an aggregrate of field can be a particle. For
example. If my body atomic component is a field, but my biochemical body is a
whole object. Now. We can treat the buckyball as whole object because we can see
it directly just like we see a blood cell. Say after you view the buckyball
using electron microscope and pick it up with the finger and throw it at the
double slit, it should still appear at the detector. But Dr. Neumaier was
claiming that a double slit just slice up matter. Now you can't say that we
don't know what we are sending, because we have seen the buckyball directly. Or
let's take a more concrete example. Let say you pick up a rat and throw it at
the double slit, the rat should still appear at the detector whole. What Dr.
Neumaier was claiming was that the rat becomes mutilated in a number of pieces
in the detector. Hope you understand what I'm saying or I'll have to rephrase it
again in my reply in case you don't get it.
 
  • #105
I think I know now what you missed, Ken. Dr. Neumaier claim was that he has solved
the measurement problem. Let's first go to QFT and the measurement problem.

Let me illustrate: According to Mr. Butoxy in a thread in the QM forum:

"In quantum field theory the field does not replace the wave-function.
Wave-functions are still there, and they still collapse.
In elementary quantum mechanics, the dynamical quantity is position. Here, the
quantum mechanical uncertainties are captured by the wave-functions which are
functions of position. Its square magnitude has the interpretation of the
probability of finding the particle at a certain position.

Similarly, in quantum field theory, the dynamical quantity is the value of the
field at every spatial point, called the field configuration. The field
configuration may be a plane-wave, or something static like the electric field
in a capacitor. Here, the quantum mechanical uncertainties are captured by
wave-functions which are functions of field configurations. Its square magnitude
has the interpretation of the probability of finding the field with a certain
field configuration. Note that here, we are potentially talking about waves of
waves.

Wave-functions are still there in quantum field theory. And they collapse when
you make a measurement. The measurement problem is not solved."Now How does Dr. Neumaier claim differs from the above.
His claim was that the wave function never collapse. His field is like the
classical field. So a molecule treated as field just travel classically and upon
reaching detector, there was no concept of collapse like in QFT.

Well. I guess you like his approach because you also want to make classical even
weak measurements as seen in the other thread. Is this why you are biased
supporting Dr. Neumaier when his conjecture is not even standard QFT as
discussed above?? Please think it clearly. If he was wrong, it doesn't mean you
were wrong too so don't put resistance in falsifying him.
 
  • #106
rodsika said:
I think I know now what you missed, Ken. Dr. Neumaier claim was that he has solved
the measurement problem.
I have not seen much of what Dr. Neumaier has claimed, so can't speak to anything but my sense of the general usefulness of his interpretation. I would indeed be skeptical that he has solved the measurement problem, but there may be issues of language here. There are several different aspects to the measurement problem, the "hard" problem is how you get a single outcome from something that the theory treats as a probability amplitude/distribution/expectation, it doesn't really matter which. This is a toughy, even classical physics has not solved it in the context of chaotic systems, because whether or not we can characterize a state as "definite" depends on whether or not we can use the concept for predictions. States that can never be used for prediction, no matter how accurately known they are, can't really be considered "definite", so I think "definite" turns out to be a fuzzy concept. In other words, how nature decides the weather is still a question we have not mastered, so I don't see how interpretations of quantum mechanics could have "solved" that one either. But he may be referring to some particular aspect, some troubling question that he made go away. That wouldn't surprise me at all, if his approach can do that. Maybe he reduced the quantum measurement problem to a classical measurement problem.

The quote by Mr. Butoxy sounded reasonable to me, but he knows more about QFT than I do. The conclusion seems to be that focusing on fields does not eliminate the uncertainty issue and the strangeness of how nature achieves a quasi-definite state from a theory that is enmeshed in uncertainty. But Neumaier does seem to have reduced the nature of the uncertainty to something much more classical, or so it seems to me.
Now How does Dr. Neumaier claim differs from the above.
His claim was that the wave function never collapse. His field is like the
classical field. So a molecule treated as field just travel classically and upon
reaching detector, there was no concept of collapse like in QFT.
I'm not sure what that means, even the language above of one electron in the detector getting the energy from the field sure sounds like a collapse to me. Any time you have a perceived outcome that is more definite than the ontological status of the agents you used to predict that outcome, you have what might be called a collapse. But I don't know what claims are being made by Mr. Neumaier. Before your issue seemed to be with the ontology of his interpretation itself, not his claims about what one can do with that ontology. Those are two different things.
Well. I guess you like his approach because you also want to make classical even
weak measurements as seen in the other thread.
I don't want to make weak measurements classical, I want to make averages of weak measurements classical, because averages of quantum results are just what classical results are. That doesn't mean there isn't anything that isn't classical, it just means that we can use classical analogs to understand much of what is going on. Dr. Neumaier does seem to be adept at taking that to the max.

Is this why you are biased
supporting Dr. Neumaier when his conjecture is not even standard QFT as
discussed above??
It's not at all clear that the thermal interpretation is not standard QFT. Claims about what one can do with it are something different. I don't consider myself to be biased, all I can do is take what limited expertise I have and bring it to bear on my opinions of what is going on in these various interpretations. I think the very first thing to establish is whether the thermal interpretation makes all the same predictions, because if it does, it is by definition an acceptable ontology. Then what claims can be made on that ontology are a very different issue, and would arise next.
 
  • #107
I'm not sure what that means, even the language above of one electron in
the detector getting the energy from the field sure sounds like a collapse to
me. Any time you have a perceived outcome that is more definite than the
ontological status of the agents you used to predict that outcome, you have what
might be called a collapse. But I don't know what claims are being made by Mr.
Neumaier. Before your issue seemed to be with the ontology of his interpretation
itself, not his claims about what one can do with that ontology. Those are two
different things.

Ken. Your reply made me realized one thing. When I read this thread previously. I
thought Dr. Neumaier was talking about the direct current (D.C.) source in the
detector where a single electron trigger would use up the energy of the current.
But your reply made me realized it was the energy of the inpinging field itself
where it got the energy. So let me now ask this question directly to him.

Dear Dr. Neumaier.
First question. If a Buckyball composed of 430 atoms hit the detector. You claim
that its field hit all the regions of the detector at once. Now when an electron
is triggered, it gets all the energy of the inpinging field. Now question, the
energy of an inpinging Buckyball is more than the energy of an inpinging
electron. So why is only one electron triggered? Multiple electrons like 4 of
them should be triggered in this manner because the field has enough ionization
energy even for 5 electrons. What is your answer?

Second question. After the Buckyball field energy was absorbed by the electron.
What happens to the Buckyball which lost the energy. What does it mean to have a
field that no longer has energy. Are you saying the Buckyball 430 atoms simply
vanish into thin air after its energy is absorbed by the electron? If not. How
does an energyless 430 atoms field behave versus if it has energy?

Third question. What produced definite outcome which Ken was mentioning above.
Pls. address his comment about it. Apparently quantum and even classical
equations should only be stochastic. There should be no definite outcome unless
human consciousness perceive it (see more details in his message). What is your
solution to it?

Fourth question, I forgot to add this. If all is field and there is no particle. How can the electrons even exist in the detector if there is no particle in the first place?!
Thanks.
 
  • #108
 
Ken. Let's just focus on the more substantial Photoelectric Effect. Dr. Neumaier
was claiming that Einstein was wrong that a photon was particle. Here he
explains how pure photon field can trigger the detector. I think you are expert
in waves as seen in your weak measurement trajectory defence in the qm forum. So
please comment on the following in his original presentation. Please take time
on it as it is crucial in establishing the decision whether or not to get back
Einstein Nobel Prize for deceiving the world photons are particles. Remember de
Broglie got the idea matter are wave from Einstein conjecture. And wave/particle
duality has confused the world for over a century. Which part of the following
do you agree and not?
http://arnold-neumaier.at/physfaq/topics/photodetection
------------------------ The photoelectric effect ------------------------
The photoelectric effect http://en.wikipedia.org/wiki/Photoelectric_effect is
usually explained (following Einstein, who received the Nobel price for this
explanation) by saying that a sufficiently energetic photon falling on a
photosensitive substance causes the latter to eject a single electron, which is
then magnified by a photomultiplier to produce a macroscopic and hence
observable effect - the ''click'' of the detector. This is commonly used in
discussions of experiments on entangled photons carried out by Alice and Bob,
who make statistics on clicks to prove or disprove things, or to communicate
secret information.

In the semiclassical picture known to Einstein 1905, currents are produced by
discrete electrons. In 1905, when Einstein proposed his explanation, the
photoelectric effect was a clear indication of the particle nature of light,
since no other model was available that could have explained the process.
Einstein's explanation was so important for the development of the subject that
he got 1921 the Nobel prize for it, a few years before modern quantum mechanics
was born. The modern concept of a photon was created only later (Lewis 1926,
Dirac 1927).

According to today's knowledge, just like Bohr's atomic model, Einstein's
explanation of the photoeffect is too simplistic, and is not conclusive. Now,
100 years later, his picture is known to be approximate only, and that currents
in metals are in fact produced by the continuous electron fields of QED.
Discrete semiclassical particles are just very rough approximations.

Indeed, the argument of Einstein put forward for the discrete nature of
radiation is spurious, since it ignores the quantum nature of the detector
(which was of course completely unknown at the time). As one can read in the
standard reference for quantum optics, L. Mandel and E. Wolf, Optical Coherence
and Quantum Optics, Cambridge University Press, 1995. the clicks in a photon
detector are an artifact of photodetection caused by the quantum nature of
matter, rather than proof of single photons arriving.

Mandel and Wolf write (on p.629, in the context of localizing photons), about
the temptation to associate with the clicks of a photodetector a concept of
photon particles: ''Nevertheless, the temptation to interpret the electronic
signal registered by a photodetector as due to a photon that is localized in
some sense is quite strong.'' The wording suggests that one should resist the
temptation, although this advice is usually not heeded. However, the advice is
sound since a photodetector clicks even when it detects only classical light!
This follows from the standard analysis of a photodetector, which treats the
light classically and only quantizes the detector.

Sections 9.1-9.5 show that the electron field responds to a classical external
electromagnetic radiation field by emitting electrons according to Poisson-law
probabilities, very much like that interpreted by Einstein in terms of light
particles. Thus the quantum detector produces discrete Poisson-distributed
clicks, although the source is completely continuous, and there are no photons
at all in the quantum mechanical model. The state space of this quantum system
consists of multi-electron states only. So here the multi-electron system
(followed by a macroscopic decoherence process that leads to the multiple dot
localization of the emitted electron field) is responsible for the creation of
the dot pattern. This proves that the clicks cannot be taken to be a proof of
the existence of photons.

Note that initially, only single photoelectrons are emitted, which would leave
no experimental trace without being magnified. A macroscopic magnification is
needed to make the photoelectrons observable. In a photodetector, a
photomultiplier is used to produce an observable current. In the case of
detection by a photographic plate, the detector is a photoemulsion, and the
photoelectrons are magnified via a chemical reaction that produces tiny dots
whose density is proportional to the incident intensity of the electromagnetic
radiation.

(The table of contents of the book by Mandel & Wolf is at
http://www.cambridge.org:80/servlet/...TEM_ENT_ID=233 If you are new to quantum
optics and want to have a shortcut through this book of over 1100 pages: At
first, you need enough classical background. To update your math, read or review
Sections 2.1-2.3 and 3.1 and go back to the pieces from Chapter 1 that you need
to make sense of these sections. Classical physics in a simplified setting
without polarization starts in Chapter 4 and 5, where you need at first only
4.1-4.3 and 5.6-5.7 -- again, reading omitted stuff you need for understanding
that as you go along. Full classical electromagnetism is covered in Chapters
6-8. You need 6.1-6.5. The quantum part starts in Chapter 9. You'd read 9.1-9.5,
10.1-10.5, 10.9, 10.10, 11.1-8, 11.13, 12.1-12.4, 12.10, 13.1-13.3, 14.1-14.6.,
15.1-3, 18.1-4, 20.1-6, 22.4. Then you have an overview over the central part of
quantum optics, and are well prepared to start a second, thorough reading of the
whole book.)

Section 12.11 is about the problems with photon position, and that there is no
associated operator, but only a POVM. It is in this section that they made the
remark referred to above. Sections 14.1-14.5 show that the semiclassical picture
of Chapter 9 holds with small corrections also in the quantum case, and is
virtually unaltered in case of coherent light.

We conclude that the discreteness of the clicks must be caused by the quantum
nature of matter, since there is nothing discrete in an incident classical
external radiation field.

I discussed the situation in some more detail in a public lecture given in 2008,
http://arnold-neumaier.at/ms/lightslides.pdf See Section 3 (pp.35-44);
names in smallcaps are accompanied by references, given at the end of the
slides.

Note that this holds even for very faint light. In deep-field astronomy,
'photographs' of perhaps several billion light years distant astronomical
objects using CCD detectors is routine. The time interval between individual
events on a CCD array of a few cm^2 can be several minutes or more in some
cases.

To explain the image, it is enough that the detector elements on the plate
respond locally according to a Poisson process with probability rate determined
by the incident energy density. This means it fires randomly at the rate
determined at each moment from the incident faint field. No memory is needed,
and energy loss is irrelevant (except for the efficiency of the process). The
local detector elements will respond independently and rarely but occasionally,
and waiting long enough will precisely reproduce the averaged intensity profile
- the goal of the imaging.

It doesn't make sense to somehow count photons classically and pretend that each
of the myriads of photons created in a distant star is a spherical wave
spreading out through space to be ''collapsed'' when entering the CCD detector.
The detector doesn't see the myriads of these extremely faint spherical waves
and decides to collapse just one of them. Instead, it ''sees'' the energy
density; according to its value, it feels more or less ''motivated'' to respond,
resulting in a Poisson statistics. The reason is that in QED, the local mean
energy density is an observable field, whereas the concept of a photon number
density cannot even be meaningfully defined.

 
  
 
Last edited by a moderator:
  • #109
rodsika said:
 
Ken. Let's just focus on the more substantial Photoelectric Effect. Dr. Neumaier
was claiming that Einstein was wrong that a photon was particle. Here he
explains how pure photon field can trigger the detector.
That is certainly a pretty radical stance. Here is my take on it-- I don't have time to study the situation exhaustively, but my initial impression is that Dr. Neumaier is doing two different things that should be analyzed separately:
1) he is offering an alternative way to think about photons and radiation
2) he is critiquing the standard way (Einstein's) of thinking about photons.
As for point #1, I can see nothing overtly wrong in what he is saying. This gets back to my earlier point about the non-uniqueness of the equivalent ways we can translate the action of a theory into descriptive expressions (like photon) to help us successfully execute that theory. All too often, when we find a successful language for executing a theory, we think we have "found the truth the theory implies", but this is poor logic. To conclude that, we need to show that our language is unique, and that is often not true in the least. I suspect this is yet another example of that phenomenon.

If I'm right, then Dr. Neumaier's language is fully equivalent in terms of the execution of a theory into making testable predictions, though it sounds ontologically vastly different. This is actually not surprising, I see it all the time. As a particularly stark example, you may have been told that forces produce acceleration, like that was an ontologically true description. Imagine I said no, there's no such thing as forces, instead particles move so as to minimize a mathematical quantity, dependent on energy considerations, called the "action", you might say "get out, how can you say there is no such thing as forces, was Newton an idiot?" But you see, my statement is exactly equivalent to Newton's, even though the ontology sounds totally different. This really happens all the time, we should neither be bothered by it, nor take it too seriously. We shouldn't conclude that the previous ontology was wrong or its proponents were deluded fools. Instead, we just ask: if this new ontology is indeed equivalent, what new insights does it offer? No fuss, no muss.

About the only claim he makes that raises a red flag for me is: "The reason is that in QED, the local mean energy density is an observable field, whereas the concept of a photon number density cannot even be meaningfully defined. " I'm not sure what he means here, the mathematics of creation and annihilation operators is often regarded ontologically as being about discrete photons, and is very useful for executing the theory. He may mean something else by not being meaningfully defined though, I just don't know enough about it. Anyway, field theorists are quite comfortable with the photon concept and will be in no rush to toss it overboard, but that doesn't make Dr. Neumaier wrong when he claims certain advantages for his way of thinking. The key point is, it's just not an either/or situation, better is to be conversant in all the perspectives-- you never know where the next insight will come from. I'm just very suspicious of using science to arrive at some objectively true ontology, in my view all scientific theories are effective theories, and nonunique ontologies are just toys we play with.
 
  • #110
Perhaps Neumaier can comment on this to clear up some possible issues and correct anything I might get wrong.

It appears to me part of the issue with getting the interpretive content of the description is in how Gibbs ensembles are embedded in the QM wavefunction in such a way that it makes it look as if the wave structure and the wavefunction is the same thing. On one hand the thermal interpretation is taking the wave structure seriously, where particles are localized waves, though a wave does not have to be localized in all cases. On the other hand the wavefunction does not just define the state of the wave, it defines an ensemble of all possible states the wave can potentially be in given the constraints. I will again go to a very rough classical analog to describe the significance of ensembles in the QM wavefunction.

A Gibbs ensemble when used to describe a classical event such as a dice roll conceptually involves making many mental copies of that dice and treating the dice as if it was all of those copies at once. Then the state with the highest probability is the state that occurs most often when you role the dice. Fair dice presumably being equal probabilities for each state.

Now imagine a wave (water) tank where you are creating solitons on one side and seeing where they hit on the other. Each soliton will have a 'fairly' definite position and trajectory for each one created. To define a theory of where it is likely going to hit, on the far side of the tank, what that one soliton does is not very useful. So you create a wavefunction that includes not just what a particular wave does, but a probability density that defines the relative probabilities for countless many solitons. The wavefunction defining this probability density has the same basic structure as the wavefunction defining a particular wave. Because even a particular wave has similar density distribution on the surface as the probability density describes for the ensemble. So it gives the false impression that if the wavefunction is not real then the wave must not be real, leading back to the particle picture of matter. It is also this ensemble of "probable" waves that appears to collapse when in this picture the wave is still there, it just showed which of the Gibbs ensembles the actual wave state possessed. It was and remains a wave the whole time, without collapse. You now just have to throw away all those wave ensembles, like the dice rolls that did not happen, and calling it a wavefunction collapse.

In this way all the wave mechanics of standard QM is ontologically maintained while the wavefunction is no more real than a dice that lands on countless many sides at once. Does this make sense or need any corrections?
 
  • #111
Ken G said:
I've been trying hard to draw the parallels between the quantum and classical pictures. I'm finding many people are unwilling to consider those kinds of parallels-- I even had one person tell me I was embarrassing myself by trying to point them out! There's a kind of myth that "quantum is quantum and classical is classical and never the twain shall meet." I'm not sure where that thinking comes from, it seems to completely ignore the correspondence principle, but maybe it's because educators have had to stress "quantum weirdness" in order to get students interested in that nether world. If so, they may have succeeded too well!

Yes. That's why I am stressing that quantum mechanics is nearly classical if viewed in the right way.
 
  • #112
rodsika said:
if you sent a cow to the
double slit. It slits the cow into dozens of pieces.

It damages the slit, and the experiment is over.
 
  • #113
rodsika said:
Dr. Neumaier said that after the buckyball was emitted. The
slits slit the field to various fragments and these hit the detector in all
regions.
I never said this. The field behaves like a water wave when it goes through a slit - it changes it shape and gradually expands. Nowhere any fragments.
 
  • #114
rodsika said:
Now How does Dr. Neumaier claim differs from the above.
His claim was that the wave function never collapse. His field is like the
classical field. .

The field remains a field even when some collapse of the wave function happens. For the field is about expectation values, and these don't chnge their natture when the wave function collapses, or rather when the density matrix decoheres under the influence of the environment.

Thus the question of a collapse simply becomes irrelevant to the interpretation.
 
  • #115
rodsika said:
First question. If a Buckyball composed of 430 atoms hit the detector. You claim
that its field hit all the regions of the detector at once. Now when an electron
is triggered, it gets all the energy of the inpinging field. Now question, the
energy of an inpinging Buckyball is more than the energy of an inpinging
electron. So why is only one electron triggered? Multiple electrons like 4 of
them should be triggered in this manner because the field has enough ionization
energy even for 5 electrons. What is your answer?
I was saying the stuff about a single electron for the case when an electron was sent through the slit.
For a buckyball, it is less clear what precisely happens. One would have to do a quantum statistical mechnaics calculation to find out what really happens. (This is like with other experiments. in simple cases, one can analyze the situation without calculation based on known principles, in more complex cases one needs to go through the calculations.) I might do some such calculations at some time but they are time-consuming, and currently I don't have the time for that.
rodsika said:
Second question. After the Buckyball field energy was absorbed by the electron.
What happens to the Buckyball which lost the energy. What does it mean to have a
field that no longer has energy.
The buckyball is not only energy. it also carries a carbon field. This is initially distributed along the detector, and presumably concentrates after a short time at a random position.
rodsika said:
Third question. What produced definite outcome which Ken was mentioning above.
Pls. address his comment about it. Apparently quantum and even classical
equations should only be stochastic.
Even classical stochastic equations can be used to predict phenomena with good acuracy, if the noise is small, although there are no definite (i.e., inifinitely precise) results.
 
  • #116
A. Neumaier said:
Yes. That's why I am stressing that quantum mechanics is nearly classical if viewed in the right way.
A crystal clear examination of what are essential differences between the two, and what can be viewed as similarities, would be a very worthwhile program indeed.
 
  • #117
rodsika said:
 
Ken. Let's just focus on the more substantial Photoelectric Effect. Dr. Neumaier
was claiming that Einstein was wrong that a photon was particle.
 
You exaggerate. Einstein's picture has some validity, else it wouldn't have be that useful.
But that picture is not _needed_ to explain the photo effect. Thus ifr one wants to dispense with the particle picture in order to have a more sensible ontological picture of the world, one can do so without harm.
 
  • #118
Ken G said:
About the only claim he makes that raises a red flag for me is: "The reason is that in QED, the local mean energy density is an observable field, whereas the concept of a photon number density cannot even be meaningfully defined. " I'm not sure what he means here, the mathematics of creation and annihilation operators is often regarded ontologically as being about discrete photons, and is very useful for executing the theory.
Your statement does not contradict mine. Photons have welldefined momentum states, to which the creation-annihilation picture applies, but this is not enough to guarantee a photon density.
This does not exist because of the lack of a position operator.
 
  • #119
my_wan said:
It appears to me part of the issue with getting the interpretive content of the description is in how Gibbs ensembles are embedded in the QM wavefunction in such a way that it makes it look as if the wave structure and the wavefunction is the same thing. On one hand the thermal interpretation is taking the wave structure seriously, where particles are localized waves, though a wave does not have to be localized in all cases. On the other hand the wavefunction does not just define the state of the wave, it defines an ensemble of all possible states the wave can potentially be in given the constraints. I will again go to a very rough classical analog to describe the significance of ensembles in the QM wavefunction.[-QUOTE]
There are two kinds of wavesß

1. Those in configuation space, the wave functions. These are just computational tools to work out the predictions of QM. These may collapse under the influence of the environment, but in the thermal interpretation this is nearly irrelevant, just contributing a little to dissipation

2. Those in real, 3D space. Here quantum fields are located, and these are ontologically relevant fields in the thermal interpretation.
 
  • #120
Ken G said:
A crystal clear examination of what are essential differences between the two, and what can be viewed as similarities, would be a very worthwhile program indeed.

On the level of wave functions, there are huge differences, since these have no classical equivalent. On the level of density matrices, or in the Heisenberg picture, the differences are marginal (more precisely of the order of the Planck constant. My book

Arnold Neumaier and Dennis Westra,
Classical and Quantum Mechanics via Lie algebras,
2008, 2011. http://lanl.arxiv.org/abs/0810.1019

shows how similar the classical and the quantum worlds are when consistently and from the start treated without significant reference to wave functions.
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 218 ·
8
Replies
218
Views
16K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 42 ·
2
Replies
42
Views
8K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 44 ·
2
Replies
44
Views
5K
  • · Replies 826 ·
28
Replies
826
Views
86K
  • · Replies 1 ·
Replies
1
Views
3K