QM Eigenstates and the Notion of Motion

strangerep
Science Advisor
Messages
3,766
Reaction score
2,213
vanhees71 said:
I'd be interested to hear more arguments against my statement that nothing moves in an energy eigenstate, but that's of course off-topic here.
I'll risk a quick off-topic answer here, since I think it's straightforward QM, not vague "interpretation" stuff. :oldbiggrin:

In QM (e.g., Ballentine p81), for a free particle, ##H = \frac12 \, M\, V\cdot V + E_0##. So in the ground state ##|E_0\rangle## we have ##H|E_0\rangle = E_0 |E_0\rangle,## hence ##V^2 |E_0\rangle = 0##.

I'm reasonably sure that a zero eigenvalue for the ##V^2## operator corresponds to "not moving".

But in a higher energy eigenstate, we get a non-zero eigenvalue for ##V^2##, which suggests "moving".

(I leave to the Mentors' discretion whether to convert this into a new thread in the QM forum.)
 
Last edited:
Physics news on Phys.org
strangerep said:
I'll risk a quick off-topic answer here, since I think it's straightforward QM, not vague "interpretation”….
That I agree with, but it seems from #4 of the offending thread that the OP there was thinking about a bound electron in the nucleus’s Coulomb potential. Is there a similarly quick and convincing way of extracting a value of ##v^2##?
 
What is the working theoretical definition of "motion" for a quantum system? We saw tracks in the cloud chamber, so from a practical standpoint "motion" is a variation in time of space coordinates assigned to the lab the cloud chamber resides in.
 
vanhees71 said:
I'd be interested to hear more arguments against my statement that nothing moves in an energy eigenstate
Nothing moves in a normalizable energy eigenstate.

But plane waves are counterexamples to your general statements, giving unnormalizable energy eigenstates at arbitrary kinetic energies.

In general, a system has a partly continuous spectrum iff it has moving parts.

dextercioby said:
What is the working theoretical definition of "motion" for a quantum system? We saw tracks in the cloud chamber, so from a practical standpoint "motion" is a variation in time of space coordinates assigned to the lab the cloud chamber resides in.
A subsystem is in motion iff the kinetic energy operator of its center of mass has a nonzero expectation value.

strangerep said:
In QM (e.g., Ballentine p81), for a free particle, ##H = \frac12 \, M\, V\cdot V + E_0##. So in the ground state ##|E_0\rangle## we have ##H|E_0\rangle = E_0 |E_0\rangle,## hence ##V^2 |E_0\rangle = 0##.

But in a higher energy eigenstate, we get a non-zero eigenvalue for ##V^2##, which suggests "moving".
For a free particle, the only bound state is the ground state. The spectrum consists of the interval ##[E_0,\infty]##. The solutions of the time independent Schrödinger equation for ##E>E_0## are plane waves with frequency ##\omega=(E-E_0)/\hbar##, corresponding to uniform motion with kinetic energy ##E-E_0##.
 
  • Like
Likes mattt, dextercioby, strangerep and 1 other person
A. Neumaier said:
Nothing moves in a normalizable energy eigenstate.
If the energy eigenstate is not normalizable, then you can't prepare the system in this state, of course. Then you must build "wave packages" of these scattering states to approximate it, and thus it's not an energy eigenstate anymore, and there indeed something may move.

Here we discuss bound states like the hydrogen atom. In a hydrogen atom in its ground state nothing moves.
 
vanhees71 said:
I'd be interested to hear more arguments against my statement that nothing moves in an energy eigenstate, but that's of course off-topic here.

...

If the energy eigenstate is not normalizable, then you can't prepare the system in this state, of course.
From an instrumentalist perspective, in the same way we associate a state with a preparation, we should associate the posed question with an experiment. E.g. "Does the system, when prepared in an energy eigenstate, move?" could be interpreted as "What is the probability that sequential determinations of the position of the centre of mass of the system all return the same result?"
 
  • Like
Likes Nugatory and dextercioby
Let ##A## be an arbitrary observable, represented by the self-adjoint operator, ##\hat{A}##. Then the operator that represents the time derivative ##\dot{A}## is (independently of the chosen picture of time evolution!)
$$\mathring{\hat{A}}=\frac{1}{\mathrm{\hbar}}[\hat{A},\hat{H}].$$
Thus the expectation value of the time derivative is
$$\frac{\mathrm{d}}{\mathrm{d} t} \langle A \rangle=\langle E|\mathring{\hat{A}}|E \rangle=\frac{1}{\mathrm{\hbar}} E (\langle E|hat{A}|E \rangle-\langle E |\hat{A}|E \rangle=0,$$
where I have used that
$$\hat{H} |E \rangle=E |E \rangle, \quad \langle E|\hat{H}=E \langle E|,$$
which follows from the self-adjointedness of ##\hat{H}##.
 
vanhees71 said:
Let ##A## be an arbitrary observable, represented by the self-adjoint operator, ##\hat{A}##. Then the operator that represents the time derivative ##\dot{A}## is...
Of course the time derivative of the expectation value of an arbitrary observable is zero for a stationary state. Sure, the explicit calculation verifying this is still nice. But verifying something obvious does not settle the current disagreement. It even gives the impression that you don't understand what the disagreement is about.
 
  • Like
Likes dextercioby
Obviously, I don't understand, where there can be a disagreement about the fact that energy eigenstates are stationary states. It follows even simpler from the Schrödinger equation. If ##|\Psi(0) \rangle=|\Psi_0 \rangle=|E \rangle## ,where ##|E \rangle## is an energy eigenvector, then it follows
$$|\Psi(t) \rangle=\exp(-\mathrm{i} \hat{H} t) |\Psi(0) \rangle=\exp(-\mathrm{i} \hat{H} t) |E \rangle=\exp(-\mathrm{i} E t) |E \rangle.$$
The state thus is
$$\hat{\rho}(t)=|\Psi(t) \rangle \langle \Psi(t)|=|E \rangle \langle E|=\text{const}.$$
So in which sense do you think that "something is moving" even if the system is in an energy eigenstate?
 
  • Like
Likes Lord Jestocost
  • #10
vanhees71 said:
Let ##A## be an arbitrary observable, represented by the self-adjoint operator, ##\hat{A}##. Then the operator that represents the time derivative ##\dot{A}## is (independently of the chosen picture of time evolution!)
$$\mathring{\hat{A}}=\frac{1}{\mathrm{\hbar}}[\hat{A},\hat{H}].$$
Thus the expectation value of the time derivative is
$$\frac{\mathrm{d}}{\mathrm{d} t} \langle A \rangle=\langle E|\mathring{\hat{A}}|E \rangle=\frac{1}{\mathrm{\hbar}} E (\langle E|hat{A}|E \rangle-\langle E |\hat{A}|E \rangle=0,$$
where I have used that
$$\hat{H} |E \rangle=E |E \rangle, \quad \langle E|\hat{H}=E \langle E|,$$
which follows from the self-adjointedness of ##\hat{H}##.
I don't strongly disagree with associating motion with a changing expectation value so long as the association is made explicit. But I don't think motion is exclusively associated with the expectation value. E.g. This paper https://arxiv.org/abs/gr-qc/9210010 shows how we might model the noise in sequences of measurements on the same system (as opposed to repeated measurements, each on different members of an ensemble) with equations of motion and a Langevin force. We can see that this sense of motion would emerge even if the system is always prepared in the ground state.
 
  • Like
Likes gentzen and dextercioby
  • #11
I missed the background to the thread, the link to the quotation i the first post seems broken so perhaps i am missing something..
vanhees71 said:
Obviously, I don't understand, where there can be a disagreement about the fact that energy eigenstates are stationary states.
As others ask how is motion defined?
Is it certain observables or our information that "moves"?

Motion to me means change of some parameter with timr with respect to some space. Position in 3d space seems like standard mening if not qualified.

The stationarity of E-eigenstates imo refers to our information. So a stationary state often captures a stationary information state that reflects via the conjugate variable a periodic motion.

But i supposed its something more delicate that disscussed?

For examole ib the ground state of hydrogen, it seems there are radial oscillations, described by the stationary s orbital with angular symmetry? Do you label this "motion" or not? It seema like a matter of definition.

/Fredrik
 
  • Like
Likes dextercioby
  • #12
Fra said:
I missed the background to the thread, the link to the quotation i the first post seems broken so perhaps i am missing something..
The original discussion was probably the one stopped shortly after
DrClaude said:
vanhees71 said:
Then, how can an energy eigenstate describe a moving particle? It's a stationary state!
But ##\langle p^2 \rangle \neq 0##. I guess it comes down to what "moving" means :smile:
I guess vanhees71 started this thread (because the other thread got closed), then tried to delete it again, but already got the answer from strangerep. (Or maybe a moderator deleted vanhees71's post, still the answer and this thread survived.)
 
  • #13
If we know that the system is in an exact energy eigenstate then this implies the information that it's a stationary state and thus that nothing is changing with time. In pop-sci language I'd say it means "nothing moves".

This thread was somehow initiated from an off-topic side discussion in the advisor launch. I've also seen that they forgot to copy one of my postings from there. Its content is what I've posted now in #9.
 
  • Like
Likes dextercioby
  • #14
gentzen said:
The original discussion was probably the one stopped shortly after

I guess vanhees71 started this thread (because the other thread got closed), then tried to delete it again, but already got the answer from strangerep. (Or maybe a moderator deleted vanhees71's post, still the answer and this thread survived.)
This thread is an offshoot of https://www.physicsforums.com/threads/electrons-at-absolute-zero-do-they-still-move.1050269/

Although I don't get why this should be of importance.
 
  • Like
Likes dextercioby
  • #15
vanhees71 said:
If we know that the system is in an exact energy eigenstate then this implies the information that it's a stationary state and thus that nothing is changing with time. In pop-sci language I'd say it means "nothing moves".
I see, with that definition what you say makes sense to me.

(I thought the discussion was about some duality between different information states, and such as states vs changes of the same state, or conjugate spaces and which one is more primary, but it seems not.)

/Fredrik
 
  • #16
vanhees71 said:
Thus the expectation value of the time derivative is
$$\frac{\mathrm{d}}{\mathrm{d} t} \langle A \rangle=\langle E|\mathring{\hat{A}}|E \rangle=\frac{1}{\mathrm{\hbar}} E (\langle E|\hat{A}|E \rangle-\langle E |\hat{A}|E \rangle=0,$$
But in the statistical interpretation that you always advocate, this only says that in an energy eigenstate, nothing happens on the average. In each single realization, something might happen.
 
  • Like
Likes dextercioby
  • #17
vanhees71 said:
If we know that the system is in an exact energy eigenstate then this implies the information that it's a stationary state and thus that nothing is changing with time.
Such statements may be fine as long as it is a purely academic discussion with no risk to ever lead to mistakes (or "costly" disputes) in practical applications. Practice tends to be messy, and induces the desire to make things simpler than they actually are. One such discussion I still have in mind was about "The ‘stable’ velocity for the electron for an orbit around the nucleus equals, ##v=\sqrt{\frac{eQ}{4\pi \epsilon_0 mr}}## (3.103)" on page 83 of Thomas Verduin's "Quantum Noise Effects in e-Beam Lithography and Metrology" PhD thesis. The discussion was about the moment transfer between the primary electron and a secondary electron "kicked out" of an inner shell at a certain ionization energy. Because that electron in the inner-shell is in a stationary shell, "somebody" was convinced that assigning a velocity to it (before the collision) and using that "classical" information for computing the moment transfer was "completely wrong". So even after I fixed some artifacts of that specific "formulation" above, it felt still "too classical" to him. I am an instrumentalist in those matters, the formulas have to work well in their specific scenarios, and interpolate somewhat reasonably between scenarios. Focusing on one specific aspect over all others simply risks to generate totally wrong results for some scenarios. (And believe me, the formulas can get messy, no matter how you compute it.)

gentzen said:
Moment without movement. Just like in Bohmian mechanics. Since nearly all physicists agree that this is a shortcoming of Bohmian mechanics, my guess would be that vanhees71 is wrong in this specific case. I just can't believe that Bohmian mechanics should be right in this respect. It will often be the ground state of an harmonic oscillator, and of course oscillate is what it will do.
Perhaps vanhees71 is right, but then I would like to see how he argues that Bohmian mechanics is correct in this "moment without movement" aspect. I don't fully understand it yet, I plan to read a two paper series by Peter Holland on that problem at some point, but I am fully aware that vanhees71 has something completely different in mind.

Currently, I am actually working on something even more complicated, namely crystal effects in SEM. And because there the question which classical computation are appropriate for which part arose again, I worked out a closely related riddle arising in Bohmian mechanics. If you want to understand my position, look at Demystifier's initial reaction when I tried to explain my resolution. (As a mathematician, I believe in "conservation of difficulty" and that questions can ultimately be resolved, but I certainly don't believe that the solution will always be the most simple one that first comes to your mind, before you tried to work-out the tricky parts in detail.)
gentzen said:
Demystifier said:
Can you be more specific about your limit? I mean, can you express it with math, or with pictures, rather than with words?
Take a "simple" window function, for example a (suitably shifted) Hann function ("raised cosine window"): ##w_L(x):=\sin^2(\pi x/L) \chi_{(-L,0)}(x)##. Take ##\phi_L(x,0):=w_L(x)\exp(ikx)## to be the initial wavefunction, for the 1D scenario with a potential barrier starting at ##x=0##. This initial wavefunction is now used both for the distribution of particles, and for solving the time dependent Schrödinger equation. For a given ##L##, we get (a distribution of) trajectories, and those trajectories may cross each other, because of the time dependence. And now we investigate the limit for ##L \to \infty##, especially the limit of the phase distribution of ##\phi_L(x,t)##, where we are allowed to suppress a global phase factor like ##\exp(-i\omega t)##.
 
  • Like
Likes dextercioby
  • #18
gentzen said:
Such statements may be fine as long as it is a purely academic discussion with no risk to ever lead to mistakes (or "costly" disputes) in practical applications. Practice tends to be messy, and induces the desire to make things simpler than they actually are. One such discussion I still have in mind was about "The ‘stable’ velocity for the electron for an orbit around the nucleus equals, ##v=\sqrt{\frac{eQ}{4\pi \epsilon_0 mr}}## (3.103)" on page 83 of Thomas Verduin's "Quantum Noise Effects in e-Beam Lithography and Metrology" PhD thesis. The discussion was about the moment transfer between the primary electron and a secondary electron "kicked out" of an inner shell at a certain ionization energy. Because that electron in the inner-shell is in a stationary shell, "somebody" was convinced that assigning a velocity to it (before the collision) and using that "classical" information for computing the moment transfer was "completely wrong". So even after I fixed some artifacts of that specific "formulation" above, it felt still "too classical" to him. I am an instrumentalist in those matters, the formulas have to work well in their specific scenarios, and interpolate somewhat reasonably between scenarios. Focusing on one specific aspect over all others simply risks to generate totally wrong results for some scenarios. (And believe me, the formulas can get messy, no matter how you compute it.)
Well, I'm convinced that one can understand this in correct quantum-mechanical terms. In the classical theory there is no "stable velocity for the electron for an orbit around the nucleus". That problem was famously solved with the formulation of modern quantum mechanics, getting rid of all inconsistencies of the pseudo-quantum theory a la Bohr.
gentzen said:
Perhaps vanhees71 is right, but then I would like to see how he argues that Bohmian mechanics is correct in this "moment without movement" aspect. I don't fully understand it yet, I plan to read a two paper series by Peter Holland on that problem at some point, but I am fully aware that vanhees71 has something completely different in mind.
I don't think that Bohmian mechanics helps in any way here. I don't know, how Bohmian mechanics describes a bound electron though. Do you have a reference, where this is discussed?
gentzen said:
Currently, I am actually working on something even more complicated, namely crystal effects in SEM. And because there the question which classical computation are appropriate for which part arose again, I worked out a closely related riddle arising in Bohmian mechanics. If you want to understand my position, look at Demystifier's initial reaction when I tried to explain my resolution. (As a mathematician, I believe in "conservation of difficulty" and that questions can ultimately be resolved, but I certainly don't believe that the solution will always be the most simple one that first comes to your mind, before you tried to work-out the tricky parts in detail.)
You can, of course, make things more complicated than necessary by using Bohmian mechanics, which adds unobservable elements ("trajectories") to quantum theory, which are not useful for anything to get the most efficient way of calculating a given physical problem.
 
  • #19
The question is whether something moves in the energy eigenstate. This question by itself is vague, so it doesn't have a unique answer. There are, however, several inequivalent ways how this question can be made precise. Among them, there are at least 2 different precise versions in which the answer is - yes.

1. Prepare the system in an energy eigenstate and then perform a single measurement of the velocity defined by the operator ##\hat{v}##. Can the measurement outcome ##v## be different from zero? The answer is - yes it can.

2. Use weak measurement to measure the trajectory of the particle, without affecting its wave function. The trajectory obtained this way corresponds to a moving trajectory, which in fact looks exactly like the Bohmian trajectory.
 
  • Like
Likes gentzen and Morbert
  • #20
Demystifier said:
The question is whether something moves in the energy eigenstate. This question by itself is vague, so it doesn't have a unique answer. There are, however, several inequivalent ways how this question can be made precise. Among them, there are at least 2 different precise versions in which the answer is - yes.

1. Prepare the system in an energy eigenstate and then perform a single measurement of the velocity defined by the operator ##\hat{v}##. Can the measurement outcome ##v## be different from zero? The answer is - yes it can.
I think the precise definition that "something is moving" is that its state changes with time, and in this sense if the system is prepared in an energy eigenstate there's "nothing moving", because then the state ##\hat{\rho}(t)=|E \rangle \langle E|## is time-independent (argued within the Schrödinger picture of time evolution for simplicity).

Ad 1. Of course, you can get ##v \neq 0##, but this doesn't imply that anything "is moving": The velocity is simply indetermined before the measurement, and measuring it gives with a probability given by the prepared state some value, which may be different from 0.
Demystifier said:
2. Use weak measurement to measure the trajectory of the particle, without affecting its wave function. The trajectory obtained this way corresponds to a moving trajectory, which in fact looks exactly like the Bohmian trajectory.
Do you mean the "trajectory" of a charged particle as seen in a cloud chamber? Here indeed the state of the particle changes, and it's not in an energy eigenstate. It looses energy by ionizing the vapour molecules in the cloud chamber! What this has to do with Bohmian trajectories is not clear to me.
 
  • #21
vanhees71 said:
I think the precise definition that "something is moving" is that its state changes with time
This definition is precise, but not directly measurable.
 
  • #22
vanhees71 said:
Do you mean the "trajectory" of a charged particle as seen in a cloud chamber?
No, the weak measurement is something completely different.
 
  • #23
vanhees71 said:
I think the precise definition that "something is moving" is that its state changes with time, and in this sense if the system is prepared in an energy eigenstate there's "nothing moving",
I'd say it's the information of future as defined by the agents expectation that doesn't change with time, this is fine. But even if it did change with time - in an unitary way - the total information sort of doesnt change anyway if you include "knowledge of the hamiltonian" as the agents information.

But I think that stationary state of the information itself sometimes encode via conjudate variables that the future itself is expected to be uncertain and thus in a way "changing" or "moving" but not necessarily in a way that implies a defined trajectory. wether its chaotic beyond tracking or just uncertain makes no difference ot the agent i think, the result is the same - unpredictable "change".

/Fredrik
 
  • #24
vanhees71 said:
Well, I'm convinced that one can understand this in correct quantum-mechanical terms. In the classical theory there is no "stable velocity for the electron for an orbit around the nucleus". That problem was famously solved with the formulation of modern quantum mechanics, getting rid of all inconsistencies of the pseudo-quantum theory a la Bohr.
(Edit: In the end, it is the application of some Virial theorem which hides behind that paragraph which includes the sentence with that "stable velocity for the electron for an orbit around the nucleus".)
The problem for me is this focus of "correct" on consistent application of quantum mechanics. If this insistence in the end leads to simulations with systematic deviations from what one can measure in experiments, for parts which could have been understood (and "simulated") well enough in terms of a mixture of classical and quantum terms (not in the sense of Bohr's old pseudo-quantum theory, but still in terms of Bohr newer mixing of classical images with quantum theory), then I get the impression that a better mastery of interpretational issues (in an instrumentalistic sense) by physicists would be a good thing.

This is similar as if you would design an optical system, and somebody would always protest against the use of geometrical optics to design most parts of the system, always insists that he is convinced that a proper understanding in wave-optical terms is possible, and in the end fails to optimize the relevant properties of the optical system.

vanhees71 said:
I don't think that Bohmian mechanics helps in any way here. I don't know, how Bohmian mechanics describes a bound electron though. Do you have a reference, where this is discussed?
Where is the problem? Bound states are normalizable, they give a "canonical" solution to the time dependent Schrödinger equation, so you just can apply "standard Bohmian mechanics". But the trajectories in that solution are all constant, i.e. nothing is moving at all.

vanhees71 said:
You can, of course, make things more complicated than necessary by using Bohmian mechanics, which adds unobservable elements ("trajectories") to quantum theory, which are not useful for anything to get the most efficient way of calculating a given physical problem.
What you get is an "exact model" where you can investigate your confusing problem, and try to work-out some of the tricky parts in detail. And by "conservation of difficulty," those details often also help you better understand what might be "some possible" way of calculating a given physical problem. A drawback of such an "exact model" is that it often prefers "some specific way" of calculating over other ways, and the most efficient way of calculating might have been one of the other ways.
 
  • #25
vanhees71 said:
Let ##A## be an arbitrary observable, represented by the self-adjoint operator, ##\hat{A}##. Then the operator that represents the time derivative ##\dot{A}## is (independently of the chosen picture of time evolution!)
$$\mathring{\hat{A}}=\frac{1}{\mathrm{\hbar}}[\hat{A},\hat{H}].$$
Thus the expectation value of the time derivative is
$$\frac{\mathrm{d}}{\mathrm{d} t} \langle A \rangle=\langle E|\mathring{\hat{A}}|E \rangle=\frac{1}{\mathrm{\hbar}} E (\langle E| \hat{A}|E \rangle-\langle E |\hat{A}|E \rangle=0,$$
where I have used that
$$\hat{H} |E \rangle=E |E \rangle, \quad \langle E|\hat{H}=E \langle E|,$$
which follows from the self-adjointedness of ##\hat{H}##.
I had a similar thought about this as already expressed by @A. Neumaier. Just because the expectation value vanishes doesn't mean that (e.g.,) the variance also vanishes, since it involves
$$\mathring{\hat{A}}^2 ~\propto~ [\hat{A},\hat{H}] \, [\hat{A},\hat{H}]
~=~ \hat{A} \hat{H} \hat{A}\hat{H} - \hat{A} \hat{H} \hat{H}\hat{A}
~-~\hat{H} \hat{A} \hat{A}\hat{H} +\hat{H} \hat{A} \hat{H}\hat{A} $$and you don't get zero in general when you put this inside ##\langle E| \dots |E \rangle## .

This is related to what happens if one queries the magnitude of velocity squared. Its expectation value does not vanish, even though the expectations of the individual velocity components do.
 
  • #26
vanhees71 said:
I think the precise definition that "something is moving" is that its state changes with time, and in this sense if the system is prepared in an energy eigenstate there's "nothing moving", because then the state ##\hat{\rho}(t)=|E \rangle \langle E|## is time-independent (argued within the Schrödinger picture of time evolution for simplicity).
The word "stationary" can be seriously misleading in the description of statistical ensembles. True, you have ## \langle x \rangle = 0 ## for a harmonic oscillator in an energy eigenstate (with the phase maximally uncertain). But looking at the formula $$
\langle x(t) x(0) \rangle = \textstyle
{\hbar \over m \omega} \langle n + \frac 1 2 \rangle \cos (\omega t) \ ,
$$ can you really teach your students that an oscillator doesn't move when it is in the "stationary" state ## n = 3 ## ?
 
  • #27
This is a autocorrelation function, describing fluctuations. In which sense do you think it describes that the oscillator is "moving"?
 
  • #28
vanhees71 said:
In which sense do you think it describes that the oscillator is "moving"?
Strange question. Doesn't ## x ## usually denote the position of a harmonic oscillator?
 
  • #29
Yes, and its expectation value is time-independent if the oscillator is in an energy eigenstate. What you considered is a autocorrelation function, describing fluctuations around this stationary expectation value.

It's as with a gas at a finite temperature in its rest frame in global thermal equilibrium. Nothing moves, and on average the gas molecules are at rest, but of course they are still flucuating around. The average velocity (the expectation value of a time derivative) is 0, but its autocorrelation function or simply ##\langle v^2 \rangle \neq 0##.
 
  • #30
vanhees71 said:
What you considered is a autocorrelation function, describing fluctuations around this stationary expectation value.
I think your view is too myopic. The "fluctuations" are not random, but strictly periodic, and in perfect agreement with what you'd expect classically. (Assuming of course that the oscillator remains unperturbed.)
 
  • #31
What I expect from classical mechanics is rather described by a coherent or squeezed state than an energy eigenstate!
 
  • #32
WernerQHs point was what I thoughtr the thread was about. What is a periodic "movement" in Q space, is stationary in the conjugate P space. That way, one can argue that what is stationary or not, depends on hte agents way of encoding things. Of course the fourier transform is good because it makes encoding periodic phenomena simple or even stationary.

/Fredrik
 
  • #33
vanhees71 said:
This is a autocorrelation function, describing fluctuations. In which sense do you think it describes that the oscillator is "moving"?
WernerQH said:
Strange question. Doesn't ## x ## usually denote the position of a harmonic oscillator?
vanhees71 said:
Yes, and its expectation value is time-independent if the oscillator is in an energy eigenstate. What you considered is a autocorrelation function, describing fluctuations around this stationary expectation value.
Wow, so the disagreement really comes down to how you (vanhees71) interpret the word "moving". For me, the most important part of "moving" is that properties (like position) in a stationary state are not necessarily constant over time. But you basically seem to have no objections to this, you just want to call that "fluctuations" instead of "moving".

It seems like for you, "moving" implies a more systematic way of being not-constant, like for example changing position continuously over time with a certain velocity. I don't believe in that one either. WernerQH's example is nice, even so for me feels just as "unphysical" (only to the other extreme) as the constant non-moving trajectories of stationary states in Bohmian mechanics.

vanhees71 said:
It's as with a gas at a finite temperature in its rest frame in global thermal equilibrium. Nothing moves, and on average the gas molecules are at rest, but of course they are still flucuating around. The average velocity (the expectation value of a time derivative) is 0, but its autocorrelation function or simply ##\langle v^2 \rangle \neq 0##.
And here you gave a nice detailed description making it clear that you don't insist on everything being always constant over time, but that it is mostly the absence of systematic movement which is important to you.
 
  • #34
For me moving means that the state of a system is time-dependent. In QT energy eigenstates are time-independent and thus the system is "not moving". It's the solution for the problem of instable atoms in the classical picture, where accelerating electrons crash into the nuclei within a very short time due to radiative energy loss. In QT that's not the case: Neglecting the quantization of the electromagnetic field you get electronic energy eigenstates which are the stationary states, and the atoms are thus stable in these states forever. The expectation values of the time-derivative of any observable is 0, but of course the observables don't take determined values and thus fluctuate around their time-independent expectation values, and thus the expectation values of squares of time-derivatives of observables are not 0, although "nothing is moving".
 
  • #35
vanhees71 said:
For me moving means that the state of a system is time-dependent.
That's how I understood you some posts up as well. No big point in arguing just about definitions.

vanhees71 said:
In QT energy eigenstates are time-independent and thus the system is "not moving". It's the solution for the problem of instable atoms in the classical picture, where accelerating electrons crash into the nuclei within a very short time due to radiative energy loss. In QT that's not the case: Neglecting the quantization of the electromagnetic field you get electronic energy eigenstates which are the stationary states, and the atoms are thus stable in these states forever. '
Interestingly, in my preferred interpretation, the choice of encoding the information in the best way, also is what makes the agent fit which relates to stability as well. If the agent can find a transformation that transforms the pattern of incoming data strems into a stationary code, that must be a massive evolutioanry advantage. This is how i prefer to interpret this. The fourier transform is just one of many possible transforms.

I recall during my first introduction to QM, that I felt that the role of the "fourier transform" was suspicuously a key concept, but which was not really motivated in a deeper waya beyond the standard "wave duality stuff". I think I have come to a much deeper understanding of this over the years, and i think there is maybe yet deeper motivations for all this that is ahead of us.

/Fredrik
 
  • #36
To translate all this philosophy about information into physics: The information about the system is encoded in the state of the system, i.e., the statistical operator ##\hat{\rho}##. That's all information you can have about a system according to QT. You don't need agents or other fictitious elements but just the statistical operator!
 
  • Haha
Likes WernerQH and Fra
  • #37
vanhees71 said:
It's the solution for the problem of instable atoms in the classical picture
It would be a rather poor solution if it relied on wording, on just avoiding the word "motion". We all know that classical theory is only "approximately" valid (think of Rydberg atoms). But it is not helpful to shun the correspondence principle altogether, and insist on a peculiar usage of the word "motion" that is at odds with how most physicists use the term.
 
  • #38
One doesn't merely avoid the word "motion" but with QT has discovered a theory with a huge realm of applicability, where you have stable bound states of electrons to atomic nuclei. Indeed the classical theory is only an approximation to QT. The energy eigenstates provide solutions, where you have an electrostatic configuration, which is, in contradistinction to the classical theory, stable. Such a solution does not exist in the classical theory, and that's why in classical theory atoms couldn't exist as stable objects.

The correspondence principle is nowadays substituted by symmetry principles, which let us derive how QT looks like for specific problems. There's no need anymore for hand-waving arguments a la Bohr.
 
  • #39
Fra said:
... If the agent can find a transformation that transforms the pattern of incoming data strems into a stationary code, that must be a massive evolutioanry advantage. ...
... I think I have come to a much deeper understanding of this over the years, and i think there is maybe yet deeper motivations for all this that is ahead of us.
If you feel that your understanding is so deep now, then try to write down something self-contained. Maybe just the solution to some specific riddle (the easiest route, because you don't need to "convert" anyone), perhaps some coherent interpretation (like the thermal interpretation), or some specific theorem (like the quantum de Finetti theorem).

vanhees71 said:
To translate all this philosophy about information into physics: ... You don't need agents or other fictitious elements ...
What I would find intersting is to do something with your agents. If all this talk about agents in the end just boils down to the perspective of a single agent, then the charge of solipsism sooner or later becomes quite justified. Probability is important in quantum physics, and probability is closely related to game theory, which is concerned with the interactions of "many agents" among each other. Now game theory is messy, more messy than physics in certain ways, but less messy than the actual biological and political realities out there. How can your philosophy help us with those issue related to agents and game theory?
 
  • #40
vanhees71 said:
The correspondence principle is nowadays substituted by symmetry principles, which let us derive how QT looks like for specific problems. There's no need anymore for hand-waving arguments a la Bohr.
In my diploma exam I was actually questioned about the correspondence principle. The examiner was a condensed matter theorist, who obviously did not consider it outdated! How do you explain to your students how quantum physics blends into classical physics? Unfortunately there is not a single established interpretation of quantum theory, so it would appear reasonable to me to expose students to a great variety of pictures (concepts, notions) so that they can hone their intuition (and discover the limits of applicability of those pictures). Rather than insisting on one "correct" picture. Aren't future physics teachers among your students? I can't help but feel pity for them and their future pupils.
 
  • #41
WernerQH said:
Unfortunately there is not a single established interpretation of quantum theory, so it would appear reasonable to me to expose students to a great variety of pictures (concepts, notions) so that they can hone their intuition (and discover the limits of applicability of those pictures). Rather than insisting on one "correct" picture. Aren't future physics teachers among your students? I can't help but feel pity for them and their future pupils.
Sorry, but this rant is both off-topic, nasty, and doesn't even make sense. In Germany, quantum physics is not really taught at school, so whatever physics teachers are taught in university about its interpretation should have a negligible influence on their future pupils.

And if you would want to teach different pictures to your students, then you should start with some existing expositions of those pictures. vanhees71 has no objections to presentations like section "3.7 Interpretations of Quantum Mechanics" in Weinberg's book or even entire books like "Verständliche Quantenmechanik: Drei mögliche Weltbilder der Quantenphysik" by Detlef Dürr and Dustin Lazarovici (or its english version). He even recommends those. I see no problem that he favors Ballentine's interpretation and his book, among others because that allows him to defend Einstein's position without embracing all the surrounding philosophical discussions. In the end such discussions would just drag away valuable time from his students.

Maybe one question is whether comprehensive books like "Do We Really Understand Quantum Mechanics?" by Franck Laloë should be recommended too. But really reading and understanding such a book would amount to dive into current research in quantum foundations, which might not be the best idea for a future physics teacher (or a future particle physics researcher).
 
Last edited:
  • Like
Likes dextercioby
  • #42
gentzen said:
If you feel that your understanding is so deep now, then try to write down something self-contained.
At some point I have in mind to publish, but I feel there is enough of interpretations so I do not want to publish just another interpretation that makes no substantial difference to the open problems. And I have no pressure to publish anything unless I feel ready. It's not near ready yet.

I do not like to read such papers myself, at least not after reading enough. I enjoyed lots of writings of the QM founders in the past but at this point, I don't want to read just another of those papers. It's as bad and empty as the other extreme - axiomatic reconstructions where the axioms are choosen without physical motivation.

After all my interpretation is I am well aware strange and complicated relative to say copenhagen interpretation (more so than standard qbism), so I expect noone to buy into until it's method can be shown to solve real problems and that is to be fair my problem. I have no intention to convince anyone, I just try to stick to what I think is rational reasoning, but to each his own.
gentzen said:
What I would find intersting is to do something with your agents. If all this talk about agents in the end just boils down to the perspective of a single agent, then the charge of solipsism sooner or later becomes quite justified.
I agree completely. I have of course thought about this. To just end up with everything beeing arbitrary would be pointless, it's not what I seek.
gentzen said:
Probability is important in quantum physics, and probability is closely related to game theory, which is concerned with the interactions of "many agents" among each other. Now game theory is messy, more messy than physics in certain ways, but less messy than the actual biological and political realities out there. How can your philosophy help us with those issue related to agents and game theory?
Yes, game theory is the right perspective to see what I talk about. (That's not to say one should jump into the formal "game theory" litterature and expect the exact math).

A short comment, which as always is a balance as I avoid put any details on the forums due to guidlines. Mentores are free to delete the post if I crossed some lines.

In my view the agents/obsevers are the players (and the agents are of course simply matter subsystems, no brains or physicists needed), but there is no objective agreement of the "rules of the game", the only rules is that the survivor wins, do what you can to survive. The agents set of "strategies" are constrained by it's physical limits. So the "strategy space" must necessarily scale (or rather evolve) with the complexity (mass) of the agent. So this implies an evolution of law, coined by Lee Smolin, but his ideas was specifically for example cosmological natura selection that the laws mutate at each big bang, and are frozen from there one... in principle I thinkg the same way, except I see no clear line, it's just some somewhere around the TOE energy scale, I expect that laws to be fixed enough so that this is why we don't see variations ofhte laws when looking out into space.

The conceptual quest in this perspective is simple enough to be explained like this:

After some evolution, can can we infer which population of agents encoding which strategies that are most likely to appear in the low energy limit without ending up in a similar landscape problem as string theory?

Could these things correspond to (be isomorphic to) matter and their respective interactions?

And the unification of all interacitons, should follow from how new interactions become possible as agents grow in complexity. This is a naturaly reason why the laws of physics must become simpler, the closer we get to unification. They only may LOOK complex, when seem from the fictive external asymptotic observer that is the coventional perspective in QFT. Ie. the theory when properly scaled (not just renormalized in the regular way) must become very simple. And simply enough to avoid the fine tuning problem of string theory for example.

The strategy is - formulate this in terms of mathematics, and algorithms/computations, and work it out and try to make contact to the familiar concepts, such as space, time, mass, energy, charge etc.

/Fredrik
 
  • #43
WernerQH said:
In my diploma exam I was actually questioned about the correspondence principle. The examiner was a condensed matter theorist, who obviously did not consider it outdated! How do you explain to your students how quantum physics blends into classical physics? Unfortunately there is not a single established interpretation of quantum theory, so it would appear reasonable to me to expose students to a great variety of pictures (concepts, notions) so that they can hone their intuition (and discover the limits of applicability of those pictures). Rather than insisting on one "correct" picture. Aren't future physics teachers among your students? I can't help but feel pity for them and their future pupils.
For sure, I don't bother my students with fruitless philosophical speculations. I admit that I have not yet found a way of teaching QT I'm really satisfied with. So I take refuge to a blend of the "historical approach", i.e., I start with a short review about the historical development, which lead to modern quantum mechanics, emphasizing from the first moment on that everything before Heisenberg, Born, Jordan and Schrödinger and Dirac is outdated and not a consistent picture. Concerning modern QT itself, of course I treat only non-relativistic QM in terms of wave mechanics since with wave mechanics in my opinion you get the most intuitive picture which at the same time is closest to the full abstract content of the theory. You also can't help it, but QT is considerably more abstract than classical point-particle mechanics and also a bit more abstract than classical field theory, but that's how physics is in the 21st century. I also cover spin and the Pauli equation and as a final topic entanglement and the Bell inequality. Concerning interpretation, I present them with the minimal statistical interpretation, with the Born rule as the key postulate. I don't see any merit in thinking that one needs more than the minimal interpretation to do physics and to understand the phenomena related to QT.
 
  • Like
Likes hutchphd, gentzen, dextercioby and 1 other person
  • #44
gentzen said:
Sorry, but this rant is both off-topic, nasty, and doesn't even make sense.
Thanks for trying to moderate. :-) But I can't understand why it should be off-topic to criticize what I perceive as a distortion of the term "motion" as most people use it.
gentzen said:
In Germany, quantum physics is not really taught at school, so whatever physics teachers are taught in university about its interpretation should have a negligible influence on their future pupils.
It's a long time since I went to school, and we didn't have quantum physics then. But my impression from Physik Journal, the web site of Heisenberg-Gesellschaft, or physikerboard.de is that there is much effort to introduce elements of quantum physics (of course not quantum theory) already at school. And I think that some of the peculiar views expressed by van Hees can be detrimental to a young physics teacher who is supposed to explain these concepts.

Does a harmonic oscillator oscillate? Only if it is not in an energy eigenstate? How do you prepare it in the state ## n = 3 ## ? The formula in my previous post #26 is most easily derived by summing over energy eigenstates, but using coherent states you can obtain the exact same formula. I can't make sense of @vanhees71's point that one is allowed to speak of motion only when one uses coherent states.
 
  • #45
gentzen said:
Sorry, but this rant is both off-topic, nasty, and doesn't even make sense. In Germany, quantum physics is not really taught at school, so whatever physics teachers are taught in university about its interpretation should have a negligible influence on their future pupils.
Fortunately that's not entirely true. An idea about quantum theory is part of the general knowledge every high-school student should get before graduating, and indeed there is some QM in the high-school curricula. Fortunately also the didactics tends to level down the amount of "old quantum theory" to discuss, of course at a more qualitative level, adapted to the very limited level in mathematical prerequisites the German highschool system offers, but at least one covers wave mechanics, the double-slit experiment, Stern Gerlach, the particle in the infinite potential box. At my time we even had the Schrödinger equation and the harmonic oscillator. For the hydrogen atom only the ground state was explicitly treated and otherwise it was explained in a qualitative way. However, I had an exceptionally good highschool teacher.
gentzen said:
And if you would want to teach different pictures to your students, then you should start with some existing expositions of those pictures. vanhees71 has no objections to presentations like section "3.7 Interpretations of Quantum Mechanics" in Weinberg's book or even entire books like "Verständliche Quantenmechanik: Drei mögliche Weltbilder der Quantenphysik" by Detlef Dürr and Dustin Lazarovici (or its english version). He even recommends those. I see no problem that he favors Ballentine's interpretation and his book, among others because that allows him to defend Einstein's position without embracing all the surrounding philosophical discussions. In the end such discussions would just drag away valuable time from his students.
Indeed, one should keep out all this philosophical confusion from the students. It doesn't in any way help to understand the physics. Concerning philosophy I think what QT teaches us is that we can understand to a certain extent natural phenomena which are way beyond what we directly perceive by our senses which are adapted to the macroscopic environment we have to survive in, but that this understanding is only possible to be expressed in a rather abstract mathematical way. On the other hand abstraction makes thinks simpler rather than more complicated, because it helps to get rid of all kinds of destractions and enables a presentation of the theory in terms of its "bare bones". For that the abstract rigged-Hilbert space formalism (aka Dirac's bra-ket formalism) is the most clear and simple exposition, but that's of course out of reach at the high-school level.
gentzen said:
Maybe one question is whether comprehensive books like "Do We Really Understand Quantum Mechanics?" by Franck Laloë should be recommended too. But really reading and understanding such a book would amount to dive into current research in quantum foundations, which might not be the best idea for a future physics teacher (or a future particle physics researcher).
One should understand the physics first, before reading such a book concerned with interpretational issues. One must also not forget that, uncomprehensible to me, there seems to be no common decision on the "right interpretation" of quantum mechanics yet. So it's an open, in my opinion also quite unspecified, problem interdisciplinary research topic on the boundary between physics and philosophy, which is one more argument to keep it entirely out of the discussion at high school.

However, you can of course discuss the 2022 Nobel Prize in physics just with the well-understood established physical theory in its minimal interpretation. At least the math is pretty simple. All you need is elementary algebra in finite-dimensional Hilbert spaces for the quantum part and some basic probability theory for the local-realistic-hidden-variable theory.

What of course cannot be discussed is the here most discussed issue of locality vs. relativistic causality, which is in fact no problem but resolved by the microcausality constraint of modern relativistic QFT, i.e., there are no "actions at a distance" but only "long-ranged correlations", which are stronger than predicted by any local realistic hidden-variable theory.
 
  • #46
vanhees71 said:
Nothing moves, and on average the gas molecules are at rest, but of course they are still flucuating around.
This would mean that everything moves, but that you take a coarse-grained view only!? But what would this mean for an electron? Since the expectation refers to the ensemble, it would mean that every realization of the electron moves, but the net effect for the ensemble is zero.

... Unless you are adhering to my thermal interpretation, according to which the expectations are the real things observable, and correlations only tell about the amount of their intrinsic uncertainty!
 
Last edited:
  • #47
So far I follow your "interpretation". ;-).
 
  • #48
vanhees71 said:
The information about the system is encoded in the state of the system, i.e., the statistical operator . That's all information you can have about a system according to QT.
But in the statistical interpretation, this would be a true statement only for an ensemble of identically prepared systems, not for a single system!
 
  • Like
Likes dextercioby
  • #49
vanhees71 said:
There's no need anymore for hand-waving arguments a la Bohr.
But all your arguments are pure handwaving when applied to a single system rather than to an ensemble!
vanhees71 said:
So far I follow your "interpretation". ;-).
Ah, finally? Note that there is nothing more to my interpretation than that! Everything else is just apllication of this to various issues regarding single systems!
 
  • #50
The only thing I do not understand concerning your interpretation is, how you define expectation values when you forbid to use Born's rule, but I don't think that we'll ever come to a consensus about this.

What I also don't understand is your narrow interpretation of the "ensemble". Of course quantum theory also applies to statistics made with a single system. E.g. a single electron in a Penning trap is used for very long times to get expectation values by measuring corresponding currents. Also a gas in a container ("canonical ensemble") consists of fixed molecules, and statistical physics describes the coarse grained macroscopic ("collective") observables for this "single system". For macrosocpic systems the fluctuations of these collective observables are tiny compared to the average values themselves, and that's why you have "classical behavior".

The only other assumption to use the maximum entropy principle you need the H-theorem according to which the constraints must be imposed by using the additive conserved quantitities, leading to the microcanonical, canonical, or grandcanonical ensembles.

One way to get off-equilibrium physics is to use the ##\Phi##-derivable approximations (2PI/CJT/Luttinger-Ward/Baym-Kadanoff formalism) leading to the Kadanoff-Baym equations and "coarse graining" is done formally as the gradient expansion, which leads to quantum transport equations, and this is indeed a expansion in powers of ##\hbar##. Also the H-theorem can be derived in this way.

In the toy-model cases (e.g., ##\phi^4## in (1+2) spacetime dimensions) where you can numerically solve the Kadanoff-Baym equation as well as the transport equation one finds that indeed the semiclassical transport equations describe the dynamics well, and particularly the "long-time limit", which of course leads to the usual standard Bose-Einstein or Fermi-Dirac distributions as expected.
 
Back
Top