A QM is Feynman path ensemble - is QFT Feynman field ensemble?

Jarek 31
Messages
157
Reaction score
31
While classical mechanics uses single action optimizing trajectory, QM can be formulated as Feynman ensemble of trajectories.
As in derivation of Brownian motion, mathematically it is convenient to use nonphysical: nowhere differentiable trajectories - should it be so?

Can this connection be taken to field theories - can we see QFT as Feynman ensemble of fields?
Nowhere differentiable ones or more physical?
With concrete field configurations corresponding to each Feynman diagram?

Generally: what are the similarities and differences between classical and quantum field theories?
E.g. in classical we remove infinities with regularization, in quantum with renormalization and cutoffs - is there any connection between them?
 
  • Like
Likes Heidi and vanhees71
Physics news on Phys.org
Yes, in QFT the path integral is over field configurations.

There is a relationship between the statistical mechanics (ensembles) of classical fields and the path integral in quantum field theory.
https://www.damtp.cam.ac.uk/user/tong/sft/two.pdf
These are notes on statistical field theory, but the similarity with path integrals in quantum field theory is discussed in Section 2.3.
 
  • Like
Likes Heidi, vanhees71 and Jarek 31
Great, I also would like to think so, but am not completely certain if it is appropriate (?)
Feynman diagram is kind of higher abstraction level - there should be e.g. ensembles of EM field configurations behind charged/magnetic dipole particles in Feynman diagrams (?)
1622128991109.png


So maybe we can also translate e.g. between methods of handling infinities between them?
In classical field theories we usually use potential for this purpose - e.g. sine-Gordon ( https://en.wikipedia.org/wiki/Sine-Gordon_equation ) enforcing varphi = + or -1 would require infinite energy of kink: going e.g. from -1 to +1. It is smoothed to finite energy thanks to potential.
Is there a connection between such classical potential-based regularization and QFT renormalization?
 
I remember statistical field theory from great Fisher's book, also in perturbative QFT approximation we get Feynman diagrams indeed from field ensemble - by using Taylor expansion and the Gaussian integration trick:
1622133284865.png


But I really miss the question of e.g. electric field around electron in such Feynman diagram - can we tell anything about it (e.g. after also fixing momentums), ensemble of such fields?
Shouldn't it be E~1/r^2, leading to rho~1/r^4 mean energy density around such electron?
However, it would integrate to infinite energy - classically requiring regularization/deformation: smoothing to finite energy, while in QFT we remove this infinity with cutoffs and renormalization - shouldn't there be a connection between them?

QFT is being approached from these opposite worlds: nonperturbative e.g. lattice completely forgetting about particles, and abstract Feynman diagrams - aren't there trials for intermediate approaches: e.g. just asking for ensembles of field configurations behind each Feynman diagram?

Aren't soliton models example of such intermediate approach? To consider their scattering with incomplete knowledge, we also need to consider ensemble of scenarios: Feynman diagrams.
Here is some Feynman diagram article for sine-Gordon model: "On the renormalization of the sine–Gordon model" https://arxiv.org/pdf/hep-th/0505276.pdf
 
  • Like
Likes vanhees71
i do not know if this is a good rephrasing:
the wick rotation (1 -> i) relates two physical models , classical to quantum.
do changing 1 -> (1 + i) / sqrt 2 correspond to an intermediate model between classical and quantum?
 
Wick rotation is a bit different - changes e.g. statistical field theory with quantum field theory.

Earlier: "Wick rotating" Feynman path ensemble, we get Boltzmann path ensemble - e.g. as in euclidean path integrals, maximal entropy random walk, or Ising model - this mathematical analogy e.g. allows to construct Bell-violation examples, or QM-like computers in Ising model.

In this QM-statistical physics analogy:
- mechanics: classical approximation means single action optimizing trajectory, QM means Feynman path ensemble,
- statistical: the system is approximately in single: the lowest energy state, but more precisely: it is in Boltzmann ensemble among possible states.
 
  • Like
Likes vanhees71
Heidi said:
i do not know if this is a good rephrasing:
the wick rotation (1 -> i) relates two physical models , classical to quantum.
do changing 1 -> (1 + i) / sqrt 2 correspond to an intermediate model between classical and quantum?
Wick rotation is just a mathematical trick to map a calculational problem in vacuum QFT from Minkowski to Euclidean space. In this Euclidean version of the QFT it's easier to argue about some formal mathematical aspects, e.g., in renormalization theory (see the famous work by BPHZ). You can evaluate all the (perturbative) proper vertex functions and thus ##N##-point Green's functions of the theory in Euclidean field theory, but than you have the trouble to analyically continuate back to "real time" (Minkowski space). This has nothing to do with some "classical to quantum" transition of something like that.

Another example, where imaginary times occur in physics is in (both relativistic an non-relativistic) many-body QFT. In thermal equilibrium you need to evaluate expectation values of field operator products ("thermal Green's functions") with the statistical operator ##\hat{\rho}=\exp(-\beta \hat{H}+\mu \beta \hat{Q})/Z## (and you also have to evaluate ##Z##, the partition sum too!). Apart from the partition sum in the denominator the statistical operator looks similar to a time-evolution operator ##\hat{U}=\exp(-\mathrm{i} \hat{H} t)## with ##t=-\mathrm{i} \beta##.

This implies that you can formulate a perturbation theory like in vacuum QFT when formally setting ##t=-\mathrm{i} \tau##, where ##\tau \in (0,\beta)##. Since the expectation values of some operator ##\hat{A}## are given by the trace ##\mathrm{Tr}(\hat{\rho} \hat{A})##, it turns out that you have to impose periodic (antiperiodic) boundary conditions for bosonic (fermionic) field operators.

The result are thus perturbative Feynman rules and Feynman diagrams which look identical with the vacuum Feynman diagrams, but instead of an integral over the energies in the vacuum case you have sums over the discrete Matsubara frequencies ##\omega_n =2 \pi n \beta## (bosons) or ##\omega_n=(2n+1) \pi n## (fermions) with ##n \in \mathbb{Z}##, and the propagators are propagators of the Euclidean-field-theory free fields.

Also this has nothing to do with some classical vs. quantum transition.
 
  • Like
Likes Heidi and Jarek 31
vanhees71 said:
Also this has nothing to do with some classical vs. quantum transition.
I have to admit that I don't understand the difference between classical and quantum phase transition (?)
For example as the simplest quantum phase transition model there is usually pointed the transverse-field Ising model ... but it is not a problem to also solve it as just Boltzmann ensemble, and turns out predictions are quite similar: "Classical Ising chain in transverse field" https://www.sciencedirect.com/science/article/pii/S0304885306016295
 
  • Like
Likes vanhees71
Sure, but isn't this rather the (semi-)classical limit of (equilibrium) statistical mechanics, i.e., where you can approximate the Bose-Einstein/Fermi-Dirac distributions by the Boltzmann distribution?
 
  • #10
Bose-Einstein/Fermi-Dirac distributions can be also considered combinatorically: just choosing if the number of particles per node is N or {0,1}, and energy e.g. as in Bose-Hubbard n(n-1).

The Wick rotation difference is more subtle, e.g.:
- the minimal energy state approximation is natural for Boltzmann distribution, while for Feynman we need more subtle e.g. saddle-point method for this kind of approximation,
- in Boltzmann distribution we lose phase - e.g. removing interference,
- in Boltzmann we get attraction to the lowest energy state, while in Feynman the excited states are still stable (parabolic vs hyperbolic evolution) - e.g. the euclidean path integrals are numerically mainly used to find the ground state, in maximal entropy random walk excited states are metastable.

Here is MERW ( https://en.wikipedia.org/wiki/Maximal_entropy_random_walk - Boltzmann path ensemble) evolution example in defected lattice - it sees excited states, but they are not stable as in Schrodinger - it goes to stationary density as in ground state Schrodinger:
1622197764013.png
 
Last edited:
  • #12
Exactly, they focus on "Euclidean gauge", there is no imaginary 'i' in exponent - this is Wick rotated picture: focused on the lowest energy ground state.
 
  • #13
Sure, Lattice QCD only works in Euclidean QFT formulation. Realtime quantities, as e.g. transport coefficients, are a big problem there!
 
  • Like
Likes Jarek 31
  • #14
Indeed, but mathematically this makes it just ("classical") Boltzmann distribution - statistical mechanics on fields, neglecting quantum phase - blurring the boundary between classical and quantum mechanics.

Or MERW - just "classical" random walk chosen accordingly to Jaynes principle of maximal entropy ( https://en.wikipedia.org/wiki/Principle_of_maximum_entropy ) - mathematically leading to the same stationary probability distribution as ground state of quantum models like Hubbard or Schrodinger.

It brings question of understanding the classical - quantum boundary?
The single lowest energy state -> Boltzmann ensemble, is analogous to single optimal action trajectory -> Feynman ensemble.

Naive response is that phase is crucial for QM - it is true e.g. for interference, but many "quantum" phenomena are already there e.g. in Boltzmann ensemble like Anderson localization ( https://en.wikipedia.org/wiki/Anderson_localization ) already obtained in MERW ( https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.102.160602 ).
 
Last edited:
  • #15
Jarek 31 said:
Wick rotation is a bit different - changes e.g. statistical field theory with quantum field theory.
Jarek 31 said:
"Wick rotating" Feynman path ensemble, we get Boltzmann path ensemble - e.g. as in euclidean path integrals,
Feynman paths (resp. fields) form an ensemble only in imaginary time QM (resp. Euclidean QFT), since only there one has a meaningful (i.e., positive) measure. In real time QM the Feynman measure is well-defined but complex, but one cannot associate stochastic processes (hence ensembles) to complex measures. In interacting Minkowski QFT, the Feynnman measure is not even well-defined.
 
  • Like
Likes mattt, Jarek 31, dextercioby and 1 other person
  • #16
Jarek 31 said:
Indeed, but mathematically this makes it just ("classical") Boltzmann distribution - statistical mechanics on fields, neglecting quantum phase - blurring the boundary between classical and quantum mechanics.

Or MERW - just "classical" random walk chosen accordingly to Jaynes principle of maximal entropy ( https://en.wikipedia.org/wiki/Principle_of_maximum_entropy ) - mathematically leading to the same stationary probability distribution as ground state of quantum models like Hubbard or Schrodinger.

It brings question of understanding the classical - quantum boundary?
The single lowest energy state -> Boltzmann ensemble, is analogous to single optimal action trajectory -> Feynman ensemble.

Naive response is that phase is crucial for QM - it is true e.g. for interference, but many "quantum" phenomena are already there e.g. in Boltzmann ensemble like Anderson localization ( https://en.wikipedia.org/wiki/Anderson_localization ) already obtained in MERW ( https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.102.160602 ).
No, the imaginary-time (Matsubara) formalism takes full quantum statistics into account. The ground state is of course a special case, namely ##T \rightarrow \infty## (##\beta \rightarrow 0##).
 
  • Like
Likes atyy
  • #17
Regarding "imaginary time", I haven't seen any reasonable interpretation(?), in spacetime we have 4 real coordinates.
So maybe instead of interpreting it as imaginary time, cannot we just interpret it as Boltzmann distribution - statistical physics? (just mathematically universal principle of maximal entropy).

Regarding Feynman ensembles, indeed phase makes it more complicated, classical mechanics is obtained in approximations like the saddle-point method, van Vleck determinant.

Regarding excited states in euclidean/imaginary time, statistical physics interpretation, I have mentioned that they indeed are there - but as metastable, no longer stable.
Feynman ensemble leads to hyperbolic PDEs with eigenvalues on unitary circle, while Boltzmann to parabolic PDEs with eigenvalues on real axis.
Like in the MERW evolution diagram above, starting close to excited state, the density will first localize there, but finally it will "deexcite" to the ground state.
It is different than e.g. in Schrodinger, where excited states are stable - excited atom requires external perturbation to deexcite there.
Should atom remain excited indefinitely without external perturbations?
 
  • Like
Likes vanhees71
  • #18
As I said "imaginary time" is just a mathematical trick to evaluate Feynman diagrams (both in vacuum and equilibrium QFT). For the fully general off-equilibrium case there's the Schwinger-Keldysh time-contour formalism, and here you are best off with the completely real time contour (but that's a matter of opinion, because there's also "thermo-field dynamics", where you go first along the real axis then half-way down to ##t_{\text{end}}-\mathrm{i} \beta/2## then back parallel to the real axis to ##t_{\text{ini}}-\mathrm{i} \beta/2## and then again parallel to the imaginary axis to the final point ##t_{\text{init}}-\mathrm{i} \beta## :-).
 
  • Like
Likes Jarek 31
  • #19
Sure, "imaginary time" is rather just a mathematical trick.

But Boltzmann distribution isn't - it is the effect of the principle of maximal entropy, e.g. while fixing mean energy. https://en.wikipedia.org/wiki/Principle_of_maximum_entropy :
The principle of maximum entropy states that the probability distribution which best represents the current state of knowledge about a system is the one with largest entropy, in the context of precisely stated prior data (such as a proposition that expresses testable information).
Principle of maximal entropy is just universal combinatorics - e.g. from Stirling approximation:
binomial(n,pn) ~ 2^{n h(p)}
for h(p) = - p lg(p) - (1-p)lg(1-p) Shannon entropy - having single maximum h(1/2)=1.
Hence having length n sequence of 0 and 1, focusing on sequences having pn of 1s, the safest choice is maximizing entropy p=1/2 as such subset asymptotically dominates any other assumptions of p.

While "imaginary time" interpretation is just a mathematical trick, mathematically equivalent "Boltzmann distribution" interpretation is literally the safest choice from universal mathematical perspective ...
So which interpretation is "better"?
 
  • Like
Likes vanhees71
  • #20
Indeed, the canonical or grand-canonical stat. op. derives from the maximum-entropy principle and can be most convincingly argued about when introducing entropy as a measure of lack of information, which must be maximized given the known constraints in order to find the statistical description with the minimal prejudice. On the other hand it also follows from kinetic theory, aka the Boltzmann(-Uehling-Uhlenbeck) equation.
 
  • Like
Likes Jarek 31
  • #21
Such Boltzmann perspective is more natural than "rotation to imaginary time", can be well understood.
The difficulty of Feynman ensemble is adding the quantum phase, e.g. leading to interference.

For example there are hydrodynamical analogs of Casimir effect - e.g. two plates in liquid get attracted if shaking the tank ( ).
Quantum Casimir effect, instead of such external shaking, needs some intrinsic energy source leading to the phase evolution ... like de Broglie clock/zitterbewegung of particle, confirmed experimentally for electron ( https://link.springer.com/article/10.1007/s10701-008-9225-1 ).
 
  • Love
  • Like
Likes dRic2 and vanhees71
  • #22
Jarek 31 said:
While "imaginary time" interpretation is just a mathematical trick, mathematically equivalent "Boltzmann distribution" interpretation is literally the safest choice from universal mathematical perspective ...
So which interpretation is "better"?
Euclidean QFT is the analytic continuation of ordinary (Minkowski or Galilei) QFT to imaginary time. It changes the dynamics completely and is mathematically equivalent (Osterwalder-Schrader theorem) only by changing the meaning of all terms. A physical interpretation is possible only in the Minkowski version. In the Euclidean version all information obtained must be reinterpreted before it makes physical sense - e.g., Euclidean decay rates become Minkowski frequencies. Thus the only safe interpretation is the real-time = Minkowski version
(Feynman speudo-stochastic view), while the imaginary-time = Euclidean version, stochastic ensemble view) is only a computational proxy for some field theories (in practice only lattice QCD) .

Except in equilibrium, where time does not matter and 4D QFT reduces to 3D statistical mechanics with its standard stochastic interpretation.
 
  • Like
Likes dextercioby and vanhees71
  • #23
But from the other side, Boltzmann distribution is mathematically universal application of the maximal entropy principle - there are very strong mathematical motivations to use it.
Why can't we focus for a moment on this motivation - see the situation from just this perspective?
E.g. as approximation before adding quantum phase in Feynman ensemble?

Also, practical difference between them is if excited states are stable in Feynman or metastable in Boltzmann ensemble - e.g. is excited atom a stable or metastable state?
In other words: should excited atom in completely empty and calm universe deexcite?

Ps. For classical objects with wave-particle duality they also observe orbit quantization (e.g. https://www.nature.com/articles/ncomms4219 , https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.113.104101 ) or quantum-like statistics (e.g. https://journals.aps.org/pre/abstract/10.1103/PhysRevE.88.011001 ).
 
  • #24
A. Neumaier said:
Except in equilibrium, where time does not matter and 4D QFT reduces to 3D statistical mechanics with its standard stochastic interpretation.
With the important qualification that in this case the fields must obey symmetric for bosons (anti-symmetric for fermions) in imaginary time with period ##\beta=1/T##.
 
  • #25
This equilibrium in intersection between these two (quantum and statistical physics) worlds is the ground state, and usually (e.g. Schrodinger) it is non-degenerated and its wavefunction can be chosen as real nonnegative (generalization of Frobenius-Perron theorem) - the quantum phase disappears (e.g. Faris articles like https://projecteuclid.org/journals/duke-mathematical-journal/volume-42/issue-3/Degenerate-and-non-degenerate-ground-states-for-Schrödinger-operators/10.1215/S0012-7094-75-04251-9.short ).
 
Last edited:
  • #26
But the equilibrium case is very special. The max entropy principle only holds for completely isolated systems in equilibrium. In some approximation it holds for systems at fixed time in local equilibrium, but this requires singling out a time, hence is not a good basis for relativistic quantum fields.
 
  • Like
Likes vanhees71
  • #27
Schrodinger equation usually also represents isolated system of nucleus and one or a few electrons.
And from experiment we know that such excited atom has tendency to deexcite to the ground state equilibirum - exactly as Boltzmann path ensemble would predict.

In contrast, Feynman path ensemble predicts that such atom should be excited forever ...
 
  • #28
Only if you neglect spontaneous emission due to the coupling of the charges to the quantized em. field.
 
  • #29
Sure, there are no doubts that Feynman path ensemble can be extended to agree with experiments ... but Boltzmann here just agrees in this matter "out-of-the-box".
 
  • #30
Jarek 31 said:
Schrodinger equation usually also represents isolated system of nucleus and one or a few electrons.
And from experiment we know that such excited atom has tendency to deexcite to the ground state equilibirum - exactly as Boltzmann path ensemble would predict.
No - an excited state of an isolated system is completely stable. It decays to the ground state only when exposed to an external field. The quantitatively exact decay is not described by statistical mechanics but by perturbation theory (interaction with a continuous spectrum). See, e.g., the Wikipedia article on spontaneous emission; a simpler model is the Weisskopf-Wigner atom.
 
  • #31
The isolated system however is given by the nucleus, the electrons and the (quantized) em. field. That's why also without an additional electromagnetic field the atom gets deexcited by spontaneous emission of photons (as you write yourself).
 
  • #32
Sure, from Feynman ensemble perspective.
But simultaneously there works also thermodynamical/statististical physicics and its perspective, e.g. having tendency to increase entropy for example by spreading energy like
"excited atom -> deexcited atom + photon"
... against QM unitarity maintaining von Neumann entropy.

I think the main problem with understanding of quantum mechanics is trying to see everything from single perspective, while physics is not that simple - there are multiple perspectives complementing each other.
We have wave-particle duality: while Feynman ensemble focuses on the wave part, Boltzmann ensemble is more focused on the particle part of the duality.
 
  • #33
There is no wave-particle duality in modern quantum theory. The example with the atom is QFT at 0 temperature, i.e., there's no Boltzmann ensemble.
 
  • #34
So why we observe entropy growth in unitary quantum evolution?
 
  • #35
Entropy stays constant under unitary time evolution.
 
  • #36
Exactly, while in physics around us it clearly grows.
You cannot escape statistical mechanics - mathematically universal principle of maximal entropy - and mathematically it is there in widely and succesfully used Boltzmann ensembles, even if you call it "Wick rotated Feynman".
 
  • #37
vanhees71 said:
The isolated system however is given by the nucleus, the electrons and the (quantized) em. field. That's why also without an additional electromagnetic field the atom gets deexcited by spontaneous emission of photons (as you write yourself).
Not really. Your isolated system has no excited states. The excited states of @Jarek31 are solutions of the time-independent Schrödinger equation without an external field! These do not decay but form the discrete part of the spectrum of the Hamiltonian. In order to get the excited atom to decay, the energy lost must be carried by an emitted photon. To make this possible, one has to add an interaction with the electromagnetic field!
 
Last edited:
  • Like
Likes dextercioby
  • #38
vanhees71 said:
Entropy stays constant under unitary time evolution.
Jarek 31 said:
Exactly, while in physics around us it clearly grows.
We observe this growth only in thermally isolated but mechanically nonisolated subsystems. These do not follow a unitary evolution but a dissipative one. In place of the Schrödinger equation one has a Lindblad equation, obtained in good approximation by contracting a larger isolated system (following unitary dynamics with constant entropy) to the system actually observed.

In most of observable physics, the system is not even approximately thermally isolated, and entropy has no reason to increase. It decreases in many chemical reactions observable at everyday temperatures; you only need to bother to do the calculations! Instead, the principle governing most of macroscopic physics is the decrease of free energy (usually Gibbs free energy, Helmholtz free energy, or enthalpy, depending on the boundary conditions).
 
  • Like
Likes dextercioby
  • #39
Jarek 31 said:
Boltzmann ensembles, even if you call it "Wick rotated Feynman".
Boltzmann ensembles have nothing to do with "Wick rotated Feynman". A Wick rotation changes a physical system into a completely different system that bears hardly any relationship to the original one.

Boltzmann ensembles make sense only for weakly interacting collections of atoms. Already a crystal is very far from a Boltzmann ensemble.
 
  • #40
A. Neumaier said:
Boltzmann ensembles have nothing to do with "Wick rotated Feynman"
Except that formulas are the same, and we cannot escape mathematically universal principle of maximal entropy in Boltzmann ensemble - it is hidden in models with incomplete knowledge (not being The Wavefunction of The Universe).
vanhees71 said:
There is no wave-particle duality in modern quantum theory.
Maybe let us take a look at "Imaging the atomic orbitals of carbon atomic chains with field-emission electron microscopy": https://journals.aps.org/prb/abstract/10.1103/PhysRevB.80.165404
They literally made photos of orbitals - striping electrons from single carbon atom, shape EM field to act as a lens, with matrix of detectors determine where in orbital electrons came from - getting electron densities as below (nicely seen s,p).
Doesn't it use both wave and corpuscular nature of electron?

s-of-carbon-chains-a-Singlet-and-b-doublet-of_W640.jpg
 
  • #41
A. Neumaier said:
Not really. Your isolated system has no excited states. The excited states of @Jarek31 are solutions of the time-independent Schrödinger equation without an external field! These do not decay but form the discrete part of the spectrum of the Hamiltonian. In order to get the excited atom to decay, the energy lost must be carried by an emitted photon. To make this possible, one has to add an interaction with the electromagnetic field!
If you want to describe spontaneous emission you have to include the quantized em. field. Here you have the atom (the nucleus, the electron, and the static Coulomb potential) as an open subsystem. That's why you can spontaneously emit one or more photons to deexcite the atom initially in an excited eigenstate of its energy. So indeed, you have to add the interaction with the em. field. Where is the contradiction?
 
  • #42
vanhees71 said:
If you want to describe spontaneous emission you have to include the quantized em. field. Here you have the atom (the nucleus, the electron, and the static Coulomb potential) as an open subsystem. That's why you can spontaneously emit one or more photons to deexcite the atom initially in an excited eigenstate of its energy. So indeed, you have to add the interaction with the em. field. Where is the contradiction?
In the fact that to define what an excited state is you need to consider a different isolated system - that without the e/m field.
 
  • Like
Likes vanhees71
  • #43
If only there was a simple straighforward way, like just using the maximal entropy principle - Boltzmann ensemble ...

Here they observe nice (n,l)-like quantization (distance and angular momentum) ... for classical objects with wave-particle duality: https://www.nature.com/articles/ncomms4219

41467_2014_Article_BFncomms4219_Fig4_HTML.jpg
 
  • #44
Jarek 31 said:
Except that formulas are the same
Same formula does not imply same physics. The formula ##x(t)=e^{-\alpha t}## appears in very different domains of physics where it has very different meanings - except that something decays.
Jarek 31 said:
Doesn't it use both wave and corpuscular nature of electron?
The images of orbitals essentially depict a charge density - see my Theoretical Physics FAQ
Neither wave nor corpuscular nature plays a role. Though the Schrödinger equation is used to define the states in which the orbitals have the textbook form. But this is just harmonic analysis on the sphere...
Jarek 31 said:
hidden in models with incomplete knowledge
These are not described by the Schrödinger equation. I suggest that you read the book
  • Calzetta and Hu, Nonequilibrium Quantum Field Theory
to see how quantum field theory covers incomplete knowledge. It is far from Boltzmann's qualitative picture, and agrees quantitatively with various experimentally accessible limits.
 
  • Like
Likes vanhees71 and PeterDonis
  • #45
Boltzmann distribution also appears in many places - universal maximal entropy principle says it is the safest assumption for incomplete knowledge situations (for fixed e.g. mean energy):
The principle of maximum entropy states that the probability distribution which best represents the current state of knowledge about a system is the one with largest entropy
Unless working with The Wavefunction of The Universe, quantum models work with incomplete knowledge situations - have hidden averaging over the unknowns.
While it can be expressed in many languages, there is always combinatorial domination of parameters maximizing entropy - you cannot escape that.

The photos of orbitals indeed nicely show e.g. s,p orbitals (consequence of wave nature resonating to standing wave described by stationary Schrodinger), but they are obtained by averaging over positions of single electrons there (corpuscular nature).
The wave-particle duality is at heart of physics, quantum mechanics - we can focus on one of these natures in some perspective/approximation, but should't forget about the compete picture.
 
  • #46
Jarek 31 said:
for fixed e.g. mean energy
Fixed mean energy is very uncommon in experimental situations. Usually the temperature or the pressure is fixed, and entropy is not maximized.
 
  • #47
Mathematically Boltzmann ensemble is just principle of maximal entropy for weighted possibilities - also optimizing such mean weight.
These weights are usually energy, but e.g. for paths it should be integrated over time.
 
  • #48
Jarek 31 said:
Mathematically Boltzmann ensemble is just principle of maximal entropy for weighted possibilities - also optimizing such mean weight.
These weights are usually energy, but e.g. for paths it should be integrated over time.
But what you get out depends on which expectations you assume to be given. Hence the principle is empty unless you make very strong assumptions, valid only under very restrictive conditions. Rather than toying around with your limited intuition, look first at how modern nonequilibrium thermodynamics is done! Then see whether you can add something substantial by creative modification. Starting at a point much older than 100 years is unlikely to lead you to something interesting and new...
 
  • #49
A. Neumaier said:
Starting at a point much older than 100 years is unlikely to lead you to something interesting and new...
Are you saying that they made combinatorics obsolete? (e.g. leading to maximal entropy principle saying what are the safest assumptions in incomplete knowledge situations).

If not, and in incomplete knowledge situations they get the same formulas, then maybe it is just new exciting dressing for long known universal mathematics.
 
  • #50
Jarek 31 said:
Can this connection be taken to field theories - can we see QFT as Feynman ensemble of fields?
With concrete field configurations corresponding to each Feynman diagram?
I am afraid that the answer is no.
In the quantum case, unlike in the classical case , the path integral is along virtual configurations not on concrete , observable ones.
in the classical case we can observe the Ising values on aset of spins on the x-axis after the Wick rotation, but with the quantum spin of a single particle, the path of the spin from t=0 to t=1 cannot be observed . if you try to look at them you get path information and you change the system.
 

Similar threads

Replies
1
Views
2K
Replies
5
Views
2K
Replies
134
Views
10K
Replies
13
Views
2K
Replies
6
Views
2K
Replies
34
Views
5K
Back
Top