I The thermal interpretation of quantum physics

  • #121
DarMM said:
I wasn't sure if it also required an argument that slow mode manifold was in fact disconnected and I couldn't think of one. Metastability of states on the full manifold decaying into those on the slow manifold is enough provided the slow mode manifold is disconnected. Is there a reason to expect this in general?
As soon as there are two local minima in the compactified universe (i.e., including minima at infinity), the answer is yes. For geometric reasons, each local minimizer has its own catchment region, and these are disjoint. This accounts for the case where the slow modes are fixed points. But similar things hold more generally. It is the generic situation, while the situation of a connected slow manifold is quite special (though of course quite possible).

DarMM said:
This somewhat confuses me. I would have thought the notion of reducibility just means it can be completely decomposed, i.e. the total system is simply a composition of the two subsystems and nothing more. What subtlety am I missing?
to 'reduce' is a vague notion that can mean many things. For example, reductionism in science means the possibility of reducing all phenomena to physics.

In the three papers I used ''reduced description'' for any description of a single system obtained by coarse-graining, whereas decomposing a system into multiple subsystems and a description in these terms is a far more specific concept.
 
  • Like
Likes DarMM
Physics news on Phys.org
  • #122
A. Neumaier said:
What you say here is correctly understood, but the final argument is not yet conclusive. The reason why the measurement is often discrete - bi- or multistability - is not visible.
For those reading what is lacking here is that I simply said:
DarMM said:
In essence the environment drives the "slow large scale modes" of the measuring device into one of a discrete set of states
The "drives" here is vague and doesn't explain the mechanism.

What happens is that if the slow manifold is disconnected then states on the manifold of all modes of the device are metastable and under disturbance from the environment decay into a state on one of the components of the slow manifold.

In other words the macroscopically observable features of the device only need a small amount of environmental noise to fall into one of a discrete set of minima, corresponding to the discrete outcome readings, as other more general states of the device are metastable only.
 
  • #123
Thanks @A. Neumaier , I think I've an okay (I hope!) grasp of this view now.
 
  • #124
DarMM said:
I think I've an okay (I hope!) grasp of this view now.
Why don't you state a revised 4-point summary of your view of my view? Then I'll give you (again) my view of your view of my view!

Note that I edited my post #121 to account for the case where the slow manifold isn't just centered around a discrete number of fixed points.
 
  • Like
Likes DarMM
  • #125
vanhees71 said:
You even use specific probabilistic/statistical notions like "uncertainty" and define it in the usual statistical terms as the standard deviation/2nd cumulant.
I spent several pages on uncertainty (Subsection 2.3 of Part II) to show that uncertainty is much more fundamental than statistical uncertainty. For example, consider the uncertainty of the diameter of the city of Vienna. It is not associated with any statistics but with the uncertainty of the concept itself.

The thermal interpretation treats all uncertainty as being of this kind, and says that the quantum formalism predicts always this conceptual, nonstatistical uncertainty. In addition, in the special case where we have a large sample of similarly prepared systems, it also predicts the statistical uncertainty.
 
Last edited:
  • #126
A. Neumaier said:
Y

Whereas according to the thermal interpretation, if we have complete knowledge about the system we know all its idealized measurement results, i.e., all q-expectations values, which is equivalent with knowing its density operator.

If you want to assess the thermal interpretation you need to discuss it in terms of its own interpretation and not in terms of the statistical interpretation!
Ok, this makes sense again. So for you determinism doesn't refer to the observables but to the statistical operators (or states). That's of course true in the minimal statistical interpretation either. So the thermal interpretation is again equivalent to the standard interpretation, you only relabel the language associated with the math, not talking about probability and statistics but only about q-expectation values. I can live with that easily :-).
 
  • #127
A. Neumaier said:
our deterministic universe

So do you thing alpha decay or spontaneous emission is also deterministic ?
 
  • #128
ftr said:
So do you think alpha decay or spontaneous emission is also deterministic ?
On the fundamental level, yes, since each observed alpha decay is something described by some of the observables in the universe. But as casting a die, it is practically indeterministic.
 
  • #129
vanhees71 said:
So for you determinism doesn't refer to the observables but to the statistical operators (or states).
No; it is not an alternative but both! It refers to the (partially observable) beables, which are the q-expectations. This determinism is equivalent to the determinism of the density operator.
vanhees71 said:
That's of course true in the minimal statistical interpretation either. So the thermal interpretation is again equivalent to the standard interpretation,
No, because the meaning assigned to ''observable'' and ''state'' is completely different.

For you, observed are only eigenvalues; for the thermal interpretation, eigenvalues are almost never observed. As in classical physics!

For you, the state of the universe makes no sense at all; for the thermal interpretation, the state of the universe is all there is (on the conceptual level), and every other system considered by physicists is a subsystem of it, with a state completely determined by the state of the universe. As in classical physics!

For you, quantum probability is something irreducible and unavoidable in the foundations; for the thermal interpretation, probability is not part of the foundations but an emergent phenomenon. As in classical physics!

How can you think that both interpretations are equivalent?

Only the things they try to connect - the formal theory and the experimental record are the same, but how they mediate between them is completely different (see post #99).
vanhees71 said:
you say on the one hand the thermal interpretation uses the same mathematical formalism, but it's all differently interpreted.
vanhees71 said:
you only relabel the language associated with the math, not talking about probability and statistics but only about q-expectation values. I can live with that easily :-).
The language associated with the math - that's the interpretation!

One can associate with it Copenhagen language or minimal statistical language - which is what tradition did, resulting in nearly a century of perceived weirdness of quantum mechanics by almost everyone - especially
  • by all newcomers without exception and
  • by some of the greatest physicists (see the quotes at the beginning of Section 5 of Part III).
Or one can associate thermal, nonstatistical language with it, restoring continuity and common sense.

Everyone is free to pick their preferred interpretation. It is time to change preferences!
 
Last edited:
  • Like
Likes dextercioby and Mentz114
  • #130
vanhees71 said:
Einstein [...] insisted on the separability of the objectively observable physical world. [...] It's of course impossible to guess what Einstein would have argued about the fact that the modern Bell measurements show that you either have to give up locality or determinism.
Well, I am not Einstein, and therefore have more freedom.

Also, the nonlocality of Bell has nothing to do with locality in the sense of relativity theory. I explained this in Subsection 2.4 of Part II, where I showed that Bell nonlocality is fully compatible with special relativity when the extendedness of physical objects is properly taken into account.

The thermal interpretation has both determinism and properly understood locality, namely as independent preparability in causally disjoint regions of spacetime, consistent with the causal commutation rules of relativistic quantum field theory. It also has Bell nonlocality in the form of extended causality; see Subsection 2.4 of Part II and the discussion in this PF thread.

Maybe Einstein would have been satisfied.
 
  • #131
A. Neumaier said:
Maybe Einstein would have been satisfied.
Maybe not, he would have found a hole in your theory right a way :smile:. Seriously, what about tunneling?
 
  • #132
ftr said:
Maybe not, he would have found a hole in your theory right a way :smile:.
Maybe. There are many others who might want to try and find such a hole in my interpretation! Not my theory - the theory is standard quantum physics!
ftr said:
what about tunneling?
This is just a particular way a state changes with time.

Consider a bistable symmetric quartic potential in 1D with a tiny bit of dissipation inherited from the environment. Initially, the density is concentrated inside the left well, say; at large enough time it is concentrated essentially equally in both wells. The position has little uncertainty initially and a lot of uncertainty once the tunneling process is completed.
 
Last edited:
  • #133
Let's see if I got the terminology straight.

In standard QM, observables are self-adjoint operators. The thermal interpretation refers to these as q-observables instead (paper I, p.3). Historically, before their mathematical nature was completely understood, Dirac referred to them as q-numbers.

The standard QM usage of the term "observable" is a bit strange because self-adjoint operators are not observable in the everyday sense of the word. The thermal interpretation tries move closer to the everyday usage of the word and defines observables as "numbers obtainable from observations" (paper I, p.3). This is similar to what Dirac called c-numbers (although I think he included complex numbers in the concept and the thermal interpretation probably doesn't).

In standard QM, expectation values and probabilities are inherently probabilistic properties of self-adjoint operators. Since they are "numbers obtainable from observations", they are observables in the thermal interpretation and calling them q-expectations and q-probabilities is done in reference to the usage in standard QM but doesn't reflect anything probabilistic in their mathematical definition. I'm not sure whether eigenvalues should also be called observables. Is the definition of observable tied to whether an experiment can actually be performed in a sufficiently idealized form?

Is this correct so far? I think it is a bit unfortunate to use the term "observable" in the thermal interpretation at all because the term is so deeply ingrained in standard QM which makes it prone to misunderstandings.
 
  • #134
kith said:
Let's see if I got the terminology straight.

In standard QM, observables are self-adjoint operators. The thermal interpretation refers to these as q-observables instead (paper I, p.3). Historically, before their mathematical nature was completely understood, Dirac referred to them as q-numbers.

The standard QM usage of the term "observable" is a bit strange because self-adjoint operators are not observable in the everyday sense of the word. The thermal interpretation tries move closer to the everyday usage of the word and defines observables as "numbers obtainable from observations" (paper I, p.3). This is similar to what Dirac called c-numbers (although I think he included complex numbers in the concept and the thermal interpretation probably doesn't).

In standard QM, expectation values and probabilities are inherently probabilistic properties of self-adjoint operators. Since they are "numbers obtainable from observations", they are observables in the thermal interpretation and calling them q-expectations and q-probabilities is done in reference to the usage in standard QM but doesn't reflect anything probabilistic in their mathematical definition. I'm not sure whether eigenvalues should also be called observables. Is the definition of observable tied to whether an experiment can actually be performed in a sufficiently idealized form?

Is this correct so far?
yes. Eigenvalues of a q-observable are state-independent, hence are not even beables.
In the thermal interpretation, q-expectations are defined for many nonhermitian operators (for example annihilation operators!) and then may be complex-valued.
Note that an observation in the thermal interpretation can be anything that can be reproducibly computed from experimental raw data - reproducible under repetition of the experiment, not of the computations. This can be real or complex numbers, vectors, matrices, statements, etc..
kith said:
I think it is a bit unfortunate to use the term "observable" in the thermal interpretation at all because the term is so deeply ingrained in standard QM which makes it prone to misunderstands.
Actually, in the papers I avoid the notion of an observable because of the possible confusion. I use beable for what exists (all functions of q-expectations) and say that some beables are observable (not observables!) But I sometimes call the traditional selfadjoint operators q-observables (the prefix q- labels all traditional notions that in the thermal interpretation would result in a misleading connotation) and sometimes call informally the observable beables ''observables'' (which matches the classical notion of an observable). However, if you see this done in the papers, please inform me (not here but preferably by email) so that I can eliminate it in the next version.
 
Last edited:
  • #135
vanhees71 said:
whether one can use these concepts to teach QM 1 from scratch, i.e., can you start by some heuristic intuitive physical arguments to generalize the Lie-algebra approach of classical mechanics in terms of the usual Poisson brackets of classical mechanics? Maybe that would be an alternative approach to QM which avoids all the quibbles with starting with pure states and then only finally arrive at the general case of statistical operators as description of quantum states?
I would introduce quantum mechanics with the qubit, which is just 19th century optics. This produces the density operator, the Hilbert space, the special case of pure states, Born's rule (aka Malus' law), the Schrödinger equation, and the thermal interpretation - all in a very natural way.

To deepen the understanding, one can discuss classical mechanics in terms of the Lie algebra of phase space functions given by the negative Poisson bracket, and then restrict to a rigid rotor, described by an so(3) given by the generators of angular momentum. This example is the one given in the last two paragraphs of post #63, and also provides the Lie algebra for the qubit.

Next one shows that this Lie algebra is given by a scaled commutator. This generalizes and defines the Lie algebras that describe quantum mechanics. Working out the dynamics in terms of the q-expectations leads to the Ehrenfest equations. Then one can introduce the Heisenberg, Schrödinger, and interaction picture and their dynamics.

Then one has everything, without any difficult concepts beyond the Hilbert space and the trace, which appeared naturally. There is no need yet to mention eigenvalues and eigenvectors (these come when discussing stationary states), the subtle problems with self-adjointness (needed when discussing boundary conditions), and the spectral theorem (needed when defining the exponentials ##U(t)=e^{\pm itH}##). The latter two issues are completely absent as long as one works within finite-dimensional Hilbert spaces; so perhaps doing initially some quantum information theory makes sense.
vanhees71 said:
First one has to understand the most simple cases to understand the meaning of an interpretation.
The Stern-Gerlach experiment is a very good example for that. [...] how would the analogous calculation work with the thermal representation
The calculations are of course identical, since calculations are not part of the interpretation.

But the interpretation of the calculation is different: In the thermal interpretation, the Ag field is concentrated along the beam emanating from the source, with a directional mass current. The beam is split by the magnetic field into two beams, and the amount of silver on the screen at the end measures the integrated beam intensity, the total transported mass. This is in complete analogy to the qubit treated in the above link. Particles need not be invoked.
 
Last edited:
  • Like
Likes Mentz114
  • #136
vanhees71 said:
for me your thermal interpretation is not different from the standard interpretation as expressed by van Kampen in the following informal paper: https://doi.org/10.1016/0378-4371(88)90105-7
It is very different.

van Kampen uses the standard assumptions of the Copenhagen interpretation, with pure states associated to single systems (p.99, after theorem III) and with collapse (which he claims to deduce on p.106, but his argument is sketchy exactly here: he deduces the collapse of the measured system from the silently assumed collapse of system+detector). His alleged ''proof'' is discussed in detail in Bell's paper ''http://www.johnboccio.com/research/quantum/notes/bell.pdf'' on pp.14-17.
 
  • #137
A. Neumaier said:
It is very different.

van Kampen uses the standard assumptions of the Copenhagen interpretation, with pure states associated to single systems (p.99, after theorem III) and with collapse (which he claims to deduce on p.106, but his argument is sketchy exactly here: he deduces the collapse of the measured system from the silently assumed collapse of system+detector). His alleged ''proof'' is discussed in detail in Bell's paper ''http://www.johnboccio.com/research/quantum/notes/bell.pdf'' on pp.14-17.

Thanks for posting a link to that essay. I think Bell summarizes pretty well what I find unsatisfactory about most textbook descriptions of quantum mechanics.
 
  • #138
stevendaryl said:
Thanks for posting a link to that essay. I think Bell summarizes pretty well what I find unsatisfactory about most textbook descriptions of quantum mechanics.
Then you should like the thermal interpretation, which suffers from none of what Bell complains about! It just requires a little to get used to...
 
Last edited:
  • Like
Likes dextercioby and stevendaryl
  • #139
A. Neumaier said:
No; it is not an alternative but both! It refers to the (partially observable) beables, which are the q-expectations. This determinism is equivalent to the determinism of the density operator.

No, because the meaning assigned to ''observable'' and ''state'' is completely different.

For you, observed are only eigenvalues; for the thermal interpretation, eigenvalues are almost never observed. As in classical physics!

For you, the state of the universe makes no sense at all; for the thermal interpretation, the state of the universe is all there is (on the conceptual level), and every other system considered by physicists is a subsystem of it, with a state completely determined by the state of the universe. As in classical physics!

For you, quantum probability is something irreducible and unavoidable in the foundations; for the thermal interpretation, probability is not part of the foundations but an emergent phenomenon. As in classical physics!How can you think that both interpretations are equivalent?
Well, we obviously have very different views on the fundamental meaning of QT, and that leads to mutual misunderstandings.

If there were really only the very coarse grained FAPP deterministic macroscopic values were observables (or "beables" to use this confusing funny language), QT never would have been discovered. In fact we can observe more detailed things for small systems, and these details are even very important to make the observed fact of the atomistic structure of matter consistent with classical physics, particularly the everyday experience of the stability of matter.

You are of course right that neither the state nor the observable operators for themselves are deterministic in standard QT but only the expectation values (of which the probabilities or probability distributions are special cases) are "beables", i.e., they are the picture and representation independent observables predictions of the theory.

It is also of course true that for macroscopic systems the possible resolution of real-world measurement devices is well too coarse to measure the "eigenvalues", i.e., the microcopic details of ##\mathcal{O}(10^{23})## microscopic degrees of freedom.

What I think is still not clarified is the operational meaning of what you call q-expectations. For me they have no different meaning in either the standard minimal interpretation and you thermal interpretation, because they are the same in the formal math (##\Tr \hat{A} \hat{\rho}##, and obeying the same EoS) and also the same operationally, namely just what they are called, i.e., expectation values.

Of course, "the state of the entire universe" is a fiction in any physics. It's principally unobservable and thus subject of metaphysical speculation in QT as well as classical physics.

Only the things they try to connect - the formal theory and the experimental record are the same, but how they mediate between them is completely different (see post #99).
So in fact they ARE the same.
The language associated with the math - that's the interpretation!

One can associate with it Copenhagen language or minimal statistical language - which is what tradition did, resulting in nearly a century of perceived weirdness of quantum mechanics by almost everyone - especially
  • by all newcomers without exception and
  • by some of the greatest physicists (see the quotes at the beginning of Section 5 of Part III).
Or one can associate thermal, nonstatistical language with it, restoring continuity and common sense.

Everyone is free to pick their preferred interpretation. It is time to change preferences!
For me it's very hard to follow any interpretation which forbids me to understand "thermal language" that is "not statistical". I've already a very hard time with traditional axiomatized "phenomenological thermodynamics", where, e.g., the central notion of entropy is its definition by introducing temperature as an integrating factor of an abstract Pfaffian form. The great achievement by the Berrnoulli's, Maxwell, and mostly Boltzmann were to connect these notions with the underlying fundamental deynamical laws of their time in terms of statistical physics, and that very general foundation so far withstood all the "revolutions" of 20-century physics, i.e., relativity (which anyway is just a refined classical theory for the description of space and time and thus not as revolutionary as it appeared at the turn to the 20th century) and QT (which indeed in some sense can be considered as really revolutionary in breaking with the deterministic world view).

The "apparent weirdness" of QT is for me completely resolved by the minimal statistical interpretation. It's not QT is weird but our prejudice that our "common sense", trained by everyday experience with rough macroscopic observables (or preceptions if you wish), tells us the full structure of matter.

In my opinion we should stop talking about QT as "something weird nobody understands" and rather state that it's the most detailed theory we have so far. The real problems are not in these metaphysical quibbles of the last millenium but in the open unsolved questions of contemporary physics, which are

-a consistent quantum description of the gravitational interaction; does it imply "quantization of spacetime" as suggested by the close connection between the mathematical description of gravity as a geometrical feature of the space-time manifold (as a pseudo-Riemannian/Lorentzian manifold as in GR or rather the more natural extension to an Einstein-Cartan manifold, gauging the Lorentz group), or is there something completely new ("revolutionary") needed? I think the answers to these questions are completely open at the moment, and despite many mathematically fascinating ideas (string and M-theory, loop quantum gravity,...) I fear we'll have a very hard time without any empirical glimpse into what might be observational features of whatever "quantum effect on gravitation and/or the space-time structure".

-the nature of what's dubbed "Dark Energy" and "Dark Matter", which may be related to the question of quantum gravity too. Also here, I think it's hard to think of any progress without some empirical guidance of what the "physics of the standard model" may be.
 
  • #140
vanhees71 said:
The "apparent weirdness" of QT is for me completely resolved by the minimal statistical interpretation. It's not QT is weird but our prejudice that our "common sense", trained by everyday experience with rough macroscopic observables (or preceptions if you wish), tells us the full structure of matter.

I don't think that's true. It's neither true that the weirdness is resolved by the minimal interpretation, nor is it true that it has anything to do with prejudice by "common sense". The minimalist interpretation is pretty much what Bell was criticizing in his essay. To quote from it:
Here are some words which, however legitimate and necessary in application, have no place in a formulation with any pretension to physical precision: system, apparatus, environment, microscopic, macroscopic, reversible, irreversible, observable, information, measurement.
 
  • #141
stevendaryl said:
Thanks for posting a link to that essay. I think Bell summarizes pretty well what I find unsatisfactory about most textbook descriptions of quantum mechanics.
My summary of what I find unsatisfactory in most textbook descriptions is given in Section 5.2 of Part I. It intersects Bell's in some respects but not in others.
 
  • #143
"The thermal interpretation gives a natural, realistic meaning to the standard formalism of quantum mechanics and quantum field theory in a single world, without introducing additional hidden variables"

This is a satisfying take on whole ensemble of QM. I'm satisfied with fields and statistical side of the story. I hope a get the picture right. Approximate position, eigenstate doesn't exist? Everything is an approximation no need for collapse or silent on it. It is still considered on the genre of statistical intepretation of QM. Does it relate to equipartition theorem (all modes of excitation carry heat).
 
  • #144
Just to understand. Take a particle reaction like ##\pi^{+} \rightarrow \mu^{+} + \nu_{\mu}##. In the thermal interpretation I assume what is happening here is that locally devices probe ##\pi^{+}, \mu^{+}, \nu_{\mu}## fields (of course these are not fundamental fields, but let's ignore that for now). Via interaction with the fields each of the devices' slow modes are placed into a bistable state and environmental noise triggers these to decay into the detection/non-detection states?
 
  • #145
julcab12 said:
I'm satisfied with fields and statistical side of the story. I hope a get the picture right. Approximate position, eigenstate doesn't exist? Everything is an approximation no need for collapse or silent on it. It is still considered on the genre of statistical intepretation of QM. Does it relate to equipartition theorem (all modes of excitation carry heat).
Eigenstates do not matter, except for computational purposes. Every observable quantity has an associated intrinsic state-dependent uncertainty within which it can be (in principle) determined. It is meaningless to ask for more accuracy, just as it is ridiculous to ask for the position of an apple to mm accuracy. Statistics enters whenever a single value has too much uncertainty, and only then. In this case, the uncertainty can be reduced by calculating means, as within classical physics. Collapse is the continuous but very fast change of the coarse-grained state in a dissipative environment. There is no relation to the equipartition theorem.
 
Last edited:
  • Like
Likes vanhees71 and julcab12
  • #146
DarMM said:
Just to understand. Take a particle reaction like ##\pi^{+} \rightarrow \mu^{+} + \nu_{\mu}##. In the thermal interpretation I assume what is happening here is that locally devices probe ##\pi^{+}, \mu^{+}, \nu_{\mu}## fields (of course these are not fundamental fields, but let's ignore that for now). Via interaction with the fields each of the devices' slow modes are placed into a bistable state and environmental noise triggers these to decay into the detection/non-detection states?
At each time ##t## you have three operator-valued effective 4-currents A,B,C (for simplicity, to avoid having to write Greek letters and/or indices). When the reaction center is at the origin, the reaction A##\to##B+C proceeds as follows: At large negative times the A-density (q-expectation of the time component of the 4-current) is concentrated along the negative z-axis, and the A-current (q-expectation of the 3-vector of space components of the 4-current) is concentrated along the positive z-axis; the 4-currents B and C essentially vanish.

If the reaction happened (which depends on the details of the environment) then, at large positive times, the 4-current A is negligible, the B-density and C-density are concentrated along two (slightly diverging) rays emanating from the origin in such a way that momentum conservation holds, and the B-current and C-current are concentrated along these rays, too. Otherwise, at large positive times, the A-density is concentrated along the positive z-axis, and the A-current is concentrated along the positive z-axis, too, and the 4-currents B, C remain negligible. During the reaction time when the fields are concentrated near the origin, one can interpolate the asymptotic happening in an appropriate way; no problems there. The details are defined by the interaction.

The manifold of slow modes splits into a basin corresponding to the decayed state (with two continuous angle parameters labeling the possible modes) one basin corresponding to the undecayed state. The metastable transition state at time zero determines together with the environmental fluctuations which basin is chosen and which direction is taken. This is comparable to what happens to bending a classical thin iron bar through longitudinal pressure in a random direction, though in that case the bar must bend, so that there is only one basin, with modes labelled by a single angle. In both cases, one of the continuous labels appears due to the rotational symmetry of the setting around the z-axis. In the case of the decay reaction, the second continuous label arises through another, infinitesimal symmetry at the saddle point at the origin.
 
Last edited:
  • Like
Likes DarMM
  • #147
DarMM said:
Via interaction with the fields each of the devices' slow modes are placed into a bistable state and environmental noise triggers these to decay into the detection/non-detection states?
A. Neumaier said:
If the reaction happened (which depends on the details of the environment) then, at large positive times, the 4-current A is negligible, the B-density and C-density are concentrated along two (slightly diverging) rays emanating from the origin in such a way that momentum conservation holds, and the B-current and C-current are concentrated along these rays, too.
Actually, this is only one of the possible scenarios, probably what happens if the decay happens inside a dense medium (a secondary decay in a bubble chamber, say).

For a collision experiment in vacuum, there is probably not enough environmental interaction near zero, and after reaching the collision region, the B,C fields rather should take (in case of a reaction happening) a rotational symmetric shape. In this case, the path like particle nature appears only later when the spherical fields reach a detector. The metastability of the detector forces the two spherical fields to concentrate along two paths, and momentum conservation makes these paths lie weighted-symmetric to the z-axis (would be geometrically symmetric when the decay products have equal mass). The details are those reported in Mott's 1929 paper (ref. [35] of Part I).

In both cases, the detection process creates the seeming particle nature of the observation record.
 
  • Like
Likes dextercioby and DarMM
  • #148
A. Neumaier said:
spherical fields reach a detector

Are these physical fields, what is this field composed of.
 
  • #149
ftr said:
Are these physical fields, what is this field composed of.
Fields are not really composed of anything (except perhaps in some formal sense of other fields). They tell about properties of Nature at arbitrary spacetime points.

In the present case, the fields of interest are the effective fields given by the currents of the particles involved in the reaction.
 
  • #150
So in essence particles are fields of nothing but numbers sometimes localised and others extended greatly, correct.
 

Similar threads

  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 154 ·
6
Replies
154
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 42 ·
2
Replies
42
Views
8K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
48
Views
6K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 53 ·
2
Replies
53
Views
7K
  • · Replies 25 ·
Replies
25
Views
5K
  • · Replies 7 ·
Replies
7
Views
3K