I What is the role of entropy in particle decay theory?

ribbie
Messages
9
Reaction score
0
Maybe my question is a bit more philosophical than scientific.
In the macroscopic world entropy means that things become more chaotic and less orderly as they decay. It seems to me somewhat paradoxical that in the quantum world lower generation particles "decay" into the first generation particles that form matter.
I realize of course that the elementary particles that are the final result are at a lower energy level than the original generations. But even the terminology of "later" generations decaying into "first" generations seems to suggest that there are two opposite types of world in existence. In the macroscopic world, order decays into chaos and in the quantum world, chaos decays into order.
Does anyone have any ideas on this?
 
Physics news on Phys.org
ribbie said:
In the macroscopic world entropy means that things become more chaotic and less orderly as they decay. It seems to me somewhat paradoxical that in the quantum world lower generation particles "decay" into the first generation particles that form matter.
Entropy has a precise mathematical definition, and it is not "disorder" - that's something that you find in simplified explanations aimed at people who aren't ready for the math. Use the proper definition and the paradox that you're seeing will go away.
 
The second generation does not have more entropy than the first generation. The second generation is called so because the particles of that generation are discovered later. They are discovered later because they have larger mass, so require a stronger accelerator for production in laboratory. For the same energy, particles with larger mass have less entropy because the phase space of allowed states is smaller. Therefore, the decay from more massive towards less massive particles is consistent with the increase of entropy.
 
Last edited:
Nugatory said:
Entropy has a precise mathematical definition, and it is not "disorder" - that's something that you find in simplified explanations aimed at people who aren't ready for the math. Use the proper definition and the paradox that you're seeing will go away.
Thanks for your reply Nugatory, is there any way the mathematical definition of entropy can be explained in words?
 
Demystifier said:
The second generation does not have more entropy than the first generation. The second generation is called so because the particles of that generation are discovered later. They are discovered later because they have larger mass, so require a stronger accelerator for production in laboratory. For the same energy, particles with larger mass have less entropy because the phase space of allowed states is smaller. Therefore, the decay from for massive towards less massive particles is consistent with the increase of entropy.
I realize that the second generation does not have more entropy than the first generation, that is an inherent part of my question. But, you did enlighten me as to why the generations' numbering seems somehow to be "backwards" from a layman's perspective. Thanks.
 
ribbie said:
Thanks for your reply Nugatory, is there any way the mathematical definition of entropy can be explained in words?
Actually there are two equivalent definitions, one from thermodynamics and one from statistical mechanics. I don't know of an easy way of explaining the thermodynamic one, but the one from statistical mechanics is somewhat intuitive: It's proportional to the logarithm of the number of ways that a system can be in a given state. If I arrange 50 coins in a row, there is only one way for them all to be heads-up: #1 is heads-up AND #2 is heads-up AND ... That is a low-entropy state. However, there are about ##1.2\times{10}^{14}## ways to have 25 heads and 25 tails; that is a higher entropy state. Intuitively, we expect that randomly flipping the coins around can turn the first state into the second but not the other way around - this is an example of entropy increasing.
 
  • Like
Likes Maximilien Kitutu, Kevin McHugh and vanhees71
Thanks for that explanation, Nugatory.
Maybe my question doesn't really have anything to do with entropy after all. I'll try to redesign it.
There seem to be two directions from which you can get an elementary particle. Either you can "shatter" matter until it yields some. Or you could--theoretically--penetrate the quantum world, "capture" a third or second generation particle and wait until it decays into a stable elementary particle.
Do you see what I'm getting at?
 
I don't see, what you are getting at. You can create particles in very different ways indeed. What's amazing indeed is that thanks to quantum theory particles of the same time are exactly the same, they are indistinguishable. An electron is an electron at utmost precision. Any electron has, e.g., precisely the same electric charge ##-e##, which is precisely opposite to a proton's charge ##+e##. It doesn't matter, where the electron comes from. It may be made in an annihilation process or conversion of a photon hitting another heavy particle (pair production) or whatever other process. It's always and electron. It's not clear to me, what this has to do with entropy in your opinion.
 
vanhees71 said:
It's not clear to me, what this has to do with entropy in your opinion.
A spontaneous decay happens easily, but the decay products cannot so easily recombine into the initial particle. That certainly has to do with entropy.
 
  • Like
Likes Maximilien Kitutu
  • #10
Sure, but a single decay of a particle doesn't produce entropy. It's going from one pure state to another pure state. In a kinetic equation both decay and recombination are subject to detailed balance (which by the way follows from unitary of the S-matrix and does not rely on time-reversal and parity invariance). E.g. if you have a decay ##1 \rightarrow 23## you also have a recombination ##23 \rightarrow 1##, and the corresponding kinetic equations for this process for particle 1 looks like
$$\mathrm{D}_t f_1 = \int \widetilde{\mathrm{d}^3 \vec{p}_2} \widetilde{\mathrm{d}^3 \vec{p}_2} |\mathcal{M}|^2 [f_2 f_3 (1 \pm f_1) -f_1 (1 \pm f_2) (1 \pm f_3)] \delta^{(4)}(p_1-p_2-p_3).$$
Of course, as any collision term, also this decay-recombination collision terms adds to entropy production in the corresponding entropy-balance inequality (Boltzmann H theorem).
 
  • #11
vanhees71 said:
Sure, but a single decay of a particle doesn't produce entropy. It's going from one pure state to another pure state.

Well, there are two different definitions of "entropy" at work here. There is Von Neumann entropy:

S = - tr(\rho\ log(\rho))

(where \rho is the density matrix), which is zero for any pure state. But there is also the Boltzmann entropy:

S = k\ log(W)

where W is the number of microstates that give rise to the same macrostate. (To make rigorous sense of this, I think you need some kind of "coarse-graining" for saying when two states are close enough as to be macroscopically indistinguishable.)

A decay of a single particle into three or more particles increases the Boltzmann entropy, I'm pretty sure. A decay into exactly two might not increase the Boltzmann entropy, because the (kinematic) state of each particle is uniquely determined by the other.
 
  • Like
Likes Demystifier
  • #12
vanhees71 said:
Sure, but a single decay of a particle doesn't produce entropy. It's going from one pure state to another pure state.
Sure, but if you want to detect the single decay, then you also need to take into account the state of the macroscopic detector entangled with the decay products. In principle this can also be described by a pure state, but in practice it cannot. So for practical purposes the decay products need to be described in terms of mixed states, which involves entropy.

More generally, in principle, anything in physics can be described by pure states. Consequently, in principle, nothing has entropy. Nevertheless we still talk about entropy because it is an efficient way to express our limited practical knowledge.
 
Last edited:
  • Like
Likes Maximilien Kitutu and vanhees71

Similar threads

Back
Top