What is the role of entropy in particle decay theory?

In summary, the conversation discusses the concept of entropy in both the macroscopic and quantum worlds. It is noted that entropy has a precise mathematical definition and is not simply "disorder." The paradox of lower generation particles decaying into first generation particles that form matter is also discussed, with the explanation that the terminology of "later" and "first" generations can be misleading. The mathematical definition of entropy is briefly explained as being proportional to the logarithm of the number of ways a system can be in a given state. The question then shifts to the relationship between entropy and the creation and decay of particles. It is concluded that while single decays do not produce entropy, the overall process of decay and recombination contributes to entropy production.
  • #1
ribbie
9
0
Maybe my question is a bit more philosophical than scientific.
In the macroscopic world entropy means that things become more chaotic and less orderly as they decay. It seems to me somewhat paradoxical that in the quantum world lower generation particles "decay" into the first generation particles that form matter.
I realize of course that the elementary particles that are the final result are at a lower energy level than the original generations. But even the terminology of "later" generations decaying into "first" generations seems to suggest that there are two opposite types of world in existence. In the macroscopic world, order decays into chaos and in the quantum world, chaos decays into order.
Does anyone have any ideas on this?
 
Physics news on Phys.org
  • #2
ribbie said:
In the macroscopic world entropy means that things become more chaotic and less orderly as they decay. It seems to me somewhat paradoxical that in the quantum world lower generation particles "decay" into the first generation particles that form matter.
Entropy has a precise mathematical definition, and it is not "disorder" - that's something that you find in simplified explanations aimed at people who aren't ready for the math. Use the proper definition and the paradox that you're seeing will go away.
 
  • #3
The second generation does not have more entropy than the first generation. The second generation is called so because the particles of that generation are discovered later. They are discovered later because they have larger mass, so require a stronger accelerator for production in laboratory. For the same energy, particles with larger mass have less entropy because the phase space of allowed states is smaller. Therefore, the decay from more massive towards less massive particles is consistent with the increase of entropy.
 
Last edited:
  • #4
Nugatory said:
Entropy has a precise mathematical definition, and it is not "disorder" - that's something that you find in simplified explanations aimed at people who aren't ready for the math. Use the proper definition and the paradox that you're seeing will go away.
Thanks for your reply Nugatory, is there any way the mathematical definition of entropy can be explained in words?
 
  • #5
Demystifier said:
The second generation does not have more entropy than the first generation. The second generation is called so because the particles of that generation are discovered later. They are discovered later because they have larger mass, so require a stronger accelerator for production in laboratory. For the same energy, particles with larger mass have less entropy because the phase space of allowed states is smaller. Therefore, the decay from for massive towards less massive particles is consistent with the increase of entropy.
I realize that the second generation does not have more entropy than the first generation, that is an inherent part of my question. But, you did enlighten me as to why the generations' numbering seems somehow to be "backwards" from a layman's perspective. Thanks.
 
  • #6
ribbie said:
Thanks for your reply Nugatory, is there any way the mathematical definition of entropy can be explained in words?
Actually there are two equivalent definitions, one from thermodynamics and one from statistical mechanics. I don't know of an easy way of explaining the thermodynamic one, but the one from statistical mechanics is somewhat intuitive: It's proportional to the logarithm of the number of ways that a system can be in a given state. If I arrange 50 coins in a row, there is only one way for them all to be heads-up: #1 is heads-up AND #2 is heads-up AND ... That is a low-entropy state. However, there are about ##1.2\times{10}^{14}## ways to have 25 heads and 25 tails; that is a higher entropy state. Intuitively, we expect that randomly flipping the coins around can turn the first state into the second but not the other way around - this is an example of entropy increasing.
 
  • Like
Likes Maximilien Kitutu, Kevin McHugh and vanhees71
  • #7
Thanks for that explanation, Nugatory.
Maybe my question doesn't really have anything to do with entropy after all. I'll try to redesign it.
There seem to be two directions from which you can get an elementary particle. Either you can "shatter" matter until it yields some. Or you could--theoretically--penetrate the quantum world, "capture" a third or second generation particle and wait until it decays into a stable elementary particle.
Do you see what I'm getting at?
 
  • #8
I don't see, what you are getting at. You can create particles in very different ways indeed. What's amazing indeed is that thanks to quantum theory particles of the same time are exactly the same, they are indistinguishable. An electron is an electron at utmost precision. Any electron has, e.g., precisely the same electric charge ##-e##, which is precisely opposite to a proton's charge ##+e##. It doesn't matter, where the electron comes from. It may be made in an annihilation process or conversion of a photon hitting another heavy particle (pair production) or whatever other process. It's always and electron. It's not clear to me, what this has to do with entropy in your opinion.
 
  • #9
vanhees71 said:
It's not clear to me, what this has to do with entropy in your opinion.
A spontaneous decay happens easily, but the decay products cannot so easily recombine into the initial particle. That certainly has to do with entropy.
 
  • Like
Likes Maximilien Kitutu
  • #10
Sure, but a single decay of a particle doesn't produce entropy. It's going from one pure state to another pure state. In a kinetic equation both decay and recombination are subject to detailed balance (which by the way follows from unitary of the S-matrix and does not rely on time-reversal and parity invariance). E.g. if you have a decay ##1 \rightarrow 23## you also have a recombination ##23 \rightarrow 1##, and the corresponding kinetic equations for this process for particle 1 looks like
$$\mathrm{D}_t f_1 = \int \widetilde{\mathrm{d}^3 \vec{p}_2} \widetilde{\mathrm{d}^3 \vec{p}_2} |\mathcal{M}|^2 [f_2 f_3 (1 \pm f_1) -f_1 (1 \pm f_2) (1 \pm f_3)] \delta^{(4)}(p_1-p_2-p_3).$$
Of course, as any collision term, also this decay-recombination collision terms adds to entropy production in the corresponding entropy-balance inequality (Boltzmann H theorem).
 
  • #11
vanhees71 said:
Sure, but a single decay of a particle doesn't produce entropy. It's going from one pure state to another pure state.

Well, there are two different definitions of "entropy" at work here. There is Von Neumann entropy:

[itex]S = - tr(\rho\ log(\rho))[/itex]

(where [itex]\rho[/itex] is the density matrix), which is zero for any pure state. But there is also the Boltzmann entropy:

[itex]S = k\ log(W)[/itex]

where [itex]W[/itex] is the number of microstates that give rise to the same macrostate. (To make rigorous sense of this, I think you need some kind of "coarse-graining" for saying when two states are close enough as to be macroscopically indistinguishable.)

A decay of a single particle into three or more particles increases the Boltzmann entropy, I'm pretty sure. A decay into exactly two might not increase the Boltzmann entropy, because the (kinematic) state of each particle is uniquely determined by the other.
 
  • Like
Likes Demystifier
  • #12
vanhees71 said:
Sure, but a single decay of a particle doesn't produce entropy. It's going from one pure state to another pure state.
Sure, but if you want to detect the single decay, then you also need to take into account the state of the macroscopic detector entangled with the decay products. In principle this can also be described by a pure state, but in practice it cannot. So for practical purposes the decay products need to be described in terms of mixed states, which involves entropy.

More generally, in principle, anything in physics can be described by pure states. Consequently, in principle, nothing has entropy. Nevertheless we still talk about entropy because it is an efficient way to express our limited practical knowledge.
 
Last edited:
  • Like
Likes Maximilien Kitutu and vanhees71

1. What is entropy?

Entropy is a measure of the disorder or randomness in a system. It is a fundamental concept in thermodynamics and is associated with the second law of thermodynamics, which states that the total entropy of a closed system will always increase over time.

2. How does entropy relate to particle decay?

In particle decay, the number of particles in a system decreases over time, leading to an increase in disorder and entropy. This is because a system with fewer particles has more available energy states, resulting in an increase in randomness.

3. Can entropy be reversed?

In a closed system, the total entropy will always increase or remain constant over time. However, in an open system, such as the universe, entropy can decrease in one part of the system while increasing in another. This is known as local entropy decrease, but the overall entropy of the system will still continue to increase.

4. How is entropy measured?

Entropy is typically measured in units of joules per kelvin (J/K). It can be calculated by dividing the amount of energy transferred in a process by the temperature at which the process occurs.

5. How does entropy affect the universe?

The second law of thermodynamics, which is based on the concept of entropy, states that the total entropy of the universe will always increase. This means that the universe will continue to move towards a state of maximum disorder and randomness over time.

Similar threads

Replies
13
Views
1K
  • Quantum Physics
Replies
11
Views
1K
Replies
5
Views
263
  • Quantum Physics
Replies
7
Views
2K
  • Quantum Interpretations and Foundations
Replies
16
Views
1K
  • High Energy, Nuclear, Particle Physics
Replies
4
Views
2K
  • High Energy, Nuclear, Particle Physics
Replies
6
Views
4K
  • Quantum Interpretations and Foundations
2
Replies
41
Views
3K
Replies
1
Views
1K
Replies
56
Views
5K
Back
Top