Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I Entropy and particle decay

  1. Aug 18, 2016 #1
    Maybe my question is a bit more philosophical than scientific.
    In the macroscopic world entropy means that things become more chaotic and less orderly as they decay. It seems to me somewhat paradoxical that in the quantum world lower generation particles "decay" into the first generation particles that form matter.
    I realize of course that the elementary particles that are the final result are at a lower energy level than the original generations. But even the terminology of "later" generations decaying into "first" generations seems to suggest that there are two opposite types of world in existence. In the macroscopic world, order decays into chaos and in the quantum world, chaos decays into order.
    Does anyone have any ideas on this?
     
  2. jcsd
  3. Aug 18, 2016 #2

    Nugatory

    User Avatar

    Staff: Mentor

    Entropy has a precise mathematical definition, and it is not "disorder" - that's something that you find in simplified explanations aimed at people who aren't ready for the math. Use the proper definition and the paradox that you're seeing will go away.
     
  4. Aug 18, 2016 #3

    Demystifier

    User Avatar
    Science Advisor

    The second generation does not have more entropy than the first generation. The second generation is called so because the particles of that generation are discovered later. They are discovered later because they have larger mass, so require a stronger accelerator for production in laboratory. For the same energy, particles with larger mass have less entropy because the phase space of allowed states is smaller. Therefore, the decay from more massive towards less massive particles is consistent with the increase of entropy.
     
    Last edited: Aug 19, 2016
  5. Aug 18, 2016 #4
    Thanks for your reply Nugatory, is there any way the mathematical definition of entropy can be explained in words?
     
  6. Aug 18, 2016 #5
    I realize that the second generation does not have more entropy than the first generation, that is an inherent part of my question. But, you did enlighten me as to why the generations' numbering seems somehow to be "backwards" from a layman's perspective. Thanks.
     
  7. Aug 18, 2016 #6

    Nugatory

    User Avatar

    Staff: Mentor

    Actually there are two equivalent definitions, one from thermodynamics and one from statistical mechanics. I don't know of an easy way of explaining the thermodynamic one, but the one from statistical mechanics is somewhat intuitive: It's proportional to the logarithm of the number of ways that a system can be in a given state. If I arrange 50 coins in a row, there is only one way for them all to be heads-up: #1 is heads-up AND #2 is heads-up AND ...... That is a low-entropy state. However, there are about ##1.2\times{10}^{14}## ways to have 25 heads and 25 tails; that is a higher entropy state. Intuitively, we expect that randomly flipping the coins around can turn the first state into the second but not the other way around - this is an example of entropy increasing.
     
  8. Aug 20, 2016 #7
    Thanks for that explanation, Nugatory.
    Maybe my question doesn't really have anything to do with entropy after all. I'll try to redesign it.
    There seem to be two directions from which you can get an elementary particle. Either you can "shatter" matter until it yields some. Or you could--theoretically--penetrate the quantum world, "capture" a third or second generation particle and wait until it decays into a stable elementary particle.
    Do you see what I'm getting at?
     
  9. Aug 21, 2016 #8

    vanhees71

    User Avatar
    Science Advisor
    2016 Award

    I don't see, what you are getting at. You can create particles in very different ways indeed. What's amazing indeed is that thanks to quantum theory particles of the same time are exactly the same, they are indistinguishable. An electron is an electron at utmost precision. Any electron has, e.g., precisely the same electric charge ##-e##, which is precisely opposite to a proton's charge ##+e##. It doesn't matter, where the electron comes from. It may be made in an annihilation process or conversion of a photon hitting another heavy particle (pair production) or whatever other process. It's always and electron. It's not clear to me, what this has to do with entropy in your opinion.
     
  10. Aug 22, 2016 #9

    Demystifier

    User Avatar
    Science Advisor

    A spontaneous decay happens easily, but the decay products cannot so easily recombine into the initial particle. That certainly has to do with entropy.
     
  11. Aug 22, 2016 #10

    vanhees71

    User Avatar
    Science Advisor
    2016 Award

    Sure, but a single decay of a particle doesn't produce entropy. It's going from one pure state to another pure state. In a kinetic equation both decay and recombination are subject to detailed balance (which by the way follows from unitary of the S-matrix and does not rely on time-reversal and parity invariance). E.g. if you have a decay ##1 \rightarrow 23## you also have a recombination ##23 \rightarrow 1##, and the corresponding kinetic equations for this process for particle 1 looks like
    $$\mathrm{D}_t f_1 = \int \widetilde{\mathrm{d}^3 \vec{p}_2} \widetilde{\mathrm{d}^3 \vec{p}_2} |\mathcal{M}|^2 [f_2 f_3 (1 \pm f_1) -f_1 (1 \pm f_2) (1 \pm f_3)] \delta^{(4)}(p_1-p_2-p_3).$$
    Of course, as any collision term, also this decay-recombination collision terms adds to entropy production in the corresponding entropy-balance inequality (Boltzmann H theorem).
     
  12. Aug 22, 2016 #11

    stevendaryl

    User Avatar
    Staff Emeritus
    Science Advisor

    Well, there are two different definitions of "entropy" at work here. There is Von Neumann entropy:

    [itex]S = - tr(\rho\ log(\rho))[/itex]

    (where [itex]\rho[/itex] is the density matrix), which is zero for any pure state. But there is also the Boltzmann entropy:

    [itex]S = k\ log(W)[/itex]

    where [itex]W[/itex] is the number of microstates that give rise to the same macrostate. (To make rigorous sense of this, I think you need some kind of "coarse-graining" for saying when two states are close enough as to be macroscopically indistinguishable.)

    A decay of a single particle into three or more particles increases the Boltzmann entropy, I'm pretty sure. A decay into exactly two might not increase the Boltzmann entropy, because the (kinematic) state of each particle is uniquely determined by the other.
     
  13. Aug 23, 2016 #12

    Demystifier

    User Avatar
    Science Advisor

    Sure, but if you want to detect the single decay, then you also need to take into account the state of the macroscopic detector entangled with the decay products. In principle this can also be described by a pure state, but in practice it cannot. So for practical purposes the decay products need to be described in terms of mixed states, which involves entropy.

    More generally, in principle, anything in physics can be described by pure states. Consequently, in principle, nothing has entropy. Nevertheless we still talk about entropy because it is an efficient way to express our limited practical knowledge.
     
    Last edited: Aug 23, 2016
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Entropy and particle decay
Loading...