Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Decay and scattering: What happens between the final and the initial state?

  1. Nothing happens as it is not measured

  2. Something may happen but it is irrelevant as it is not measured

  3. QFT describes accurately what happens

  4. Something interesting happens but QFT does not describe it

  1. Apr 5, 2007 #1


    User Avatar
    Science Advisor

    For processes of particle decay and inelastic scattering, quantum field theory (QFT) predicts well the probabilities of various final states for given initial states. Technically, this is described by the S-matrix, which is the unitary-evolution matrix describing the transitions from t=-infinity to t=infinity.
    But what happens in between at intermediate times?
    How exactly the initial particle(s) get transformed to the final particles?
    Is that a continuous process or an instantaneous jump?
    If it is a jump, when and where exactly does it happen?
    In the detector? Or much before, during the collision itself?
    Can QFT answer these questions at all?
    Are these questions really physically relevant? Are they physical questions, or purely philosophical ones?

    Here, I do not ask you to give, with a confidence, the final answer, but merely to express your opinion and intuition about it.
  2. jcsd
  3. Apr 5, 2007 #2
    Purely philosophical. There is no scientific way to talk about that, since it is not measurable by definition. :smile:

    If string theory is correct, we could imagine things going continuously from one particle to another. It LQG is, we can on the contrary imagine things go in discrete steps... Well, my two cents.
  4. Apr 5, 2007 #3

    Hans de Vries

    User Avatar
    Science Advisor

    My preference tends to go to option 3 here. However, probably for the
    opposite reason as you might expect. Of course, One adds the amplitudes
    and not the probabilities of the different diagrams. Thus: all of the diagrams
    must be existing simultaneously.

    So yes, subdividing a complex process in separate things we artificially label
    "virtual particles" should be done always with this in mind.

    Now, for me personally, this is ALWAYS the case when one artificially groups
    an extended object together by giving it a single label. For real particles
    just as well as for virtual particles!

    Does the wave-function of a "real" particle at some space-time point
    have a notion that is part of a "real" photon or a "real" electron? That is,
    is each point in space-time effectively labeled as such, or is it just we
    who put these labels there? Let me use a (rather different) metaphor:

    Does a moving air-molecule have a notion that it is part of a spoken lie
    or a spoken truth? We humans have absolutely no problem of grouping a
    bunch of moving air molecules together and label them in such a way!
    We do this on a daily basis!

    Now for me there is always this artificiality in labeling extended objects
    as a whole, and it is wrong to somehow assume that each individual part,
    or point of the extended objects individually also bears this label physically.

    In ths sense one could say that "virtual"particles are not so much more
    artificial as "real" particles are.

    This point of view has as a consequence that one can not take Unitarity as
    something which is automatically implied. Rather, one has to find a physical
    explanation of the effect.

    Yes, these are all the central questions.

    The final state particles are typically monochromatic, with a momentum in
    a given direction. This implies that one can define a rest frame where the
    phase is synchronized: the same everywhere.

    So, one might suspect that such a phase synchronization mechanism is
    at the base of the projection process which selects one of the many

    A photon in an interference experiment has multiple momenta (in different
    directions) at each space-time point. Furthermore, it doesn't have a rest-
    frame. Projecting out a monochromatic, single momenta state can only
    take place during an interaction where the speed is less then c.

    I would give a non-pertubative QFT treatment a better chance.

    QFT often uses a lot of simplifications. For instance the use transversal
    photons which is a non Lorentz invariant treatment. (A transversal photon
    has a longitudinal component in other rest frames.)

    Spin polarization is generally handled in a purely statistical way:
    An electron with a spin A has an X% chance to have a spin B. There's no
    deeper physical explanation of these effects in QFT.

    Regards, Hans.
  5. Apr 5, 2007 #4
    I strongly disagree. A real particle can be measured. An air molecule can be trapped in a box. On the contrary, virtual particles are mere mathematical objects which can not, by definition, be measured.
  6. Apr 5, 2007 #5

    Hans de Vries

    User Avatar
    Science Advisor

    Likewise, many physicists, (not me) will claim that the wave function of a
    real particle is a mere mathematical object.

    In many QFT textbook you'll find the claim, at one place or another, that:
    "Most so-called real photons are, strictly speaking, virtual, in the sense that
    they are emitted at one place and absorbed at another."

    There is no way to distinguish, by measurement, that this process of
    emission and absorption is different for real particles and virtual particles.

    It is not impossible, "by definition", as you say. We don't "define" nature,
    we just try to describe it.

    Regards, Hans.
  7. Apr 5, 2007 #6
    The unobservability of an absolute phase is at the heart of the so succesful gauge principle. I am aware of the Aharonov-Bohm effect, but want to point out that this corresponds only to a phase shift. Actually, mere intereferences are already sufficient to understand that phase shifts are observable.
    I have never considered this a serious difficulty. This is a rethorical argument. If you want me to accept that "real" photons have billionth of eV mass, I'm fine with that :smile:

    Let me describe a fairly recent event in electron scattering. One photon exchange has been for decades taken for granted. I myself describe my data everyday by using a single virtual photon exchange. It so occurs that for some polarisation observables (that is, non averaging over spins) it has been shown that two photons eschange can produce a few percent deviations at 1 GeV scale. Older theoretician did not expect this to happen at all.

    The way I picture it, truncating the infinite sum over Feynman diagrams is most of the time a very good approximation to what is actually happening, but interpreting single Feynman diagrams as real processes might be misleading. To me, the virtual particles are mere convenient tools.
    Sure. But we define mathematical objects. As you pointed out, one can not, strictly speaking, measure a really real photon. But on the other hand, they are at first defined as massless, with regards to the Poincare group.

    This is a very old debate, and as usually, most probably just a matter of taste :smile:
  8. Apr 5, 2007 #7


    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    I agree with this point of view. The way I see the whole Feynman diagram expansion is as a mere convenient trick to remember the mathematical expressions obtained from a perturbation expansion. Nothing more. I don't see virtual particles as being more "physical" than, say, the ghost particles generated by gauge fixing non-abelian gauge theories. Those are clearly unphysical but it's convenient to see their contributions as coming from loops containing particles obeying the "wrong" statistics.
  9. Apr 6, 2007 #8


    User Avatar
    Science Advisor
    Homework Helper

    This is the heart of the issue. I agree.
  10. Apr 6, 2007 #9


    User Avatar
    Science Advisor

    I like this. See
    especially Fig. 1 that summarizes various possible answers to the main question of this thread.
  11. Apr 7, 2007 #10
    I have voted option three, because I feel that QFT already does a fairly good job, although it is far from perfect. Some tweaking and new phenomenology is necessary to create a much more precise and accurate QFT that WILL consistently and accurately describe the workings of every interaction in between the initial and final states (with very minimal errors, of course). The sum of all applicable Feynman diagrams is a nice place to start, but I sometimes feel that the quark flavor transitions are a bit hairy in QFT, and I have often wondered if the idea of composite quarks and leptons is viable and/or valid. Recent evidence against the null result for D0 - Dbar0 mixing is fairly ominous, and makes me think there is new physics phenomenology necessary here.

    So, to sum it all up, I say yes to option three because I believe a more advanced and edited QFT will be able to describe the unseen inner actions between initial and final states...
  12. Apr 7, 2007 #11


    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    You have always posted exteremly interesting posts in the past so I can't help from using the opportunity to ask you what the situation is concerning D_0 Dbar 0 mixing?! What is going on there?

  13. Apr 7, 2007 #12


    User Avatar
    Science Advisor

    I'm equally in favor of 3 and 4-that is, QFT may well do a good job, but who knows for sure?. Now there is, here I go again, a large literature on this subject, going back to the early days of scattering theory, when the Weiskopf-Wigner resonance structure was developed. When non-relativistic QM is included there is close to an infinite amount of work, particulary in quantum optics which deals with transitions taking place over finite times, and for which perturbation theory does not work well. The topic is huge.

    See Cohen-Tannoudji --Atom-Photon Interactions, great on resolvants;Mandel and Wolf's book on Quantum Op
    tics, great on non-perturbative problems.

    Reilly Atkinson

  14. Apr 10, 2007 #13
    Take a look at this URL; http://arxiv.org/PS_cache/arxiv/pdf/0704/0704.0120v2.pdf [Broken]

    BES-III has found evidence for D_0 - Dbar_0 mixing that is inconsistent with the null result by almost 4 standard deviations! This just came out on ArXiv on April 3rd.

    You should also take a look at;
    http://arxiv.org/PS_cache/arxiv/pdf/0704/0704.1000v1.pdf [Broken]

    The Belle Collaboration just posted this on ArXiv on April 7th. Looks like their figures for the mixing parameters "x" and "y" also show that the null result no longer appears valid.
    Last edited by a moderator: May 2, 2017
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook