Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Relativistic Muon Lifetimes

  1. Dec 12, 2009 #1
    Given the lifetime of a muon as 2.197 microseconds, and the rest mass of 105.65MeV, and a total particle energy of 10GeV, I need to calculate how far, in the rest frame, the particle will travel before decay.

    2. Relevant equations

    Attempts/Understanding Thus Far

    The equations aren't complicated but I can't quite make them mesh with my understanding. If deltaT' is the moving time, dividing it by gamma to get the stationary period will always yield a shorter time amount of time passing for the stationary period; isn't this the opposite of what I want?

    And, in order to figure out the total distance traveled, as measured in the rest frame, is it enough to multiply v*t'=L', then multiply by gamma to get the rest frame distance? Do I need to convert the time as well? Or would this lead to too many factors of gamma?
  2. jcsd
  3. Dec 12, 2009 #2


    User Avatar
    Homework Helper

    deltaT is the proper time, the time that an event takes in the same reference frame where the event takes place. Here, deltaT would be 2.197 microseconds, since that's how long the muon thinks it lives.

    After getting deltaT' and figuring out v, distance is just v*deltaT'. (In fact, distance is always v*deltaT' in any reference frame as long as v and deltaT' are both measured in said frame.)
  4. Dec 13, 2009 #3
    But if deltaT' is the time frame of the muon, I must divide it by gamma to get the amount of time it'll take in the earth's reference frame, right?

    But that'll yield a smaller number. So, the particle would seem to decay quicker, and thus it's clock would be running fast according to the earth, right? I haven't taken Relativity yet, this is for a nuclear physics class, so I'm just running off of my own research over the years. It seems like the equation makes the wrong clock slow down, and I can't see (though I'm sure I am wrong) where I'm wrong in my understanding.

    If deltaT' is 2.19 microseconds, then deltaT is .003 microseconds (gamma=70.66); but shouldn't it be if deltaT' passes 2.19 microseconds at .9999c, then the other observer measures the same event as lasting 154 microseconds?
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook