1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Simulate Gamma Rays from Radioactive Decay

  1. Aug 9, 2012 #1
    Hello, I am trying to simulate the gammas from certain radioactive decays but I am really puzzle as to how to approach this. The site I'm using as a reference lists the intensities of the different gammas corresponding to an specific decay.

    The thing that confuses me is that, for example, some intensities do not add up to 100; which is understandable since there might be more than one gamma emitted per decay. But then, how would one simulate this?

    For example, Lets say I have X -> Y + gamma and lets say I have two different gammas, a and b, with intensities 50 and 60 respectively. Then, the way I think about it is that out of 100 decays, 50% of the time I get a, and 60% of the time I get b. But how does one tell a computer to do this?

    It appears to me that normalizing the intensities wouldn't work and so throwing a single random number between 0 and 1 won't work. Is the best approach to just do a different random number for each intensity?
  2. jcsd
  3. Aug 9, 2012 #2


    User Avatar
    Science Advisor
    Gold Member

    If your question is how to make a computer program randomly pick between two options, the simplest solution would be something like:

    real gamma,energy1,energy2,i
    i = rand()
    if i > 0.5 gamma=energy1 else gamma=energy2
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook