Simulate Gamma Rays from Radioactive Decay

Click For Summary
SUMMARY

This discussion focuses on simulating gamma rays emitted from radioactive decay, specifically addressing the challenge of handling non-normalized intensity values. The user presents a scenario involving two gamma emissions with intensities of 50 and 60, questioning how to effectively simulate the emission probabilities. The consensus suggests using a random number generator to select between the gamma energies based on their respective intensities, rather than normalizing the values. A simple code snippet is provided to illustrate the implementation of this random selection process.

PREREQUISITES
  • Understanding of radioactive decay processes
  • Familiarity with gamma ray emission and intensity concepts
  • Basic programming skills, particularly in random number generation
  • Knowledge of conditional statements in programming
NEXT STEPS
  • Research methods for simulating radioactive decay in programming languages like Python or C++
  • Learn about Monte Carlo simulations for probabilistic modeling
  • Explore libraries for random number generation and statistical analysis
  • Investigate how to handle multiple emissions in decay simulations
USEFUL FOR

Researchers in nuclear physics, software developers working on simulation tools, and educators teaching concepts of radioactive decay and gamma emissions.

Marioqwe
Messages
65
Reaction score
4
Hello, I am trying to simulate the gammas from certain radioactive decays but I am really puzzle as to how to approach this. The site I'm using as a reference lists the intensities of the different gammas corresponding to an specific decay.

The thing that confuses me is that, for example, some intensities do not add up to 100; which is understandable since there might be more than one gamma emitted per decay. But then, how would one simulate this?

For example, Let's say I have X -> Y + gamma and let's say I have two different gammas, a and b, with intensities 50 and 60 respectively. Then, the way I think about it is that out of 100 decays, 50% of the time I get a, and 60% of the time I get b. But how does one tell a computer to do this?

It appears to me that normalizing the intensities wouldn't work and so throwing a single random number between 0 and 1 won't work. Is the best approach to just do a different random number for each intensity?
 
Physics news on Phys.org
Marioqwe said:
Hello, I am trying to simulate the gammas from certain radioactive decays but I am really puzzle as to how to approach this. The site I'm using as a reference lists the intensities of the different gammas corresponding to an specific decay.

The thing that confuses me is that, for example, some intensities do not add up to 100; which is understandable since there might be more than one gamma emitted per decay. But then, how would one simulate this?

For example, Let's say I have X -> Y + gamma and let's say I have two different gammas, a and b, with intensities 50 and 60 respectively. Then, the way I think about it is that out of 100 decays, 50% of the time I get a, and 60% of the time I get b. But how does one tell a computer to do this?

It appears to me that normalizing the intensities wouldn't work and so throwing a single random number between 0 and 1 won't work. Is the best approach to just do a different random number for each intensity?

If your question is how to make a computer program randomly pick between two options, the simplest solution would be something like:

real gamma,energy1,energy2,i
energy1=a
energy2=b
i = rand()
if i > 0.5 gamma=energy1 else gamma=energy2
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 15 ·
Replies
15
Views
5K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 20 ·
Replies
20
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K