Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Determining background radiation in a decay spectrum

  1. Apr 10, 2012 #1
    I was working on calibrating some newly purchased software onto our lab computers when I noticed that the decay spectrum on the screen did not look exactly as it does on published material. I attribute this to background radiation effects (correct me if I am wrong please), so I decided the best way to test the background rate would be to run the detector being used without a radioactive source.

    My question: how do I then use this data to remove it from generated spectra? I initially thought I would run the detector without the source for the same amount of time as I had with the source, but this would mean running it for days at a time in cases of weaker sources...seams rather impractical. May someone point me in the right direction please?
     
  2. jcsd
  3. Apr 10, 2012 #2
    If you are looking at nuclear gammas (e.g., Cs137), the local environment (e.g., shielding) can affect the amplitude of the Compton edge and backscatter peaks.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Determining background radiation in a decay spectrum
  1. Background Radiation (Replies: 9)

  2. Absorbtion Spectrum (Replies: 11)

  3. Line Spectrum (Replies: 4)

  4. The Atomic Spectrum (Replies: 2)

Loading...