determining background radiation in a decay spectrum

by physguy09
Tags: background, background radiation
physguy09 is offline
Apr10-12, 10:28 AM
P: 19
I was working on calibrating some newly purchased software onto our lab computers when I noticed that the decay spectrum on the screen did not look exactly as it does on published material. I attribute this to background radiation effects (correct me if I am wrong please), so I decided the best way to test the background rate would be to run the detector being used without a radioactive source.

My question: how do I then use this data to remove it from generated spectra? I initially thought I would run the detector without the source for the same amount of time as I had with the source, but this would mean running it for days at a time in cases of weaker sources...seams rather impractical. May someone point me in the right direction please?
Phys.Org News Partner Physics news on
Researchers find tin selenide shows promise for efficiently converting waste heat into electrical energy
After 13 years, progress in pitch-drop experiment (w/ video)
Global scientific team 'visualizes' a new crystallization process (w/ video)
Bob S
Bob S is offline
Apr10-12, 02:55 PM
P: 4,664
If you are looking at nuclear gammas (e.g., Cs137), the local environment (e.g., shielding) can affect the amplitude of the Compton edge and backscatter peaks.

Register to reply

Related Discussions
Background radiation Cosmology 7
spectrum flip dependent on background General Physics 7
Background radiation High Energy, Nuclear, Particle Physics 1
Background radiation Biology 5
background radiation? General Physics 24