Gamma ray spectroscopy: Germanium detector efficiency calibration.

quantumlolz
Messages
7
Reaction score
0
Hi all,

Got a bit of a problem with a lab experiment at uni (I'm not sure if this is the right place to post this, mods feel free to move it if necessary!)

Anyway: We're trying to get a plot of efficiency against energy for a planar germanium detector. We've got spectra for different sources at a fixed distance from the detector.

The main question really is: how do we calculate efficiency? I've got Gilmore's "Practical Gamma Ray Spectroscopy" book in front of me opened at the efficiency calibration section. The only one that seems to mention that the source-detector distance is important (they specify in the lab script that we get spectra at a particular source/detector distance...) is absolute full energy peak efficiency. The equation given for this is:

efficiency = full energy peak count rate/(source strength x probabiltiy of emission of the particular gamma ray being measured)

(I think this is the one we're meant to be using - but how does one then calculate the probability??)

I'll be massively massively grateful to anyone that can help, because I'm really pretty confused...
 
Physics news on Phys.org
Last edited by a moderator:
The first two lines of Bob S's #2 are basically the answer you need.

@Bob S: I don't think they need the x-ray attenuation coefficients for germanium, since they're trying to measure efficiency empirically, not predict it theoretically. They also don't need to calculate solid angle (assuming they have the source at the actual distance from the detector where it'll be in the actual experiment). Scattering materials nearby shouldn't actually matter, since they're measuring the photopeak efficiency; if it's scattered the Ge won't get a photopeak event, it'll get some lower (Compton-scattered) energy.
 

Similar threads

Back
Top