- #1
- 94
- 0
For my advanced physics lab course this semester, I recently conducted an experiment using a hyper-pure germanium detector to measure the energy of gamma rays released upon the formation of deuterium (DF).
Essentially, I used a neutron source to bombard a hydrogen rich target (used both paraffin wax and water) to form deuterium, and then used the detector to generate a spectrum with which I fit with a gaussian peak to measure the centroid of the suspected DF peak.
The detector was cooled to liquid nitrogen temperatures. It output a pulse of differing height depending on the energy absorbed by the crystal. These pulses were then sorted into different bins or channels depending on their height. To associate energies to these channels I needed to calibrate the detector using known radioactive sources.
I did a long run of the background room radiation and used several Bi214 peaks in the background to linearly calibrate the detector in the approximate range of DF energy (2224.57 KeV).
My measurement for DF energy for paraffin wax is 2221.7(7) KeV and for water is 2222.2(7) KeV. To obtain these values I specifically used 4 calibration peaks, fit a line to them (using their known energies) and then simply plugged the measured channel number for my measurement into the line to get the energy. I also did many other calibrations using different peaks and differing numbers of peaks and I consistently get a value of around 2221-2222 KeV but at least 4 standard deviations away from the literature value.
Does anyone have any idea why my measured value is consistently lower by about 2-3 KeV? Is there some sort of process that I may be ignoring in my analysis? I do not know a lot about nuclear physics and spectroscopy so I have no idea why this discrepancy exists.
Right now my best guess is simply a change in detector temperature between when I did my calibration and when I did the DF measurements but honestly I have no idea.
Your help would be appreciated as I'm writing a formal report on this and I would like to say something other than "I have no idea why this is 2-3 KeV different".
Essentially, I used a neutron source to bombard a hydrogen rich target (used both paraffin wax and water) to form deuterium, and then used the detector to generate a spectrum with which I fit with a gaussian peak to measure the centroid of the suspected DF peak.
The detector was cooled to liquid nitrogen temperatures. It output a pulse of differing height depending on the energy absorbed by the crystal. These pulses were then sorted into different bins or channels depending on their height. To associate energies to these channels I needed to calibrate the detector using known radioactive sources.
I did a long run of the background room radiation and used several Bi214 peaks in the background to linearly calibrate the detector in the approximate range of DF energy (2224.57 KeV).
My measurement for DF energy for paraffin wax is 2221.7(7) KeV and for water is 2222.2(7) KeV. To obtain these values I specifically used 4 calibration peaks, fit a line to them (using their known energies) and then simply plugged the measured channel number for my measurement into the line to get the energy. I also did many other calibrations using different peaks and differing numbers of peaks and I consistently get a value of around 2221-2222 KeV but at least 4 standard deviations away from the literature value.
Does anyone have any idea why my measured value is consistently lower by about 2-3 KeV? Is there some sort of process that I may be ignoring in my analysis? I do not know a lot about nuclear physics and spectroscopy so I have no idea why this discrepancy exists.
Right now my best guess is simply a change in detector temperature between when I did my calibration and when I did the DF measurements but honestly I have no idea.
Your help would be appreciated as I'm writing a formal report on this and I would like to say something other than "I have no idea why this is 2-3 KeV different".