Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Data Analysis: Gamma ray attenuation

  1. Mar 1, 2009 #1
    Hello, I'm attempting to analyse the data recovered from an experiment that I performed in lab, but I'm having some problems understanding how to properly apply the statistical methods learned to this specific problem.

    Essentially, the experiment consisted of placing a source of gamma rays near to a detector and counting the number of photons detected in an interval of 2 minutes. We then placed increasing thicknesses of a given material and observed how the number of photons counted (in 2 minutes) was affected.

    As such I have a table of values of width versus number of counts. Due to time constraints, we could only perform the experiment twice, so for each thickness, I have two values for the number of photons counted. I figured that I could take the average of these values and the standard deviation and use that to plot a curve with error bars. The problem is that two of the values have exactly the same number of counts (this seemed insanely unlikely given that the number of counts was on the order of 500 and the other values for standard deviation are around 30/40), so for these 2 widths, I have a standard deviation of zero.

    Looking at the curve, it seemed apparent that an exponential curve would be a good fit, so that is what I am attempting to do. However, my curve fitting program is throwing up massive problems due to the two values with "zero" error. I anticipate the error to increase with the number of counts, so I can't just set a constant error value. So my trouble is how to resolve in a sound manner these crazy zero errors that I'm finding. Any help would be enormously useful because I'm at an utter loss as to what to do here!
  2. jcsd
  3. Mar 16, 2009 #2
    Did you ever resolve this issue?

    One might consider looking at the systematic error from the detector and then making a good faith estimate of the error induced in likelihood of getting the same number of counts. Interesting question none-the-less.
  4. Mar 16, 2009 #3
    In the end I presumed that the number of counts has an error of the square root of the number of counts (I heard this is what to do for some reason I don't understand at all). When I took that into account, it was a matter of finding the weighted mean along with the associated error. I have no idea why we assume that the error on the number of counts from a radioactive source is the square root of the number of counts. If anyone could explain, it'd be useful.

    As it was I just blindly wandered down the statistical process that I've been taught somewhat rushed to get to an answer which seemed reasonable. It annoys me that we never do a proper statistical treatment. We had an 8 week course on data analysis and error propagation in first year, but I hardly think 8 hours of lectures suffices. Ah well...
  5. Mar 16, 2009 #4
    That method comes from the Poisson distribution. If you expect an average rate, call it [itex]\lambda[/itex] of something to occur and the counts in your experiment over a period t are random, then the distribution is
    [itex]P(n) = \frac{(\lambda t)^n e^{-\lambda t}}{n!}[/itex],
    where [itex]n[/itex] is the number counted and [itex]P(n)[/itex] is the probability of getting [itex]n[/itex] counts. You can compute
    \langle n\rangle &= \Sigma_0^\infty nP(n) \\
    \langle n^2\rangle &= \Sigma_0^\infty n^2P(n) \\
    to find that the mean value of this distribution and its standard deviation are
    \mu &= \lambda t \\
    \sigma &= \sqrt{\lambda t}\\

    When you do a counting experiment like this, your result is your best guess for what the mean actually is. Your uncertainty is the square root of your number of counts.
    Last edited: Mar 16, 2009
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook