Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Uncertainty in measurement (GRE question)

  1. Jul 20, 2011 #1
    This is actually a GRE problem, I'm just trying to go through and understand them all for studying purposes, but I don't understand how to do this one at all.

    1. The problem statement, all variables and given/known data
    A student makes 10 one-second measurements of the disintegration of a sample of long-lived radioactive isotope and obtains the following values.
    3, 0, 2, 1, 2, 4, 0, 1, 2, 5
    How long should the student count to establish the rate to an uncertainty of 1 percent?

    2. Relevant equations
    I have no idea. But I would guess that standard deviation might be involved..
    [tex]\sigma_{x}^{2}=\langle x^{2}\rangle-\langle x\rangle^{2}[/tex]

    3. The attempt at a solution
    But I am unsure how to get time involved in this. But the standard deviation is 2.4
    Any help is appreciated.
  2. jcsd
  3. Jul 21, 2011 #2
    Here's the Wikipedia article on the Poisson distribution: http://en.wikipedia.org/wiki/Poisson_distribution

    The Poisson distribution is the probability density function for the number of events that will occur during some interval if the average number of events that occur per interval is a known constant (it will not always be exactly the average because the process is random; the Poisson distribution is what describes the spread). For a radioactive isotope, an "event" is a decay. Obviously as the sample decays the expected number of decays per second decreases, but I think we are meant to ignore that since the isotope is "long-lived," and presumably the sample is very large. The Poisson distribution has the interesting property that the (ideal, not sample) standard deviation is the square root of the average,

    The average of the ten numbers you listed is 2, so we can take that to be the average number of decays per second. If you take longer time intervals, the average and standard deviation both increase.

    This problem is asking you to find the time required for the standard deviation divided by the average to reach 0.01.
  4. Jul 25, 2011 #3
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook