- #1
DM107
- 4
- 0
Hi,
I came across a question in an exam which I couldn't really relate to any topic of physics, that I had studied.
It goes like this-
A detector is used to count the number of gamma rays emitted by a radioactive source. If the number of counts recorded in exactly 20 seconds is 10000, the error in the counting rate per second is?
Can someone please let me know what concept is involved so that I can try and solve it?
I came across a question in an exam which I couldn't really relate to any topic of physics, that I had studied.
It goes like this-
A detector is used to count the number of gamma rays emitted by a radioactive source. If the number of counts recorded in exactly 20 seconds is 10000, the error in the counting rate per second is?
Can someone please let me know what concept is involved so that I can try and solve it?