A detector is used to count the number of gamma rays from a radioactive source if the number of counts is 10000 in exactly 20 secs then what is the error in counting rate per sec?
Homework Equations- no idea.
The Attempt at a Solution- As per our syllabus we had only done Geiger muller counter which gives a dead time & from that we can calculate the error. However that is not given here. SO I got no idea to approach. Please help.
Options given- 5%, 22.4%, 44.7%, 220% (absolute values all).
Question source TIFR GS 2010.