1. The problem statement, all variables and given/known data A detector is used to count the number of gamma rays from a radioactive source if the number of counts is 10000 in exactly 20 secs then what is the error in counting rate per sec? 2. Relevant equations- no idea. 3. The attempt at a solution- As per our syllabus we had only done Geiger muller counter which gives a dead time & from that we can calculate the error. However that is not given here. SO I got no idea to approach. Please help. Options given- 5%, 22.4%, 44.7%, 220% (absolute values all). Question source TIFR GS 2010.