Hello! What is the difference between the sensitivity and the efficiency of a detector. In the detector books I found they seem to be treated as 2 different concepts, but based on the descriptions, I can't seem to understand the difference. Efficiency seems to be how many particles you detect out of the total number of particles, while the sensitivity seem to show whether you can detect any particle at all. From this I understand that they are the same thing, but sensitivity outputs basically a 0 or 1 saying of you can detect anything or not, while efficiency outputs a number between 0 and 1 saying how much you actually see. It seems like efficiency implies sensitivity i.e. 0 efficiency detection means 0 sensitivity. What am I missing? Thank you!