Sensitivity vs efficiency

  • I
  • Thread starter kelly0303
  • Start date
  • #1
278
15
Hello! What is the difference between the sensitivity and the efficiency of a detector. In the detector books I found they seem to be treated as 2 different concepts, but based on the descriptions, I can't seem to understand the difference. Efficiency seems to be how many particles you detect out of the total number of particles, while the sensitivity seem to show whether you can detect any particle at all. From this I understand that they are the same thing, but sensitivity outputs basically a 0 or 1 saying of you can detect anything or not, while efficiency outputs a number between 0 and 1 saying how much you actually see. It seems like efficiency implies sensitivity i.e. 0 efficiency detection means 0 sensitivity. What am I missing? Thank you!
 

Answers and Replies

  • #2
gleem
Science Advisor
Education Advisor
1,707
1,041
Efficiency is the fraction of particles entering the detector that are detected. Sensitivity can be defined as the ratio of the count rate to a unit flux of particles ( particles/sec/cm 2). I am only familiar with one detector in which sensitivity is used and that is a small length BF3 neutron detector used for thermal neutron detection.
 
  • #3
34,783
10,944
I largely see sensitivity used in the context of physics analyses: How common must a physical process be so we expect to see e.g. 1 event, or how large must it be so we see a signal with a significance of 5 sigma, or similar.

Efficiency is much simpler: We have an event (or even a single particle), how likely are we to find it?
 

Related Threads on Sensitivity vs efficiency

Replies
1
Views
2K
  • Last Post
Replies
2
Views
2K
  • Last Post
Replies
1
Views
4K
  • Last Post
Replies
15
Views
3K
  • Last Post
Replies
6
Views
1K
  • Last Post
Replies
4
Views
861
Replies
13
Views
1K
Replies
4
Views
651
Replies
10
Views
5K
Replies
4
Views
963
Top