Optical detector sensitivity at low and medium flux

Click For Summary

Discussion Overview

The discussion centers on the sensitivity of optical detectors, particularly whether these detectors can exhibit better sensitivity at medium flux levels compared to low flux levels near the readout noise. Participants explore the implications of readout noise and the potential use of offset background signals to enhance detection capabilities.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • Eric questions if an optical detector can have better sensitivity at medium flux than at low flux due to readout noise, suggesting that a detector might discriminate between very close power levels (100 mW and 100.1 mW) while failing to detect lower levels (0.1 mW).
  • One participant describes a scenario involving an amplifier and oscilloscope, noting that adding a "light bias" may not improve the signal-to-noise ratio, but could potentially enhance sensitivity if the detector operates on a steeper slope of its characteristic curve.
  • Another participant expresses uncertainty about whether the signal-to-noise ratio improves with increased light flux, referencing a specific case of a TV camera using a solar cell that benefited from light bias.
  • There is a discussion about the relationship between output electrical power and input optical flux, with one participant asserting that diode current is typically proportional to optical power, indicating a linear relationship.

Areas of Agreement / Disagreement

Participants express differing views on the effects of light bias on sensitivity and signal-to-noise ratios, and there is no consensus on whether sensitivity improves with increased light flux.

Contextual Notes

Some assumptions about the characteristics of optical detectors and the behavior of noise in relation to signal levels remain unresolved. The discussion includes varying interpretations of how sensitivity is defined and measured.

Laloum Eric
Messages
2
Reaction score
0
Dear all,

This is my first post and I'm not sure if it's the right place to ask so don't hesitate to advise if necessary.
I would like to know if it's possible that an optical detector (the complete system including sensor and front-end electronics) has a better sensitivity at medium flux than at low flux near detector readout noise. Practically speaking is it possible to have a detector capable of discriminating 100 mW and 100.1 mW but not being able to detect 0.1 mW, notably because of readout noise. If it would be the case, then a solution to detect low level signal below the readout noise would be to add an offset background signal (of course not too high background in order that the background noise stays below detector readout noise).

Thank you.

Eric
 
Engineering news on Phys.org
Imagine the amplifier connected to an oscilloscope. In the absence of a light signal, suppose we see 10mV of noise on the trace. Now turn on the weak light signal to be detected, and suppose we now see the noise trace raised by 10mV. If we add some "light bias", the trace will be raised up, say by 1 volt, but will still be thickened to the extent of 10mV by noise as before. So there is no advantage.
But if the detector, as a typical diode, has a square law characteristic, then adding light bias will raise the sensitivity of the detector by bringing the operation on to a steeper slope. Whether the signal/noise ratio is improved by so doing I am uncertain. I did see a TV camera (using mechanical scan) and utilising a solar cell as the pick up device, in which light bias was used to good effect.
Incidentally, if you use an oscilloscope (or spectrum analyser) displaying dB on the Y axis, then it gives the appearance that noise is reduced as the signal gets bigger, but this is just an artifact of the Decibel System, the noise actually remaining the same.
 
  • Like
Likes   Reactions: mfb
Thank you for your detailed answer. I guess the square law relates to the fact that the output electrical power varies as the square of the input optical flux (output current is linear towards light flux). Still, I though that detector sensitivity is expressed as current vs light flux. So, can we really say that sensitivity slope is increasing with increased light flux
?
 
I think the diode current is usually proportional to optical power, so it is linear. I am not sure why my camera example used light bias - it was a very old type of solar cell.
 

Similar threads

  • · Replies 0 ·
Replies
0
Views
850
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 24 ·
Replies
24
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
9
Views
4K
  • · Replies 152 ·
6
Replies
152
Views
11K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 13 ·
Replies
13
Views
5K