Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Optical detector sensitivity at low and medium flux

  1. Jan 1, 2017 #1
    Dear all,

    This is my first post and I'm not sure if it's the right place to ask so don't hesitate to advise if necessary.
    I would like to know if it's possible that an optical detector (the complete system including sensor and front-end electronics) has a better sensitivity at medium flux than at low flux near detector readout noise. Practically speaking is it possible to have a detector capable of discriminating 100 mW and 100.1 mW but not being able to detect 0.1 mW, notably because of readout noise. If it would be the case, then a solution to detect low level signal below the readout noise would be to add an offset background signal (of course not too high background in order that the background noise stays below detector readout noise).

    Thank you.

    Eric
     
  2. jcsd
  3. Jan 1, 2017 #2

    tech99

    User Avatar
    Gold Member

    Imagine the amplifier connected to an oscilloscope. In the absence of a light signal, suppose we see 10mV of noise on the trace. Now turn on the weak light signal to be detected, and suppose we now see the noise trace raised by 10mV. If we add some "light bias", the trace will be raised up, say by 1 volt, but will still be thickened to the extent of 10mV by noise as before. So there is no advantage.
    But if the detector, as a typical diode, has a square law characteristic, then adding light bias will raise the sensitivity of the detector by bringing the operation on to a steeper slope. Whether the signal/noise ratio is improved by so doing I am uncertain. I did see a TV camera (using mechanical scan) and utilising a solar cell as the pick up device, in which light bias was used to good effect.
    Incidentally, if you use an oscilloscope (or spectrum analyser) displaying dB on the Y axis, then it gives the appearance that noise is reduced as the signal gets bigger, but this is just an artifact of the Decibel System, the noise actually remaining the same.
     
  4. Jan 2, 2017 #3
    Thank you for your detailed answer. I guess the square law relates to the fact that the output electrical power varies as the square of the input optical flux (output current is linear towards light flux). Still, I though that detector sensitivity is expressed as current vs light flux. So, can we really say that sensitivity slope is increasing with increased light flux
    ?
     
  5. Jan 2, 2017 #4

    tech99

    User Avatar
    Gold Member

    I think the diode current is usually proportional to optical power, so it is linear. I am not sure why my camera example used light bias - it was a very old type of solar cell.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Optical detector sensitivity at low and medium flux
Loading...