Spectrometer sensitivity (photons/count) at different wavelengths

Click For Summary

Discussion Overview

The discussion focuses on the sensitivity of spectrometers at different wavelengths and how this affects the measurement of irradiance from various radiation sources. Participants explore calibration methods and the implications of responsivity curves on the reliability of relative irradiance measurements across different wavelengths.

Discussion Character

  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant questions whether spectrometers can provide reliable relative irradiance measurements when comparing different wavelengths due to varying sensitivity (photons/count).
  • Another participant suggests that the resolution of this issue depends on the type of spectrometer and mentions the concept of a responsivity curve, particularly in the context of IR detectors.
  • It is noted that calibration can be performed using a known broad band spectrum source to adjust for sensitivity differences across wavelengths.
  • A participant emphasizes the importance of comparing test samples directly to known samples for effective calibration, while also considering factors like optical setup and temperature.
  • Calibration is discussed as a method not only for converting counts to intensity values but also for addressing relative intensity issues caused by different sensitivities at various wavelengths.
  • Another participant mentions a different calibration method for wavelength accuracy that involves sources emitting at known frequencies, indicating that these methods vary based on specific requirements.

Areas of Agreement / Disagreement

Participants generally agree that calibration is essential for addressing sensitivity issues in spectrometers, but there is no consensus on a single calibration method, as approaches may vary depending on the type of spectrometer and specific experimental conditions.

Contextual Notes

Participants highlight the dependence on specific calibration methods and the characteristics of different sensors, such as bolometers and silicon photodetectors, which may behave differently in terms of wavelength sensitivity.

fog37
Messages
1,566
Reaction score
108
TL;DR
spectrometer sensitivity at different wavelengths does not provide correct relative irradiance....
Hello Everyone,

I am trying to better understand how a spectrometer must be used to measure the wavelength content of the radiation from a specific source.
All spectrometers measure irradiance over a wavelength range (for ex, UV-VIS) but the sensitivity (photons/count) is not the same for all wavelengths. This means that even if the radiation contains the same energy at two different wavelengths ##\lambda_1## and ##\lambda_2##, the spectrometer will show the wavelength with the lower sensitivity to have a higher irradiance even if that is not what is going on...

Does that mean that spectrometers cannot provide a reliable relative irradiance when we compare different wavelengths? How do we solve for that?

Thanks!
 
Engineering news on Phys.org
How you resolve this issue (called responsivity curve in the language of IR detectors and photodiodes, probably called other things in other contexts) depends on what kind of spectrometer you are dealing with. For FTIR, you can calibrate the spectrum by taking a reference spectrum over a very non-dispersive material (gold in the IR, for example). (This procedure also deals with the spectrum of the broadband light source.) There isn't a one-size-fits-all procedure.
 
  • Like
Likes   Reactions: fog37
fog37 said:
This means that even if the radiation contains the same energy at two different wavelengths λ1 and λ2, ...
It also assumes that the aperture of the effective slit is the same width at both wavelengths.

You need to find a source that radiates a known broad band spectrum. Then you can calibrate the sensitivity of your system.

Knowing the predictable characteristics of the sensor employed can resolve the problem. Some sensors, such as bolometers, register thermal energy independent of wavelength.
 
  • Like
Likes   Reactions: fog37 and Twigg
There are two levels of concern here, depending upon the system and your needs. The most direct calibration is to set up the system and compare test sample directly to a known sample . Usually I run known source spectrum --test source scan---known source spectrum. The happy news is that silicon photodetectors are very linear in response, so that is all you really need.
Sometimes you don't have the luxury to calibrate in situ at the time of the test. Then you need to calibrate beforehand and also know a priori how changes in optical setup (calibration vs test) may effect your result. Be aware that optics (slit width optic focal number etc) and temperature can be important. It is easy to get lost in the minutiae.
Spectrometers are exceedingly clever and precise when well used.
 
  • Like
Likes   Reactions: Charles Link and Twigg
I see. So calibration is the solution.

I thought that calibration would only convert the vertical axis, which measures counts, to a correct intensity value ##W/m^2##. But I guess calibration can also provide a correction factor to take care of the different sensitivity across different wavelengths which would result in the relative intensity issues I am describing...
 
  • Like
Likes   Reactions: hutchphd
And of course there is an entirely different method for calibrating the wavelength accuracy which involves sources that emit at known frequencies. These again depend upon your exact requirements but are not particularly arcane nor complicated. Wonderful instruments.
 
  • Like
Likes   Reactions: Twigg

Similar threads

Replies
11
Views
1K
  • · Replies 23 ·
Replies
23
Views
3K
  • · Replies 11 ·
Replies
11
Views
7K
Replies
2
Views
3K
Replies
7
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 31 ·
2
Replies
31
Views
4K
Replies
8
Views
5K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 18 ·
Replies
18
Views
3K