1. Nov 13, 2013

elegysix

Here's what I'm thinking. The sun is too bright to measure directly with our equipment. If I calibrate without a filter, and capture the reference spectrum with and without the filter, then I can model how the filter changes the spectrum. This way I can capture the sun spectrum with the filter, and transform the data as if I had not used it.

What I mean to say is, if
$I_{r}(\lambda)=Y(\lambda)$
and
$I_{r+f}(\lambda)=G(\lambda)Y(\lambda)$
then is it valid to argue that
$I_{s}(\lambda)=\frac{I_{s+f}(\lambda)}{G(\lambda)}$

Or is it that $G(\lambda)$ is dependent on I?

where
$I_{r}$ is the irradiance of the reference
$I_{r+f}$ is the irradiance of the reference measured through a filter
$I_{s}$ is the irradiance of the sample
$I_{s+f}$ is the irradiance of the sample measured through the same filter

2. Nov 14, 2013

Andy Resnick

As a first approximation, that approach is fine. The main shortcomings with this approach are 1) if G(λ) << 1 (the filter transmits very little light at some wavelengths) and/or 2) if your detector is a coarse-grained spectrometer (say a color camera). Problem (1) introduces error by amplifying noise, and problem (2) means that the measured spectrum is a convolution, not a multiplication, so the 'division' step is actually a deconvolution.

The filter transmittance should not vary with intensity unless it has been specifically designed to do so (e.g. a saturable absorber).