Short version: how do I figure out how much error I get when measuring light intensity vs wavelength for a given monochromator band pass width? Long version: I have a project where we will be passing white light through liquids, then looking at the spectrum intensity of the light with a monochromator. The measurement of intensity is done with a regular photodiode. For the light source I'm using, there is about .02V of noise. The range the photodiode can output is about 0-10V. I measured the band pass of the monochromator with a HeNe laser and got 1.1nm for a .1mm slit and 3.9nm for a .5mm slit. For the small slit the output signal peaks around 2.2V, for the larger slit I get up to 4.9V. I'm concerned about the small slit giving me enough resolution in intensity valus, but I'm also hesitant to be measuring much larger than 1nm bands at any given point. For a given band pass width, about how much error do I expect in my results? Thanks.