Let say that you use a LED and direct it through a gas. You measure the 'light' transmitted through the gas and the 'light' entering the gas to obtain the transmission. Since the Beer–Lambert law is wavelength dependent and the LED is broadband within the region of a for example C02 band, how can you determine the transmission of this wavelength integrated light with the ultimute goal of determine the concentration of the species investigated? Beer–Lambert law states that I/I0 = exp(kv*l) = exp(sigma*N*l) where kv (absorption coefficient) is wavelength dependent. You have calculated kv per wavelength, but since the light measured is broadband how can you 'simulate' the transmission with known kv? Sigma is the absorption cross section. I guess the problem is that the light 'contains many wavelength (and not just one)' and that the Beer law is for one wavelength since the absorption coefficient is wavelength dependent. So how can you extract the concentration information in such arrangement? The detector sees a passband of wavelengths and each wavelength has its own extinction coefficient. Thismeans that the detector sees a summation of the effect of each individual wavelength within the passband. I hope I made any sense.