- #1
velkyr
- 3
- 0
Homework Statement
"A source emits light with two monochromatic components of wave-lengths λ1 = 510.50 nm and λ2 = 510.90 nm. Using the Rayleigh criterion, find the minimum number of slits of a grating that must be illuminated by a beam from the source in order to resolve these components."
Homework Equations
[itex]\theta = \frac{1.22\lambda}{D}[/itex]
[itex]m\lambda = dsin\theta[/itex]
[itex]R = \frac{\Delta\lambda}{\lambda} = mN[/itex]
The Attempt at a Solution
First, I substituted in λ1 and λ2 into Rayleigh and assumed they would have an equal aperture diameter, giving an equation that looks like [itex]\frac{1.22\lambda_1}{\theta_1} = \frac{1.22\lambda_2}{\theta_2}[/itex]. Of course this left two unknown values of θ. So, I then attempted to use the resolving power formula using the two wavelengths. I calculated R to be 1276.25. However, not knowing the order of light being received, I couldn't then use this to calculate N.
My current thinking is to assume m = 1 and use the diffraction maximum condition to find values for θ in each case. However if I were to then use those angles in Rayleigh to calculate D, I'm not sure what relevance the aperture diameter actually has to finding the number of slits... I would really appreciate a nudge in the right direction because I'm running out of ideas. Thank you.