1. The problem statement, all variables and given/known data A laser is a light source that emits a diffraction-limited beam (like waves diffracting through a wide slit) of diameter 2 mm. Ignoring any scattering due to the earth's atmosphere, calculate how big a light spot would be produced on the surface of the moon, 240,000 miles away. Assume a wavelength of approximately 600 nm. 2. Relevant equations d = 2mm L = 240,000 miles λ = 600nm 3. The attempt at a solution I am using a small angle approximation where Θ = λ/d from dsinΘ = λ And so, converting the proper units, we have Θ = (6e-7)/0.002 = 0.0003 Angle is a dimensionless unit, so this seems to be correct. Now, if I wanted to find how big the light spot is, do I simple do tanΘ = x/240,000 => 240,000*tan(0.0003) = x = 72 miles Looking good?