1. The problem statement, all variables and given/known data How far away can a human eye distinguish two car headlights 2.1 m apart? Consider only diffraction effects and assume an eye pupil diameter of 5.0 mm and a wavelength of 550 nm. What is the minimum angular separation an eye could resolve when viewing two stars, considering only diffraction effects? 2. Relevant equations θ=(1.22*λ)/D where D=diameter 3. The attempt at a solution If θ=2sin-1(0.5d/l) where d=distance between objects and l=length/distance to objects, then θ=(1.22*λ)/D with λ550e-9 m and D=0.005 m and l=2.1 m yields l=8.9658e5 m. This is incorrect though and I'm not sure where I went wrong. Any help is appreciated, thanks.