The question is as follows: A point source emits a spherical wave with λ = 500 nm. If an observer is far away from the source and is only interacting with the light across a small area, one can approximate the local wave as a plane wave. How far from the source must the observer be so that the phase of the wave deviates by less than 36° over an illuminated spot 3.9 cm in diameter? I honestly don't even know where to start with this one. I have all my homework done except for this problem. Please help!