for those of you that are familiar with Young's Single slit experiment, I am having trouble determining how far apart the minima are. The problem is as follows: one shines a laser with wavelength lambda=650nm through a single slit of width a=.04mm onto a screen 10m away. How far apart are the minima? I know that sin[theta]= lambda/a. However, does "a" have to be in meters? nanometers? or millimeters?