< Mentor Note -- thread moved to HH from the technical physics forums, so no HH Template is shown > Hi guys, I have a question here which I'm struggling to do and perhaps you could help ; Monochromatic electromagnetic radiation with wavelength λ from a distant source passes through a slit. The diffraction pattern is observed on a screen 2.50 m from the slit. If the width of the central maximum is 6.00 mm, what is the slit width a if the wavelength is 500 nm (visible light)? What I have done is y=(R(m+1/2)lambda) / d where y = 3mm (half of the central maximum width); R = 2.5m m = 0 lambda=500nm d =( R(m+1/2)lambda) / y d =( 2.5 * (1/2) * 500x10^-9 )/ 3x10^-3 and so I manage to obtain d = 2.08x10^-4 m, however this answer is wrong what am I doing wrong? Thanks !