I am stuck on a problem that is driving me crazy. Assuming the limits of visible light are 430nm-680nm, design a grating that will spread the first order spectrum through an angular range of 20 degrees. Now I have tried a couple of different ways to solve this. Let me step through my thought process. Knowing the equation d*sin(theta) = n*lambda where d is the distance between slits, and n is the order of maxima. Since we are looking at the first order, the n basically disappears. Noting that as lambda changes, so does theta, I started out by taking the derivative of the equation. I found this method in the text book, but it states that it only applies when change in lambda is much less than lambda. That is not my case. I tried it out anyway, and got a final answer of d=835.4nm. When I back sub this into the original equation and calculate the angle of each wavelength, then find the difference, it turns out to be 23.5 degrees. Close, but not 20. Abandoning this method, I then tried writing two equations to solve for the two angles of red and violet light, so (theta red)=arcsin(lambda red/d), and likewise for violet, then stating (theta red) - (theta violet) = 20 degrees, but I end up with d=250nm / sin(20) d=730.9nm When I sub that into two equations to find each angle, the difference of the angles is 36 degrees. Further off the mark than my first try. I know I skipped alot of details here, but there is just too much for me to put down. I feel like I am just not attacking this in the correct way. Can someone put me on track? If you need more detail about a particular approach I tried, let me know and will put it down. Thanks.