1. The problem statement, all variables and given/known data Light from a laboratory sodium lamp has two strong yellow components at 589.5923 nm and 588.9953 nm. How far apart in the first-order spectrum will these two lines be on a screen 1.00 m from a grating having 10,000 lines per centimeter? 2. Relevant equations Grating equation a sin (theta) = m (lambda) 3. The attempt at a solution Okay, so... I'm using Hecht's fourth edition of Optics and am struggling with question 33 as reproduced above. I'm not really sure if I need anything else besides the grating equation but through my inspection I'm pretty sure it's the only one that seems valid for the information given and what I want to achieve. I might be wrong though. Anyway, my supposed method to go about this is to find the angle difference. So I'm going to solve for theta using the two different lambdas, and then using the angles I think I should be able to come up with how far apart they are given that it is 1.00 m from the screen and use trigonometry. Does that sound good? Is that viable? I did it and ended up with.... 36.13 degrees for the 589 nm and 36.09 for the 588nm. I took the difference of the tangents, and got .0011m. Did I do this correctly or should I have used an equation specific to spectroscopy, given that the question hinted about "first-order spectrum"? I just took that to mean that m = 1. Thanks for any help you can provide!