This might be a bit of a silly question, but it's been driving me nuts for a couple of hours now. Background first: I'm reading Serway's Physics for Scientists & Engineers, Vol 2 Ed.8, and I'm currently in the Optics Chapters (light and whatnot). In O3.3 (O3 is generally about light going through two openings and meeting up at some point at a surface) it has certain equations/formulas, but the problem is that it doesn't specify if the "phase difference" φ it introduces is measure in radians or degrees, or how the formulas work. See, thus far I've been working with angles in degrees, so it's so far so good. Then I reached that Part, and it beared enough simalirites with the Wave Functions (it even points it out) from previous Chapters, so I figured φ would be measured in rad. Here are the given equations: φ = 2π*d*sinθ/λ φ is the Phase Difference, d is the distance between the two openings, and λ thus far has been the length of the light wave. I = Imax*cos2(π*d*sinθ/λ)=Imax*cos2(φ/2) I is the intensity that we can measure at the point the two waves meet. My problem is that when I moved on to the exercises, I couldn't make any sense of it. I assumed that θ would be measured traditionaly in degrees, and when the time came to compute/measure φ, I'd turn my findings into radians. As we know, π = 3.14 rad = 180 degrees. But all of my results are out of whack. Sometimes I get the correct answer by not turning degrees into rads. Others I turn them, and I get slightly different results. Can anyone who's read this explain to me what I'm missing? If you want I can post an exarcise as an example. Any help is appreciated!