# Observing wavelength at an angle

1. Jan 20, 2016

I recently came across an equation stating that $\lambda_{ob} = \frac{\lambda}{cos(\alpha)}$ if $\alpha$ is the angle the observer is relative to the wave's direction of propagation. I guess I can kind of understand that a person perpendicular (i.e. $\alpha = 1$) would see the normal wavelength, but am just failing to understand what exactly it even means to observe the wavelength at an angle. How does standing perpendicular to the wavefront make it look like the wave has an infinite wavelength?

2. Jan 20, 2016

### Staff: Mentor

It does not, but if you look at the phase difference between different points of your "observation plane", they all have the same phase. This equation tells you how far apart maxima (or specific phases) are on your observation plane. I don't think it is useful to call this "wavelength", however.

3. Jan 20, 2016

Hmmm....do you mind expanding on that? How exactly does the maxima look infinitely far away from each other if you're simply looking at the wave perpendicular to its direction of propagation?

Also, isn't the distance separating maxima essentially what an observed wavelength is?

4. Jan 20, 2016

### Staff: Mentor

The distance along the observation plane is "infinite" (you see the same maximum everywhere at the same time). The distance along the travel direction (the real wavelength) does not change.

5. Jan 20, 2016

Thinking of the wave as stationary and myself moving against the wavefront at the angle $\alpha$ really helped! Thank you!