Consider a lightwave having a phase velocity of 3 x 10^8 m/s and a frequency of 6 x 10^14 hz. What is the shortest distance along the wave between any two points that have a phase difference of 30 degrees ? What phase shift occurs at a given point in 1 microsecond and how many waves have passed by in that time ?
velocity = frequency x wavelength
The Attempt at a Solution
have calculated that the wavelength is 5000m, by dividing speed of light by the frequency, but am stuck and don't know how to proceec. any help would be amazing, thank you!