1. The problem statement, all variables and given/known data Consider a lightwave having a phase velocity of 3 x 10^8 m/s and a frequency of 6 x 10^14 hz. What is the shortest distance along the wave between any two points that have a phase difference of 30 degrees ? What phase shift occurs at a given point in 1 microsecond and how many waves have passed by in that time ? 2. Relevant equations velocity = frequency x wavelength 3. The attempt at a solution have calculated that the wavelength is 5000m, by dividing speed of light by the frequency, but am stuck and don't know how to proceec. any help would be amazing, thank you!