1. The problem statement, all variables and given/known data I'm given two frequencies: 4,000 Hz, and 5,000 Hz. They are completely in-shift at time ##t=0##. I am to find the time it takes them to get completely out of phase. 2. Relevant equations 3. The attempt at a solution I've not gotten waves very well thus far in physics. My teacher said being completely out of phase occurs at a phase shift of one half wavelength, or, at π rads. Where do I start?