- #1
- 18
- 0
Suppose we have an ultrasonic transmitter and an ultrasonic receiver, separated by some fixed distance ##d##. Both devices are attached to an oscilloscope. The transmitter generates ultrasonic waves at some frequency ##f## that we can change. The receiver is some piezoelectric that translates pressure exerted on him (that obviously changes with time) to an electric signal. The oscilloscope then shows two sinusoidal graphs (amplitude as function of time) with the same frequency but with a phase shift (because the speed of sound is finite and it takes time for the wave to reach the receiver), i.e. time delay (##\Delta t##). Now, I wonder whether the time delay between these two waves (which is essentially the same wave, but with different phase) changes with frequency (I believe it does, because, if I'm not mistaken, ##\Delta \varphi = \frac{2\pi}{\lambda} \cdot d \propto f## and there's a linear relationship between the phase shift ##\Delta \varphi## and the time delay ##\Delta t##). But is it even possible to obtain the speed of sound (velocity of the wave) from ##f(\Delta t)## graph (i.e. knowing time delay for different values of frequency)?