- #1
- 909
- 1,135
Given a soundwave with wavelength ##\lambda = 0,628 m## and a period ##T = 2 ms##.
The stopper is started at the exact moment when the wave is at its minimum, call it ##-A##. After ##3.5ms## and ##0,157m## from the point of origin, the wave has reached its maximum, ##A##.
Why is it so?
According to my reasoning the wave has reached 0 at that time. It takes the ##2ms## to reach minimum again and then with the remaining ##1.5ms## it reaches 0 with the first ##0.5ms##, then the maximum after ##1ms## and then 0 again after ##1.5ms##, but this is incorrect.
Edit: I would understand if we were given just the distance it has traveled in which case it would have traveled exactly a fourth from the point of origin and therefore reaching its maximum, but then why the ##3.5ms##?
The stopper is started at the exact moment when the wave is at its minimum, call it ##-A##. After ##3.5ms## and ##0,157m## from the point of origin, the wave has reached its maximum, ##A##.
Why is it so?
According to my reasoning the wave has reached 0 at that time. It takes the ##2ms## to reach minimum again and then with the remaining ##1.5ms## it reaches 0 with the first ##0.5ms##, then the maximum after ##1ms## and then 0 again after ##1.5ms##, but this is incorrect.
Edit: I would understand if we were given just the distance it has traveled in which case it would have traveled exactly a fourth from the point of origin and therefore reaching its maximum, but then why the ##3.5ms##?
Last edited: