- #1

- 24

- 0

For example, if I have

x'' + ω

_{0}

^{2}x = cos(ωt)

If we suppose that ω≠ω

_{0},

then the general solution should look something like:

x(t) = c

_{1}cos(ω

_{0}t) + c

_{2}sin(ω

_{0}t) + (1/(ω

^{2}-ω

^{2}))cos(ωt)

This is okay with me mostly. But then thinking about what happens when ω→ω0 AND ω≠ω0, then obviously the amplitude of the oscillator should be huge. However, It would seem that the amplitude does not depend on time. Which is to say, that the exact moment that we introduce this driving force, the amplitude of the oscillator

*instantaneously*becomes enormous. Which is hard to believe, because I would expect the object to start deviating from its simple oscillations more slowly and grow in time.

I know that when ω=ω

_{0}that there

*is*a factor of

*t*in the amplitude, but that is not the case here.

**Is it because**the superposition of the two sinusoids makes it

*seem*like the initial amplitudes are small. So when the driving force is introduced, the waves align such that the oscillating body does not seem to have a huge amplitude. But over some time, the waves will align such that the body does have evidently huge oscillations. This would imply, though, that the oscillations would become small again. In other words, we would expect long beats. Is this correct?

Or, maybe it's more likely that after the driver begins, the motion converges onto that of the driven oscillation? Why and how would one describe so mathematically?