- #1
chocolatesheep
- 3
- 0
Hi everyone.
I was wondering about this.
If the position of an object changes in time, the object has velocity.
If the velocity of an object changes in time, the object is accelerating (decelerating)
If the acceleration of an object changes in time, could we hypothetically have acceleration of acceleration.
I have this scenario in my head:
An asteroid is at rest, far away from earth. The Earth starts dragging the asteroid towards it. The asteroid will accelerate towards Earth ever so slightly (since gravity depends on distance)
The closer it gets to earth, not only will it go faster, but it will also accelerate faster.
So we will have the change of acceleration in time:
[itex]a\prime = \lim_{\Delta t\to\ 0}\frac{\Delta a}{\Delta t}[/itex]
So acceleration of acceleration would be in [itex]\frac{m}{s^3}[/itex] or rather [itex]\frac{\frac{m}{s^2}}{s}[/itex] if it appeals more.
Would this be useful? We could calculate the exact acceleration in any given moment as opposed to having the average acceleration.
But that's beside the point since we could always calculate the exact acceleration if we know how far it is from a planet.
But if we needed to know the exact speed of the asteroid after let's say 20 hours, we would get an incorrect answer if we treated the acceleration as if it were constant.
So [itex]a_1 = a_0 \pm a\prime t[/itex]
so if [itex]a_0 = 0[/itex] then [itex]a_1 = a\prime t[/itex]
and if [itex]v_1 = v_0 \pm at[/itex]
and if [itex]v_0 = 0[/itex] then [itex]v_1 = at[/itex]
=> [itex]v_1 = a\prime t^2[/itex] if the object is starting to move from rest
Does any of this make sense?
Waiting for someone to point out a flaw in this.
I was wondering about this.
If the position of an object changes in time, the object has velocity.
If the velocity of an object changes in time, the object is accelerating (decelerating)
If the acceleration of an object changes in time, could we hypothetically have acceleration of acceleration.
I have this scenario in my head:
An asteroid is at rest, far away from earth. The Earth starts dragging the asteroid towards it. The asteroid will accelerate towards Earth ever so slightly (since gravity depends on distance)
The closer it gets to earth, not only will it go faster, but it will also accelerate faster.
So we will have the change of acceleration in time:
[itex]a\prime = \lim_{\Delta t\to\ 0}\frac{\Delta a}{\Delta t}[/itex]
So acceleration of acceleration would be in [itex]\frac{m}{s^3}[/itex] or rather [itex]\frac{\frac{m}{s^2}}{s}[/itex] if it appeals more.
Would this be useful? We could calculate the exact acceleration in any given moment as opposed to having the average acceleration.
But that's beside the point since we could always calculate the exact acceleration if we know how far it is from a planet.
But if we needed to know the exact speed of the asteroid after let's say 20 hours, we would get an incorrect answer if we treated the acceleration as if it were constant.
So [itex]a_1 = a_0 \pm a\prime t[/itex]
so if [itex]a_0 = 0[/itex] then [itex]a_1 = a\prime t[/itex]
and if [itex]v_1 = v_0 \pm at[/itex]
and if [itex]v_0 = 0[/itex] then [itex]v_1 = at[/itex]
=> [itex]v_1 = a\prime t^2[/itex] if the object is starting to move from rest
Does any of this make sense?
Waiting for someone to point out a flaw in this.