# Relativistic mass, time dilation, length contraction, and traveling near c

## Main Question or Discussion Point

A spaceship with an arbitrarily large quantity of fuel cells departs Earth and accelerates away from it with a fixed trajectory until it reaches .9c. It continues to accelerate, but never reaches c because that is impossible for any object with mass.

From the frame of reference of the crew onboard the ship, their mass, rate of time, and length remains unchanged, so from their perspective, what is keeping the spaceship from continuing to accelerate to c?

From the frame of reference of mission control on Earth, what is keeping the spaceship from accelerating to c? I believe the answer to this is that it's relativistic mass increases as it's velocity approaches c, so that an infinite amount of energy would be required to fuel any further acceleration. However, I have a possibly erroneous understanding that time dilation and length contraction fit into this somewhere. Is the key that as the spacecraft approaches c, it's time becomes so dilated, and it's length so contracted, that it is getting no "bang for its buck" from the fuel it is using? If so, how does the spaceship's time dilating and it's length contracting affect it's velocity/acceleration as seen by mission control?

As always, thank you.

-coktail

Related Special and General Relativity News on Phys.org
mathman
From the crew's perspective the spaceship has v = 0, i.e. the spaceship is its frame.

From the viewpoint of mission control I suggest looking at the equations for special relativity.

Thank you, mathman. However, I was hoping for an explanation more in layman's terms.

Nugatory
Mentor
Thank you, mathman. However, I was hoping for an explanation more in layman's terms.
For the crew perspective, it doesn't get any simpler than mathman's answer: In the ship frame, the speed and hence the acceleration of the ship is always zero.

For the ground perspective, your explanation about the mass increase is just fine - the mass of the ship, as measured in the ground frame increases with velocity while the force remains constant: Plug an increasing mass and a constant force into F=ma, and you'll get a decreasing acceleration.

However, there is a bit of cheating in this explanation. I'll go into that a bit later unless someone else gets to it before me.

mfb
Mentor
In the ship frame, the speed and hence the acceleration of the ship is always zero.
You have to be careful here: the ship is not an inertial frame.
There is an inertial frame where the ship is at rest at a specific point in time - there, time dilation and length contraction is not there for this single moment in time, but the ship is accelerating.

@coktail: Consider some point in time (for earth) where the ship travels at some relativistic speed. Let's assume that it accelerates with a photon drive - the most efficient possible drive in terms of momentum per reaction mass. In the perspective of the ship, the photons all have an energy of E=10 eV (or whatever, does not matter), and a corresponding momentum of p=E/c.
For observers on earth, the light source is moving, and you get doppler-shift: Both the classical part and time dilation. The energy (and therefore momentum) per photon is much lower for observers on earth. In addition, time dilation also reduces the rate of photon emission.
The same effects apply to other particles, too - their velocity relative to the ship and their emission rate is decreased.

Another point of view: Let's assume the ship accelerates from 0 to .01 c in one day - something you can treat without special relativity. Now, the ship just keeps accelerating with the same thrust (as viewed from the ship). What is its velocity after 2 days? Well, we have to use the relativistic velocity addition - the result is close to .02, as we are well below c. However, it becomes more significant later: If the ship is at .9c and accelerates for another day, the result is not .91c, but (0.9+0.01)/(1+0.9*0.01)=0.9019.

For the crew perspective, it doesn't get any simpler than mathman's answer: In the ship frame, the speed and hence the acceleration of the ship is always zero.

If the crew keeps on accelerating at a constant rate, say, 1G, then they are able to determine that they are not inertial, given that objects would fall to the floor.

So if they see their velocity increasing over time compared to their launchpad on the Earth, what do they experience when their velocity nears that of C?

Do they start burning more an more fuel to accelerate less and less quickly?

Nugatory
Mentor
You have to be careful here: the ship is not an inertial frame.
There is an inertial frame where the ship is at rest at a specific point in time - there, time dilation and length contraction is not there for this single moment in time, but the ship is accelerating.
Quite true, both that the ship is not an inertial frame and that there is always an inertial frame in which the ship is at rest at a specific point point in time. The first point means that the ship's crew feels as if they're in a gravitational field, but they can still consider themselves at rest.

The second point is, of course, the key to the non-cheating explanation that I promised OP. Starting with what mfb says below:
Another point of view: Let's assume the ship accelerates from 0 to .01 c in one day - something you can treat without special relativity. Now, the ship just keeps accelerating with the same thrust (as viewed from the ship). What is its velocity after 2 days? Well, we have to use the relativistic velocity addition - the result is close to .02, as we are well below c. However, it becomes more significant later: If the ship is at .9c and accelerates for another day, the result is not .91c, but (0.9+0.01)/(1+0.9*0.01)=0.9019.
This says that the ship's acceleration as measured in the ground frame is getting ever smaller as the speed gets closer to c, so that the ship never quite gets all the way to the speed of light (although it can get arbitrarily close). So where did this relativistic mass concept come from? Well, back to F=ma... The way that you measure the mass of an object is to apply a known force to it and see what acceleration results. We know that the force is constant, and we see that the acceleration is getting smaller as the velocity increases, so we infer that the mass is increasing.

You guys are awesome. Thank you so much for taking the time and ENERGY (he he). These forums are the best thing.

Nugatory
Mentor
So if they see their velocity increasing over time compared to their launchpad on the Earth, what do they experience when their velocity nears that of C?

Do they start burning more an more fuel to accelerate less and less quickly?
No.

As far as the ship's crew is concerned, they are at rest (v=0) in a gravitational field. They puke a bit of rocket exhaust astern at velocity Δv and now they're moving forward at speed (m/M)Δv, where M is is the rest mass of the ship and m is the rest mass of that bit of exhaust; all of this relative to the frame in which they had been at rest before the rocket gave them another boost. And after that, they're still at rest in the new frame, so can repeat this process ad infinitum.

Of course their speed relative to the earth is changing as we keep on adding up these boosts. But that is irrelevant to how their rocket responds to the kick from the exhaust, just as you don't spend a lot of time thinking about the many miles per second you're moving relative to the Lesser Magellanic Cloud when you're working the throttle or brakes of a car you're driving through city streets.

Another way of thinking about it... Suppose the crew never looks out the window to see whether and how quickly the Earth is moving away from them... Maybe the earth blows up and is completely destroyed while they aren't looking... Then what could it possibly mean to say "as their velocity nears c"? That's c relative to what?

Last edited:
A spaceship with an arbitrarily large quantity of fuel cells departs Earth and accelerates away from it with a fixed trajectory until it reaches .9c. It continues to accelerate...

From the frame of reference of the crew onboard the ship, their mass, rate of time, and length remains unchanged, so from their perspective, what is keeping the spaceship from [STRIKE]continuing to[/STRIKE] accelerating to c?

Earth is 0 mph; why won't the ship observers ever calculate their speed to be c?

If the laws of physics remain unchanged, x amount of input should yield a constant amount of acceleration throughout the acceleration period, right? In that even when the ship observers calculate their speed as 0.9c (Earth is the comparative) the ship observers still measure the same fuel input / 1g acceleration ratio.

Is it that the contracted rulers of the ship's Pre-SR observers has them determine the Earth has begun to accelerate towards them symmetrical to there length contraction*?

Maybe I am wrong here, but seems very cool how the g force remains unchanged, maybe better queried as "Is the "sensation" of g force "invariant"?" Oh yea! proper acceleration, proper time, proper length, or more clearly proper length + proper time = proper acceleration. neato! That train of thought would have been so much more boring with the blatant: 1m/s(squared) + principle of relativity.

*Ah and from a coordinate acceleration perspective (again Earth is comparative = 0,0) the "decreased" coordinate acceleration from the ships observers perspective compared to the proper acceleration is what gamma would be (or what ever it should be called here). is that right?

Last edited:
But what if they DO examine their speed relative to Earth? You say their speed changes as they keep adding boosts, so what would they see as the reason they can't reach c?

Nugatory
Mentor
But what if they DO examine their speed relative to Earth? You say their speed changes as they keep adding boosts, so what would they see as the reason they can't reach c?

If the ship crew does examine their speed relative to the earth, they'll still conclude that they're at rest; it will be the earth that's moving away from them.

Whether you thinking in terms of the earth at rest and the ship moving away from the earth (earth frame), or the ship at rest and the earth moving away from the ship (ship frame) you have to use the relativistic formula for velocity addition as mfb described.

Say I'm sitting in a spaceship at rest (as far as I can tell) while the earth is moving away from me at speed -v. I fire the engines to boost my speed by +Δv.

In the frame in which the ship had been at rest, the earth is still moving to the left at v. The ship is now moving to the right at the speed Δv. But from either the earth frame or the post-boost ship frame, the new relative speed is not v+Δv - it's $$\frac{v+\Delta{v}}{1+\frac{v\Delta{v}}{c^2}}$$

A spaceship with an arbitrarily large quantity of fuel cells departs Earth and accelerates away from it with a fixed trajectory until it reaches .9c. It continues to accelerate, but never reaches c because that is impossible for any object with mass.

From the frame of reference of the crew onboard the ship, their mass, rate of time, and length remains unchanged, so from their perspective, what is keeping the spaceship from continuing to accelerate to c?[..]l
From that perspective, the Earth and all the universe is accelerating towards c due to their rocket engine. When you use an accelerating frame's perspective, you easily run into some weird (twisted) stuff.

But what if they DO examine their speed relative to Earth? You say their speed changes as they keep adding boosts, so what would they see as the reason they can't reach c?

Yea my post was mostly me asking a question and answering it, posted to ask if it's a correct interpretation.

The way the OP was worded I understood that the Earth was the agreed upon rest frame, and from which the ship observers would calculate their speed.

Not sure why the responses speak of Earth appears to accelerate away from the ship observers, it's been said they left Earth and accelerate away. So the undoubtedly know they are the ones in motion comparatively.

I think the reason the ship observers never calculate their speed to be c is because:

1. We Earth observers would see the ship slowly decelerate
2.They ship observers wouldn't "feel" or measure any reduced acceleration
3. The ship observers would measure less distance between Earth and themselves than what the Earth observers would measure.

From number 3 the ship observers would determine Earth has begun to accelerate towards them, at a rate equal to however gamma shortens their rulers as the ship observers continue to accelerate.

So the ship observers never calculate their speed as c because Earth just won't stay put...(at 0x,0y) from their perspective.

So where did this relativistic mass concept come from? Well, back to F=ma... The way that you measure the mass of an object is to apply a known force to it and see what acceleration results. We know that the force is constant, and we see that the acceleration is getting smaller as the velocity increases, so we infer that the mass is increasing.
See, this is why Relativistic Mass is a Bad Thing and should be eradicated from all physics classrooms everywhere.

No, you cannot obtain the relativistic mass from F=ma for a relativistic rocket, because when the Earth observer measures F/a, he does NOT get the relativistic mass $\gamma m$. He obtains the expression $\gamma^3 m$. And in general, the result obtained for F/a will differ from both of these values depending on the angle between F and a. Foolishly teaching that there is a "mass increase" at relativistic speeds (as opposed to new laws of physics with Newton's Laws as their low-energy limits) leads immediately to these sorts of misconceptions.

At any rate, the OP's question of to what effect Earth observers attribute the rocket's decreasing acceleration presupposes Newton's Laws, i.e. that F=ma is correct. ("If the rocket's coordinate acceleration is slowing, then there must be some reason having to do with F or m, because that's what Newton's Laws say"). The answer is really that Newton's Laws are wrong at speeds nearing c. F does not, in fact, equal ma. Constant force does not, in fact, lead to constant acceleration. The relationship between them is more complicated than that, and has to do with the ways that different inertial frames are related to each other, through the Lorentz Transformation. So the Earth observers attribute the slowing coordinate acceleration to their knowing the correct laws of motion: special relativity.

The rocket observers, meanwhile, feel the exact same force of acceleration the whole time. (More precisely, their accelerometers [masses on springs] always indicate a constant, unchanging acceleration). If they periodically drop a traffic cone out the window, they can observe this traffic cone to accelerate away from them at the same rate every time, forever. Nonetheless, they never reach c, because all light rays continue to pass them locally at c. No matter how much they accelerate, they make no headway towards catching up with light rays, ever. Also notice how there is a difference between proper acceleration (as measured by accelerometers and traffic cones onboard ship) and the coordinate acceleration measured by Earth. In Newtonian physics, these quantities are always the same, but in SR they differ.

If they periodically drop a traffic cone out the window, they can observe this traffic cone to accelerate away from them at the same rate every time, forever. Nonetheless, they never reach c, because all light rays continue to pass them locally at c. No matter how much they accelerate, they make no headway towards catching up with light rays, ever.
why do the traffic cones fall behind at the same rate? Wouldn't the first cone accelerate away more slowly than the last cone?

They are moving at a faster and faster velocity compared to the earth, so why do the cones not recede faster and faster each time they release one? At non-relativistic speeds, ISTM that this would happen.

Would they use fuel at a greater rate later in the flight in order to maintain a 1G acceleration?

Would they notice that even though they are burning fuel faster and faster, the rate of change of the earth's distance is slowing?

Would the rate of change of the earth's redshift diminish as they approach a relative velocity near C? And conversely, if they are aimed at a distant target which is comoving with the earth, wouldn't they notice a change in blueshift as they go faster and faster, and that the rate of change of the blueshift changes as they approach a relative velocity of C?

I seem to be missing something basic here.

In a situation in which there is a launchpad on a small planet and a target which is also so small and distant that the effects of its gravity can be ignored, ISTM that the rocket crew will notice a rate of redshift from the launchpad and a rate of blueshift from the target that changes over time, despite the fact that their on-board accelerometer indicates a steady 1G acceleration. It also seems to me that they would burn fuel at a faster and faster rate in order to maintain 1G acceleration.

Am I wrong?

And given that the sensors detect changing rates of redshift and blueshift, along with increasing fuel requirements, the crew would know that they are approaching C. No?

Last edited:
why do the traffic cones fall behind at the same rate? Wouldn't the first cone accelerate away more slowly than the last cone?

They are moving at a faster and faster velocity compared to the earth, so why do the cones not recede faster and faster each time they release one? At non-relativistic speeds, ISTM that this would happen.
Because the cones are initially at rest relative to the spacecraft. If the spacecraft were not accelerating at all, then each time they dropped a cone, that cone would simply drift along with the ship, at rest. The cones do not fall behind the craft because of its velocity, but because of its acceleration.

There is no experiment whose outcome depends on the speed of the craft. If the falling-away rate of a traffic cone was greater for later cones than earlier cones, then the pilot would be able to measure his speed that way, and that is not allowed.

Would they use fuel at a greater rate later in the flight in order to maintain a 1G acceleration?
No, there is no experiment the pilot can do that will depend on his speed. That is the Principle of Relativity. If he required more fuel at high speeds as compared to low speeds, he would have an absolute speed that he can measure, and he can't do that.

Would they notice that even though they are burning fuel faster and faster, the rate of change of the earth's distance is slowing?
Yes, they would notice that Earth does not have a constant coordinate acceleration. Assuming, of course, that Earth hasn't fallen behind their Rindler Horizon, in which case they wouldn't notice anything about it at all.

Would the rate of change of the earth's redshift diminish as they approach a relative velocity near C? And conversely, if they are aimed at a distant target which is comoving with the earth, wouldn't they notice a change in blueshift as they go faster and faster, and that the rate of change of the blueshift changes as they approach a relative velocity of C?
They certainly do note changes in redshift and blueshift of these bodies. Earth, in fact, will eventually redshift to infinity. This is due to their changes in relative motion.

ISTM that the rocket crew will notice a rate of redshift from the launchpad and a rate of blueshift from the target that changes over time, despite the fact that their on-board accelerometer indicates a steady 1G acceleration.
Yes, because redshifts and blueshifts depend on relative velocity, which is changing.

It also seems to me that they would burn fuel at a faster and faster rate in order to maintain 1G acceleration.
Absolutely not, since that would reveal the absolute velocity of the craft. In a real rocket, in fact, it would require less and less fuel, since the mass of the rocket decreases as fuel is expended.

And given that the sensors detect changing rates of redshift and blueshift, along with increasing fuel requirements, the crew would know that they are approaching C. No?
They are allowed to detect that the speed of the Earth, relative to them, is changing. They are NOT allowed to know that they are approaching c in any absolute sense. You, right now, are traveling at nearly c in the frame of some cosmic ray. Can you detect this at all?