aderowbotham said:
It seems rational to me that one may compare the rate of spacetime expansion to the speed of light, in much the same way that one could compare the speed at which I might run *up* an escalator which itself is moving downwards. Or the speed at which an ant might walk across the surface of an inflating balloon.
So given that we can detect light / radiation from distances approaching 13 billion years, it seems logical that the average speed of expansion since the big bang must greatly *exceed* the speed of light. Otherwise that light would have reached us a long time ago.
Is that valid?
phyzguy said:
You're on the right track. The escalator and "ant on the balloon" analogies are good pictures to keep in mind. I strongly recommend these papers to help answer your questions. One is at a popular level and one is more detailed:
http://arxiv.org/abs/astro-ph/0310808v2
Phyzguy already gave a clear concise answer and there may be no need to say more. I'll throw in some additional stuff just in case you or somebody else can use it. Phyzguy attached a great 2005 SciAm article by Lineweaver and Davis.
Ned Wright visual animation of the "ant on the balloon" thing.
http://www.astro.ucla.edu/~wright/Balloon2.html
You can get this just by googling "wright balloon model"
======================
One fine-point to be clear about is how you define time and distance. There is a kind of standard "universe time" that goes along both with the gradual cooling of the ancient light (CMB) filling the universe, and with the standard Friedmann-equation model of the universe. This timescale is also sometimes called "Friedmann-time" because of its role in the model. It is the time measured by an observer who is at rest relative to the ancient light. It looks the same to him in all directions. The solar system and Earth are approximately at rest in this sense so for most purposes we can ignore the distinction.
In an environment where distances are changing there will necessarily be several possible definitions of distance. One that is commonly used is the "proper" distance. The distance you would measure by ordinary means like radar signal timing or a very very long tape measure if you could stop expansion at a specified moment.
It is sometimes called the "freeze-frame" distance. This definition of distance is the one employed in the Hubble Law you see used all the time: v = Hd.
d is a proper distance, v is the rate that distance is increasing, H is the Hubble parameter at that specified moment...at that given "age of the universe".
It is true that the current distances to most of the galaxies we can see are increasing at rates > c.
If you translate the presentday Hubble parameter H
0 = 71 km/s per megaparsec into ordinary language, it says that largescale distances are currently increasing by 1/140 of a percent every million years.
Ask yourself what is 1/140 of a percent of 14 billion lightyears. Well one percent of that distance is 140 million lightyears, and 1/140 of that is one million lightyears. So that distance is increasing by a million lightyears every million years. At rate c.
Most of the galaxies which we have seen are currently farther than that, so according to the standard model cosmos the presentday distances to them are increasing faster than c.
There is one apparent paradox. Many of the galaxies we observe today
were already receding faster than c when they emitted the light now reaching us .
The seeming paradox is, how could the light ever have gotten here? The "ant" of the light, running towards us at a constant speed, would at first have been swept back by expansion and would have actually been getting farther away---losing ground, so to speak.
The solution to the paradox is explained by Lineweaver and Davis somewhere in that SciAm article. The light has been able to make it to us because the expansion rate H has diminished over time.
So the ant hangs in there, stubbornly trying to get to us, and at first is losing ground but (as H(t) decreases) after a while keeps at a constant distance, and ultimately (as H(t) decreases some more) begins to make real progress towards us.
The "Hubble distance" c/H(t) is the distance which at time t is increasing exactly at rate c.
As H(t) decreases, the Hubble distance increases. If it decreases fast enough, then light which was originally outside the Hubble radius, and being dragged back away from us, will eventually be within the Hubble radius and start to make progress towards us.
You get the Hubble distance from the Law v = H d simply by setting v = c
and solving c = H d
for the distance d.
This may all be obvious to you, or it may be more detail than you want. Phyzguy already gave the perfect short answer so this is just extra stuff thrown in on the chance it might be useful.