I understand that Spacetime is expanding, as demonstrated by the increase in red-shift with distance. And I also understand that light travels *through* spacetime which itself is expanding. It seems rational to me that one may compare the rate of spacetime expansion to the speed of light, in much the same way that one could compare the speed at which I might run *up* an escalator which itself is moving downwards. Or the speed at which an ant might walk across the surface of an inflating balloon. So given that we can detect light / radiation from distances approaching 13 billion years, it seems logical that the average speed of expansion since the big bang must greatly *exceed* the speed of light. Otherwise that light would have reached us a long time ago. Is that valid?
You're on the right track. The escalator and "ant on the balloon" analogies are good pictures to keep in mind. I strongly recommend these papers to help answer your questions. One is at a popular level and one is more detailed: http://arxiv.org/abs/astro-ph/0310808v2
No. Even if the universe wasn't expanding, light would still take 13 billion years to travel 13 billion light-years. You also haven't defined what it would mean to average the speed of expansion. Keep in mind that the universe may be infinite.
Sure; light, by definition, takes 13bn years to travel 13bn light-years. Though yes, I didn't account for the fact that 13bn light years was only the distance for the particular journey of the light we are recording now. Not necessarily the distance now or the distance when the light left it. So if light has taken 13bn years to reach us it seems that the logical possible reasons for this are: 1. The source was 13bn light-years distant from us at the time when the light left it, and there has been no expansion or contraction of the space between us and it since. – This is unlikely given the observed red-shift. Or 2. The source was closer to us when the light left, but the light took 13bn years to reach us because expansion of spacetime between us and the source occurred at such a rate that the light took 13bn years to catch up with us. So actually in the case of visible objects this rate must be slightly less than the speed of light otherwise the light would never reach us. And I guess we can't directly tell (without extrapolation) how distant the object is now. By 'average' I was simply referring to the fact that the rate of expansion may have changed over time (whatever "time" means in this context).
Phyzguy already gave a clear concise answer and there may be no need to say more. I'll throw in some additional stuff just in case you or somebody else can use it. Phyzguy attached a great 2005 SciAm article by Lineweaver and Davis. Ned Wright visual animation of the "ant on the balloon" thing. http://www.astro.ucla.edu/~wright/Balloon2.html You can get this just by googling "wright balloon model" ====================== One fine-point to be clear about is how you define time and distance. There is a kind of standard "universe time" that goes along both with the gradual cooling of the ancient light (CMB) filling the universe, and with the standard Friedmann-equation model of the universe. This timescale is also sometimes called "Friedmann-time" because of its role in the model. It is the time measured by an observer who is at rest relative to the ancient light. It looks the same to him in all directions. The solar system and earth are approximately at rest in this sense so for most purposes we can ignore the distinction. In an environment where distances are changing there will necessarily be several possible definitions of distance. One that is commonly used is the "proper" distance. The distance you would measure by ordinary means like radar signal timing or a very very long tape measure if you could stop expansion at a specified moment. It is sometimes called the "freeze-frame" distance. This definition of distance is the one employed in the Hubble Law you see used all the time: v = Hd. d is a proper distance, v is the rate that distance is increasing, H is the Hubble parameter at that specified moment....at that given "age of the universe". It is true that the current distances to most of the galaxies we can see are increasing at rates > c. If you translate the presentday Hubble parameter H_{0} = 71 km/s per megaparsec into ordinary language, it says that largescale distances are currently increasing by 1/140 of a percent every million years. Ask yourself what is 1/140 of a percent of 14 billion lightyears. Well one percent of that distance is 140 million lightyears, and 1/140 of that is one million lightyears. So that distance is increasing by a million lightyears every million years. At rate c. Most of the galaxies which we have seen are currently farther than that, so according to the standard model cosmos the presentday distances to them are increasing faster than c. There is one apparent paradox. Many of the galaxies we observe today were already receding faster than c when they emitted the light now reaching us . The seeming paradox is, how could the light ever have gotten here? The "ant" of the light, running towards us at a constant speed, would at first have been swept back by expansion and would have actually been getting farther away---losing ground, so to speak. The solution to the paradox is explained by Lineweaver and Davis somewhere in that SciAm article. The light has been able to make it to us because the expansion rate H has diminished over time. So the ant hangs in there, stubbornly trying to get to us, and at first is losing ground but (as H(t) decreases) after a while keeps at a constant distance, and ultimately (as H(t) decreases some more) begins to make real progress towards us. The "Hubble distance" c/H(t) is the distance which at time t is increasing exactly at rate c. As H(t) decreases, the Hubble distance increases. If it decreases fast enough, then light which was originally outside the Hubble radius, and being dragged back away from us, will eventually be within the Hubble radius and start to make progress towards us. You get the Hubble distance from the Law v = H d simply by setting v = c and solving c = H d for the distance d. This may all be obvious to you, or it may be more detail than you want. Phyzguy already gave the perfect short answer so this is just extra stuff thrown in on the chance it might be useful.
Why slightly? It would depend on the galaxy's distance from ours. There isn't any over-all speed of expansion (this many m/s) for all of spacetime.
Because in my example the light left the object when it was much closer to us – say 1bn ly (I didn't specify a distance). The light from the source would have reached us much sooner than 13bn years later if the rate of expansion between us and the source was much smaller. The values in the example are just arbitrary of course. OK but my example was an object 13bn ly away (by the end of the light vector). On the large scale space is homogeneous, as I understand it. Anyway - I need to go away and read up on the various linked papers before I spout any more hypothetical musings!
The issue isn't homogeneity. In a homogeneous universe, there is not a single expansion speed for spacetime, in units of m/s. The Hubble law says that more distant galaxies are receding from us more rapidly in proportion to their distance.
Thanks to ALL for a Great Discussion. Please help me understand a SEEMING contradiction in the Lineweaver&Davis excellent article (exact quotes below) about the Rate of Universe Expansion - is it changing UP or DOWN? "The rate at which the distance between galaxies increases follows a distinctive pattern discovered by American astronomer Edwin Hubble in 1929: the recession velocity of a galaxy away from us (v) is directly proportional to its distance from us (d), or v = Hd. The proportionality constant, H, is known as the Hubble constant and quantifies how fast space is stretching —not just around us but around any observer in the universe." "In models of the universe that fit the observational data, the denominator increases faster than the numerator, so the Hubble constant decreases. In this way, the Hubble distance gets larger" "The recent discovery that the rate of cosmic expansion is accelerating makes things even more interesting".
Early in the history of the universe, the universe was "matter-dominated", and the mutual gravitational interaction of the matter slowed the expansion and caused the Hubble constant to decrease with time. When the universe was about half its present age, the density of matter had fallen to the point that things gradually switched over so that the universe became "lambda dominated", and the Hubble constant is now increasing with time. You can see this in the middle graph on page 3 of the "Expanding Confusion" article, where the size of the Hubble sphere was initially increasing with time, and is now decreasing with time.
No, Hubble constant is never increasing with time, it is always decreasing. Edit: Ok, finished with wood chopping, so I will try to elaborate a little bit. There is a common misconception that if universe is in accelerated expansion value of Hubble constant must be increasing with time. Probable cause for this is misunderstating of what quantity called Hubble constant means. It defines velocity now of some object (galaxy), some distance away from, for example, us. If nothing acts on that object (gravity or dark energy), then that velocity should remain same in any time in the future. But since that object is moving away with that velocity, thus increasing distance, value of Hubble constant must drop in inverse proportion with time. So, in empty universe Hubble constant simply becomes H(t)=1/t. Answer whether universe is accelerating, or not, is in second derivative by proper time of scale factor (a). If it is positive than it is said that universe is accelerating.
There appears to be some confusion. The Hubble constant is decreasing and (according to standard model) expected to continue decreasing on into future, though at diminished rate. Since H is increasing, the reciprocal c/H, called Hubble radius, is increasing, but you say it is "now decreasing with time." What can have caused this misunderstanding?
Ut, in your post #11 you point to a clear VERBAL CONTRADICTION. It is actually not a real contradiction in the math model, but arises because of a poor choice of words. Briefly, the key quantity to keep track of is called the scale factor a(t). This is our handle on the size. It enters into the formula for distance and distances expand as a(t) increases. You probably know some calculus. The time derivative da/dt can be written a'(t). The real rate of expansion is a'(t). It is positive because a(t) is increasing, and when expansion decelerates a'(t) decreases. Acceleration means a'(t) increases. The Hubble quantity H(t) is defined to be a'(t)/a(t). It is the proportional increase. It is convenient to work with. It corresponds approximately (for near galaxies) but not exactly with the slope of redshift with distance. H(t) is handy. But you can obviously have expansion accelerate and H(t) decrease. because accelerate just means a'(t) increases. But the denominator a(t) is increasing so fast that even if a'(t) is gradually rising the ratio H = a'/a is still going to decline. ===================== To look at the math, you can google "friedmann equations". Or somebody may suggest something better. In the math there is no contradiction. The contradiction is only verbal, caused by inadequacy of language. We use the word "rate" both for the real nutsandbolts rate of increase a'(t). And we also use the word "rate" for the proportional or fractional rate of increase a'/a. We shouldn't, but we don't have a separate word. So the English language confuses these two kinds of expansion rates and we get what sounds like a contradiction. ====================== BTW there is a very good translation of the Gilgamesh poem. I can't think of the translator's name. David Ferry, that's it. http://www.amazon.com/gp/product/0374523835/ Seeming to be a fine accurate translation but also easy graceful, effective as literature. Real poetry in English. I don't like what I've seen of the Stephen Mitchell translation. I have the Ferry version upstairs. You have taken your Nick from one of the characters in that great ancient narrative poem...
Hi Marcus: Nice to meet a connoisseur of the ancient Babylonian literature and many thanks for the explanation. I am going to think it over diligently. All in all it seems that the Universe is not homogeneous and unpredictable, and that we do not and CAN NOT know what is going on with its far distant areas NOW. Who knows, there might be a space contraction since long already with a vector towards us. Best Regards
My apologies. Calimero is right. Since the Hubble constant is [tex]\frac{\.{a}}{a}[/tex], where a is the scale factor, it is never decreasing with time in an expanding universe. A lambda-dominated universe expands exponentially and has H which is constant in time. I agree that the second derivative of a is what separates acceleration from deceleration. I should have said: "Early in the history of the universe, the universe was "matter-dominated", and the mutual gravitational interaction of the matter slowed the expansion and caused the expansion to decelerate. When the universe was about half its present age, the density of matter had fallen to the point that things gradually switched over so that the universe became "lambda dominated", and the expansion is now accelerating. You can see this in the middle graph on page 3 of the "Expanding Confusion" article, where the size of the Hubble sphere was initially increasing with time, and is now decreasing with time. "
Since spacetime is expanding causing areas of our universe to disappear from view, and regions far away accelerating from us at super luminous speeds can we view these effects as spacetime dividing into dividing into a multiverse of separate spacetimes? Are these separate spacetimes now in effect other dimensions of spacetime separate and beyond our own?
The problem with the "spacetime" theory is that space is NOTHING, which is infinite in distance, and there is no such thing as an entity called "time". "Time" is only our description of what is happening compared with something else that is happening, i.e. the clock's hands indicate one hour has passed for each 1/24 rotation of the Earth. Therefore, since "time" itself as an entity does not exist, it cannot be slowed or speeded up, which throws the theory of time dilation, or "time travel" out the window. The theory of "time" slowing down when nearing the speed of light is bogus. If a nuclear powered satellite's clock seems to change it's "rate of time", it's because the rate of nuclear fusion or fission changes when put into a weaker or stronger gravitational field, not because "time" itself has changed. It's the same as if I mix 5 minute epoxy at 70 degrees, it takes 5 minutes to cure, however, if I mix it at 35 degrees it will take 8 minutes to cure, time has not changed, only the conditions have changed. So the "space-time continuum" is BS, only something from science fiction. The equation E=MC2 CANNOT be valid, because there is no way to quantify the speed of light to put it into the equation, other than by using numbers created by man's selection of units, i.e. "miles per second", "kilometers per second", and so on, which will give different numbers for the same velocity. The speed of light has NOTHING to do with energy and mass.