# Distances to galaxies using the Hubble Relation: look-back time

1. Mar 10, 2012

### Froglet

Hi all, this is my first post; apologies if it seems a bit basic, I suspect there's something fundamental I'm not getting here

Say I observe the spectrum of a galaxy, and I calculate its redshift. I then use the Hubble Relation to find its distance, which is, say, ten billion light years.

So far so good ... but doesn't this also mean I'm actually observing the galaxy as it was ten billion years ago, so in the intervening time it will have receded further with the expansion of the Universe, so isn't the real distance much greater? Would this affect our current estimate for the age of the Universe?

Put another way, I understand how Hubble's Law can apply to relative distances in space, but I'm not certain how we can calibrate it with an accurate distance at the present time if we are always looking into the distant past when we observe.

Many thanks.

2. Mar 11, 2012

### Chronos

Yes, the light from a galaxy 10 billion light years in our past is ten billion years younger than the galaxy that emitted it.

3. Mar 11, 2012

### Chalnoth

Right, it isn't all that easy to do. But it also isn't all that hard. We generally estimate these things by measuring two different aspects: the redshift of an object, and its apparent distance.

The redshift of most objects is very easily measured to extreme accuracy, because it can be estimated directly off of spectral lines. This redshift gives us the amount our universe has expanded since that light was emitted. For example, an object at a redshift of $z=1$ emitted light when our universe was half the size it is now (size goes as $1/(z+1)$).

The second point is the apparent distance. This is generally less accurate, but can be based on a number of different things, depending upon how we measure it. The most common is measuring distance from apparent brightness, as with supernovae. The further away an object is, the dimmer it appears.

When we measure the distances and redshifts of lots and lots of objects, we get a reasonably-accurate picture of how our universe has expanded over time.

4. Mar 11, 2012

### Chronos

Another useful and confirming indicator is the angular diameter distance.

5. Mar 11, 2012

### Chalnoth

Which, for the uninitiated, is how big it looks. Curiously enough, at very large distances objects actually start to look larger the further away they are.

6. Mar 11, 2012

### Froglet

Thanks for your replies...I'm still a bit hazy on this, I'm afraid.

I think part of my problem is that I'm imagining the redshift of a galaxy to be produced by a 'police siren' type Doppler Shift rather than a cosmological redshift. They aren't the same thing, are they?

Am I right in saying that a cosmological redshift tells us how much the Universe has 'stretched' from the point of its formation up until now (our present time), whereas a standard Doppler shift from my galaxy (distance=10 billion light years) would simply be telling us how quickly that galaxy was receding 10 billion years ago?

I'm still struggling with this look-back time thing. Say we use a secondary distance indicator to a galaxy, like a type 1 supernova that we observe in that galaxy. Say we measure it to be 5 billion light years away. Is this the true distance to the galaxy *now*, or the distance that it was at some time in the past?

After all, when we observe Betelgeuse in Orion (distance = 640 light years) aren't we seeing that star not as it is now, but as it was 640 years ago?

7. Mar 11, 2012

### Chalnoth

That depends entirely upon which distance measure you use.

The normal distance measure that most people use is the current distance, which can be understood as the light travel time that we would get if we stopped the expansion right now, freezing everything in place, and bounced some light rays around.

A really extreme example of this is the cosmic microwave background. The matter that emitted the part of the cosmic microwave background which we see today was a mere 43 million light years away. However, our universe has expanded by a factor of 1090 since then (meaning distances are now 1090 times as large as they were when the CMB was emitted). So the matter that emitted the CMB which we see now some 13.7 billion years ago is now 47 billion light years away.

The reason why the light travel time of 13.7 billion years is between the emission distance and the current distance is that the expansion rate has been slowing down for most of the past 13.7 billion years. Early-on, the expansion was extremely fast, so fast that a light ray starting 43 million light years away traveling towards us was actually losing ground: the expansion created more space between us and the light ray than the light ray could traverse at any given time. Eventually the expansion slowed to the point that this light ray was able to start making headway, eventually arriving at Earth some 13.7 billion years after it started.

8. Mar 11, 2012

### Xotica

Correct. Remember, the speed of light (c) is invariant. If you go outdoors and glance at the sun, the sunlight that you currently see is actually a relic. Approximately 8 minutes has elapsed between photon emission and retinal detection.

9. Mar 11, 2012

### Froglet

So using this normal distance measure, if I observed a galaxy to be five billion light years away using the supernova method, then I would say that it was *actually* five billion light years away at this present time. Is that correct?

10. Mar 11, 2012

### Chalnoth

The problem is, "actually" is a tricky concept in General Relativity, at least when it comes to distances. There is no "actual distance" in General Relativity. Any distance measure we come up with is just a matter of convention.

11. Mar 11, 2012

### marcus

I completely agree. Also this is a concise intuitive well-worded statement that captures the gist of what is essential and initiallly gives people trouble. You use the conventional idea of cosmic time (as measured by observers at rest relative to CMB) when you say now. So that idea and the idea of current distance is already implicitly out there. Froglet asks a natural question about this normal or current distance measure:

Yeah, I think that's right. It's a natural followup question to make sure you understand. The actual now distance can be measured using a standard candle like the blow-up of a known type of supernova. If we have a good idea of the original wavelengths and "wattage" that we would have seen if we were, say 100 lightyears away then we look at how much dimmer and how much the wavelengths are elongated (which reduces the energy of the light and contributes to the dimming.) After taking account of whatever else could have dimmed the light the MAIN reason is the distance. The energy is spread out over a much larger sphere whose radius is the current distance.

Chally can correct me if I'm wrong but I would say, and I think you Froglet understand it this way too, that when we have a standard candle we are actually directly measuring the current distance (after some adjustments for stuff like the wavelengths being stretched out and slight attenuation due to dust etc.)

The only quibble is about how we SAY it. Most folks aren't comfortable saying *actual* distance because it seems to slight the other distance measures in use.
The technically correct terms are things like the current "proper" distance. The proper distance is what would be measured at a particular moment of universe time, in the way Chalnoth described, by freezing expansion and using some conventional method like radar.

Last edited: Mar 11, 2012
12. Mar 11, 2012

### Froglet

Thanks for everyone's feedback on this, I've found it really interesting. I think I've sorted out my initial misconception.

Okay, here goes...

I take the spectrum of a galaxy and observe its redshift. Using the Hubble Relation, I calculate the galaxy's distance to be ten billion light years.

My problem was: I'm observing the galaxy as it was in the distant past, so doesn't that mean that it is now much further away than I observe it?

The answer is no, because the light that I observed from the galaxy was emitted when the Universe was much smaller, and the galaxy was much closer to me, so in the extra time that has elapsed since the light was emitted, the expansion of the Universe has carried the galaxy out to where I'm observing it to be right now.

So the redshift that I observed is directly proportional to how far the galaxy has been carried along by the stretching of space-time since the light was emitted.

How does this sound?

13. Mar 11, 2012

### marcus

Chalnoth is the main person responding to your questions in this thread so I should wait. But I have to say that sounds right on!
By convention you have to add one to the redshift z number to get the ratio of wavelengths---just an occasionally annoying detail. A redshift z = 0.5 means that the wavelength now is 1+z = 1.5 times what it was when the light was emitted.

And that 1+z is also the ratio of distances (aside from a little random motion things might have). The distance now is 1.5 times what it was then (for our z=.5 example).

If you want some illustrative examples, I keep a link handy, to an easy to use calculator, in my signature.
http://www.einsteins-theory-of-relativity-4engineers.com/cosmocalc.htm
Just type .5 into the z box and it will tell you distance then (when light emitted) and distance now (when light received). And light travel time. And recession rates then and now, and so on.
The calculator is called something like "cosmocalc" or "cosmological calculator 2010" because based on 2010 data and best fit model parameters. It's one of many such online thingees. A guy named Jorry who posts here at PF set it up.

Last edited: Mar 11, 2012
14. Mar 12, 2012

### jobigoud

I am sorry if I'm going to add to the confusion for not understanding what you mean, but I think you had it right initially and that you are mistaken now.

- The observed galaxy is indeed further away now that what we observe it to be. (if we measure it using the "lookback time")
- It was closer to us 10 billion years ago, but the receding speed is not equivalent to the speed of light. (Can be greater)
- What has always puzzled me, is that this "look back" distance we measure, in "light years", doesn't correspond to any physical distance. It's not the distance separating us from the object now, nor is it the distance separating us from it at the time the photons were emitted. It doesn't have any reality at a single point in time. It is the distance between us now, and it, then. It's much more a measure of time, the time spent for photons to reach us, than a meaningful measure of distance.

Last edited: Mar 12, 2012
15. Mar 12, 2012

### Froglet

I thought my solution seemed a bit too simple!

But I've had another thought: isn't the important issue that the redshift-distance measurement scale is a reasonable measure of *relative* distance rather than *absolute* distance, because the scale is calibrated using other 'rungs' on the cosmological distance ladder (type 1 supernovae, Tully-Fisher, Cepheids etc)?

16. Mar 12, 2012

### Chalnoth

Well, there are a few absolute distance measurements, which generally use particular geometries of a specific object. For example, the supernova SN 1987A, prior to exploding, expelled a couple of rings of matter. I'm not entirely sure what sort of violent past event caused these rings of matter to be expelled, but the point is that they were expelled, and at the time the supernova went off, these rings of matter were approximately one light year away from the supernova.

How do we know this? Well, we saw those rings illuminate right around a year after the supernova went off. Furthermore, this supernova was close enough that we can actually observe the apparent size of these rings using the Hubble space telescope. So we use the time it took to illuminate these rings to get an accurate measure of the distance the rings are from the source, and then the apparent size of the rings in our telescope tells us, to a very high degree of accuracy, how far away the supernova was. And it was about 168,000 light years away.

17. Mar 12, 2012

### Froglet

As I understand it, the first 'rung' on the ladder would be finding the distance to Venus at greatest elongation using radar-ranging, which is an absolute measure (I think). Then we use some basic trigonometry to find the Earth's distance to the Sun, which could then be used to calibrate stellar parallax measurements to the nearby stars and so on.

18. Mar 12, 2012

### Chalnoth

Well, that has been the classical measure. But as time moves forward we are finding more and more absolute measurements of distances of things further away. This significantly reduces the number of "rungs" on the distance ladder, improving the overall accuracy and reducing the impact of systematic effects.

Incidentally, it is actually entirely possible to use supernova data to measure the expansion of the universe without any knowledge whatsoever as to their intrinsic brightness (and therefore, no information at all about their absolute distance). All that you need is an independent measurement of the expansion rate, or some other measurement of cosmological parameters which differs significantly in how it constrains the parameters than supernovae do.