# Space Expanding Proportional to Distance

1. Dec 23, 2015

### Daniel K

Hi guys.
I have a question regarding the expansion of space.
Alright, so I know that space expands proportionally to the amount of distance. However, I'm not sure why this occurs. Can someone inform me on that?

Additionally, I know any individual placed anywhere in the universe would observe galaxies closer to them moving at slower speeds than galaxies that are farther away. However I don't see how this is possible. I'll elaborate on that.

Let's say that there are two groups of observers. One observer group is closer to galaxy A than the other group and they say that galaxy A has moved 50 light years away from galaxy B in the past 1 million years. The other group, being stationed farther away from galaxy A, communicates that they observe galaxy A has moved 70 light years away from galaxy B in the past 1 million years. How far has galaxy A really moved from galaxy B?

Thanks!

2. Dec 24, 2015

### Jorrie

Noby can be sure of the "why" of expanding space. There are theories that give us some clues, e.g. cosmic inflation that gave the spatial expansion a severe kick-start and then left it to expand at a decreasing rate. Google cosmic inflation to learn about that.
Considering large scale expansion only, space expands at the same rate everywhere at the same cosmic time. The recession rate for that time depends on the Hubble value H(t) and the proper distance (d) between the galaxies and observers (v_rec = H(t) d). The distance that you are after is simple to calculate. Assume a H(t) value and the distance between the various observers and galaxies and do the sum. Note that H(t) changes negligibly over a million years, so you can use Ho ~ 68 km/s/Mpc, which converts to 1/144 % per million years.

3. Dec 24, 2015

### Daniel K

Thanks for the response, although I still have a few remaining questions.
I understand this. My confusion stems from why an objects distance is proportional to its speed.

I'm not sure if I completely follow this. However I think I understand where my confusion stems from. Is it true that for an observer would observe a galaxy traveling away faster from another galaxy that an observer who is closer? Sorry if this is confusing.

4. Dec 24, 2015

### Bandersnatch

No, it's not true.
Say the two observer are labelled 1 and 2.
Observer 1 is at distance d from galaxy A, and distance 2d from galaxy B. I.e., the separation between the two galaxies is d. He sees that galaxy A is receding away at velocity $V_A=dH$, whereas galaxy B is receding at $V_B=2dH=2V_A$. The relative velocity between the two galaxies is $V_r=V_B-V_A=dH$
Observer 2 is at distance 2d from galaxy A, and galaxy B is 3d away. The recession velocities observed are $V_A=2dH$ and $V_B=3dH$. Relative recession velocity between the two is: $V_r=V_B-V_A=dH$, i.e. it's the same.
It's also the same as the recession velocity of galaxy A as seen from galaxy B, since the distance between them is d, and V=dH.

In other words, the observed relative recession velocity between two objects depends only on the distance between those objects, and not on how far they are from the observer.

5. Dec 24, 2015

### Jorrie

True, except that you should avoid using "traveling" and "faster" when we are talking about galaxies that recede from us. It creates wrong impressions and confusion. The receding galaxies of the standard model are not "traveling" anywhere - they are comoving with the expansion of space and as such can be viewed as stationary in their local space. We refer to their increasing proper distance as recession and speak of recession rate, not speed.

A distant galaxy that is twice as far from us recedes at twice the recession rate of the nearer galaxy.

If you have not done so, read the balloon analogy in the cosmology FAQ. It makes the concepts very easy to understand. I am trying to give you pointers towards figuring these things out rather than just being told it works so and so...

Edit: I see now that you have asked a slightly different question to the one that I answered. Bandersnatch have answered your original question.

Last edited: Dec 24, 2015
6. Dec 24, 2015

### Bandersnatch

Jorrie, it's not true. The relative recession velocity is the same.

7. Dec 24, 2015

### Jorrie

Yup, sorry, I've corrected it in a note. The question was formulated rather strangely.

8. Dec 24, 2015

### Daniel K

Let me correct what I said.
I was asking if a close observer would see two galaxies receding in more slow manner, and therefore less distance, than an another individual who is more far away.

9. Dec 24, 2015

### Jorrie

Your question is still very imprecise, but generally, the answer is no. Bandersnatch gave you the complete rationale for that. The relative recession rate between two distant galaxies depends solely on their distance apart and the Hubble constant, not the distance of the observer (which essentially cancels out in the sums that Bandesnatch did).

10. Dec 24, 2015

### Daniel K

So if you measure the recession rate between two objects it won't matter how close you are?

11. Dec 24, 2015

### Bandersnatch

That's right. All observers, everywhere, see all distances in the universe as increasing at the same rate. If you measure the same length of a distance (e.g. between galaxies), then you will observe it to grow at the same rate, no matter how far to it you are.

12. Dec 24, 2015

### phinds

This is common pop-science terminology and even slops over into serious discussions sometimes but that doesn't make it right. It is the consensus that space does NOT expand because it is not a thing, like a sheet of rubber, that even CAN expand. What happens is that things get farther apart, which is what is called recession. I refer you to "metric expansion" and suggest you read the balloon analogy article referenced in my signature, which discusses this further.