- #1

- 2

- 0

This is my first post but I've been a spectator for a long time now. So I've been working on some of the basics of cosmic expansion and there is one contradiction that I came upon that I can't seem to resolve. I've looked around some of the similar threads but I couldn't find anything satisfying so I'll ask myself.

If the expansion of the universe can be described using the scale factor as d(t)=d0*a(t) then by differentiating I find that d'(t)=d0*a'(t) (I'm just following http://en.wikipedia.org/wiki/Scale_factor_(cosmology)).

So this tells me that if a'(t)=const (as was thought to be the fact before the discovery of accelerating expansion) then the recession speed of a galaxy d'(t) should be constant, right?

But if I know look at Hubble's law (which I can even derive from the formula for d(t)) I find that d'(t)=a'(t)/a(t)*d(t)=H*d(t) or simply v=H*D. So doesn't this mean that as the distance becomes greater the speed also becomes greater. So the galaxy is accelerating. Somehow these two expressions must be consistent. What's up?!

I'd be grateful for any help