# Age of the universe in earth's reference frame?

VantagePoint72
I understand that the ~14.6 billion year age of the universe is in the cosmological frame, i.e., representing the coordinate time elapsed in a comoving reference frame. Of course this means (as has been discussed plenty of times here) observers in different frames would measure a different age due to Lorentz and gravitational time dilation effects. What I haven't been able to find is a conversion of this age to "earth time".

Now, I realize there are a number of issues with such a conversion; not least among the difficulties is that the Earth itself is only about 4.5 billion years old. So, perhaps we can consider a more reasonable goal. Our measurements of the age of the Earth are invariably made with earth-based clocks: that is, according to the radioactive decay of isotopes that, to very good approximation, have been at rest in the Earth's reference frame for their whole existence. Gravitational effects will also have been quite small since the Earth's geometry and make up hasn't changed much (as far as GR is concerned) since its formation. Thus, we can comfortably interpret the 4.5 billion year age of the Earth as being our time, according to our clocks.

Would it be possible for us to determine how much comoving time elapsed for the universe during these 4.5 billion Earth years? Or, equivalently, if a comoving observer were created with the universe at the instant of the Big Bang, how much proper time would they measure before they observed the formation of the Earth (assuming their worldline brought them nearby in space at just the right time)? We know our current velocity with respect to the CMB due to measurement anisotropies, but do our cosmological models allow us to calculate its past values, as we (presumably) would need to know for such a conversion?

Last edited:

Gold Member
Dearly Missed
That's a good question to be asking. I think there's hardly any difference and I will explain why I think that. But it's certainly a question we need to to ask.
I will just address the effect of the Solar System's motion relative to observers at rest with respect to the ancient light (the microwave background).

It is only about 370 km/s so on the order of 1/1000 of the speed of light.

I guess you could think of that as made up of the Galaxy's speed of around 600 km/s partly canceled by Solar System's orbital speed of around 250 km/s which at the present time is in somewhat a backwards direction. When we come around to the other side of Milky then at least for a while our orbital speed might be roughly aligned with the Galaxy speed and we will be going roughly 850 km/s relative to Background.

But still that is only on the order of 3/1000 of the speed of light.

As I recall the "Galactic Year" or period of the Solar System's orbit around Galactic Center is about 225 to 250 million years. So you can figure out roughly how many times we have been around the Galactic Center. With speed relative to background slowly fluctuating as indicated.

It's a bit like a bug riding on a sailing Frisbee, sometimes his rotation speed partially cancels the forward motion and sometimes it adds. And most of the time it doesn't exactly do either.

What I'm trying to say is that any speed/acceleration effect has to be really tiny. A comoving observer---i.e. one at rest relative to the CMB---would estimate almost the same age of the universe as we would.

Also I think that the gravitational well of the Galaxy that we are in is rather shallow. I forget what escape speed from the Galaxy is. Probably somethng on the order of 500 km/s (from our position) I would guess. So gravitational dilation wouldn't significantly affect our measurement of time.

These are merely hunches on my part. I didn't work it out. There are several other people who might correct me on this and might even have some precise figures! This is just to get the ball rolling. Hopefuly others will comment. It's late here. I'll check your thread tomorrow.

Last edited:
Homework Helper
I'll go along with that ... or: any difference will be less than the uncertainty in the approximate age of the Earth and the comoving time involved in the first place. However - eagerly await a better answer :)

VantagePoint72
Thanks for the initial replies. One point that perhaps should be clarified before others weigh-in. Saying that $t$ years have passed during some interval in the cosmological frame could be interpreted in one of two ways: either (1) the comoving observer measured $t$ in some convenient units—say seconds—by their watch and then converted units as $1 yr = 3.2 \times 10^7 s$, or (2) the observer had a telescope pointed at our solar system and counted out $t$ revolutions of the Earth around the Sun. The slight ambiguity is due to the fact that we're quoting the age of the universe in one reference frame using a unit of a time that's naturally defined in another—the revolutions of our planet.

My understanding is that when cosmologists say the universe is 15 (ish) billion years old in the cosmological frame, they mean in the sense of (1). We would say that observers in any frame would all give the age of the universe (or the earth) as the same in Earth years in the sense of (2) since that would be tantamount to converting to their frame and then back to the Earth's frame—i.e., in the absence of GR applying a Lorentz transformation and then the inverse transformation. Aside it from then being silly to say that the quoted time is in any sense according to the cosmological frame, this is obviously impossible for the age of the entire universe since the Earth hasn't been around that long to be watched! Nonetheless, I bring this up because there was at least one previous thread (that, unfortunately, I can't find) in which a commenter said the age of the universe in Earth's reference frame is exactly the same as in the cosmological frame because, presumably, he believed years were being used in the sense of (2). I think this is wrong, and just wanted to bring this point up in advance in case it resurfaced.

Again, thanks for the preliminary thoughts, I hope others add to them!

Homework Helper
The slight ambiguity is due to the fact that we're quoting the age of the universe in one reference frame using a unit of a time that's naturally defined in another—the revolutions of our planet.
I don't think anyone uses revolutions of the Earth as the basis for measurements of time in physics discussions. At least, not since the 60's. These days it is sort-of taken for granted that our standard clock is ground state of Caesium 133 with the usual ideas about what constitutes "proper time".

I would agree that "cosmological time" would be reasonably interpreted as that measured by a clock stationary in the cosmological frame. I think OP has done well to describe what this means in the context of the question being asked.

Fair enough to try to head off a known misunderstanding though.

DaveDash
My understanding is because the universe is assumed to by isotropic and homogeneous, proper time and time are the same thing in the case of each and any reference frame, on a large enough scale.

However, if were on a comet traveling along at 0.9C and we were asked to calculate the age of the Earth, watching our solar system for the creation of the Earth, what would we determine?

We would simply determine the Earth was the same age than it was. We would only see the Earth later than when it was created (due to the long time it would take light to travel to us). However, we ourselves would have been traveling for a longer time.
So we would still think the Earth is the same age, but we would disagree about exactly WHEN the Earth was created, and we would also disagree with the current date and time.

Conversely when we look at stars and galaxies moving away from us, we don't disagree about their age (in fact proper time - the age of something - is invariant).

That's at least my understanding. Maybe I did not understand the question however.

VantagePoint72
We would only see the Earth later than when it was created (due to the long time it would take light to travel to us)

Sorry, but this line makes me pretty skeptical of the rest of what you've said. Relativity has nothing to do with the transit time of light, so I'm not sure how that worked its way into your answer. The time assigned to an event by an observer in some reference frame is the 'true' time it occurs in that frame, not the time its light reaches the observer. I don't really get the rest of the point you're making about everyone agreeing on the age (with, if I'm reading right, relativity of simultaneity taking care of the bookkeeping). Suppose the Earth were completely destroyed by a giant meteor tomorrow and we, somehow overcoming the obstacle of being dead, asked, "What age did the Earth reach before it was destroyed?" We're asking about the time interval between two well-defined events: the formation and destruction of the planet. By time dilation, this interval measured in our proper will be longer than in the proper time of an observer who took off in a rocket after the Earth's formation, jetted around for a while, and flew home before the meteor strike. Hence, you have someone who disagrees with our measurement of the Earth's age. Proper time between two events is NOT invariant, it also depends on the path you take between them.

Homework Helper
@LastOneStanding: that's what I was thinking too: different observers disagree about how long things take.

Gold Member
Dearly Missed
...
Would it be possible for us to determine how much comoving time elapsed for the universe during these 4.5 billion Earth years? Or, equivalently, if a comoving observer were created with the universe at the instant of the Big Bang, how much proper time would they measure before they observed the formation of the Earth (assuming their worldline brought them nearby in space at just the right time)? We know our current velocity with respect to the CMB due to measurement anisotropies, but do our cosmological models allow us to calculate its past values, as we (presumably) would need to know for such a conversion?

...using a unit of a time that's naturally defined in another—the revolutions of our planet...

As I guess you know, time in the Earth frame is defined by the Atomic clock and not by revolutions or rotations. Orbit period can vary. So when we give an estimate of how long ago something happened in Earth years it does not refer to orbit periods but to some conventional number of Cesium clock second.

So then your question comes down to a question about Universe time. How does it relate to the Atomic Clock standard? We have to imagine a stationary Observer out in intergalactic space who has his own Cesium clock, which tells Universe time.

He points his telescope at us and observes our clocks. Ahah! he says, their Cesium is moving at 370 km/s, so it is a bit slowed down compared with my Cesium. So when they publish figures about the Age of Expansion (being 13.754 or 13.755 billion Earth Cesium years) unless they have taken account for their own motion they will have published a faulty estimate!

I personally, being comoving (i.e. stationary), he says, know the real age and they, poor fellows, have it wrong.

So that is kind of interesting! You asked a nice question! But we do not have to worry about the length of the Cesium second when Earth and sun are in another part of their orbit around Milky center! then the length of the Cesium second will be slightly different (from the Observer's standpoint). But that does not matter. We publish our estimates of the Age in terms of the second we have NOW. We do want them to be right in terms of the standard units of time we have now.

We want our whole history of the expansion of the U to be right in terms of today's standardized units. So the answer to your question comes down to asking if cosmologists take into account the 370 km/s motion that we have NOW.

That 370 km/s motion relative to CMB will affect how our second compares with the Observer's second. So it could affect the accuracy of our estimates! in the Observer's eyes (sup specie aeternitatis, so to speak).

But finally then, by how much? We are talking 1/1000 of the speed of light.
So the square root of (1 minus one millionth).
The upshot is that our estimates of age and plots of times when things happened do not achieve that level of precision.
They would have to get a lot better before anyone would bother to make that correction so that they are really correct in terms of Universe time. So let's think of them as already correct, out to the indicated level of precision
It's actually a fun question, LastOne, thanks for raising it!

VantagePoint72
Thanks for revisiting this, Marcus. Question, though. Isn't this:
He points his telescope at us and observes our clocks. Ahah! he says, their Cesium is moving at 370 km/s, so it is a bit slowed down compared with my Cesium. So when they publish figures about the Age of Expansion (being 13.754 or 13.755 billion Earth Cesium years) unless they have taken account for their own motion they will have published a faulty estimate!
backwards? Since we quote our estimates for the universe's age in the cosmological frame, doesn't that mean that we're using the intergalactic observer's seconds? You said, "We publish our estimates of the Age in terms of the second we have NOW," but isn't that equivalent to saying, "We publish our estimates of the Age in terms of our own reference frame," which isn't true? We use the comoving frame! It's the fact that we know the age of the universe according to someone else's clock but not our own that is the problem.

Mind you, I do find your argument convincing that the difference will be negligible. Well, at least for converting the age of the Earth to cosmological time, not necessarily for converting the age of the universe to Terran time. In the latter case, I would think that our clocks, had we been around, would have diverged from the comoving clocks by quite a lot in the early stages of the universe when it was denser, inflating, etc. For the Earth's whole existence, though, the universe looked pretty much like it does now.

Gold Member
Dearly Missed
Since we quote our estimates for the universe's age in the cosmological frame, doesn't that mean that we're using the intergalactic observer's seconds?
No.
Our seconds are defined using our actual atomic clocks. A clock here on Earth would run a bit slower than one in intergalactic space.

We use his FRAME but we work in our UNIT. And the unit is defined using presentday clocks which are affected by presentday motion (not motion in some past epoch).

You said, "We publish our estimates of the Age in terms of the second we have NOW," but isn't that equivalent to saying, "We publish our estimates of the Age in terms of our own reference frame,"

No. It is not equivalent. I think you may be confusing reference frame with unit.

Cosmology is done essentially from a comoving perspective. The solar system motion motion is adjusted for. there is something called Universe time.

But science is done using the standard atomic clock second, as defined TODAY (not having to do with how an hypothetical Cesium clock might have run in some past era when the universe was denser and hotter, or when the Earth's orbital period was different etc etc.)

I would think that our clocks, had we been around, would have diverged from the comoving clocks by quite a lot in the early stages of the universe when it was denser, inflating, etc.

Yes. But that is irrelevant to the definition of the metric system of units.
=====================

But overall, as you acknowledge, the difference in time is so slight (between us and an observer at universe rest in intergalactic space) that the number of atomic clock seconds he counts (for some event to occur) and the number we count are essentially the same out to more decimal places than we ordinarily estimate stuff in cosmology.
The difference here seems too negligible to discuss further.

VantagePoint72
I understand that a reference frame is not the same as a unit—but it seems to me that quoting a measurement using a unit defined in another reference frame is identical to quoting the result in that reference frame. This is what I was discussing earlier in my comment about what exactly is meant by a year. Let me try to illustrate what I mean, and hopefully you will be able to point out exactly where my understanding has gone wrong:

Consider the standard twin paradox scenario. Alice stays on Earth and measures time in A-seconds. Bob takes the rocket trip and measures time in B-seconds. Now suppose we're interested in the time interval between when Bob departs and when he returns. As we know, Bob is accelerated and so will measure less proper time between the two events than Alice will. Or, in other words, on average B-seconds are longer than A-seconds. Now, if I understand you correctly, you're saying that the way we measure cosmic time would be analogous to Alice measuring the length of Bob's trip in Bob's reference frame, but using her own units; that is, using A-seconds. But the factor by which Bob's proper time is less than Alice's is the same factor by which an average B-second is longer than a A-second. To transform from "Alice's frame with A-seconds" to "Bob's frame with B-seconds", Alice multiplies her time interval by some factor, and then to transform from "Bob's frame with B-seconds" to "Bob's frame with A-seconds" she divides by the exact same factor. Thus, the length of the trip in Bob's frame using A-seconds is exactly the same as the length of the trip in Alice's frame.

So, while units and reference frames are not the same thing, it seems to me that transforming to a particular reference frame and converting to the units of that reference frame yield the same result. Saying "In the comoving frame but using our seconds" is mathematically the same as saying "In our reference frame"...