Questioning the timescape model

  • Context: Undergrad 
  • Thread starter Thread starter timmdeeg
  • Start date Start date
Click For Summary
SUMMARY

The discussion centers on the comparison between the standard homogeneous cosmological model, ΛCDM, and the timescape cosmology, which incorporates the backreaction of inhomogeneities. The timescape model, as presented in the paper "Supernovae evidence for foundational change to cosmological models," suggests that dark energy is a geometric effect arising from varying clock rates among cosmological observers. It has shown to provide a superior fit to Type Ia supernova observations compared to ΛCDM, despite both models yielding the same expansion factor of 1100 since recombination. The conversation highlights the implications of these findings on the understanding of cosmic expansion and the role of the equivalence principle.

PREREQUISITES
  • Understanding of ΛCDM cosmology and its implications
  • Familiarity with timescape cosmology and its theoretical framework
  • Knowledge of Type Ia supernova observations and their significance in cosmology
  • Basic grasp of the equivalence principle and its applications in cosmological models
NEXT STEPS
  • Research the implications of backreaction phenomena in cosmological models
  • Explore the mathematical framework of timescape cosmology and its predictions
  • Investigate the role of the equivalence principle in various cosmological contexts
  • Analyze recent Type Ia supernova data and its impact on cosmological model validation
USEFUL FOR

Astronomers, cosmologists, and researchers interested in the evolution of cosmological models, particularly those examining the nature of dark energy and cosmic expansion dynamics.

timmdeeg
Gold Member
Messages
1,578
Reaction score
364
"Supernovae evidence for foundational change to cosmological models" https://arxiv.org/pdf/2412.15143
The paper claims:

We compare the standard homogeneous cosmological model, i.e., spatially flat ΛCDM, and the timescape cosmology which invokes backreaction of inhomogeneities. Timescape, while statistically homogeneous and isotropic, departs from average Friedmann-Lemaître-Robertson-Walker evolution, and replaces dark energy by kinetic gravitational energy and its gradients, in explaining independent cosmological observations.

The paper seems regarded in the community, here some citations:

https://arxiv.org/pdf/2511.17160
5.1 The timescape model The timescape cosmology [39, 40, 66–68] represents the first example in the literature of an averaged cosmological model applied to observational data [77–80]. In particular, it has been shown to provide an excellent fit to Type Ia supernova observations [77, 78], even outperforming the standard ΛCDM model in this context with recent observations [79, 80].Interestingly, the model provides an alternative perspective on the origin of dark energy,proposing that it arises purely as a geometric effect, resulting from backreaction phenomena coupled to calibration effects of the varying clock rates among cosmological observers.
5.1.1 Timescape: theoretical framework Within the timescape framework, the Universe is partitioned into two distinct domain classes,walls and voids [39, 40, 66–68]. Physically, walls correspond to marginally bound regions of the Universe, containing overdense matter structures, whereas voids are identified with underdense domains undergoing faster-than-average expansion.
Each domain is simplified and modelled by a dust-sourced FLRW spacetime, with no cosmological constant, and with a curvature depending on the domain type: spatially flat for walls and negatively curved for voids. ...

https://inspirehep.net/files/12a747b2f7cf8415b0631049b4e98ac1
The void and wall regions are each defined by locally homogeneous and isotropic FLRW solutions with their own respective scale factors av and aw, which are combined as a disjoint union to form the Buchert volume-average scale factor, a¯ = fvi a3v + fwi a3w , (1.55) where fvi and fwi correspond to the initial fraction of void and wall regions by volume in the universe.

https://arxiv.org/pdf/2504.01669
https://arxiv.org/pdf/2503.13391
https://arxiv.org/pdf/2502.20494
https://inspirehep.net/files/ee43eed060da936a2951abf3da5f1b68

Simply understood it seems that mainly the voids "undergoing faster-than-average expansion" in effect are replacing the assumption of the Cosmological Constant. But surely the model has to be confirmed by further investigations.

We know that since recombination the temperature dropped from around 3000 k then to 2,7 K now. Which corresponds to an increase of the scale factor of about 1100 times since then. So it seems that ΛCDM and timescape yield the same expansion of the universe till today inspite of causing accelerated expansion in much different ways. Coincidentally? Whereby this doesn't exclude that the expansion history is different. Would you agree with this reasoning?

Regarding the curvature it is said above "spatially flat for walls and negatively curved for voids": Our universe is spatially flat according to the power spectrum of the cosmic microwave background. How does this fit to the quote? As far as I know the spatial curvature can't change over time.
 
  • Like
Likes   Reactions: javisot
Physics news on Phys.org
timmdeeg said:
So it seems that ΛCDM and timescape yield the same expansion of the universe till today inspite of causing accelerated expansion in much different ways. Coincidentally?
No. The expansion factor of 1100 is based on data, not models--all models have to account for it somehow.

timmdeeg said:
Whereby this doesn't exclude that the expansion history is different.
Yes, that's how the different models can have the same expansion factor of 1100 from recombination to today despite having very different mechanisms: by varying the expansion history. The expansion factor of 1100 doesn't tell you how long it took between those two times, or how the expansion rate varied from then to now. Many very different models can all have the same expansion factor but still have varying times and expansion rates in between.
 
  • Like
Likes   Reactions: timmdeeg
PeterDonis said:
No. The expansion factor of 1100 is based on data, not models--all models have to account for it somehow.
Thanks, got it.
 
timmdeeg said:
We know that since recombination the temperature dropped from around 3000 k then to 2,7 K now. Which corresponds to an increase of the scale factor of about 1100 times since then. So it seems that ΛCDM and timescape yield the same expansion of the universe till today inspite of causing accelerated expansion in much different ways. Coincidentally?
Like timmdeeg, I would like to know if it is a coincidence that the anomalous acceleration of the expansion of the universe can be reinterpreted as a gravitational effect of some kind, or if the equivalence principle has something to do with it.
 
javisot said:
I would like to know if it is a coincidence that the anomalous acceleration of the expansion of the universe can be reinterpreted as a gravitational effect of some kind
I'm not sure what you mean by "a coincidence". The model under discussion is a speculative model.

javisot said:
if the equivalence principle has something to do with it.
As far as I can tell, the equivalence principle plays no role in the speculative model under discussion here. One would not expect it to, since the effects being accounted for are not local--they extend over patches of spacetime much too large for spacetime curvature to be negligible.
 
PeterDonis said:
I'm not sure what you mean by "a coincidence". The model under discussion is a speculative model.


As far as I can tell, the equivalence principle plays no role in the speculative model under discussion here. One would not expect it to, since the effects being accounted for are not local--they extend over patches of spacetime much too large for spacetime curvature to be negligible.
In "Supernovae evidence for foundational change to cosmological models" they say

Physically, it is motivated by an extension of Einstein’s Strong Equivalence Principle to cosmological averages at small scales (∼ 4 – 15 Mpc) where perturbations to average isotropic expansion and average isotropic motion cannot be observationally distinguished (Wiltshire 2008).
 
  • Like
Likes   Reactions: javisot
javisot said:
Like timmdeeg, I would like to know if it is a coincidence that the anomalous acceleration of the expansion of the universe can be reinterpreted as a gravitational effect of some kind, or if the equivalence principle has something to do with it.
What I meant, saying "coincidence" : There are two models (one established and the other one speculative) which explain the measured accelerated expansion today (using the same Supernovae data) much differently, but nevertheless agree that the universe has expanded by a factor of 1100 since recombination till today.
 
timmdeeg said:
but nevertheless agree that the universe has expanded by a factor of 1100 since recombination till today.
The point is that the ratio of scale factors is also the ratio of CMB temperatures. At recombination the temperature was about 3000K by definition, because that's the temperature above which the universe is mostly opaque plasma, and today the temperature is about 2.7K, so the ratio of scale factors is 1100 and all models agree on the expansion since recombination - unless they're completely abandoning FLRW-style cosmology and saying expansion isn't a thing at all, or contesting the model of plasma, or contesting the measure of the CMB temperature today. I don't think these models are doing any of that.

Where they might differ (and I expect they do) is the functional form of ##a(t)##, and the values of ##t## at recombination and today.
 
Last edited:
  • Informative
  • Like
Likes   Reactions: timmdeeg and javisot
Ibix said:
The point is that the ratio of scale factors is also the ratio of CMB temperatures.
Yes, but isn't it astonishing that otherwise much different models predict this very same ratio of scale factors? Of course they have to, so its not a coincidence (wrong wording).
 
  • #10
Ibix said:
Where they might differ (and I expect they do) is the functional form of ##a(t)##, and the values of ##t## at recombination and today.
Good point. And the values of ##t## might depend on where one lives, in a wall or in a void.
 
  • #11
timmdeeg said:
Yes, but isn't it astonishing that otherwise much different models predict this very same ratio of scale factors?
No, because they share the notion of a scale factor, and with that the temperature ratio is the same as (actually, the inverse of) the scale factor ratio. The point is that "the time of recombination" is literally "the time at which the temperature was 1100 times what it is today", so in any model with a scale factor in the FLRW sense the scale factor was 1100 times smaller. It's pretty much a definition.
timmdeeg said:
Of course they have to, so its not a coincidence (wrong wording).
It's a tautology, rather than a coincidence. If you had a model that broke the simple connection between scale factor and CMB temperature (somehow) and you still found the 1100 ratio, that would be a coincidence. I don't think timescape does that. It might argue (I'm not sure) that the 2.7K measure is off due to unaccounted effects of local density variation, but I doubt it would be a major error source in the 1100 number.
 
  • Like
Likes   Reactions: timmdeeg
  • #12
Ibix said:
The point is that "the time of recombination" is literally "the time at which the temperature was 1100 times what it is today", so in any model with a scale factor in the FLRW sense the scale factor was 1100 times smaller. It's pretty much a definition.
So ##t## now isn't special. Now both models predict the same "amount" of accelerated expansion whereby the temperature ratio is 1100. If true independent of ##t## wouldn't this mean that in the far future both models predict the same "amount" of accelerated expansion whereby the temperature ratio is say 2000. Correct?

Its hard to trust. ΛCDM predicts asymptotically exponential expansion in the very far future. But timescape? Only if it mimics Λ time independent.
 
  • #13
timmdeeg said:
In "Supernovae evidence for foundational change to cosmological models" they say

Physically, it is motivated by an extension of Einstein’s Strong Equivalence Principle to cosmological averages at small scales (∼ 4 – 15 Mpc) where perturbations to average isotropic expansion and average isotropic motion cannot be observationally distinguished (Wiltshire 2008).
That immediately makes me question the validity of the paper. The whole point of the equivalence principle, in each of its forms (weak, Einstein, and strong) is that it only applies locally, over a scale where spacetime curvature is negligible and an individual patch of spacetime on that scale can be considered to be flat. That is not the case in any cosmological model, including the "foundational change" model proposed in the paper, on scales of 4 to 15 MPc.
 
  • Informative
Likes   Reactions: timmdeeg
  • #14
timmdeeg said:
What I meant, saying "coincidence" : There are two models (one established and the other one speculative) which explain the measured accelerated expansion today (using the same Supernovae data) much differently, but nevertheless agree that the universe has expanded by a factor of 1100 since recombination till today.
I already answered this in post #2: that expansion factor is an observational fact, which all models must account for. So obviously it's not a coincidence that two very different models that both claim to account for the observational facts will have the same property.
 
  • Like
Likes   Reactions: timmdeeg
  • #15
timmdeeg said:
isn't it astonishing that otherwise much different models predict this very same ratio of scale factors?
No, it's what you should expect of any cosmological model that makes it to the stage of getting published in a peer-reviewed journal. It only seems "astonishing" to you because you don't see all the other models that have been thought of in various private contexts by cosmologists, and never get anywhere near being published because they fail this obvious and simple observational test.
 
  • Like
Likes   Reactions: timmdeeg
  • #16
timmdeeg said:
If true independent of ##t## wouldn't this mean that in the far future both models predict the same "amount" of accelerated expansion whereby the temperature ratio is say 2000. Correct?
Of course, because the temperature ratio has to be equal to the ratio of scale factors, and the ratio of scale factors is the ratio by which the universe has expanded during whatever time period you're considering. The same argument I and @Ibix have given for why the two have to match now, applies at any time.

There are many reasons to be skeptical of the timescape model, but this isn't one of them.
 
  • Like
Likes   Reactions: timmdeeg
  • #17
timmdeeg said:
the values of ##t## might depend on where one lives, in a wall or in a void.
Not the values of coordinate time ##t##--the values of proper time at that coordinate time ##t## that have elapsed for a comoving observer (one who sees the universe around him as isotropic). In a wall, that proper time at a given coordinate time ##t## (which is determined by the average expansion of the universe) will be smaller than ##t##, while in a void, it will be larger.
 
  • Like
Likes   Reactions: timmdeeg
  • #18
PeterDonis said:
That immediately makes me question the validity of the paper. The whole point of the equivalence principle, in each of its forms (weak, Einstein, and strong) is that it only applies locally, over a scale where spacetime curvature is negligible and an individual patch of spacetime on that scale can be considered to be flat. That is not the case in any cosmological model, including the "foundational change" model proposed in the paper, on scales of 4 to 15 MPc.
Unexpectedly for me this seems a crucial point!

PeterDonis and Ibix, many thanks for enlightening comments.
 
  • #19
PeterDonis said:
That immediately makes me question the validity of the paper. The whole point of the equivalence principle, in each of its forms (weak, Einstein, and strong) is that it only applies locally, over a scale where spacetime curvature is negligible and an individual patch of spacetime on that scale can be considered to be flat. That is not the case in any cosmological model, including the "foundational change" model proposed in the paper, on scales of 4 to 15 MPc.
Equivalence Principle on cosmological scales:

https://arxiv.org/abs/2502.02426
https://arxiv.org/abs/2412.08627
https://arxiv.org/abs/2011.05876
https://arxiv.org/abs/1803.02771
 
  • #20
javisot said:
The fact that these papers claim to be testing "the equivalence principle on cosmological scales" doesn't mean the concept actually makes sense, at least as it is used in GR, or that what they are testing actually is that thing.

For example, in the 2018 paper abstract, they seem to be defining the "equivalence principle" for dark matter as "whether its free fall is characterised by geodesic motion, just like baryons and light". There is no way to even test that in GR, since in GR free fall is characterised by geodesic motion by definition. So what they are actually doing is testing alternative theories of gravity other than GR, not testing whether the EP as it is in GR holds. And indeed, at the end of the abstract, we find: "Our analysis shows therefore that relativistic effects bring new and complementary constraints to alternative theories of gravity."

Also, supposing we are observing the motion of dark matter in some distant galaxy somehow (how is not at all clear since it's dark and we don't see it), how do we know it actually is in free fall? We can't attach an accelerometer to it. So if, hypothetically, we were to find that it moved differently than baryons or light in the same location with the same initial velocity, how would we know this was a violation of the EP, and not an effect of some non-gravitational interaction that affects the dark matter differently from the baryons or the light? (After all, we can't attach accelerometers to the baryons or the light either, so we don't actually know that they are in free fall.) This kind of issue would affect all of the papers you cite, even supposing they were using a coherent concept of "equivalence principle on cosmological scales" in the first place.
 
  • Like
Likes   Reactions: javisot
  • #21
PeterDonis said:
The fact that these papers claim to be testing "the equivalence principle on cosmological scales" doesn't mean the concept actually makes sense, at least as it is used in GR, or that what they are testing actually is that thing.

For example, in the 2018 paper abstract, they seem to be defining the "equivalence principle" for dark matter as "whether its free fall is characterised by geodesic motion, just like baryons and light". There is no way to even test that in GR, since in GR free fall is characterised by geodesic motion by definition. So what they are actually doing is testing alternative theories of gravity other than GR, not testing whether the EP as it is in GR holds. And indeed, at the end of the abstract, we find: "Our analysis shows therefore that relativistic effects bring new and complementary constraints to alternative theories of gravity."

Also, supposing we are observing the motion of dark matter in some distant galaxy somehow (how is not at all clear since it's dark and we don't see it), how do we know it actually is in free fall? We can't attach an accelerometer to it. So if, hypothetically, we were to find that it moved differently than baryons or light in the same location with the same initial velocity, how would we know this was a violation of the EP, and not an effect of some non-gravitational interaction that affects the dark matter differently from the baryons or the light? (After all, we can't attach accelerometers to the baryons or the light either, so we don't actually know that they are in free fall.) This kind of issue would affect all of the papers you cite, even supposing they were using a coherent concept of "equivalence principle on cosmological scales" in the first place.
You're right, I didn't share the papers with the intention of correcting anything. If you look at all four papers, they all state verbatim at some point that "the application of the equivalence principle on cosmological scales is problematic and subject to debate." My intention was to show that there is at least some debate on the matter.
 
  • #22
javisot said:
If you look at all four papers, they all state verbatim at some point that "the application of the equivalence principle on cosmological scales is problematic and subject to debate."
I'm not sure that's the actual mainstream position. AFAIK the actual mainstream position is the one I stated earlier, that the EP is only valid locally, on scales small enough that spacetime curvature is negligible, which does not include the scales the papers are studying.

javisot said:
My intention was to show that there is at least some debate on the matter.
If there are other papers that responded to the ones you cited with counter-arguments, leading to an actual back and forth in the literature, then there would be debate, yes. I'm not sure there actually has been any such back and forth about this particular issue, though. Just publishing papers making a claim and saying there is "debate" does not mean there actually is debate. Lots of papers get published that go nowhere at all, because no one else in the field thinks them even worth responding to, let alone starting a debate about their claims.
 
  • Like
Likes   Reactions: weirdoguy
  • #23
PeterDonis said:
That immediately makes me question the validity of the paper. The whole point of the equivalence principle, in each of its forms (weak, Einstein, and strong) is that it only applies locally, over a scale where spacetime curvature is negligible and an individual patch of spacetime on that scale can be considered to be flat. That is not the case in any cosmological model, including the "foundational change" model proposed in the paper, on scales of 4 to 15 MPc.
Such scales refer to Galaxy cluster which as I understand it don't feel isotropic expansion. If we consider them as spacetime patches containing galaxies as point particles, why then can't we assume flat spacetime within them?
 
  • #24
timmdeeg said:
Such scales refer to Galaxy cluster which as I understand it don't feel isotropic expansion.
Galaxy clusters are assumed to be gravitationally bound, yes. However...

timmdeeg said:
why then can't we assume flat spacetime within them?
Because they're gravitationally bound systems. You can't assume spacetime is flat even on the scale of, say, the solar system; if you did, you'd be predicting that the planets didn't orbit the Sun.
 
  • #25
PeterDonis said:
Because they're gravitationally bound systems. You can't assume spacetime is flat even on the scale of, say, the solar system; if you did, you'd be predicting that the planets didn't orbit the Sun.
So, if you say "that it (the equivalence principle) only applies locally, over a scale where spacetime curvature is negligible and an individual patch of spacetime on that scale can be considered to be flat" does it mean that tidal effects are negligible? From this point of view I would assume that locality means much much smaller compared to the size of a galaxy cluster, however depending on how one defines "negligible".
 
  • #26
timmdeeg said:
does it mean that tidal effects are negligible?
Yes.

timmdeeg said:
From this point of view I would assume that locality means much much smaller compared to the size of a galaxy cluster
No, it means that tidal effects are negligible. Tidal effects are not negligible over distance scales much, much, much smaller than the size of a galaxy cluster. As I have already pointed out, they are not negligible even on the scale of the Earth-Moon system--or, for that matter on the scale of the Earth itself.
 
  • #27
PeterDonis said:
Tidal effects are not negligible over distance scales much, much, much smaller than the size of a galaxy cluster. As I have already pointed out, they are not negligible even on the scale of the Earth-Moon system--or, for that matter on the scale of the Earth itself.
Instead, looking for negligible tidal effects one would consider areas far away from gravitational centers like stars and galaxies. Within a galaxy cluster there might be such areas and if being connected these could result to scales of 4 to 15 MPc. Perhaps that's what the authors refer to. No sure if this reasoning makes sense.
 
  • #28
timmdeeg said:
looking for negligible tidal effects one would consider areas far away from gravitational centers like stars and galaxies.
That still doesn't help, because the area that needs to be covered in the models being proposed can't be limited to just such areas. They also have to include areas occupied by the gravitational centers.
 
  • Like
Likes   Reactions: timmdeeg

Similar threads

  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 12 ·
Replies
12
Views
5K
  • · Replies 62 ·
3
Replies
62
Views
11K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 29 ·
Replies
29
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 72 ·
3
Replies
72
Views
10K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 18 ·
Replies
18
Views
4K