A Explaining the Hubble tension with fundamental physics

The Hubble tension or Hubble discrepancy is a contradiction between the Hubble constant as measured today, and the Hubble constant as measured in the early universe and extrapolated to today. @mfb recently listed the relevant measurements.

There are some threads about this in the Cosmology forum, including one which would explain the discrepancy as due to a local inhomogeneity in the universe's density ("local underdensity"). But I thought that in this thread we might focus on "fundamental physics" explanations (new forces and particles, modified gravity, etc).

Until today I hadn't come across any explanations that I found compelling, and I believe Nima Arkani-Hamed recently said (at Strings 2019?) that no-one has any good ideas. But the cover story in a recent New Scientist led me to a recent paper by Agrawal, Obied, and Vafa, which tries to explain this in the context of Vafa's ideas about dark energy in string theory.

The standard explanation of dark energy in string theory, has been to say that there is a small positive cosmological constant, the result of near-cancellations between positive and negative contributions to vacuum energy. That would mean we are living in a de Sitter geometry. But Vafa's hypothesis is that de Sitter geometry is not actually possible in string theory, and that instead we are living in flat space, with dark energy coming not from a cosmological constant, but from the vev of a "quintessence" field.

The way I visualize what Vafa envisages can be seen from the image at 36m30 here. I think one should interpret that blob, as a torus with two holes, and as a sketch of the extra dimensions. Not a literal sketch, since there are more than two extra dimensions in string theory; but the idea is that the extra dimensions should contain at least two "holes". Then we are to envision the standard model as arising e.g. from branes wrapped around one of the holes, and the quintessence field as the "modulus" (scalar parameter) describing the radius of the other hole. The size of the other hole is slowly expanding, and that produces the behavior desired in a quintessence model of dark energy.

Vafa says in his talk that the quintessence field doesn't interact directly with any standard matter fields (although they will interact indirectly, via gravity), but there will be other degrees of freedom associated with that second hole, and this can be where dark matter comes from. He could be referring to branes wrapped around the second hole; or to other moduli associated with the hole, and their superpartners.

Now where does the Hubble tension come from? Another conjecture of Vafa's is the "distance conjecture". This says that if you have a field-theoretic limit of a string theory vacuum, and one of the parameters varies by more than a Planck unit, then new degrees of freedom will appear. One way this can come about, is again through the extra dimensions changing size. Increasing one of the field-theory parameters means e.g. changing the size of a torus in the compact dimensions, but as the torus gets thinner along one circle, it may get fatter along the other circle, and so string states wrapping the torus in that other direction will come into play. (By the way, "distance" in "distance conjecture" refers to distance in parameter space; the varying parameter moves along a "distance" that is a Planck unit in size.)

Vafa's concept seems to be that as the quintessence modulus grows, the associated dark matter grows lighter, and also a bunch of new dark matter states become active; and that together all this can explain the Hubble tension. In the New Scientist article, he says that this could mean that in the long-term future, the universe gains an extra macroscopic dimension. I suppose one should envision the second hole in the image above, as growing so large that it becomes macroscopic. However, this specific possibility is not emphasized in the paper.

I am not actually advocating this explanation. For one thing, I don't see how Vafa's approach to dark energy gets around the usual problem that if supersymmetry is broken, there should be a cosmological constant anyway. Also, a string vacuum can contain so many moving parts, that it's not surprising that the Hubble tension can be imitated. Also, I prefer the MOND-like approach to dark matter of Khoury and Berezhiani, and I have not at all thought about its compatibility with Vafa's approach to dark energy.

Nonetheless, I do consider this approach to explaining the Hubble tension notable. It ties into deep issues in string theory, to do with the possible low-energy behavior of the theory, and it may have an elegant geometric visualization. It's also the first approach to the Hubble tension that I've seen, that resembles a first-principles explanation, as opposed to just a fudge of cosmological-model parameters.
 

MathematicalPhysicist

Gold Member
4,088
134
And they think that Science Fiction is hard to visualize...
Anyway, couldn't it be that the cosmological models we are basing our extrapolation till today, just flat out false?
 
32,993
8,775
Anyway, couldn't it be that the cosmological models we are basing our extrapolation till today, just flat out false?
Why do they work so well in many other places then? And what can replace them - lead to an agreement for the Hubble constant while staying consistent with everything else?
 

MathematicalPhysicist

Gold Member
4,088
134
Why do they work so well in many other places then? And what can replace them - lead to an agreement for the Hubble constant while staying consistent with everything else?
Well a model can explain one thing approximately but be flat out wrong about others.
I don't see any way out of it, every model we can suggest will eventually be false, especially if we take Popper criterion for science.
The other approach is not being even wrong...

Which books do you suggest to read on Cosmological models?
I have both Cosmology books of Weinberg's, and I learnt GR from Schutz, haven't finished reading it though.
 

ohwilleke

Gold Member
1,425
350
Why do they work so well in many other places then? And what can replace them - lead to an agreement for the Hubble constant while staying consistent with everything else?
The working hypothesis of lambdaCDM is that the Hubble constant is due to the cosmological constant in classical general relativity (a.k.a. "lambda") which is a global and uniform property of space-time.

But, it is entirely possible that the Hubble constant is merely an average value of a phenomena with some other cause which is not entirely homogeneous. This would evade detection until now if the variation around the mean of this non-homogeneous phenomena had about the same order of magnitude of cutting edge observational precision.

In much the same way, until recently, the background temperature of the cosmic background radiation was thought to be uniform until we finally had enough observational precision to distinguish differences on the order of 0.01 degrees Kelvin.

There are lots of proposals out there for such a non-homogeneous more localized source of Hubble expansion in particular places and times.

One such proposal (that I find attractive and which has been published, but lacks wide acceptance) is that dark matter phenomena arise from gravitons that if they didn't interact with each other would leave a galaxy, for example, that instead add to the strength of gravitons already pulling matter at its fringes towards the center. The diversion of these gravitons from leaving the galaxy system makes gravity between galaxies weaker giving rise to an apparent pulling apart of galaxies faster than you would expect if this phenomena didn't happen. This explains the cosmic coincidence problem and why the Hubble constant is merely an average of a localized phenomena that varies over time and space, rather than a uniform cosmological constant. See A. Deur, “A possible explanation for dark matter and dark energy consistent with the Standard Model of particle physics and General Relativity” (August 14, 2018) (Proceeding for a presentation given at Duke University, Apr. 2014. Based on A. D. PLB B676, 21 (2009); A.D, MNRAS, 438, 1535 (2014)).

Inhomogeneous distributions of "quintessence" are another example of such a theory.
 
Last edited:
32,993
8,775
The working hypothesis of lambdaCDM is that the Hubble constant is due to the cosmological constant in classical general relativity (a.k.a. "lambda") which is a global and uniform property of space-time.
That is wrong.
The cosmological constant has an influence on the Hubble constant but it is not the reason for the expansion.
But, it is entirely possible that the Hubble constant is merely an average value of a phenomena with some other cause which is not entirely homogeneous. This would evade detection until now if the variation around the mean of this non-homogeneous phenomena had about the same order of magnitude of cutting edge observational precision.
We know this is the case, and we have known that for decades. Matter, radiation and the cosmological constant are all influencing the expansion, and matter doesn't have a uniform distribution.
In much the same way, until recently, the background temperature of the cosmic background radiation was thought to be uniform until we finally had enough observational precision to distinguish differences on the order of 0.01 degrees Kelvin.
Who expected it to be perfectly uniform?
 

Want to reply to this thread?

"Explaining the Hubble tension with fundamental physics" You must log in or register to reply here.

Related Threads for: Explaining the Hubble tension with fundamental physics

Replies
8
Views
2K
Replies
5
Views
839
Replies
10
Views
4K
Replies
50
Views
4K
Replies
18
Views
5K

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving

Hot Threads

Top