Vanadium 50 said:
I'm told that it is totally impossible for that to be the case today. Of course, U was also told the exact same thing back then.
The statistical errors in the data, and the identified systemic measurement errors, are probably too small, even if slightly understated, to resolve the problem.
The late time Hubble constant estimates have moderately large errors amenable to slight adjustment, but the Planck CMB based calculation has such a large data set and involves such precise observations, that at least within the model within which the calculation is done, the errors are tiny. And, the tension is too big to resolve with errors in the late time Hubble constant measurements alone. It also makes sense to doubt that the late time measurements are really the problem, because, while they have larger error bars, these measurements are also robust, with many different methods of measuring the Hubble constant in the late time era largely confirming each other.
Many preprints by independent authors so far this year have all reached that conclusion.
See, e.g.,
Kumar (2024),
Gialamas (2024),
Colgáin (2024),
Roy (2024),
Pogosian (2024),
He (2024),
Calderon (2024),
Signorini (2024), and
Akarsu (2024). Valentino and Blunt have even written
a whole book on the Hubble tension, largely concurring with this conclusion.
Stacy McGaugh
speculates that the problem may be "a
systematic in the interpretation of the CMB data rather than in the local distance scale."
In other words, the tension might be a modeling error in the CMB based calculation of the Hubble constant, since that is a model dependent calculation and there are other cracks in the LambdaCDM model of cosmology that was used to make that calculation (such as the appearance of galaxies observed by the JWST sooner than the model).
See also Liu (2024) making a similar analysis. (Note that in the sense being discussed, the argument being made is not that "modeling error" means that the LambdaCDM model was inaccurately translated into math and data analysis by Planck's scientists; instead, the concern is that the LambdaCDM model is not the correct cosmology model to use because it misstates the astrophysical reality.)
This makes sense, because inaccuracies in the model used to make the calculation won't show up in the Planck measurement error bars, and because this is pretty much a singular measurement approach in tension with the diverse measurement approaches used in the late time era. If a different model produced a higher value of the Hubble constant from the CMB, the tension would go away and it might fix other tensions in the cosmology fits too. And, this is far from the only tension that LambdaCDM has with observational evidence, so something about that model needs to be fixed in any case.
LambdaCDM may be a good first order approximation of our universe's cosmology with a small number of parameters. But as our data gets better on multiple fronts, it may not be good enough to fit all the data.
Gialamas (2024) on the other hand, argues basically that the late time measurement may be a local effect. This paper argues that the part of the galaxy where we are measuring the late time Hubble constant may just be randomly non-representative of the late time universe as a whole. In other words, everyone is accurately measuring what they see, but they are failing to take into account a basically random sampling error which causes our little corner of the universe to be weird. This random sampling error may be much bigger than one might naively expect, because these random variations are correlated with each other due to their common cosmological origins in the early universe.
Resolving the source of the still unresolved tension isn't easy, and there are a variety of proposals out there to gather new kinds of data to figure out why there is a tension.