Universe could be Bigger and Older - Move over 13.7 Gy

  • Thread starter neutrino
  • Start date
  • Tags
    Universe
In summary, the researchers found that if the Hubble Constant were smaller by 15%, the universe would be older and bigger by that amount.
  • #1
neutrino
2,094
2
This would be very interesting if proved right.

A project aiming to create an easier way to measure cosmic distances has instead turned up surprising evidence that our large and ancient universe might be even bigger and older than previously thought.

A research team led by Alceste Bonanos at the Carnegie Institution of Washington has found that the Triangulum Galaxy, also known as M33, is about 15 percent farther away from our own Milky Way than previously calculated.

Currently, most astronomers agree that the value of the Hubble constant is about 71 kilometers per second per megaparsec (a megaparsec is 3.2 million light-years). If this value were smaller by 15 percent, then the universe would be older and bigger by this amount as well.

The new finding implies that the universe is instead about 15.8 billion years old and about 180 billion light-years wide.

http://www.space.com/scienceastronomy/060807_mm_huble_revise.html
(Please add other links related to this story if you find any.)

http://arxiv.org/abs/astro-ph/0606279 - This may be the paper.

I don't understand one thing, though (btw, I'm still a layman, quite literally, when it comes to cosmological matters). If H were smaller, then would not the galaxies slow down the farther they got, and wouldn't that in turn imply a smaller universe when compared with current estimates? :confused:
 
Space news on Phys.org
  • #2
neutrino said:
This would be very interesting if proved right.



http://www.space.com/scienceastronomy/060807_mm_huble_revise.html
(Please add other links related to this story if you find any.)

http://arxiv.org/abs/astro-ph/0606279 - This may be the paper.

I don't understand one thing, though (btw, I'm still a layman, quite literally, when it comes to cosmological matters). If H were smaller, then would not the galaxies slow down the farther they got, and wouldn't that in turn imply a smaller universe when compared with current estimates? :confused:


If H is smaller then they have to be farther away to match the velocities we measure. We don't measure their distances we measure their redshifts which gives us velocity. H determines our distance estimate.
 
  • #3
neutrino said:
This would be very interesting if proved right.

http://arxiv.org/abs/astro-ph/0606279 - This may be the paper.

thanks for the arxiv link. I believe you are right about this being the paper.

One of the main results of the HST was Wendy Freedman's "HUBBLE KEY PROJECT" determining H0 to be 71
(in units of km per second per Mpc)

and these people imply that if their findings were confirmed it would be revised to 61

here is what they actually say in their paper (one can discount what they were quoted as saying in the media)
... when comparing to the HST Key Project (Freedman et al. 2001) distance to M33. They derive a metallicity corrected Cepheid distance of 24.62 ± 0.15 mag, using a high reddening value of E(V - I) = 0.27 and an assumed LMC distance modulus of 18.50 ± 0.10 mag. If we calculate the LMC distance our result would imply, we derive 18.80 ± 0.16 mag, which is not consistent with the eclipsing binary determinations. The error is obtained by adding in quadrature the individual errors in the two distance measurements. Taking this one step further, our LMC distance would imply a 15% decrease in the Hubble constant to H0 = 61 km s-1 Mpc-1. This improbable result brings into question the Key Project metallicity corrections and reddening values not only for M33, but also for the other galaxies in the Key Project. We thus demonstrate the importance of accurately calibrating the distance scale and determining H...
Before Wendy Freedman's team "settled" on 71------around which then a substantial consensus developed-----estimates of the Hubble parameter varied widely from 50-100 and even higher. "all over the map"
the variation was quite a bit more than what I say----more than the 50-100 range----I just can't remember what it was. quite scandalous.

but since around 2000 or whenever they settled on 71 there has been a wonderful calm consensus in the field, which people called a new era of "precision cosmology" and a "concordance".

over the long haul, however, cosmology has been accustomed to lots of disagreement about basic parameters of the model. so the years since 2000 have been the exception.

Personally I do not expect that they will give up their 71. Not soon or easily anyway.
they might find something questionable about Bonanos et al.
Or if they do change, it will take quite a few years. It will take a lot of eclipsing binaries to confirm a new distance to M33.
==================

Neutrino, I may be telling you stuff you already know, but I will say it anyway because other people may read the thread.

Anyone who wants to play around with different values of the Hubble and see all the different consequences can easily do it with
Wrights calculator. There is a box where you put in what value you want---the default is 71 but you can put in 61 or whatever you want. You will get different everything, age of universe, light travel times, different distances.

However I would not worry about them saying bigger "SIZE OF THE UNIVERSE" because that is a vague idea. It has not yet been excluded that the universe is infinite.
 
Last edited:
  • #4
the more I look at this Bonanos et al result the more suspicious I get.

there are some weak links
like inferring the brightness from the calculated masses
(where they so far have used just ONE eclipsing binary)also their "180 billion LY" is not to worry about, that is just an adjustment of Cornish et al figure of "156 billion LY" published in 2004, which used Wendy Freedman's 71.

they just adjusted Cornish 156 by 15 percent and got 180. But Cornish et al figure was totally NOT an estimate of the size of the universe. It was a LOWER BOUND. Cornish et al was not saying that the universe is even finite. It could be much much larger than their 156, which was just the best lower bound they could come up with using data available in 2004. So increasing that by 15% and saying 180 billion LY does not seem very important.

And it is so far very speculative. I do not have any assurance that the proposed new figure of 61 will stick.
Notice that the reporter telephoned Larry Krauss (a prominent cosmologist) and he sounded pretty cautious and reserved.
 
Last edited:
  • #5
Thanks, marcus. It's better to get it from someone who has an idea of what's actually happening than from a science writer;I think that's one of the reasons I stoppped reading popular-level books a few years ago. :biggrin: I saw the post in another thread where you had mentioned Ned Wright's calculator, but I've yet to have a look at it. I'll do so right now.

Also, thanks to franznietzsche, for clearing up my doubt. :smile:
 
  • #6
Here's the Ohio State press release...
http://researchnews.osu.edu/archive/biguni.htm

It's interesting to note that "the current value [of H0] has been accepted since the 1950s."
 
  • #7
neutrino said:
Here's the Ohio State press release...
http://researchnews.osu.edu/archive/biguni.htm

It's interesting to note that "the current value [of H0] has been accepted since the 1950s."

I suppose the staff public relations writer (Pam Frost Gorder) may not have understood part of what the contact researcher (Krzysztof Stanek) said. If you notice any other things that might need checking or correcting there is always the possibility of sending an email to the contact people:Contact: Krzysztof Stanek, (614) 292-3433; Stanek.32@osu.edu

Written by Pam Frost Gorder, (614) 292-9475; Gorder.1@osu.edu
 
Last edited:
  • #8
In the mainstream, there is still quite a bit of discomfort with the concordance value of [itex]H_0[/itex]. This is because the primary observations upon which it is based (most notably, the HST Key Project) are potentially subject to large systematic errors. I don't see the concordance value going down due only to one eclipsing binary measurement, but I wouldn't be surprised if we eventually settled on a value several sigma outside the current error bars. CMB measurements, which are generally thought to be the most reliable, cannot give a tight constraint on [itex]H_0[/itex] all by themselves.
 
  • #9
Can someone help me out? I'm not an astrophysicist so I'm not up to speed with most of the models, but...

IF the universe is 156 bLY in diameter, and IF the baryon density is approx 10^-28 kg/m3 (I believe that's a rough accepted figure), then this means the entire universe is close to being a black hole (that's ignoring dark matter - which would only cement the case).

(average density of a black hole with schwarzschild radius 78bLY would be approx 2.9 x 10^-28 kg/m3 according to my calculations)

But if it continues expanding as it seems to be, then it will at some point cease to be a black hole.

Is this reasoning correct?

Best Regards
 
  • #10
The observable universe can't be a black hole by the standard definition, but it could be a very large white hole. You might find this article interesting. :

http://math.ucr.edu/home/baez/physics/Relativity/BlackHoles/universe.html"
 
Last edited by a moderator:
  • #11
It is possible Ho is as low as ~61, but doubtful. WMAP3 has pretty much constrained that possibility almost beyond consideration. I'm much less inclined to view any black hole, or white hole possibilities as viable. The main problem with those propositions is, IMO, they confer us a priveleged position [uncomfortably near the 'center' of the observable universe]
 
  • #12
SpaceTiger said:
The observable universe can't be a black hole by the standard definition, but it could be a very large white hole. You might find this article interesting. :

http://math.ucr.edu/home/baez/physics/Relativity/BlackHoles/universe.html"
Thanks for that, Space Tiger. According to this link, we cannot apply the simple Swarzschild equation because it assumes a static universe (I'm not sure why you then say the universe can't be a black hole "by definition", but never mind), and the universe is obviously expanding - thus its the expansion energy which means the normal Swarzschild equation doesn't apply.

Nevertheless, isn't it interesting that the present estimate for average baryon density is roughly equal to the estimate of mass density if we were to assume a static black-hole universe 156bLY across? If the universe is not a black hole then that's an extraordinary coincidence.

And Chronos - why does it follow that we would have to assume ourselves at the centre of the universe IF the universe were a black hole? I can't see that follows.

Best Regards
 
Last edited by a moderator:
  • #13
moving finger said:
Nevertheless, isn't it interesting that the present estimate for average baryon density is roughly equal to the estimate of mass density if we were to assume a static black-hole universe 156bLY across? If the universe is not a black hole then that's an extraordinary coincidence.

Is it? Suppose we were to estimate both quantities with dimensional analysis. The event horizon of a black hole is the point beyond which light cannot escape. We would expect this to depend upon the mass of the black hole (the source of the gravitation), the gravitational constant (G, the connection between mass and gravitation), and the speed of light. The only way to get a length scale from these three quantities is:

[tex]d\propto\frac{GM}{c^2}[/tex]

There can, of course, be an arbitrary dimensionless numerical factor out front, but from a classical estimate (setting the escape speed to c), we get:

[tex]r_h=\frac{2GM}{c^2}[/tex]

We wouldn't expect it to be dramatically different in GR and, in fact, it turns out to be exactly the same.

Now, the particle horizon is the distance from which light could have reached us since the Big Bang. In any universe, we would expect this to depend on the speed of light and the expansion rate as a function of time. The latter, however, can be related to some other quantities by the second Friedmann equation -- the expansion rate is controlled by the energy density of the various components of the universe. In a flat, matter-dominated model, this yields:

[tex]H^2=\frac{8\pi G\rho}{3}[/tex]

and we're left again with the same three quantities: c, G, and M. In both cases, we're calculating a horizon in which gravitating matter is determining the propagation of light. I did a quick calculation (someone should check it) of the particle horizon in a flat, matter-dominated universe and got:

[tex]r_h=\frac{GM}{2c^2}[/tex]

The answer will be slightly different in [itex]\Lambda CDM[/itex], but not by a lot. It is only in the very recent past that the universe became dark energy dominated.
 
Last edited:
  • #14
New Results on H_0 from Chandra

This is getting very interesting indeed. Just after a couple of days after the Bonanos results appear (in the press), new results from the orbiting Chandra X-ray telescope show that the universe is not that old.
By combining X-ray data from Chandra with radio observations of galaxy clusters, the team determined the distances to 38 galaxy clusters ranging from 1.4 billion to 9.3 billion light years from Earth. These results do not rely on the traditional distance ladder. Bonamente and his colleagues find the Hubble constant to be 77 kilometers per second per megaparsec (a megaparsec is equal to 3.26 million light years), with an uncertainty of about 15%.

This result agrees with the values determined using other techniques. The Hubble constant had previously been found to be 72, give or take 8, kilometers per second per kiloparsec based on Hubble Space Telescope observations. The new Chandra result is important because it offers the independent confirmation that scientists have been seeking and fixes the age of the Universe between 12 and 14 billion years.

http://chandra.harvard.edu/photo/2006/clusters/
http://chandra.harvard.edu/press/06_releases/press_080806.html (press release)

http://www.arxiv.org/abs/astro-ph/0604039

I guess measuing the distances to 38 galaxiy clusters is better than one binary.
 
  • #15
neutrino said:
This is getting very interesting indeed. Just after a couple of days after the Bonanos results appear (in the press), ...

http://chandra.harvard.edu/photo/2006/clusters/
http://chandra.harvard.edu/press/06_releases/press_080806.html (press release)

http://www.arxiv.org/abs/astro-ph/0604039

...

Neutrino, I have bolded a link which you researched, I think.
Good work. You may have tried Bonamente in the arxiv search engine, as I did.

thanks, it often helps to get the technical version as well as the public media one.

In this case just by looking further down the list of Bonamente papers I found one from December 2005 that may relate more directly to the topic:
http://arxiv.org/abs/astro-ph/0512349
Determination of the Cosmic Distance Scale from Sunyaev-Zel'dovich Effect and Chandra X-ray Measurements of High Redshift Galaxy Clusters
M. Bonamente, M. Joy, S. La Roque, J. Carlstrom, E. Reese, K. Dawson
ApJ submitted (revised version)

"We determine the distance to 38 clusters of galaxies in the redshift range 0.14 < z < 0.89 using X-ray data from Chandra and Sunyaev-Zeldovich Effect data from the Owens Valley Radio Observatory and the Berkeley-Illinois-Maryland Association interferometric arrays. The cluster plasma and dark matter distributions are analyzed using a hydrostatic equilibrium model that accounts for radial variations in density, temperature and abundance, and the statistical and systematic errors of this method are quantified. The analysis is performed via a Markov chain Monte Carlo technique that provides simultaneous estimation of all model parameters. We measure a Hubble constant of 76.9 +3.9-3.4 +10.0-8.0 km/s/Mpc (statistical followed by systematic uncertainty at 68% confidence) for an Omega_M=0.3, Omega_Lambda=0.7 cosmology. We also analyze the data using an isothermal beta model that does not invoke the hydrostatic equilibrium assumption, and find H_0=73.7 +4.6-3.8 +9.5-7.6 km/s/Mpc; to avoid effects from cool cores in clusters, we repeated this analysis excluding the central 100 kpc from the X-ray data, and find H_0=77.6 +4.8-4.3 +10.1-8.2 km/s/Mpc. The consistency between the models illustrates the relative insensitivity of SZE/X-ray determinations of H_0 to the details of the cluster model. Our determination of the Hubble parameter in the distant universe agrees with the recent measurement from the Hubble Space Telescope key project that probes the nearby universe."

this is very impressive. It does not really say a DIFFERENT AGE UNIVERSE to me. It does not actually say a DIFFERENT VALUE HUBBLE PARAMETER either, as I read it.
The errorbars are so wide that when WendyFreedmanHST says 71 +/-4
and these people say 77 +/-10, or whatever they say, what impresses me is that the values are SO CLOSE!

these people have a radically different way to determine distance and they are studying objects much farther away than the HST Key project.

what is so impressive is to get totally independent confirmation with a different methodology used in a different range of distances

I'm glad you flagged this. If someone else spotted it and said something, I didnt notice. first I heard of it.

BTW notice that the Bonamente team give two uncertainties in tandem----like +/-4 and +/-10----one they say is a STATISTICAL uncertainty which probably reflects the spread in the sample of 38 clusters-----the other larger uncertainty they call a "systematic" uncertainty which I don't entirely understand but think may have to do with this being a new method and they arent too sure of their methodology. But as with any new technique. that said, it is great. confirmation of the rough size of the Hubble parameter from TOTALLY OUTSIDE THE USUAL DISTANCE LADDER. excellent! :biggrin:
 
Last edited:
  • #16
Thanks for the other link, marcus. I did actually search for 'Boanmente' in the arxiv, and as soon as I saw the term Sunyaev-Zel'dovich Effect, and that the paper was very recent, I assumed that it was the one.

I never knew about this till I saw the latest entry in http://www.badastronomy.com/bablog/2006/08/08/wait-a-sec-how-big-is-the-universe-again/ this morning.
 
  • #17
Neutrino,
Ned Wright had something to say about the Bonamente et al result

http://www.astro.ucla.edu/~wright/cosmolog.htm

in his "news of the universe" report. He questions how much older (maybe not 15 percent) and he indicates that if one took the new value of Hubble parameter at face value it might NOT IMPLY A LARGER UNIVERSE, but would tend to favor a closed finite one. So he says the effect of the new value (if it were accepted) might be to give many people a notion of the universe being SMALLER. Paradoxical. here is what he says

==quote Wright==
An Older but Larger Universe?


Ohio State astronomers have measured a new precise distance to the nearby galaxy M33 based on a spectroscopic eclipsing binary. Their value is 15% larger than the old Cepheid based distance. By itself this says nothing about the Hubble constant because M33 is so close to the Milky Way that its radial velocity is dominated by random motions, not the expansion of the Universe. But it could indicate that Cepheid distances are incorrect by 15%.

If so, the Hubble constant would be smaller: about 62 instead of 72 km/sec/Mpc. But the claim in the OSU press release that "the universe could be [...] 15 percent older" is incorrect. If the Hubble constant is lower, then CMB anisotropy data require that OmegaM, the ratio of the matter density to the critical density, be higher, so the vacuum energy is lower, and the change in the age of the universe is considerably smaller, as shown in graph at right [click on the graph to enlarge] which shows the age vs. Ho for CMB consistent models as the solid curve, and the 1/Ho behavior assumed by the OSU press release as the dashed curve. So the Universe would not be 15% older but perhaps 7% older.

The claim that the Universe would be 15% larger is partially incorrect. Even though relatively nearby galaxies would be 15% further away the actual size of the Universe would go from infinite (flat) to finite (closed) but very big, which is a smaller Universe. The distance to distant quasars at redshift z=6 would increase by only 4%, and the distance to the last scattering surface changes less than 0.5% because this is what is fixed by the CMB.

CNN quoting space.com and John Johnson of the LA Times accepted the press release's claims of a 15% older and larger Universe uncritically. The real news is that a new method for precision distance measurements has achieved its first result. It will be averaged in with other methods used to calibrate the Cepheid period-luminosity relation and lead to a few percent decrease in the Hubble constant.

One of the methods to be averaged will be the Sunyaev-Zeldovich effect which gives Ho = 77 +/- 10 km/sec/Mpc according to a recent paper. This agrees with the values of Ho from WMAP and HST quite well.
==endquote==

never as simple as one thinks at first, but that's part of the fun:tongue2:
 
  • #18
moving finger said:
Can someone help me out? I'm not an astrophysicist so I'm not up to speed with most of the models, but...

IF the universe is 156 bLY in diameter, and IF the baryon density is approx 10^-28 kg/m3 (I believe that's a rough accepted figure), then this means the entire universe is close to being a black hole (that's ignoring dark matter - which would only cement the case).

(average density of a black hole with schwarzschild radius 78bLY would be approx 2.9 x 10^-28 kg/m3 according to my calculations)

It's almost surely not a black hole. Saying "it's close to being a black hole" is really just a way of saying that gravitational attraction is playing a really significant role in the dynamics of the universe at the length scale you mention - so significant that we need general relativity to understand what's going on.

It's all about dimensional analysis: take a mass, and you can use G and c to cook up a length scale. This is roughly the Schwarzschild radius of that mass, and what it means is if you've got about that much mass in that big a radius, general relativity is really important for understanding what's going on. You could have a black hole, you could have a big bang...

The diameter you mention is a funny thing. 156 billion light years is the calculated present diameter of that portion of the universe we can see when we look way back in the past.

The most distant stuff we see is the cosmic background radiation. This was emitted when the hot gas cooled down enough to become transparent about http://math.ucr.edu/home/baez/timeline.html#bang". But, of course we can't see what they're doing now!

So, 78 billion light years is much further than anything we've ever seen. But, to make it seem even more impressive, people multiply by 2 to get a diameter of 156 billion light years.

But if it continues expanding as it seems to be, then it will at some point cease to be a black hole.

It's probably not a black hole now. But you're right in this: if the universe continues expanding, a portion whose diameter is 156 billion light years will gradually flatten out, and eventually we'll be able to use Newtonian gravity plus special relativity to understand what galaxies are doing in that region. General relativity effects will become insignificant at this length scale.
 
Last edited by a moderator:
  • #19
Hi marcus,
Thanks, I went through that y'day. Phil Plait had linked the page in his http://www.badastronomy.com/bablog/2006/08/10/update-the-universe-and-the-volcano/, but I forgot to post it here.

Ned Wright said:
Even though relatively nearby galaxies would be 15% further away the actual size of the Universe would go from infinite (flat) to finite (closed) but very big, which is a smaller Universe.
He is actually referring to what would happen to our estimates, and not what would happen physically, right? :redface:
 
  • #20
neutrino said:
He is actually referring to what would happen to our estimates, and not what would happen physically, right? :redface:

Right! and only to some people's estimate.
 
  • #21
marcus said:
Right! and only to some people's estimate.
And that,too. :biggrin:
 
  • #22
You mentioned that Ned Wright says:

marcus said:
...One of the methods to be averaged will be the Sunyaev-Zeldovich effect which gives Ho = 77 +/- 10 km/sec/Mpc according to a recent paper. This agrees with the values of Ho from WMAP and HST quite well.

I've seen elsewhere (http://www.badastronomy.com/bablog/2006/08/08/wait-a-sec-how-big-is-the-universe-again/ ) the following statement that refers to the recent Chandra result (http://chandra.harvard.edu/press/06_releases/press_080806.html ):

The thing to know is that this method really is independent of redshifts, and they get about the same value for the Hubble constant: 77 km/s/Mpc +/- 15%.
.

I believe this is a misunderstanding. The distance measurement in the Chandra case is of course new and independent of the usual distance-ladder, but the "recession velocity" of the galaxy clusters remains a redshift measurement.

Please correct me if I'm wrong, but all Hubble estimates rely on a redshift measurement, don't they?
 
  • #23
oldman said:
Please correct me if I'm wrong, but all Hubble estimates rely on a redshift measurement, don't they?

Yeah, I suspect they mean independent of the distance ladder.
 
  • #24
I'd be interested in focusing on this one method (SZ-CMB-Xray)and trying to understand it better. Here is the Harvard Chandra pressrelease simplification:
====quote====
The astronomers used a phenomenon known as the Sunyaev-Zeldovich effect, where photons in the cosmic microwave background (CMB) interact with electrons in the hot gas that pervades the enormous galaxy clusters. The photons acquire energy from this interaction, which distorts the signal from the microwave background in the direction of the clusters. The magnitude of this distortion depends on the density and temperature of the hot electrons and the physical size of the cluster. Using radio telescopes to measure the distortion of the microwave background and Chandra to measure the properties of the hot gas, the physical size of the cluster can be determined. From this physical size and a simple measurement of the angle subtended by the cluster, the rules of geometry can be used to derive its distance. The Hubble constant is determined by dividing previously measured cluster speeds by these newly derived distances.
===endquote===
anybody want to elaborate, explain, give more links?

I note that out past z= 1.6 the angular size of things INCREASES with distance, so when they say that given the angular size "the rules of geometry can be used to derive its distance" this is not entirely straightforward unless one knows the redshift range. As it happens in this case it was 0.14 < z < 0.89---angular size decreases with distance. OK not important.

the main thing is that chandra sees Xray so it can TELL THE TEMPERATURE
and that would imply, if you know the quantity and density of gas, how much INTERACTION WITH CMB to expect

and you can measure the interaction with CmB independently using radiotelescopes

So once they measured the Xray TEMPERATURE and the amount of interaction with CMB, they could presumably decide on HOW MUCH AND HOW DENSE the gas and they had the option of fitting that to a HYDROSTATIC MODEL.

it sounds like they may have fitted their model of a cloud of hot gas to their data by RANDOMLY VARYING PARAMETERS in a "Monte Carlo" fashion. I am only reading the abstract of the Max Bonamente et al paper, not reading the paper.
Someone else who is better at this could explain with more reliable detail.

I see they give TWO LEVELS OF ERRORBAR. that is interesting. the first range of uncertainty is about variation in the STATISTICAL SAMPLE OF 38 CLUSTERS
and then there is a second LARGER range of uncertainty having to do with systematic error----unsureness about the METHOD basically.
here again is the Harvard pressrelease link (given by BadAstronomer, Oldman and others)
http://chandra.harvard.edu/press/06_releases/press_080806.html
 
Last edited:
  • #25
Here is the arxiv paper about the SZ-Xray-CMB work
I note that they ASSUMED darkenergy=0.7 and matter=0.3.

http://arxiv.org/abs/astro-ph/0512349
Determination of the Cosmic Distance Scale from Sunyaev-Zel'dovich Effect and Chandra X-ray Measurements of High Redshift Galaxy Clusters

M. Bonamente, M. Joy, S. La Roque, J. Carlstrom, E. Reese, K. Dawson
ApJ submitted (revised version)

"We determine the distance to 38 clusters of galaxies in the redshift range 0.14 < z < 0.89 using X-ray data from Chandra and Sunyaev-Zeldovich Effect data from the Owens Valley Radio Observatory and the Berkeley-Illinois-Maryland Association interferometric arrays. The cluster plasma and dark matter distributions are analyzed using a hydrostatic equilibrium model that accounts for radial variations in density, temperature and abundance, and the statistical and systematic errors of this method are quantified. The analysis is performed via a Markov chain Monte Carlo technique that provides simultaneous estimation of all model parameters. We measure a Hubble constant of 76.9 +3.9-3.4 +10.0-8.0 km/s/Mpc (statistical followed by systematic uncertainty at 68% confidence) for an Omega_M=0.3, Omega_Lambda=0.7 cosmology. We also analyze the data using an isothermal beta model that does not invoke the hydrostatic equilibrium assumption, and find H_0=73.7 +4.6-3.8 +9.5-7.6 km/s/Mpc; to avoid effects from cool cores in clusters, we repeated this analysis excluding the central 100 kpc from the X-ray data, and find H_0=77.6 +4.8-4.3 +10.1-8.2 km/s/Mpc. The consistency between the models illustrates the relative insensitivity of SZE/X-ray determinations of H_0 to the details of the cluster model. Our determination of the Hubble parameter in the distant universe agrees with the recent measurement from the Hubble Space Telescope key project that probes the nearby universe."

Max Bonamente and Marshall Joy are at University of Alabama, various of the others are at UC-Berkeley, University of Chicago and suchlike.

It is an accidental bananapeel hazard that we should also be hearing about the ECLIPSING BINARY work of BONANOS et al.
and the name is confusable with Bonamente. Bonanos could have originally meant "Goodyear" and Bonamente possibly meant "Goodwill"----dont really know, just have to keep them straight somehow.
 
Last edited:
  • #26
SpaceTiger said:
In both cases, we're calculating a horizon in which gravitating matter is determining the propagation of light.
You're correct.

For an expanding matter-dominated universe the particle horizon scales with t, and (I believe) the total mass enclosed within that particle horizon also scales with t. Since Swarzschild radius is proportional to total mass, it follows that the "equivalent Swarzschild radius" for the total mass within the particle horizon will also scale with t.

Thanks
 
Last edited:
  • #27
A project aiming to create an easier way to measure cosmic distances has instead turned up surprising evidence that our large and ancient universe might be even bigger and older than previously thought.

A research team led by Alceste Bonanos at the Carnegie Institution of Washington has found that the Triangulum Galaxy, also known as M33, is about 15 percent farther away from our own Milky Way than previously calculated.

Currently, most astronomers agree that the value of the Hubble constant is about 71 kilometers per second per megaparsec (a megaparsec is 3.2 million light-years). If this value were smaller by 15 percent, then the universe would be older and bigger by this amount as well.

The new finding implies that the universe is instead about 15.8 billion years old and about 180 billion light-years wide.
It seams to me that the value of H calculated from distances to stars in M33, a galaxy located 0.9 Mpc away within our Local Group, must not be necessarily the H = 71 km/s Mpc that one obtains from the assumptions of homogeneity and isotropy. I am surprised that I did not found any discussion about this. I could imagine that the background geometry in the Local Group is expanding at less rate than 71 km/s Mpc, that the H = 71 km/s Mpc are reached at higher scales and that, therefore, the conclusion of 15% difference to the accepted value of H cannot be made. See for example this paper that argues that the "universal" value of H is reached at the border of the Local Group.
 
Last edited:
  • #28
hellfire said:
It seams to me that the value of H calculated from distances to stars in M33, a galaxy located 0.9 Mpc away within our Local Group, must not be necessarily the H = 71 km/s Mpc that one obtains from the assumptions of homogeneity and isotropy. I am surprised that I did not found any discussion about this. I could imagine that the background geometry in the Local Group is expanding at less rate than 71 km/s Mpc, that the H = 71 km/s Mpc are reached at higher scales and that, therefore, the conclusion of 15% difference to the accepted value of H cannot be made. See for example this paper that argues that the "universal" value of H is reached at the border of the Local Group.

I think the paper referred to in the original post claims to meaure the distance to M33 more or less directly, and to find a greater distance different than previously accepeted. M33 is not used directly to find a value for the Hubble constant, it is used as an example of a typical galaxy with typical features that can be compared to similar galaxies that are not in the local group. These futher galaxies are then used as part of a scheme that gives a value for the Hubble constant.
 
  • #29
Ok, this would make sense, thanks.
 
  • #30
If the Hubble constant is lower, then CMB anisotropy data require that OmegaM, the ratio of the matter density to the critical density, be higher
why?

(this was added to make the post longer)
 

1. How can the universe be bigger and older than previously thought?

Scientists have recently discovered new evidence that suggests the universe may be larger and older than previously believed. This evidence includes observations of ancient stars and galaxies, as well as measurements of the cosmic microwave background radiation.

2. What does this mean for our understanding of the universe?

If the universe is indeed bigger and older than we thought, it could have significant implications for our understanding of its origins and evolution. It could also challenge some of our current theories and models about the universe.

3. How does this affect the age of the universe?

If the universe is older than previously thought, it would mean that the Big Bang occurred earlier than the estimated 13.7 billion years ago. This could also mean that the universe has been expanding for a longer period of time, leading to a larger and more diverse universe.

4. Could there be other universes beyond our own?

Some scientists believe that the discovery of a larger and older universe could support the idea of a multiverse, where there are multiple universes beyond our own. However, this is still a highly debated and speculative concept in the scientific community.

5. How will this impact future research and exploration of the universe?

The discovery of a potentially bigger and older universe will likely lead to new avenues of research and exploration. Scientists will continue to gather more data and evidence to better understand the universe and its mysteries, and this new information could shape our future understanding and discoveries about the universe.

Similar threads

  • Cosmology
Replies
11
Views
2K
  • Astronomy and Astrophysics
Replies
7
Views
136
  • Cosmology
Replies
1
Views
2K
Replies
12
Views
2K
  • Astronomy and Astrophysics
Replies
4
Views
1K
  • Cosmology
Replies
23
Views
5K
Replies
10
Views
3K
Replies
7
Views
9K
  • Cosmology
Replies
2
Views
2K
Back
Top