Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Does any model predict the mass (or total energy) of the observable Universe?

  1. Dec 28, 2009 #1

    BillSaltLake

    User Avatar
    Gold Member

    During the (recently departed) matter-dominated era, the total mass of the Universe, including dark matter, is believed to have been fairly constant at around 1055 kg. This was the case in a time period from at least the CMB last scattering surface to nearly the present. Mass content is probably about the same now but it may be slowly decreasing due to dark energy. Anyhow, the total mass + energy of the observable Universe is currently probably within a factor of 10 of that. This is about 1063 Planck masses. Where did this very particular number come from?
    Did inflation simply say that this is a random initial condition? If this total mass (or total mass+energy) prediction was derived from fundamental constants, I think the prediction would require a dimensionless factor of 1063 somewhere. Does some model contain a number close to 1063?
     
  2. jcsd
  3. Dec 29, 2009 #2

    Chalnoth

    User Avatar
    Science Advisor

    The total mass of the universe? We don't know this. We can't know this because our vision is limited by how our universe has expanded with time. We just can't see beyond a certain distance.

    However, that said the current mass density of our universe was likely set by two things:
    1. The temperature at reheating (set by the energy density of the inflaton field).
    2. The physics that determined the relative particle abundances afterwards.
     
  4. Dec 29, 2009 #3

    BillSaltLake

    User Avatar
    Gold Member

    Agreed. The total mass is not known (yet). However, we know something about the mass of the observable Universe. As far as I understand it, only a small minority of the total baryonic matter is not directly visible (hiding in the interval between about 1 microsecond at matter creation, and about 400,000 yr at the last scattering surface). Because of the adiabatic expansion of photons at only the square root of t, less than 1% of the baryons are hidden in there. The estimates are around 1080 baryons in the observable Universe (~1053kg + dark matter). This number would have been very constant for a long time. I assume Inflation assigned this number as a random variable, but at about 1063 Planck masses, the number is "way out, man".
     
  5. Dec 29, 2009 #4
    All of the standard cosmological models don't deal with total masses but rather with densities with the assumption that everything that is important can be characterized by a densities of certain type of things.

    Once you have a density evolution then you just add up all of the density within a certain volume and you get a total mass, but that's a derived quantity, and not something you input into the theory. One consequence of inflation is that the boundary at which the universe is not smooth is well outside the observable range so that the universe ends up being a lot, lot, lot bigger than the tiny bit that we can see.

    The initial conditions for the standard model are all densities which are calibrated to observations. The fact that these initial conditions seem to be rather random bothers people. One thing that inflation does is to erase initial conditions. Suppose you started out with a universe that is lumpy. If you inflate it, then the lumpiness disappears, and the exact nature of the pre-inflation lumpiness doesn't matter.

    The total mass measurement is something that one calculates as a consequence of observations, but it's not a fundamental number or even much of a prediction in the standard cosmologies. Also it's not a constant since the amount of matter within the observable universe is going to change over time.

    Also, this points out a handy-dandy "alternative cosmology generation machine." What you do is to find some number which is dimensionless and which you assert is fundamental, then you play with a universe in which you keep that number constant. The most famous of those is Dirac's large number hypothesis, but there have been others....

    http://en.wikipedia.org/wiki/Dirac_large_numbers_hypothesis

    There's also a very good page on non-standard cosmologies, which basically summarizes all of the alternative ideas that have been proposed in the last century.

    http://en.wikipedia.org/wiki/Non-standard_cosmology
     
  6. Dec 29, 2009 #5
    That isn't the case. If you add up the masses of all of the baryonic matter that we see then this is still a lot, lot less than the amount of baryonic matter than LCDM suggests exists (I think it's something like a factor of 1000).

    There are multiple dark matter problems.

    The reason people get the number for baryonic matter, is that this is the number that will make nucleosynthesis and galaxy clustering work. If you look at the number that LCDM gives you and the amount of baryonic matter that you see is less than that number, you just say "well we can't see all of it." If it turns out that LCDM gain a number of the amount of baryonic matter, and you see *more* than that, then we have a big, big problem.

    The estimates are based on theory and not on any observation. It's really important to keep straight what you are observing and what you are calculating otherwise you end up with circular logic. I don't know of any hard numbers that calculate the number of baryons in the observable universe that can set cosmological limits.

    It isn't. If you have an open universe then the number of baryons in the observable universe decreases over time, while in a closed universe it increases as matter moves into your horizon.

    The amount of matter you can see in the observable universe is more a function of what you can see than of the amount of matter. If the standard cosmologies are correct and the universe is open, then the amount of total matter in the universe is a non-calculable quantity that may well be infinite.
     
    Last edited: Dec 29, 2009
  7. Dec 29, 2009 #6

    BillSaltLake

    User Avatar
    Gold Member

    Sometimes I don't state myself very well. There's an apparent contradiction which is almost certainly due to a flawed assumption on my part. I'll number the parts to make it easier to critique.

    1). During the plain-vanilla period from about t = 1 million to 1 billion years, the Universe was matter-dominated according to LCDM (and according to most other models).

    2). In that period, within a factor of 2, the LCDM prediction for matter density was 1/(6piGt2).

    3). We can take that density and multiply it by the observable volume at any time t. (The volume is 4pi(ct)3/3). The result is mass m = 2c3t/(9pi G).

    4). The variable m above is not the mass of the observable Universe, but it should scale with time in a similar way. Note that the expression m is proportional to time.

    5). The "mass of the observable Universe" is an independently measurable quantity, to within an order of magnitude, at least for baryonic mass.

    6). Although the matter-dominated period is over, the expression m = 2c3t/(9pi G) is within an order of magnitude of the present observationally estimated baryonic mass (about 80 billion galaxies x 100 billion stars x one solar mass) and also within an order of magnitude of 6x that baryonic mass (accounting for dark matter in the LCDM), so the formula seems to resemble reality at least crudely.

    7). In a flat, matter-dominated Universe at critical density, we expect the total mass of the observable Universe to be essentially constant, not proportional to t over a 1000:1 time ratio (the interval from one million to one billion years).

    The apparent contradiction is that the mass was simultaneously constant and increasing proportional to t. If the mass was actually m = 2c3t/(9pi G), then there is no weird "1063 Planck masses" to worry about, but that would mean that mass was not approximately constant at that time. Was matter streaming in from a large reservoir at the very edge of the observable Universe? This might resolve the contradiction.
     
  8. Dec 30, 2009 #7
    This is problematic. GR changes geometry so that when you calculate volumes you can't use the simple Euclidean formulas. If you assume everything is flat, I think it will work, but even then there could be something weird.

    Why would we expect this? At time zero, the total mass of the observable universe is zero, since nothing is observable.

    I think the resolution is that as time passes, your horizon increases and you are seeing more and more of the universe, so in some sense matter is streaming into your view. One seconds after the BB, you only see things within one light second, and that's not very much matter.
     
  9. Dec 30, 2009 #8

    BillSaltLake

    User Avatar
    Gold Member

    I've heard the opinion expressed before that during most of the history of the observable Universe, the mass within that expanding volume was roughly constant. The reason for this opinion is that signals arriving from t = 0 have an infinite redshift (although in the dark energy era, this region may become no longer "observable"). Therefore mass cannot(?) enter the observable region, because there is a fixed region with infinite redshift. A coworker who was a Professor specializing in GR and black holes here at the U of Utah expressed that opinion to me about a decade ago (I've since lost touch with him).

    Twofish-quant's opinion appears to be similar in saying that in an open universe, the number of baryons in the observable universe decreases over time, while in a closed universe the number increases as matter moves into the horizon. This would also suggest that in a flat matter-dominated Universe, the number of observable baryons is constant.

    I have never got the Friedmann equations to yield a constant mass under these circumstances. In fact, in a matter-dominated era, the solution gives a uniform baryon density over all regions, including the non-Euclidian geometry. (I.e., constant density in each dV which comprises [c times dt] depth x [r2 times solid angle], where r is calculated from ct). Thus the mass should in fact be proportional to time in the matter-only Friedmann solution.

    Is there something I'm missing, or is idea of 'roughly constant mass' just a misinterpretation of what current models actually predict?
     
  10. Dec 30, 2009 #9
    I do not follow.
     
  11. Dec 30, 2009 #10
    Ohhh... I think see what's happening. There are two different horizons here. There is "all the matter we can see now" or the particle horizon and "all the matter that we will ever be able to see" which is the event horizon.

    The boundary of the particle horizon is the sphere of light that has already reached us, whereas the event horizon is the limit at which the light will *ever* reach us that that's d_e = \int t_0 c / a(t) dt. The first scales as t because as time goes on you are able to see more and more of the universe. The second gives you something different, but I haven't been able to get a constant value out of it. For a flat universe I get that the amount of observable mass goes as 1/t.
     
  12. Dec 30, 2009 #11

    BillSaltLake

    User Avatar
    Gold Member

    To edpell- dark energy, if it exists, should have a certain property that is counter intuitive: as it occupies a larger and larger fraction of the total mass-energy density, dark energy causes the average separation between things to increase. "Things" are masses whose separations between different "things" are not set by interactions. The separation between stars within a given galaxy is set by gravity, so this separation is not increased as a result of the increase of dark energy content. The spacing between superclusters is increased by an increase in dark energy content. A distant supercluster that is visible now may eventually be "pushed" by dark energy to outside the horizon of visibility.
     
  13. Dec 30, 2009 #12
    I do not understand how anything can be pushed outside the horizon of visibility. Any push will be slower than the speed of light so how do we get outside?
     
  14. Dec 30, 2009 #13
    Not necessarily, there's nothing in general relativity that keeps two points in an expanding universe from moving away from each other at faster than the speed of light. The rule is that information can't travel faster than light, otherwise you can send a message back in time to kill your grandfather. As long as you don't exchange information faster than light, the future stays the future and the past stays the past. In the case of expanding galaxies there is no problem because once something slips into the darkness, it's not coming back.

    Whether that's true with general relativity is an open question. My gut feeling is that it's not possible to build a time machine (and that's partly because I haven't gotten any messages from myself), but no one has managed to mathematically prove that yet.
     
  15. Dec 30, 2009 #14
    Yes..........., just looks at a cell, e.g. of a human being. Then look around the body of this human being.... and you will have an idea of the mass of the universe.


    The cell that you just saw is the observable universe, and the rest is what you or we do not see

    Thanks

    M/S
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Does any model predict the mass (or total energy) of the observable Universe?
Loading...