What Could Cause a Big Crunch in the Expansion of the Universe?

  • Thread starter Thread starter TheIsland24
  • Start date Start date
  • Tags Tags
    Expansion Universe
Click For Summary
The discussion centers on the accelerating expansion of the universe and the concept of a potential Big Crunch. It highlights that while the universe's expansion increases its volume and decreases density, gravity acts to decelerate this expansion. However, the presence of dark energy is causing an acceleration in the expansion rate, which complicates the possibility of a Big Crunch. The conversation also clarifies that gravity influences the expansion dynamics, but dark energy behaves differently, contributing to the acceleration. Ultimately, the universe's fate depends on the interplay between matter density and dark energy, with current understanding suggesting a continued expansion rather than a collapse.
  • #121
Thanks for correcting my bad link
marcus said:
Personally I don't think it's smart to make up your mind on the universal time issue before all the returns are in. It comes up in unimodular GR, it comes up in Loll's work. Also in standard cosmology. So I prefer to wait and see on that one.
I am certainly not in a mind-making-up position. Too ignorant. Much better to spectate in hope that such matters will get resolved.
 
Space news on Phys.org
  • #122
Is it generally agreed among the people here that the universe is expanding? (I would say "expanding in size," but if the universe is infinite in size then that would make no sense.)

Among those of you who are "expansionists," is it generally agreed that the expansion taking place within any finite part of the universe is taking place uniformly, in much the same way as the expansion of a lump of rising bread dough?

Is the rate of expansion steady or accelerating?
 
  • #123
LtDan said:
Is it generally agreed among the people here that the universe is expanding? (I would say "expanding in size," but if the universe is infinite in size then that would make no sense.)

Among those of you who are "expansionists," is it generally agreed that the expansion taking place within any finite part of the universe is taking place uniformly, in much the same way as the expansion of a lump of rising bread dough?

Is the rate of expansion steady or accelerating?

The observable universe is not infinite. The rest is a philosophical debate and at the moment so is the question of 'acceleration'.
 
  • #124
marcus said:
heh heh. What you suggest would dramatically reduce the number of black holes. Universe collapses in a crunch before star and galaxy formation can get properly started.
Not if it's still flat. You'd have to change the initial conditions if G was larger, but that was part of my premise.

marcus said:
The Devil is in the details, Chal. How exactly would you change which parameters? Glad to see you are thinking about this! Alex Vilenkin is a worldclass cosmologist at Tufts and he has unsuccessfully tried to disprove this optimality. If you come up with an idea that actually works, I'm sure he would like to know.
It's not all that difficult.

Take the following situation:

1. G is larger by some factor (say two, for an example).
2. The average density of each component of the universe is smaller by the same factor.

If you then hold everything else the same, and define the primordial perturbations as a fraction of the density (such that their amplitude is cut in half along with the overall density), then we should have a pretty easy model to work with.

First, this makes some pretty simple predictions. It predicts, first of all, that the large-scale properties of the universe will be identical: it will last just as long. Structure will form in much the same way. There will, to first order, be just as many objects with mass m/2 in this hypothetical universe as there are in the current universe with mass m. Now, we might have to be careful in that the nonlinearities of gravity might create more dense objects, but I doubt they would create fewer of them. So I think just taking the number of objects with mass m/2 as in the current universe with mass m is a conservative assumption.

Then we have to ask: how many of these objects are black holes? Well, I was unable to find a closed form for the Tolman-Oppenheimer-Volkoff limit for neutron stars (I'm not sure one exists), but we can take a look at the Chandresekhar limit:

m_c = \frac{\omega^0_3\sqrt{3\pi}}{2} \left(\frac{\hbar c}{G}\right)^\frac{3}{2}\frac{1}{\left(\mu_e m_H\right)^2}

So, in this hypothetical scenario where G is twice as large, then, the Chandresekhar limit is 2^{3/2} smaller. If there was to be no change in the number of neutron stars, then the Chandresekhar limit would need to be only half the value it is in our current universe. But this is smaller again by a factor of the square root of two, indicating that many even smaller-mass objects will be made into neutron stars, and so you'll have many, many more.

To avoid this you'd have to show that in actuality, the nonlinearity of gravity makes it so that you end up with far fewer small-mass objects than you'd expect from just taking the simple linear approximation. Or you'd have to show that the TOV limit is actually proportional to 1/G. I sincerely doubt that either is the case.

marcus said:
"preferred" is vague. Smolin's statement is a mathematical description of a local max---optimal tuning in other words. "Life" does not enter into the logic.
But it has to if there is to be any relevance of the claim to observational data.

marcus said:
How many "living beings" are there? Can you make that rigorous? Do you know if there are "dramatically few" or "dramatically many"? What you are called the "correct prediction" is vague, I would say mathematically meaningless. And it is not the correct prediction in any case---to get there you first had to assume we aren't talking about optimal tuning.
Obviously it's a difficult thing to put into numbers. It's certainly beyond the amount of work I've put into it. But it must be done if you're going to try to test claims like this one.
 
  • #125
Rymer said:
The observable universe is not infinite. The rest is a philosophical debate and at the moment so is the question of 'acceleration'.
The accelerated expansion of the universe is hardly a philosophical debate. It's an observational fact.
 
  • #126
LtDan said:
Is it generally agreed among the people here that the universe is expanding? (I would say "expanding in size," but if the universe is infinite in size then that would make no sense.)
Absolutely. There is just no reasonable doubt that the universe is expanding. The evidence is quite conclusive.

LtDan said:
Among those of you who are "expansionists," is it generally agreed that the expansion taking place within any finite part of the universe is taking place uniformly, in much the same way as the expansion of a lump of rising bread dough?
This is unclear. We can only make definitive claims about our region of the universe, and cannot seriously consider very different regions.

LtDan said:
Is the rate of expansion steady or accelerating?
The evidence says accelerating.
 
  • #127
Hi Chal, I suspect your argument fails but I am glad to see you are working on it. To recap here, you want to show that the parameters we have now are not a local optimum for bh production. So you must show a small change that would increase production---I would say 1% or 2% change would be appropriate, he is talking about a local optimum in parameter space. Remember he has an evolutionary process in mind and evolution finds hilltops in the fitness landscape--local maxima--not the highest mountains in sight. But I would be interested if you could think of even a 10% change that would have resulted in greater hole abundance.

Your original plan was to make gravity 100 times stronger.

Chalnoth said:
... Make G larger by a couple of orders of magnitude, and all matter that collapses will collapse into black holes...

That wasn't really very relevant. Now you want to make it twice as strong. That still wouldn't address local optimality (which is really a first order derivative or gradient thing, mathematically) but OK. You would deserve a big congratulations if you could even show an improvement from doubling the strength of gravity.

And to keep the universe from collapsing you say you want to reduce the density of each component to be half of what it is at present. As follows:
Chalnoth said:
It's not all that difficult.

Take the following situation:

1. G is larger by some factor (say two, for an example).
2. The average density of each component of the universe is smaller by the same factor.
...

You don't say what you want the expansion history to be, or what the present value of the Hubble rate is supposed to be. There's not enough detail for me to tell what happens in your universe. I do notice that different components scale differently with expansion. You just now cut the dark energy density in half (it is one component) and that density is not affected by expansion. It will be constant all the way back to the start of expansion. But the matter density is affected (as the inverse cube of the scale factor).

By reducing the density of ordinary and dark matter, you do seem to me to run the risk of getting fewer black holes (when what you want is more.)

Just isn't enough here to decide if your example would obtain more black holes. In any case it would not disprove Smolin's conjecture of local optimality, but keep trying.

It would be nice if you or anybody could find a case that would disprove it! Competent people have tried. Notably the worldfamous cosmologist Alex Vilenkin, who published a paper outlining his attempt to shoot the conjecture down. I would be delighted to see another such attempt by someone else---if they could make it stick, of course.

I suspect you do not have a counterexample (a modification that would produce a greater abundance of holes) but I encourage you to work harder on it and show more detail---particularly the expansion history, the Hubble parameter etc.
 
Last edited:
  • #128
marcus said:
Hi Chal, I suspect your argument fails but I am glad to see you are working on it. To recap here, you want to show that the parameters we have now are not a local optimum for bh production. So you must show a small change that would increase production---I would say 1% or 2% change would be appropriate, he is talking about a local optimum in parameter space.
The argument I used is independent of the scale of the change. It works as well for a 0.01% change as a 1000x change.

To work it out in more detail, I suspect you'd need some reasonably-large N-body simulations that include hydrodynamics, combined with better knowledge of the TOV limit, specifically how it scales with G. But I sincerely doubt you'd find anything but an exacerbation of these effects I noted, making even more black holes.

marcus said:
You don't say what you want the expansion history to be, or what the present value of the Hubble rate is supposed to be.
Well, true, I forgot to mention that in this hypothetical scenario, the universe would remain flat. But that could have either been clear from the context, or gleaned from my previous post, where I stated this explicitly. In that situation, the expansion history would be identical to our own universe, a fact chosen explicitly to make it easy to analyze. Linear structure formation should also progress in an identical manner to our own universe, with the differences coming in once you get to non-linear structure formation and the evolution of compact objects.

marcus said:
By reducing the density of ordinary and dark matter, you do seem to me to run the risk of getting fewer black holes (when what you want is more.)
Well, since structure formation runs in the same way as before, just with things having half the mass (due to half the original density), a naive analysis would equate each object with mass m in our universe to an object with mass m/2 in the hypothetical universe. Then we ask the question: Okay, if we take an object with mass m in the present universe that is not a black hole, would the equivalent object with mass m/2 in the hypothetical universe collapse into a black hole?

I can't answer that definitively, but I can show that it seems highly likely that for some objects, this is the case, as Chandresekhar limit scales as 1/G^{3/2}, meaning that objects equivalent to those in our universe with about a factor of 1.414 lower mass than the current Chandresekhar limit would become neutron stars. Given that the physics which govern the Chandresekhar limit and the TOV limit are very, very similar (they're both based upon the pressure that degenerate matter can support), it is not unreasonable to suspect that the TOV limit behaves in a similar fashion, meaning even though the masses of the objects are lower, even lower-mass objects become black holes, so you get many more of them.
 
  • #129
Umm, if you decrease the hierarchy between gravity and the other forces, quite obviously you are going to increase black hole production. This is completely general in any local cosmological neighborhood and follows from both a semiclassical treatment and the conjectured (but pretty conservative) asymptotic darkness proposals. You can tweak and escape this conclusion by playing around with cosmological initial conditions and values (say the cc), but then you are right back to putzing around with idiosyncratic probability measures and wondering about how long and how big such and such a baby universe lasts for, poincare recurrence times and the like.

Related is the whole question of exactly how do you count black holes anyway, which arguably is just as problematic and vague.
 
Last edited:
  • #130
If Chalnoth's twidling with the g as a constant and the average mass density went through, wouldn't it only show that these particular parameters could have had a wide range and so drop out of the basket of 30 that need to be entrained to maximal black hole production? You would still have the same problem of finetuning for the remaining constants?

It is really about how a package of constants hang together. So playing two scaling factors off against each other is trivial. Getting potentially 20 or 30 to explore a mutual landscape of fitness would be a rather more multi-dimensional and non-linear affair?

Of course mass density does depend on some of those 30 parameters, like mixing angles. Can they then be subsumed into a single thing - average mass density?
 
  • #131
Apeiron, as I recall Smolin's conjecture explicitly only involves the 30-or so dimensionless parameters of the standard particle and cosmo models.
Newton's G is not part of what you get to play with. So you are right in a sense, as you suggested, G is not in the set.

You can think of the Planck units as the units in which the other stuff is expressed, changing G, or hbar, or c just changes the units, not the physics. The real stuff, like the mass of the electron, is expressed as a ratio to the Planck mass (that ratio is one of the dimensionless Thirty).

John Baez has an online essay on the dimensionless parameters, as I recall. You've probably seen it. If not, say and I could get a link.
 
  • #132
In case anyone else would like to follow Chalnoth's and Haelfix's example and try your hand at disproving Smolin's conjecture of bh optimality, I will link this article by Frank Wilczek, Martin Rees, Max Tegmark, and Tony Aguirre, that lists the 31 basic fundamental dimensionlesss constants you get to play with:
http://arxiv.org/abs/astro-ph/0511774
Dimensionless constants, cosmology and other dark matters
Max Tegmark (MIT), Anthony Aguirre (UCSC), Martin J Rees (Cambridge), Frank Wilczek (MIT)
29 pages, 13 figs; Phys.Rev.D73:023505,2006
(Submitted on 29 Nov 2005)
"We identify 31 dimensionless physical constants required by particle physics and cosmology, and emphasize that both microphysical constraints and selection effects might help elucidate their origin. Axion cosmology provides an instructive example, in which these two kinds of arguments must both be taken into account, and work well together. If a Peccei-Quinn phase transition occurred before or during inflation, then the axion dark matter density will vary from place to place with a probability distribution. By calculating the net dark matter halo formation rate as a function of all four relevant cosmological parameters and assessing other constraints, we find that this probability distribution, computed at stable solar systems, is arguably peaked near the observed dark matter density. If cosmologically relevant WIMP dark matter is discovered, then one naturally expects comparable densities of WIMPs and axions, making it important to follow up with precision measurements to determine whether WIMPs account for all of the dark matter or merely part of it."

It would seem a fairly reliable paper, Wilczek is Nobel laureate, Rees is the UK Astronomer Royal.
 
Last edited:
  • #133
You are free of course to redefine G's role as a constant all you want, but the ratio between physics scales is something that is tuned as they say for life. So something like the ratio between the gravitational force and the strong force is a physical quantity that can be adjusted in principle.

This actually shows up when you study things like large extra dimensions, where the hierarchy scale is reduced drastically and where you might expect to see microscopic black hole production in accelerators. There are good reasons to believe this is not the case in our world (at least the original versions of ADD) but people do take it seriously in phenomenology and vast literatures exist on the subject.

For CNS, you can then start to ask questions like 'Do those microscopic black holes count in the fecundity measure?" "Do they lead to larger relic densities of stellar black holes over the age of the universe?", and so forth.
 
  • #134
In anyone would like to take the conjecture seriously and try to prove it false, a good source is http://arxiv.org/abs/hep-th/0407213

This specifies black holes formation "from massive stars".
The conjecture concerns the standard dimensionless constants being optimal for bh formation from massive stars.

See page 31-33.

An earlier post suggested changing the "hierarchy of forces" to make gravity stronger compared to others---say the strong force. Smolin appears to have thought of that in his first papers on this in the 1990s. This amounts to decreasing \alpha_{strong}.
What he observes is that this would destabilize nuclei which participate or expedite massive star formation. In other words it appears to be counterproductive to change the "hierarchy of forces" in the way suggested in the previous post.

You might want to read pages 31-33 rather than rely hearsay. It comprises sections called "Successes of the Theory", "Previous Objections", and "Why a Single Heavy Pulsar Could Refute [the optimality conjecture]"
 
Last edited:
  • #135
Eh? If the ratio between alpha strong and alpha EM/weak remains the same, nuclei will not be destabilized nor will nuclear fusion processes be altered. You would have to change gravities force by some 15+ orders of magnitude to have any effect on that whatsoever.
 
  • #136
You are acting like you haven't read the paper. He deals with that case as well (where the weak coupling constant is decreased.)

I don't have time to transcribe every case here. So please read the paper. If you want to seriously address this conjecture and try to refute it, then I suggest you read and think about what he has to say instead of expecting me to transcribe it line by line.

The paper I cited has references to earlier papers that go over in more detail what seems to go wrong playing with the coupling constants as you suggest, and much else besides.

Trying to disprove the conjecture could be a good research project for someone (with the ability to address it effectively). If anyone could disprove it, the result would presumably be publishable, and would make them well-known.
Smolin cites previous attempts to do the sort of thing you are suggesting---find a way of adjusting some of the 30-odd parameters that would have resulted in more holes.
 
  • #137
Marcus, try to understand that I am not talking about changing the strength of the strong force, the electromagnetic force or the weak force. Merely Newtons constant size relative to them (keeping the other three's interrelations fixed) since that was what was brought up earlier in the thread. The paper you keep pointing out is talking about an entirely different scenario (namely varying details of the nuclear forces, say the mass difference between neutrons and protons and so forth) it explicitly does not deal with gravities strength at all!

Anyway, I generically disagree with the paper (and in fact pretty much the entire literature dealing with probability distributions in multiverses, whether they be anthropic, CNS, Stringy or anything else). Incidentally someone raised this exact point about G to Lee when he gave this lecture several years ago at a conference I was in.. He then gave a rather foggy argument that I forget exactly, based on galactic physics and stellar evolution (note this conveniently sidesteps the microscopic bh production point)

In fact, I completely agree with him that stellar black hole production rates are debatable. For instance, the most obvious thing that occurs when you change gravities strength (fixing everything else) is to change the details of the HR diagram. Stars will tend to live shorter lifetimes. When stars live shorter lifetimes, the dynamics and timescales of molecular cloud formation and the ensuing stellar formation are altered. Why? B/c the clouds rely on ionizing radiation from supernovae to seed the conditions for future star births. Since, the dynamics of supernovae are also altered when you mess with G it requires numerical simulations to handle quantitatively. In short, a complicated mess with lots of competing feedbacks that no one understands sufficiently well. Having said that, you get so much more clumping and seeds, that its a little hard to see the feedbacks competiting with change of a couple orders of magnitude to G, but anyway I won't debate this part of the hypothesis.

Still generically you do expect a lot of micro black hole production with a reduced hierarchy and that's conveniently ignored in the hypothesis and gets into exactly how you measure the distribution of black holes in the first place. Do mergers count as one or two? What about a single black hole with an event horizon the size of our visible universe, how do we count that. Is their a mass cutoff that we are considering here (keeping in mind that different universes might have different mass hierarchies), what timescales are we talking about? Do we correct for recurrence times. And so on and so forth. Its essentially the same sort of nonsense that people argue about when dealing with anthropic multiverse measures.
 
Last edited:
  • #138
Haelfix your speculation and reasoning is interesting but unrelated to Smolin's conjecture as stated for example in the paper I just cited.
There he conjectures that the dimensionless parameters are fine tuned for astrophysical black holes----those produced by collapse of massive stars.


So varying G is irrelevant (although Chalnoth mentioned it) because
G is not dimensionless.

And talking about microscopic black holes is irrelevant.

What somebody needs to do is come to grips with the actual conjecture and find some reason to dismiss it. That is, make a serious effort to refute, as I take it Vilenkin did and also apparently Joe Silk in 1997. Silk was at UC Berkeley then, later moved to Cambridge. He is one of the most eminent cosmologists in the world. His attack on Smolin's conjecture was apparently published in Science. But it didn't cut the mustard. So Vilenkin tried a few years later, after Smolin's 2004 essay appeared in a book of scholarly papers published by Cambridge U.P.

==============
EDIT to reply Chalnoth.
Dear Chalnoth, the conjecture is the conjecture as stated. It concerns astrophysical black holes resulting from a certain process---the collapse of massive stars. What you are proposing to do (or Haelfix, if he still considers that an appropriate reaction) is not science. In science, you do not alter the hypothesis in order to reject it. :biggrin:

No one is arbitrarily "throwing out" micro holes, because they were never included in the conjecture in the first place. :smile:
 
Last edited:
  • #139
marcus said:
What somebody needs to do is come to grips with the actual conjecture and find some reason to dismiss it.
Actually, Haelfix did just that through reference to microscopic black holes. Because you can't just arbitrarily throw those out without reason to do so.
 
  • #140
marcus said:
Haelfix your speculation and reasoning is interesting but unrelated to Smolin's conjecture as stated for example in the paper I just cited.
There he conjectures that the dimensionless parameters are fine tuned for astrophysical black holes----those produced by collapse of massive stars....

What somebody needs to do is come to grips with the actual conjecture and find some reason to dismiss it. ...

==============
EDIT to reply Chalnoth.
Dear Chalnoth, the conjecture is the conjecture as stated. It concerns astrophysical black holes resulting from a certain process---the collapse of massive stars. What you are proposing to do (or Haelfix, if he still considers that an appropriate reaction) is not science. In science, you do not alter the hypothesis in order to reject it. :biggrin:

So far no refutation has been sustained, either in the playpen or outside in the real intellectual world (by competent experts who have tried).

It may very well be wrong. If someone could show that they could gain considerably in reputation. (I for one would be delighted.)

As explicitly stated, this is about dimensionless parameters being at a local max for astro holes. A good source is the 2004 paper. But there is also a 2006 paper, and a couple of scholarly books with chapters devoted to it
Bernard Carr's Universe or Multiverse? (Cambridge 2007)
Rudy Vaas Beyond the Big Bang (Springer 2009)
The 2006 paper is
http://arxiv.org/abs/hep-th/0612185
The 2004 paper, in case someone didn't get the link earlier, and hasn't read the relevant parts yet, is
http://arxiv.org/abs/hep-th/0407213

In everything I've seen the conjecture has been explicit and consistent with what I just said (dimensionless, astrophysical). It rests on quite a bit of discussion which anyone can read if they wish.
Naive to think that "you can't" propose such a conjecture.

Another book is in preparation---Smolin and Unger---about this and related topics. Time, for example.

I see that Springer has finally sent out reviewer copies of "Beyond the Big Bang". Here is the table of contents.
http://www.springer.com/astronomy/general+relativity/book/978-3-540-71422-4?detailsPage=toc
Apparently won't be in the bookstores until December 2009.
 
Last edited by a moderator:
  • #141
marcus said:
It may very well be wrong. If someone could show that they could gain considerably in reputation.
Not very likely. It'd be more of an interesting footnote than something that actually makes a scientist. Showing something false which is already extremely hypothetical and unlikely isn't particularly interesting.

What is interesting is when a new result shows that old, widely-accepted ideas are false, especially if a better theory is proposed in their stead. Showing this bit of speculation false is nothing of the sort, except to a select few who are interested in tackling such esoteric problems.

Edit: I'd also like to add that as far as I'm concerned, the conjecture that our universe is very near an actual local maximum for the production of stellar black holes is so incredibly unlikely that I see no reason to spend much time investigating it.

Why is my prior probability on this eventuality so low? Well, it just comes down to this: life seems to be pretty special. It seems to be required that a large variety of physical processes be just so for life to even exist meaning that life traces out a very tiny fraction of this thirty-dimensional parameter space we're talking about.

As a result, it seems rather ludicrous to me that this tiny region of the large parameter space will just happen to also be the optimum of something else, like stellar black hole production.
 
Last edited:
  • #142
Excellent! So you have given up trying to disprove the conjecture.
Instead you declare that it doesn't interest you because according to your subjective judgment it is unlikely to be true.

This is fine. Everybody should be guided by their personal intuition in these matters. :smile:

Chalnoth said:
... as I'm concerned, the conjecture that our universe is very near an actual local maximum for the production of stellar black holes is so incredibly unlikely that I see no reason to spend much time investigating it...
 

Similar threads

  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 28 ·
Replies
28
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
Replies
7
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 17 ·
Replies
17
Views
2K
  • · Replies 20 ·
Replies
20
Views
3K
Replies
1
Views
2K
  • · Replies 24 ·
Replies
24
Views
3K