What Could Cause a Big Crunch in the Expansion of the Universe?

  • Thread starter Thread starter TheIsland24
  • Start date Start date
  • Tags Tags
    Expansion Universe
Click For Summary
The discussion centers on the accelerating expansion of the universe and the concept of a potential Big Crunch. It highlights that while the universe's expansion increases its volume and decreases density, gravity acts to decelerate this expansion. However, the presence of dark energy is causing an acceleration in the expansion rate, which complicates the possibility of a Big Crunch. The conversation also clarifies that gravity influences the expansion dynamics, but dark energy behaves differently, contributing to the acceleration. Ultimately, the universe's fate depends on the interplay between matter density and dark energy, with current understanding suggesting a continued expansion rather than a collapse.
  • #91
marcus said:
I disagree. The whole point of Smolin's discussion is that it does NOT refer at any point to life.
Then that's a serious problem. Because anything about our universe that is required for the existence of intelligent observers is not something we have a right to be surprised about. And stars certainly appear to be a requirement. So we have no right to be surprised about their existence, which makes an attempt to solve the question of why they exist in the first place to be solving a problem that doesn't exist.
 
Space news on Phys.org
  • #92
apeiron said:
...Davies from http://arxiv.org/ftp/astro-ph/papers/0403/0403047.pdf

"Another multiverse model has been discussed by Smolin14. He proposes that “baby”
universes can sprout from existing ones via the mechanism of gravitational collapse.
According to the classical picture, when a star implodes to form a black hole, a spacetime
singularity results in the interior of the hole. Smolin suggests that a quantum treatment
would lead instead to the nucleation of a tiny new region of inflating space, connected to
our space via a wormhole. Subsequent evaporation of the black hole by the Hawking
process severs the wormhole, thereby spatially disconnecting the baby universe from
ours. Furthermore, following Wheeler15, Smolin proposes that the violence of
gravitational collapse might ‘reprocess’ the laws of physics randomly, producing small
changes in values of parameters such as particle masses and coupling constants. Thus the
baby universe will inherit the physics of its parent, but with small random variations,
similar to genetic drift in biological evolution. This process could continue ad infinitum,
with baby universes going on to produce their own progeny. It would also imply that our
universe is the product of an earlier gravitational collapse episode in another universe.
Those universes whose physical parameters favoured black hole production, for example
by encouraging the formation of large stars, would produce more progeny, implying that
among the ensemble of universes with all possible variations of the laws of physics, those
universes with prolific black hole production would represent the largest volume of
space." ...

I'm aware of Smolin's ideas about cosmic evolution via bouncing black holes. I've never taken them very seriously. Too much Hat and not enough Cattle, as they say in Texas. But in this thread it seems that there are people who do take such stuff seriously. And the Davies quote you kindly contributed puts Smolin's views succintly.

Perhaps one of the posters in this thread could clarify the following difficulty I have:

The bounce of a old black hole into a new universe is described above specifically as a process A process is something that takes time. But just whose "time" is this?

That of observers in the old universe outside the black hole? Can't be. For them any process involving the actual formation or growth of a black hole; the actual tranfser of mass through its horizon, takes an infinite time (although they don't find the external gravitational field of infalling matter to be static).

In fact it is hard to see how any such observers can claim that any black holes "exist", since existence involves being present "now". And "now" is not a universal instant when there are black holes bouncing around.

That of observers inside an horizon? Lucky them if their infalling is accompanied by enough matter to render tidal forces innocuous!

God's time? Then physics has morphed into theology --- more Hat!

Isn't Smolin (or Davies) being rather sloppy here --- or perhaps just too anthro'centric?
 
Last edited:
  • #93
oldman said:
Perhaps one of the posters in this thread could clarify the following difficulty I have:

The bounce of a old black hole into a new universe is described above specifically as a process A process is something that takes time. But just whose "time" is this?
I don't think that's a valid complaint. The word "process" in physics just denotes the behavior of some physical system or other.

oldman said:
That of observers in the old universe outside the black hole? Can't be. For them any process involving the actual formation or growth of a black hole; the actual tranfser of mass through its horizon, takes an infinite time (although they don't find the external gravitational field of infalling matter to be static).

In fact it is hard to see how any such observers can claim that any black holes "exist", since existence involves being present "now". And "now" is not a universal instant when there are black holes bouncing around.
The thing about general relativity is that it demonstrates that there is no such thing as a global "now". "Now" is only explicitly-defined at a singular point, and observers at one point can't make any definitive statements about what is happening "now" far away. That's not to say you can't define a "now" far away, rather that the definition is arbitrary.

oldman said:
That of observers inside an horizon? Lucky them if their infalling is accompanied by enough matter to render tidal forces innocuous!
I don't quite see why this is important. The time coordinate is going to get quite messy here anyway, and you really shouldn't expect naive analyses to capture the nature of how time applies to this hypothetical process.
 
  • #94
Chalnoth said:
I don't think that's a valid complaint. The word "process" in physics just denotes the behavior of some physical system or other.

But "behaviour" also implies finite duration --- can you give me an example of behaviour without such duration? Perhaps you are thinking along the lines of a "block universe" such as envisaged by Julian Barbour?

The thing about general relativity is that it demonstrates that there is no such thing as a global "now". "Now" is only explicitly-defined at a singular point, and observers at one point can't make any definitive statements about what is happening "now" far away. That's not to say you can't define a "now" far away, rather that the definition is arbitrary.

Yes. But when you ordinarily talk about something that "is" or "exists", say a black hole at the centre of our galaxy, you're implying that it does so "now". You can define "now" by choosing a section of spacetime in which you define events to be simultaneous, according to some agreed protocol. E.g. in a FLRW universe you could say "now" everywhere is when the CMB has the same temperature as that you find. Is this what you mean by an arbitrary definition?

What do you do when a Schwarzschild event horizon --- which dilates duration to infinity for external observers -- separates the universe into different domains? How can you then even talk about evolution, such as "black hole formation", or "black hole growth", in ordinary terms which imply duration? As Davies does.

I don't quite see why this is important. The time coordinate is going to get quite messy here anyway, and you really shouldn't expect naive analyses to capture the nature of how time applies to this hypothetical process.

Sounds as if it is too tricky to describe clearly. But thanks.
 
  • #95
oldman said:
But "behaviour" also implies finite duration --- can you give me an example of behaviour without such duration? Perhaps you are thinking along the lines of a "block universe" such as envisaged by Julian Barbour?
That's only a failure of our language. It's easy enough to describe the behavior of physical systems even if you wish to consider a hypothetical system without any time coordinate.

oldman said:
Yes. But when you ordinarily talk about something that "is" or "exists", say a black hole at the centre of our galaxy, you're implying that it does so "now". You can define "now" by choosing a section of spacetime in which you define events to be simultaneous, according to some agreed protocol. E.g. in a FLRW universe you could say "now" everywhere is when the CMB has the same temperature as that you find. Is this what you mean by an arbitrary definition?

What do you do when a Schwarzschild event horizon --- which dilates duration to infinity for external observers -- separates the universe into different domains? How can you then even talk about evolution, such as "black hole formation", or "black hole growth", in ordinary terms which imply duration? As Davies does.
Well, you just have to recognize that reality is a bit more subtle.

oldman said:
Sounds as if it is too tricky to describe clearly. But thanks.
I wouldn't say that. I'd say it's too tricky to describe simply. Any truly accurate description is going to be messy and complex and include lots of mathematics. Now, I suppose it is possible that one could find a contradiction in the math there somewhere (I haven't looked at it in detail myself), but finding an apparent contradiction in a simplified description isn't going to invalidate the hypothesis. It just means the simplified description is likely missing some important details of the physics which it is trying to describe.
 
  • #96
Chalnoth said:
We can never see the photons that leave it after a certain time (not simply given by its recession velocity, but instead by the future expansion history of the universe). But this doesn't mean that we cease to see it: we see its after-image forever. It just gets dimmer and dimmer. And, as near as we can tell time slows and slows for this image as time goes forward, and the apparent age of the object in our after image asymptotically approaches the age at which the object crossed our horizon.

Note, however, that this is only true in an accelerating universe. If the universe were not accelerating, or stopped accelerating at some point in the future, then there would be no future horizon, and, given infinite time, we would be able to see the full history of all objects in the universe.

Acceleration has nothing to do with it.
 
  • #97
Rymer said:
Acceleration has nothing to do with it.

Boggle. Of course it does. A non-accelerating expansion does not have this horizon. You only get the horizon with an accelerating expansion.
 
  • #98
apeiron said:
Marcus, here is what Paul Davies said about Smolin's multiverse. A good summary.
...
Apeiron, I think Davies account misses the main point and is not at all a summary, much less a good one. One problem is simply that Davies article is old (March 2004) and references nothing of Smolin's but a popular 1997 book.

Evidently the discussion has changed over the course of time---maybe your reference to the evolution of viruses is pertinent.

What Smolin has emphasized in recent talks about this, and what interests me about it, is what I consider to be a hard empirical hypothesis. There are thirty-some parameters that go into either the current cosmo model or the standard particle model. Perhaps fewer, but call them the Thirty, for short. Are the Thirty optimal for black hole abundance?

This has no claim in it about the existence of a multiverse, or about what happens down a black hole. It represents an hypothesis to be tested. If it is false then it should be possible to find a parameter which, if it had been slightly different, would have resulted in greater hole abundance.

I have to go, back later today.
 
  • #99
Ah, okay. This is an idea I've heard before. I find it very interesting to think about. Now, I also find it highly speculative and almost certainly wrong. But it is very interesting.

One thing I've been thinking about is that in order to falsify the hypothesis, you don't only need to show that tweaking the parameters in a certain direction would create more black holes, but you also need to show that there is still at least the possibility of life in that direction (a difficult thing, but at least we can put outer bounds on it).

What we expect to see from this hypothesis is that the parameters are right on the outer bounds of what's possible for life, with black hole number being maximized given the constraint of life existing. Not because life has to exist, but because living beings like ourselves can only observe regions where it does: the hypothesis likely predicts that by far most of these "baby universes" will not be conducive to life.
 
  • #100
BTW my nickname for the 30-odd essential parameters that rule the standard models of cosmo and particle physics is a classical joke :redface:
In Greek history the Thirty were a group of oligarchs which governed Athens for a brief period around 404 BC.

The hypothesis is very simple and says nothing about Life or "Multiverses". It is simply the falsifiable conjecture that the Thirty are optimal for black hole abundance.

We are being challenged simply to falsify this. That means to find a small change in the parameters which would have led to more black holes.

This challenge was published in 1992. People (like Alex Vilenkin) have tried to shoot it down, but as yet it has not been shot down. The longer it survives the more likely people are to see it as something requiring explanation.
 
Last edited:
  • #101
marcus said:
BTW my nickname for the 30-odd essential parameters that rule the standard models of cosmo and particle physics is a classical joke :redface:
In Greek history the Thirty were a group of oligarchs which governed Athens for a brief period around 404 BC.

The hypothesis is very simple and says nothing about Life or "Multiverses". It is simply the falsifiable conjecture that the Thirty are optimal for black hole abundance.

We are being challenged simply to falsify this. That means to find a small change in the parameters which would have led to more black holes.

This challenge was published in 1992. People (like Alex Vilenkin) have tried to shoot it down, but as yet it has not been shot down. The longer it survives the more likely people are to see it as something requiring explanation.

I like your Thirty.

Question all this seems to be assuming there are really 'black holes'. I'm not sure anymore what the term means. At one time it seemed to mean that NOTHING could escape. But my current understanding is that it is not the case. There are 'quantum' level events that do escape (at least these and maybe more).

The reason I bring up the question is that, is this the prediction 'black holes' or just very super-compacted matter objects? And how would the difference effect such postulated tests?

Not clear what is being 'counted'.
 
  • #102
sylas said:
Boggle. Of course it does. A non-accelerating expansion does not have this horizon. You only get the horizon with an accelerating expansion.
A non-accelerating expansion most certainly does have a horizon, as does a non-expanding universe! All that is required to have a horizon for the "observable" universe is for the universe to be larger in light years than it is old in years. Ie, a universe that is 1 year old and not expanding will have an observable size of 1 ly, a universe 14 billion years old will have an observable size of 14 billion light years.

For a constantly expanding universe that started at a big bang, that requires an expansion to always exceed the speed of light.
 
  • #103
russ_watters said:
A non-accelerating expansion most certainly does have a horizon, as does a non-expanding universe!
Technically, a non-accelerating universe has a past horizon, meaning there exist objects that have never been in causal contact with us. It doesn't, however, have a future horizon, meaning that eventually everything will be in causal contact with us (or rather, with our position in space-time...we certainly won't be around).
 
  • #104
Hi Rymer and Chalnoth!

Chalnoth, it occurs to me that Smolin may have confused things by putting the cart before the horse and talking too much about a possible explanation which could be offered, if his very interesting conjecture were established.

The conjecture (concisely stated at the bottom of page 29 of hep-th/0407213) is that the Thirty are optimal for black hole abundance.

Nothing said about Life or Multiverses or Bounce Scenarios :-D. There is a clear challenge to disprove the conjecture by finding a small modification of the Thirty that would have increased the count---say by lowering the collapse threshold so that a less massive star could collapse to hole, and thus a higher percentage of stars could form holes.

Or by making it easier for clouds of gas to condense in the first place so that there would have been more and larger stars to eventually form holes.

It would be great if someone could prove the conjecture false.

On the other hand if it is true, then it is a peculiar instance of fine-tuning and definitely calls for some explanation. But first we should see if it is true or not.

That is my point of view, in any case. Sorry if that wasn't clear at the start.
=======

Rhymer, I'm glad you liked the reference to the Thirty Tyrants who ruled Athens for a brief period. I hope that the reign of this Thirty is also curtailed and that our standard models can be simplified so that the number of essential parameters is reduced. Fewer free parameters means more elegant, and also more predictive.

Rymer said:
...The reason I bring up the question is that, is this the prediction 'black holes' or just very super-compacted matter objects? And how would the difference effect such postulated tests?

Not clear what is being 'counted'.

Rhymer black holes are a regularly catalogued astronomical object. No one knows what happens inside them, though there are various models. Astronomers know black holes when they see them, and they have their customary ways of distinguishing them from other compact objects like neutron stars. There is a fairly complete picture of what conditions lead to their formation.

So that is what is being counted.

However you might still be wondering what goes on inside---what the competing models are. One can chuck General Rel out because it suffers from a singularity (stops computing meaningful numbers, blows up and says infinite curvature and other unrealistic stuff.)
But there are other models and you can find them at arxiv.org.
Some names of researchers:
Leonardo Modesto
Kevin Vandersloot
Christian Boehmer
Dah-wei Chiou

These people model black hole collapse using a formalism that does not blow up and does not develop singularities. It's work in progress. The research papers consider gradually more and more general cases. The effort is to gradually get rid of simplifying assumptions like homogeneity and isotropy. What happens if the collapse is slightly lopsided? And so on.
If I'm not mistaken, both Modesto and Chiou delivered papers on this at a conference in Beijing last week.
(Of course Smolin must be happy with this research because in some cases what they found was a bounce resulting in a new region of expanding space, but it is still early days. One should not count the chickens before the eggs hatch---nobody knows how this line of research will go.)
 
  • #105
I think you're failing to get my point again, so let me see if I can restate it.

If we want to falsify Smolin's hypothesis, we must bear in mind that only a limited parameter space is available for investigation: that part of the parameter space that can be observed.

It is entirely possible that the the optimum point in the parameter space is very, very far outside of the region that can possibly be observed. But that doesn't make the hypothesis false, because there might be tons of stuff outside of what can be observed that is more in line with the hypothesis. Only a small fraction will be in the possibly-observable region of the parameter space, but that's okay. There's nothing that says the maximum must be within the observable region.

So, to reach a solution, we are required to stick to just the possibly-observable region, and see if the region we observe is tilted in the direction within this parameter space we expect, given this hypothesis of the universe being fine-tuned for the production of black holes.

And what defines this possibly-observable region? Well, that's the region in the parameter space where observers can be found: the region where complex life is possible. Nothing else is observable, so it doesn't matter for hypothesis-testing.

------------

With that said, let me make use of the above to argue why I think that Smolin is unlikely to be onto anything of importance here. To do this, I'll simply point out what is required for lots and lots of black holes. You need:

1. A universe that lasts a long time. If it recollapses very quickly, obviously few if any black holes will form.
2. A universe that has structure formation. Stuff has to get clumpy before black holes have a chance to form.
3. At least some of the matter needs to be dissipative. That is, it has to experience friction so that it can collapse more readily.

And I believe that's all of it. Point (1) is met by our universe being nearly flat, and the cosmological constant not being too large. Point (2) is met by a combination of the magnitude of primordial perturbations, the existence of normal + dark matter, and the cosmological constant being very small. Point (3) is met by the existence of the electromagnetic force.

The problem I have with this, however, is that these are all also requirements for life. And if the requirements for life include the requirements for black holes, then clearly any universe with life in it is going to have lots and lots of black holes. So even if there is an appearance of fine tuning, I don't think we can take it seriously.

The only real way to test it would be to demonstrate succinctly that within the limited range of the observable parameter space, this theory predicts that universes on one end of the parameter space are vastly, vastly more likely than universes on the other, such that we should find ourselves strongly to one end of said parameter space.
 
Last edited:
  • #106
russ_watters said:
A non-accelerating expansion most certainly does have a horizon, as does a non-expanding universe! All that is required to have a horizon for the "observable" universe is for the universe to be larger in light years than it is old in years. Ie, a universe that is 1 year old and not expanding will have an observable size of 1 ly, a universe 14 billion years old will have an observable size of 14 billion light years.

For a constantly expanding universe that started at a big bang, that requires an expansion to always exceed the speed of light.

There are several horizons you can speak of. The one that was being discussed in [post=2291438]msg #16[/post] is often called the "event horizon". It partitions spacetime into (A) events we have "seen" or that we can "see" if we only wait long enough, and (B) events that we can never see, no matter how long we wait.

If someone speaks of a photon never being able to reach us, then that means they are proposing it is past the event horizon.

The event horizon only exists in an accelerating universe.

It doesn't make sense to speak of an expansion exceeding the speed of light. Expansion is not a velocity, in those units. The units of expansion are basically inverse time. Of it is given in km/sec/Mparsec. But you can also give it as sec-1, and it means the inverse of the time it would take the double in scale factor at the current expansion rate.

There are always regions in the universe that are receding faster that the speed of light. (Using proper time, proper distance co-ordinates.) It is perfectly possible to see a galaxy which is now and always has been receding with a recession velocity greater than c. As long as the expansion is not accelerating, a photon will be passing into new regions with smaller and smaller recession velocities, and eventually into regions where co-moving galaxies are receding at less than light speed and then finally to our own local region, allowing us to see that distant galaxy.

Cheers -- sylas
 
  • #107
Chalnoth said:
I think you're failing to get my point again, so let me see if I can restate it.
Dear me. As far as I can see from what you have just posted, I have not once failed to get your point, Chalnoth. So I certainly cannot fail "again." You are confusing possibility with optimality and getting the possibility of life mixed up with actual fine-tune optimality for hole abundance.

But a lot of what you say is good. Like this:

I'll simply point out what is required for lots and lots of black holes. You need:

1. A universe that lasts a long time. If it recollapses very quickly, obviously few if any black holes will form.
2. A universe that has structure formation. Stuff has to get clumpy before black holes have a chance to form.
3. At least some of the matter needs to be dissipative. That is, it has to experience friction so that it can collapse more readily.

And I believe that's all of it. Point (1) is met by our universe being nearly flat, and the cosmological constant not being too large. Point (2) is met by a combination of the magnitude of primordial perturbations, the existence of normal + dark matter, and the cosmological constant being very small. Point (3) is met by the existence of the electromagnetic force.

This may not be all that we have to fine-tune for so as to maximize hole production. There may be some particle physics allowing neutron stars to collapse more readily etc etc. But you have given a helpful summary of a good many of the factors.

The problem I have with this, however, is that these are all also requirements for life. And if the requirements for life include the requirements for black holes, then clearly any universe with life in it is going to have lots and lots of black holes. So even if there is an appearance of fine tuning, I don't think we can take it seriously.

You have argued that any universe with life in it is going to have lots and lots of black holes. Unfortunately you have not argued that any universe with life in it is going to be optimized for black holes.

That is, incapable of improvement by a small adjustment.

Surely life is not so difficult to provide as a possibility, in a suboptimal universe. E.g. take our own universe and change one of the parameters which is less critical for life and more critical for holes (like the mass of a certain quark not found in normal earthlike environments). Change our Thirty just slightly, keeping life possible but making the universe suboptimal---there you have your example :-D.

However it sounds like you don't want to take seriously the possibility that the conjectured optimality is correct. Alex Vilenkin (one of the top cosmologists) took it seriously enough to publish a paper trying to disprove the conjectured optimality of parameters. But it is perfectly fine if you Chalnoth don't feel like taking it seriously. I do. It is part of the Baconian empirical tradition, the scientific method and all. Somebody puts up a falsifiable conjecture that looks like it might be correct, so you try to test it and reject it.
 
Last edited:
  • #108
I'll add this recent paper of Fred Adams to the discussion. He tried varying the fundamental constants and sees what stars would be like. (It's not directly applicable to what we are talking about, but still kind of related.)

http://arxiv.org/abs/0807.3697
Stars In Other Universes: Stellar structure with different fundamental constants
Fred C. Adams
Accepted for publication by the Journal of Cosmology and Astroparticle Physics, 29 pages, 6 figures
(Submitted on 23 Jul 2008)
"Motivated by the possible existence of other universes, with possible variations in the laws of physics, this paper explores the parameter space of fundamental constants that allows for the existence of stars. To make this problem tractable, we develop a semi-analytical stellar structure model that allows for physical understanding of these stars with unconventional parameters, as well as a means to survey the relevant parameter space. In this work, the most important quantities that determine stellar properties -- and are allowed to vary -- are the gravitational constant G, the fine structure constant alpha, and a composite parameter C that determines nuclear reaction rates. Working within this model, we delineate the portion of parameter space that allows for the existence of stars. Our main finding is that a sizable fraction of the parameter space (roughly one fourth) provides the values necessary for stellar objects to operate through sustained nuclear fusion. As a result, the set of parameters necessary to support stars are not particularly rare. In addition, we briefly consider the possibility that unconventional stars (e.g., black holes, dark matter stars) play the role filled by stars in our universe and constrain the allowed parameter space."

This paper points out what looks to me like a possible avenue to falsifying the conjecture of optimality. Our universe seems to overproduce small stars. What parameters might (if they could be adjusted) have led to a different mix with more abundant massive ones?
What is required to enable a large cloud to condense before the stars own energy blows gas away and halts the process? I think we are familiar with these considerations (e.g. the Eddington Limit) but in a different context.
 
  • #109
Chalnoth said:
Technically, a non-accelerating universe has a past horizon, meaning there exist objects that have never been in causal contact with us. It doesn't, however, have a future horizon, meaning that eventually everything will be in causal contact with us (or rather, with our position in space-time...we certainly won't be around).

Now that is a safe prediction that no one can call you on in the future.
 
  • #110
Rymer said:
Now that is a safe prediction that no one can call you on in the future.

It is a mathematical theorem concerning a widely used class of models for the universe, and a formal demonstration of the relevance of acceleration to event horizons.
 
  • #111
sylas said:
There are several horizons you can speak of. The one that was being discussed in [post=2291438]msg #16[/post] is often called the "event horizon". It partitions spacetime into (A) events we have "seen" or that we can "see" if we only wait long enough, and (B) events that we can never see, no matter how long we wait.

If someone speaks of a photon never being able to reach us, then that means they are proposing it is past the event horizon.

The event horizon only exists in an accelerating universe.

It doesn't make sense to speak of an expansion exceeding the speed of light. Expansion is not a velocity, in those units. The units of expansion are basically inverse time. Of it is given in km/sec/Mparsec. But you can also give it as sec-1, and it means the inverse of the time it would take the double in scale factor at the current expansion rate.

There are always regions in the universe that are receding faster that the speed of light. (Using proper time, proper distance co-ordinates.) It is perfectly possible to see a galaxy which is now and always has been receding with a recession velocity greater than c. As long as the expansion is not accelerating, a photon will be passing into new regions with smaller and smaller recession velocities, and eventually into regions where co-moving galaxies are receding at less than light speed and then finally to our own local region, allowing us to see that distant galaxy.

Cheers -- sylas

Problem with this is if the expansion VELOCITY is constant -- no acceleration -- then the obvious options are:

1) All acceleration/deceleration forces just happen to cancel.
2) Or there are no acceleration/deceleration forces.

If 2, then General Relativity does not apply -- and recession velocity is 'real' -- and likely Doppler until someone invents another theory. This would indicate that the VELOCITY would be limited by the speed of light. (No event horizon of the type you mention).

So I gather, that you are adopting the '1' solution -- correct?
 
  • #112
Rymer said:
Problem with this is if the expansion VELOCITY is constant -- no acceleration -- then the obvious options are:

1) All acceleration/deceleration forces just happen to cancel.
2) Or there are no acceleration/deceleration forces.

If 2, then General Relativity does not apply -- and recession velocity is 'real' -- and likely Doppler until someone invents another theory. This would indicate that the VELOCITY would be limited by the speed of light. (No event horizon of the type you mention).

So I gather, that you are adopting the '1' solution -- correct?

Non-sequitur. The velocity isn't constant according to all available measurement, and can't be constant consistent with existing physics.

I have said nothing about constant rates of expansion, so I am not adopting either of these counter factuals. I have simply pointed out that you were incorrect to say that event horizons have nothing to do with acceleration.

Cheers -- sylas
 
  • #113
sylas said:
Non-sequitur. The velocity isn't constant according to all available measurement, and can't be constant consistent with existing physics.

I have said nothing about constant rates of expansion, so I am not adopting either of these counter factuals. I have simply pointed out that you were incorrect to say that event horizons have nothing to do with acceleration.

Cheers -- sylas

I have a model that will do exactly that -- constant velocity -- matching data and the data matching a theory derived value. But that is not the point -- what I seem to be finding with this completely different model (be it right or wrong) is again a kind of optimal solution -- it seems to 'suffer' from a form of fine-tuning also. That I find bizarre.

This would seem to imply that there is something very subtle going on a a level we haven't properly identified. Which then leads me to an apparent 'off the wall question':

How confident are we that this 'distance ladder' concept we use for our measurements isn't introducing or masking accelerations/deceleration etc.

This might be one explanation for the linear results I'm seeing -- i.e. our measurement system is introducing it by the ladder scaling back to equivalent nearby distance. If so, this might imply a bigger problem than a theory difference.
 
  • #114
Rymer said:
I like your Thirty.

Question all this seems to be assuming there are really 'black holes'. I'm not sure anymore what the term means. ...
The reason I bring up the question is that, is this the prediction 'black holes' or just very super-compacted matter objects? ...

Have a look at this thread in the nearby Astrophysics forum:
https://www.physicsforums.com/showthread.php?p=2304439#post2304439

An up-to-date model of what a black hole is by two first-rate people.
 
  • #115
marcus said:
Have a look at this thread in the nearby Astrophysics forum:
https://www.physicsforums.com/showthread.php?p=2304439#post2304439

An up-to-date model of what a black hole is by two first-rate people.

Thanks, I've downloaded what I could and bookmarked the rest ... it will take some reading and thought.

The Forum this evening is VERY slow -- seems to be 'service-links' (i.e. advert servers).

Its taking over five minutes for a page to load. Giving up for the evening.
 
  • #116
marcus said:
Dear me. As far as I can see from what you have just posted, I have not once failed to get your point, Chalnoth. So I certainly cannot fail "again." You are confusing possibility with optimality and getting the possibility of life mixed up with actual fine-tune optimality for hole abundance.
No, I have done nothing of the sort, so I fear you have misunderstood me once again.

First, let me state that the simple claim that the universe we observe is optimally fine-tuned for black hole abundance is almost certainly completely wrong. I can think of a rather trivial situation, after all, under which black holes will be vastly more abundant: make G larger. Make G larger by a couple of orders of magnitude, and all matter that collapses will collapse into black holes. There won't be any worry about stars or anything of the sort: black holes will be everywhere. Of course, you'd have to change some other parameters to compensate so that the universe expands as much and becomes as large. But other than that it should work pretty well.

But this doesn't actually impact the hypothesis if the hypothesis is not that our universe is optimally-tuned, but instead that the universe behaves in such a way that the production of black holes is preferred. If that is the hypothesis, then we must limit our search to only the search space where life is possible to falsify the hypothesis.

The correct prediction, then, is that this hypothesis predicts that our universe is such that the direction from our current location in parameter space in which there are more black holes is also a direction in which there are dramatically fewer living beings. This is, of course, very difficult to show. But that's what the hypothesis requires.

For the example of above of varying G, for instance, what this would predict is that making G larger even by a little bit would make life much, much harder, if not impossible, while making G significantly smaller (which would mean fewer black holes) wouldn't be so bad for life. That's what this hypothesis of black hole optimality predicts.

Rymer said:
However it sounds like you don't want to take seriously the possibility that the conjectured optimality is correct.
Not really. It's just that I think it's going to be fantastically difficult to either verify or falsify, and it would require a lot of work that I am, frankly, not interested in performing.
 
  • #117
Chalnoth said:
That's only a failure of our language. .

Your language, or rather your choice of words, not mine...

It's easy enough to describe the behavior of physical systems even if you wish to consider a hypothetical system without any time coordinate.

You must be referring to configuration and phase spaces .. an often useful abstraction made much of by Julian Barbour, but not resorted to when describing bouncing black holes and suchlike. Smolin's summary of his proposals about cosmological natural selection are full of terms implying processes or behaviours,which imply simultaneity or finite duration. These concepts are inappropriate in this context. Here are examples of his description:

Smolin said:
...The world consists of an ensemble E of universes...Black hole singularities bounce and evolve to initial states of expanding universes...At each such creation event there is a small change ... the ensemble E is updated by a discrete series of steps, in each steps one new universe is created for each black hole in an existing universe...after many steps the ensemble converges...

My difficulty about what kind of time folk like Smolin and Davies are imagining here is not allayed by your reassuring claims that:

Well, you just have to recognize that reality is a bit more subtle...
It just means the simplified description is likely missing some important details of the physics which it is trying to describe.

I suspect that those who think in this way have in mind an armchair view of the multiverse rolling by and evolving in front of them. But I fear that this is just wishful imagining. Time is not simple at all, it seems.
 
Last edited by a moderator:
  • #118
Well, again, I think you're worried about superficial things like word choice, when the real meat is only found in the mathematics.
 
  • #119
Chalnoth said:
No, I have done nothing of the sort, so I fear you have misunderstood me once again.
heh heh

First, let me state that the simple claim that the universe we observe is optimally fine-tuned for black hole abundance is almost certainly completely wrong. I can think of a rather trivial situation, after all, under which black holes will be vastly more abundant: make G larger. Make G larger by a couple of orders of magnitude, and all matter that collapses will collapse into black holes.

heh heh. What you suggest would dramatically reduce the number of black holes. Universe collapses in a crunch before star and galaxy formation can get properly started.
Of course, you'd have to change some other parameters to compensate so that the universe expands as much and becomes as large. But other than that it should work pretty well.

The Devil is in the details, Chal. How exactly would you change which parameters? Glad to see you are thinking about this! Alex Vilenkin is a worldclass cosmologist at Tufts and he has unsuccessfully tried to disprove this optimality. If you come up with an idea that actually works, I'm sure he would like to know.

But this doesn't actually impact the hypothesis if the hypothesis is not that our universe is optimally-tuned,...

If? The statement to be refuted is the one about optimally tuned. Let's stick with that. Let's not fudge things so we can sneak "Life" into the discussion.

but instead that the universe behaves in such a way that the production of black holes is preferred. If that is the hypothesis, then we must limit our search to only the search space where life is possible to falsify the hypothesis.

"preferred" is vague. Smolin's statement is a mathematical description of a local max---optimal tuning in other words. "Life" does not enter into the logic.

The correct prediction, then, is that this hypothesis predicts that our universe is such that the direction from our current location in parameter space in which there are more black holes is also a direction in which there are dramatically fewer living beings.

How many "living beings" are there? Can you make that rigorous? Do you know if there are "dramatically few" or "dramatically many"? What you are called the "correct prediction" is vague, I would say mathematically meaningless. And it is not the correct prediction in any case---to get there you first had to assume we aren't talking about optimal tuning.

That's what this hypothesis of black hole optimality predicts.

No, Chal. You have just made a mistake. Get over it.

a lot of work that I am, frankly, not interested in performing.

Good! Just forget about it then. :biggrin:
 
  • #120
oldman said:
... Smolin's arXiv:hep-th/0612185[/URL] of his proposals about cosmological natural selection are full of terms implying [B]processes [/B]or [B]behaviours[/B],which imply simultaneity or finite duration...[/QUOTE]

I would say this is a sharp insight. BTW your link needs fixing.
[url]http://arxiv.org/abs/hep-th/0612185[/url]

I am personally not as interested in the evolving multiverse explanation as I am in the black hole local optimum conjecture. I would like to see that tested more thoroughly and either disproved or not disproved. After that then the floor is open for people to suggest explanations.

But what you point out is right, namely that the evolving mutiverse picture [B]does involve some sort of universal time[/B]. That doesn't automatically rule it out, but it's an important issue!
Obviously he has been thinking about that. A recent Smolin talk on Pirsa is about the "reality of time and the evolution of laws".
[url]http://pirsa.org/08100049/[/url]
A recent Smolin paper is on unimodular gravity (which gives the same results as ordinary GR but has a universal time feature which GR does not.)
[url]http://arxiv.org/abs/0904.4841[/url]
[B]The quantization of unimodular gravity and the cosmological constant problem[/B]
Lee Smolin
22 pages
(Submitted on 30 Apr 2009)
"A quantization of unimodular gravity is described, which results in a quantum effective action which is also unimodular, ie a function of a metric with fixed determinant. A consequence is that contributions to the energy momentum tensor of the form of the metric times a spacetime constant, whether classical or quantum, are not sources of curvature in the equations of motion derived from the quantum effective action. This solves the first cosmological constant problem, which is suppressing the enormous contributions to the cosmological constant coming from quantum corrections. We discuss several forms of uniodular gravity and put two of them, including one proposed by Henneaux and Teitelboim, in constrained Hamiltonian form. The path integral is constructed from the latter. Furthermore, the second cosmological constant problem, which is why the measured value is so small, is also addressed by this theory. We argue that a mechanism first proposed by Ng and van Dam for suppressing the cosmological constant by quantum effects obtains at the semiclassical level."

Personally I don't think it's smart to make up your mind on the universal time issue before all the returns are in. It comes up in unimodular GR, it comes up in Loll's work. Also in standard cosmology. So I prefer to wait and see on that one.

The black hole optimality question does not involve the time issue.
Are the Thirty fine-tuned for hole abundance, or are they not?
By itself this makes no reference to an evolutionary process.
The process business only comes up if you assume that optimality has been established and say "Well, what if the universe is fine tuned for holes? What then? How could that be rationally explained?"
 
Last edited by a moderator:

Similar threads

  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 28 ·
Replies
28
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
Replies
7
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 17 ·
Replies
17
Views
2K
  • · Replies 20 ·
Replies
20
Views
3K
Replies
1
Views
2K
  • · Replies 24 ·
Replies
24
Views
3K