What caused the physical laws we have?

In summary, Lee Smolin and Roberto Unger are working on a book that asks if the laws of nature can evolve. They believe that, given the universe's age, the answer is yes.
  • #36


Fra said:
One can say that the maxent principle defines a differential structure, that is valid until more information is received. This means that the entropy measure of the maxent is deforming during the process, which means that there is no global heath death.

Another way to see this is that I am suggesting a unification of the Principle of maximum entropy, and he principle of least action. It is in fact, the one and same principle. The unification lies at information about states vs information about expected change. And if you insist on a transparent information picture the state space is replaced by a space of differential changes. so the MaxEnt principe implies a kind of principle of least action.

/Fredrik
 
Space news on Phys.org
  • #37


I've also argued that least action and MEP are the same - one is about the dissipation of histories (as in Feynman sum over histories) and the other about the dissipation of energy gradients.
 
  • #38


To wrap up - if laws are viewed as emergent regularities of self-organising systems, then we need three parts to the story.

We need the initial conditions, the unformed potential from which the system arose.

We need the selection mechanism or process which acts on this potential to develop it.

Then we need the statistics of the outcome - the regularities that persist because a system has gone to equilibrium.

This is a triadic story - which Peirce as a logician was trying to articulate.

And which part needs to be modeled as the mathematics? It could be just the end state.

You could perhaps throw away the mechanism and the initial conditions as excess metaphysical baggage, just use the emergent laws, or statistical regularities.
 
  • #39


apeiron said:
We need the initial conditions, the unformed potential from which the system arose.

We need the selection mechanism or process which acts on this potential to develop it.

Then we need the statistics of the outcome - the regularities that persist because a system has gone to equilibrium.

This is a triadic story - which Peirce as a logician was trying to articulate.

Philosophically we are close enough here.

I have structured the problem into three parts as well

1) what is the logic of guessing?
Which is the logic of choosing the action based upon the current info. Essentially this is a form of statistical inference.

2) what is the logic of correction?
Which is the logic of updating your information i nthe light of new, essentially bayesian but not quite. It's sort of statistical inference of change, based on differential structure of the former AND the fact that the statespace in general can change, expand or shrink. The logic of correction is what restores infomratio conservation.

3) The synthesis of the logic of a corretive guessing? which results in evolution.

I think I described parts of it very briefly in https://www.physicsforums.com/showthread.php?t=239414&page=3 some time ago.

apeiron said:
And which part needs to be modeled as the mathematics? It could be just the end state.

IMO, all steps will be modeled quantiatively. Also all three steps are sort of intermingling. It's sort of a cycle. But it will be a mix of conenvtional analytic expressions, and algorithm type of models. As long as it's computable, it's quantitative. I have low hopes of fidning neat analytical solutiosn to this. It's far too complicated. Computer simulations is more likely, where near certain solutions effective analytical models can be found as approximations.

About the initial conditions as you say, I have come to the standpoitn that in principle the model msut work with ANY initial conditions. But that's an ambigouity that isn't possible to handle, so I have chosen to focus on the initial conditions in hte low complexity limit, simply because then the initial value problem becomes trivial, or vauge as you would probably call it.

Essentially the state space expands, and thus the initial conditions when the state space (abstractly speaing) is small, then there simlpy is no landscape of initial conditions.

But I sure don't have any answers yet either. But I expect to find them.

But we seem to share similar view on the nature of law in principle, and that it's a subtle concept.

/Fredrik
 
  • #40


I think the deep mathematical foundation here is to be found in "the geometry of symmetry breaking".

This is what underlies the modelling in all the different fields I've cited, from particle physics and cosmology, to hierarchy theory, Peircean semiotics, dissipative structure theory and neural networks.

So the target for mathematical modelling is the trajectory from max symmetry => max asymmetry.

Seeing as you want an information-based view of this, think of the trajectory from noise to signal.

Noise is the max symmetry initial conditions - the vagueness or pleni-potential. Then some operation (a symmetry breaking) isolates a signal. So we have the max asymmetry of an event standing proud of its (discarded) context. A 1 surrounded by a sea of 0s in Shannon's accounting system.

The max symmetry => max asymmetry trajectory would be a reformulation of the second law. It would be a higher level generalisation of the entropy of order => disorder story.

How does this square with cosmology?

In the beginning was a max symmetry in that the universe was plankscale. All distances and energies were vanilla the same and undifferentiated. Then the symmetry broke. At heat death, it will be max broken.

Again the outcome seems vanilla as everywhere (every location) is max cold. All gradients dissipated. But actually the situation is max dichotomised in that the universe is also max large and flat. So the universe is max divided into largeness and smallness with as little as possible existing in-between. A crisp outcome. A phase transition completed. A max ent in terms of a division between macro and micro states.

Dark energy and other issues would have to be reconciled to this picture. But the point is that second law thinking would seem the target of a fruitful generalisation towards concrete maths models. Intermediate stepping stone ideas like disorder, entropy, information, would be generalised to the fully geometric idea of symmetry (and symmetry breaking, and thirdly, asymmetry).

I have already been locked once in these forums for daring to suggest there is indeed a geometry of asymmetry, so I will leave it there.
 
  • #41


apeiron said:
I think the deep mathematical foundation here is to be found in "the geometry of symmetry breaking".

This is what underlies the modelling in all the different fields I've cited, from particle physics and cosmology, to hierarchy theory, Peircean semiotics, dissipative structure theory and neural networks.

So the target for mathematical modelling is the trajectory from max symmetry => max asymmetry.

Seeing as you want an information-based view of this, think of the trajectory from noise to signal.

Noise is the max symmetry initial conditions - the vagueness or pleni-potential. Then some operation (a symmetry breaking) isolates a signal. So we have the max asymmetry of an event standing proud of its (discarded) context. A 1 surrounded by a sea of 0s in Shannon's accounting system.

The max symmetry => max asymmetry trajectory would be a reformulation of the second law. It would be a higher level generalisation of the entropy of order => disorder story.

How does this square with cosmology?

In the beginning was a max symmetry in that the universe was plankscale. All distances and energies were vanilla the same and undifferentiated. Then the symmetry broke. At heat death, it will be max broken.

Again the outcome seems vanilla as everywhere (every location) is max cold. All gradients dissipated. But actually the situation is max dichotomised in that the universe is also max large and flat. So the universe is max divided into largeness and smallness with as little as possible existing in-between. A crisp outcome. A phase transition completed. A max ent in terms of a division between macro and micro states.

Dark energy and other issues would have to be reconciled to this picture. But the point is that second law thinking would seem the target of a fruitful generalisation towards concrete maths models. Intermediate stepping stone ideas like disorder, entropy, information, would be generalised to the fully geometric idea of symmetry (and symmetry breaking, and thirdly, asymmetry).

I have already been locked once in these forums for daring to suggest there is indeed a geometry of asymmetry, so I will leave it there.

Yes, we can finish the discussion here. Again your general reasoning above makes sense to me. So I have no obvious objections, except of course that we both admit that the exact mathematic model is still yet not on the table. So well have to keep working on that.

So back to the questions of post 1.

- What caused the physical laws we have?
- Have these laws always been in existence?
- Is it true we may never know what the universe was lke before the BB?
- Can a theory such as the multi-universe theory be proven?

I think we've now made our contribution to elaborate this. Which is also IMO partly well in line with Smolins reasoning that Marcus highlighted in post #4 when quoting his talk on the reality of time and the evolution of laws.

/Fredrik
 

Similar threads

Replies
15
Views
712
Replies
4
Views
970
Replies
5
Views
1K
Replies
3
Views
3K
Replies
4
Views
1K
Replies
2
Views
1K
Replies
1
Views
751
Replies
19
Views
2K
Replies
3
Views
1K
  • Cosmology
Replies
5
Views
1K
Back
Top