apeiron said:
If you agree thus far in a general fashion, have you come across discussions of the model for emergent regularities?
No, not yet any mathematical models that satisfy the requirements of my visions.
However, the precursor of a mathematical model is somethings a line of reasoning, from which a preferred mathematical formalism loosely follows. There are components of this reasoning present in the reasoning of several people.
1) Lee Smolin - Evolving law
2) Carlo Rovelli - Relational QM, with the key idea that observer can only compare measurements by means of physical interactions. Unfortunately he doesn't really change QM, which is my dissapointment. But there are some brilliant sections in his RQM paper that stand out even if the finish isn't what I hoped.
3) Ariel Caticha - Has the idea that the laws of physicse at some level conincide with the rules of inference as in reasoning based upon incomplete information, he is close to variou MaxEnt methods
4) Olaf Dreyer - with this "internal relativity", an point of view which aims to restores the largely neglected point of view of a physical inside observer. As I undestand it, his ideas are very young and is very much in progress. Time will show what he comes up with.
I have read some of what I've found from these people, and all of them has elements of reasoning that I think is extraordinary and keys. Yet, at the status of development they are, none of them has what seems to be a satisfactoty strategy. There are points in each of their reasoning which I do not share.
And since the reasoning of each of these persons, naturally leads to different formalisms, clearly if I don't share the founding principles, their formalisms are of little _fundamental interest_.
Now you may think that what these people are doing isn't what you all statistical modelling, but the general fashion in which I agree with you, is general. TO me key is the physical basis of the statistics, and in general, I do not accept the continuum probability theory as a basis. Instead, what I have in mind is mathematically a combinatorical starting point, where there are interactions between discrete structures, the continuum limit should be recovered as an effective description in hte large complexity limit, but the continuum does IMHO not have a place in starting assuptions. This is why, I have difficulty in adapting flatly all the standard statistics at this fundamental level.
So as for the choice of formalism, I am still looking for it. But I have some reasonable strong guidelines as to what I'm looking for, and in which general direction to find it.
apeiron said:
As I see it, there are two general bodies of statistical modelling here - gaussian and powerlaw. Gaussian models the "laws" of static worlds (closed system perspectives like an ideal gas) and powerlaws model dynamic worlds (open, far from equilibrium, systems such as criticality, edge of chaos, renormalised, scale-free, fractal, etc).
So linear and non-linear phenomena. Inertial and accelerative frames.
The observer aspect needs to be considered as well. Again a choice. You can reduce the notion of the observer to some localised viewer (point-like within the system - and then choice of location becomes a big, (often relativistic), issue. Or you can expand observerdom to the global bound. Which is the event horizon approach. And the constraints approach in thermo/hierarchy theory.
...
Do you think static or dynamic, closed or open?
Given the previous comment, I think I have a more open and radical view, I am not constrained to current staistical modelling. But for sure it can not be a closed system model. It would rather be an open, evolving model, and most probably this is also reflected in the mathematical model. So it's more likely to be of a evolving algorithm-type model than it is to be a conventional static diff.equation model with static parameters. I hope there would be a minimum of parameters, ideally none at the fundamental scale, but at the effective human scale I think there will be, but then these parameters would be understood as evolved, and there would be no initial value problem.
apeiron said:
Do you place your obsever at the local or the global scale?
Local of global with respect to what? Spacetime? That is one question I ask.
But even not answering that, my picture of the observer is that the observer could be any subsystem of the universe. And from the point of view of the observer itself, it's of course a local one. But an observer can of course sitll be distributed and non-local relative to a second observer. This does not present a contradiction as I see it.
I think of locality in terms of loose information geoemtric terms:
Two observers are close, if they have the same information.
Thus there is a direct tension is picturing two remote observers, having the same information. It doesn't make sense. Because it totally ignores the fact that spacetime is part of the information.
Sometimes the separation of spacetime and internal info is possible, but in the general case I don't see why it is even necessary. The separation, is rather as I see it related to the simultaneous emergence of spacetime and matter. Here Olaf Dreyer has presented similar arguments. He seems to think that artifical and ambigous separation of spacetime and matter at the fundamenntal level rather than helping, is part of the problem. I agree there.
Instead, the fundamental starting point is that there is no difference. Thus, the question is NOT how to patch matter models onto pure spacetime models. The question is how the separation of spacetime and matter degrees of freedom can be understood.
I similarly have the idea that the observer environment and the observed itself, evolving simultaneously. It's essentially an analogous problem to me.
/Fredrik