What are the fundamental information-processes in physics?

  • Thread starter ConradDJ
  • Start date

apeiron

Gold Member
1,971
1
Yes, except of course that there are only equilibriums at each level, here is no fixed microstructure to which equilibrium refers.


/Fredrik
Glad it is soft bits after all! And soft contexts too! Actually, I should call the difference vague-crisp rather than soft-hard as this would be the more appropriate jargon.

And while I agree we would be talking about equilibriums at every level, I think the next crucial point is that there is then only one value for the resulting overall systems equilibrium - its Lyapunov exponent so to speak.

So across all scales, the universe's information/observerhood must be thermalised. A single temperature rules.

But where this gets tricky is that the universe is of course an expanding space. It is not a static system, closed in scale, but a dynamic system, open in scale.

So the right statistics is not the usual gaussian model of a closed system but the powerlaw or fractal statistics of a scale free system. The "temperature" is not damped around a single mean value but is expressed flatly - log/log fashion - over all scales. Hence the Lyapunov exponent analogy.

Basically, this all needs a modern open systems model of entropy, perhaps like Renyi or Tsallis non-extensive models. A fractal equilibrium model with an emergent axis of scale rather than a damped gaussian equilibrium such as is only possible inside a closed box.
 

Fra

3,073
142
So across all scales, the universe's information/observerhood must be thermalised. A single temperature rules.
This would be exact only at global equlibrium, if it exists yes. But knowledge of global equilibrium can not be known to be exact. It does in my view more take the form of a rational expectation, on which actions are based.

But where this gets tricky is that the universe is of course an expanding space. It is not a static system, closed in scale, but a dynamic system, open in scale.
Exactly my point above. In general the spaces of possibilities are changing. Not always expanding, sometimes it's contracting too (at least in principle).

So this is a challange.

The point of course, if we take this reasoning seriously, the escape here can NOT be as simple to to invent an ad hoc structure, an external container state space for in which to describe the expanding state space. This external container doesn't exists. The fact the inside is the only physical container is what yield the evolving nature of this.

Here more work is needed.

I think the KEY to make predictions here is that the ACTION of a system, which is in a certain sense the containing for it's own environment should follow a rational scheme of inference - this is my key conjecture that makes this predictive, it's not just philosophy.

In current models, the actions of a system follows from the initial state AND the laws of physics. In my suggested reasoning here, state and law are treated on the same principal footing, the difference is that the sit in different levels in this hiearachy window, this also imples that the laws of physics usually put in in the lagrangian or hamiltonian, actually follows from the evoution. Not only does this provide a framework for understanding the state spaces, hilbert spaces etc - it provides a potential insight also to the understanding of the ORIGIN of symmetries/laws encoded in the hamiltonians or lagrangians.

Basically, this all needs a modern open systems model of entropy,
I agree. In general we have an open system. So the problem is how to instead of basing actions of the fixed constraints of a closed system, construct actions based on the softer constraints of an open system - and then more importantly - what happens when two such systems interact.

/Fredrik
 
Last edited:

ConradDJ

Gold Member
307
1
In general the spaces of possibilities are changing... In general we have an open system.

So the problem is how to instead of basing actions of the fixed constraints of a closed system, construct actions based on the softer constraints of an open system - and then more importantly - what happens when two such systems interact.

One way to approach this is to think about what's involved in an evolutionary process. When I look at physics from that standpoint, it seems as though the fundamental processes may necessarily involve more than two interacting systems. Here's what I have in mind --


Evolution pertains to a process (like biological reproduction) that happens over and over again, where each iteration depends on the success of previous iterations, and inherits something from them. The process has to preserve certain information from the past, so that it doesn’t have to begin from the same baseline again and again each time.

It seems that each iteration must be able to give rise to more than one further iteration, so there can be a proliferation of variant forms, subject to natural selection -- some variants passing on information better than others. And to allow for variation, the passing on of information from one generation to the next has to be reliable, but not complete and exact.

As to the nature of this information, the one thing that must be passed on, in every generation, is the ability to pass that information on to another generation. In biological evolution, this key functionality is the ability of organisms to reproduce organisms that can reproduce.

I'm thinking that a "measurement event" may represent this kind of process. It involves the gathering of several kinds of information determined in other measurement-events, some of which are local – i.e. information preserved in the state of the "observer" as determined in its prior interactions – and some of which are distant events on the observer's past light-cone.

The sum of these moments constitutes a "measurement set-up" in which an interaction with some other system can happen in a number of ways -- the possibility-space -- and which way it happens makes a difference. The difference it makes gets stored as new information in the observer's state, which will sooner or later be communicated back out into the world in another interaction.

The idea is that what's essentially being passed on, from one measurement-event to the next – not through any one interaction, but in the sum of many interactions that constitutes the "entire measurement situation" – is the functionality of measurement itself, i.e. the ability to create new measurement-situations. Essentially what each measurement does is to make other measurements possible, by preserving past information along with newly determined information in the current state of the observer, and then by contributing new information to other observers.

If such a process could somehow get itself started, then it's reasonable to suppose it could evolve ways to define more and more specific information, both more precisely and reliably (less "guessing"), and yet with more variation. The more that gets determined, the richer the basis is for determining new information.

I'm thinking this kind of evolution might eventually explain why the universe we actually observe looks so much like a "deterministic" system, even though the fundamental information-processes appear to be essentially random.

But it's indeed a generalisation of statistical mechanics and probabilistic inference, but based on a new measure of information construct. Words can't nail this.

I think this must be right. I’m hoping that to think about what’s required to make evolution work might help indicate what kinds of probabilities we’re dealing with, at a basic level.
 

Fra

3,073
142
Here's what I have in mind --

Evolution pertains to a process (like biological reproduction) that happens over and over again, where each iteration depends on the success of previous iterations, and inherits something from them. The process has to preserve certain information from the past, so that it doesn’t have to begin from the same baseline again and again each time.

It seems that each iteration must be able to give rise to more than one further iteration, so there can be a proliferation of variant forms, subject to natural selection -- some variants passing on information better than others. And to allow for variation, the passing on of information from one generation to the next has to be reliable, but not complete and exact.
...
I'm thinking that a "measurement event" may represent this kind of process. It involves the gathering of several kinds of information determined in other measurement-events, some of which are local – i.e. information preserved in the state of the "observer" as determined in its prior interactions – and some of which are distant events on the observer's past light-cone.
Yes, once we have agree on the general direction here, the next question is how to satisfy the basic requirements of evolution, ie. "diversity", "reproduction" and "selection" etc.

Smolins idea in his CNS is the black holes produce offsprings by producing new universes that produce new blackholes. This is why in his reasoning he is lead to suggest that a typical universe should be optimized to produce black holes. And that during each offspring small variations in physical laws appear. Enough to get diversity, but not large enough to destroy the stability.

My current idea has similarities to yours. In my view the "DNA" if we put it like that, of physical law, is the action which is implicit in the measurement complex that constitutes and observer. Or rather the DNA is a certain trait of such complex. And when an observer interacts with it's environment, not only does is spread information, it also spreads it's way of reasoning - and THIS is the DNA of physical law.

So in my view, a viable system is able to convey it's action to the environment, and this ultimately makes the environment gradually more friendly for emergense of offsprings in consistency with the same DNA (or action).

I think this is similary to what you describe.

This means that such an observer is both self-preserving and produced offstrings INDIRECTLY (and not it the direct sense we konw from biology; like cellular division and copying of DNA etc) byt means of simple interactions. So the offspring is pretty much procuded by "induction" so to speak.

For selection, I see it as negotiation. Those who manage to negotiation with it's environment and still maintain a coherent strucutre, are selected. I see this a more or less a rational inference. Rational inference is selected among a random inference.

/Fredrik
 

ConradDJ

Gold Member
307
1
Yes, ...the next question is how to satisfy the basic requirements of evolution, ie. "diversity", "reproduction" and "selection" etc.

... when an observer interacts with its environment, not only does it spread information, it also spreads its way of reasoning - and THIS is the DNA of physical law.

... So the offspring is pretty much produced by "induction" so to speak.

For selection, I see it as negotiation. Those who manage to negotiation with its environment and still maintain a coherent structure, are selected. I see this a more or less a rational inference. Rational inference is selected among a random inference.

Fredrik -- I'm thinking about this... trying to see how to envision it in terms of familiar physical processes. Say two particles interact and there is a transfer of linear / angular momentum... is "induction" involved here? Or are you thinking of a different sort of process?

Note that "selection" is already built into the basic reproductive process, in biology -- that is, reproduction can succeed or fail, in any instance -- where "success" means that the organism succeeds in producing offspring that also succeed, etc...

"Negotiation with the environment" certainly comes into play here -- you could say, an organism or a species has to "maintain a coherent structure" while coping with its environment. This is true, but the selective "criterion" of success is in a sense already there prior to any issue of "adaptation".

This was an important issue in biology, where Lamarck saw the process of adjusting to the environment as something that happens at the level of individual organisms... in his theory of the "inheritance of acquired characteristics". That blurred the picture, by focusing on adaptation as the basic process rather than differential reproduction of fixed genetic information. (There may not be any analogy here to the evolutionary process in physics, where we're not dealing with literal reproduction. So far we don't have anything like the clarity of Darwin's basic insight.)

Anyhow if the basic process is "inference" or "reasoning" in some physical sense, I'm thinking about what might constitute "success", in terms of what gets passed on. I understand this is all a work-in-progress!...
 
5,598
39
One assumes information to exist at a locale. The other that information is created at a locale. And these are two different views (though in both cases you would appear to find information at a locale).
I have my doubts about those assumptions....here's why.....and I do have yet to read thru all that is posted above here, but before that, I wanted to post a dramatically different idea from that expressed in the above quote: From Leonard Susskind, THE BLACK HOLE WAR,2008.

I can't find the exact paragraph I want: the essential idea is the holographic principle (conjecture), that information in a region of space resides on the enclosing surface....In the case of a black hole, for example, Beckenstein's and Hawking's work shows if you add a bit to the black hole the horizon increases by one Planck area....but more generally, every time you describe a volume of space you can pick an ever larger "horizon", a larger enclosing surface, even to the edge of our universe if one exists, and the information content of the original volume is included on the surface area....but each time it resides at a different location..a different horizon!!! Information about a location in spacetime appears to have no definite location itself!

Susskind goes on to discuss that information in a finite region (or equivalently surface area) of space is itself finite....hence it appears space is discrete...this has been discussed in at least one other thread recently....and so appears to conflict with quantum field theory which is continuous...

and if that were not enough to support Conrad's assertion that information in the world is not very clear, you can also consider the horizon of a black hole and it's information content: Susskind points out
...the experimenter is faced with a choice: remain outside the black hole and record data from the safe side of the horizon, or jump into the holeand do observations from the inside...'You can't do both' "
he claims.

So it sounds like information resides in different places and your location may determine what information is accessible....

And as a reminder, I want to see what's been made in this thread of information loss in black holes ....

and just for fun here's the vote taken in 1993 at the theoretical physics conference in Sanata Barbara California:

WHAT HAPPENS TO IFNORMATION THAT FALLS INTO A BLACK HOLE (votes cast)
1. It's lost: 25
2. It comes out with Hawking radiation: 39
3. It remains (accessible) in a black hole remnant: 7
4. Something else: 6

I wonder how such a vote would go today??
 
5,598
39
Conrad, post # 7 says:

But first – reproduction is clearly the fundamental information-process underlying biological evolution.
Via Charles Seife, Decoding the Universe, Chapter 4, LIFE, :

It is not the individual that is driving reporoduction; it is the information in the individual. The information in an organism has a goal of replicating itself. While the organisms body is a by-product, a tool for attaining that goal, it is just the vehicle for carrying that information around, sheltering it, and hel;ping the information reproduce.
(I know that's weird punctuation, but I quoted it as published.)
 

apeiron

Gold Member
1,971
1
In the case of a black hole, for example, Beckenstein's and Hawking's work shows if you add a bit to the black hole the horizon increases by one Planck area....
And so what about reversing the argument? If you instead keep subtracting away bits from the event horizon around a locale, eventually you would get down to some minimal amount of information. With QM saying you can never get down to just nothing at a locale.

Discrete points in spacetime would thus be seen as a limit on observation. All about getting down to the least amount of bits that can be seen. And so that becomes the event horizon which defines something as a location.
 
5,598
39
And so what about reversing the argument?
No problem..I agree...it's Hawking radiation....

Discrete points in spacetime would thus be seen as a limit on observation
I'd say it differently: Below Planck Scale, nothing exists as we know it...there might be no information....

... when an observer interacts with its environment, not only does it spread information,
or perhaps the information is already everywhere....encapsulated in a boundary/surface condition...

The process has to preserve certain information from the past, so that it doesn’t have to begin from the same baseline again and again each time.
This seems to be different from what Rovelli in RQM says...removing information via new question (postulate #2) eliminates some prior information...to maintain his postulate #1.

I'd be interested if you guys that have been in the thick of the discussion could agree on a list of issues/uncertainties....I suspect that would be incredibly long when you got done. Then it would be interesting the pare the list down to a manageable number to try to tie together in a coherent theory....Given the relatively narrow scope of Rovelli"s RQM paper and all the things it touches even so, suggests a tough road ahead....good luck...
 

ConradDJ

Gold Member
307
1
This seems to be different from what Rovelli in RQM says...removing information via new question (postulate #2) eliminates some prior information...to maintain his postulate #1.

Well, certainly much prior information is lost. This is also true in biological evolution, of course. To recap from my post above –
I'm thinking that a "measurement event" may represent this kind of [evolutionary] process. It involves the gathering of several kinds of information determined in other measurement-events...

Not all past information needs to be preserved, only what’s “relevant” (Rovelli’s term) to determining what can happen in future.
The idea is that what's essentially being passed on, from one measurement-event to the next – not through any one interaction, but in the sum of many interactions that constitutes the "entire measurement situation" – is the functionality of measurement itself, i.e. the ability to create new measurement-situations.

As to “tough road” – right. But to me, it’s not so much that there’s a long list of issues... it’s that there are so many ways in which information gets physically determined / communicated – all of physics is involved.

As in the black hole issue you raised – we know how to discuss information as if it’s “just there” in the world... we can quantify it, we can break it down into information about particles, information about fields... But to approach information from the standpoint of how it gets to be physically observable, in each case... is like heading into an unexplored jungle.

For his limited purposes, Rovelli could avoid all that. But it also means that he offers no answer to basic questions like – how do all these different observers actually end up agreeing on what’s going on in the world? As he shows, the QM formalism says that indeed they all do... but we get no insight into what makes this work.

Again, I think the reason it’s hard to understand the basic information-processes is that there are quite a few of them, and none are simple, and they’re all interdependent. This is what we would expect, as the result of an evolutionary process... but that doesn’t make this kind of approach less daunting. So "good luck" is needed...thanks.
 

apeiron

Gold Member
1,971
1
I'd be interested if you guys that have been in the thick of the discussion could agree on a list of issues/uncertainties....
I'm not too sure what even the question is here :smile:

But the standard lament in the systems science circles in which I move is that standard issue reductionist modelling - the modern information theoretic approach being its latest form - manages to leave out essential aspects of reality, such as meaning, observers, and other contextual or global factors.

So aim number one would be to provide an alternative model in which these kinds of things get represented again.

In practice, the standard view of information is that bits just exist. They are substantial locales just waiting to be counted. No meanings are implicit in their existence, no observers are required.

The systems view would then be - at least my version of it - that bits can only exist within bit-shaping contexts. So we have a dyadic or dichotomistic story. The existence of a bit implies the existence of a matching context. And the nature of this relationship can then be generalised mathematically in the language of symmetry, symmetry-breaking and asymmetry. Hopefully.
 

Related Threads for: What are the fundamental information-processes in physics?

Replies
5
Views
9K
Replies
22
Views
5K
Replies
1
Views
956
  • Last Post
Replies
1
Views
5K
Replies
1
Views
2K
Replies
42
Views
6K
Replies
11
Views
3K
Replies
17
Views
1K
Top