What are the fundamental information-processes in physics?

  • Thread starter ConradDJ
  • Start date

ConradDJ

Gold Member
307
1
I’ve been thinking about comments made by Fra in a number of threads, where he raises questions like – what does an observer “see” at the sub-atomic scale? We could make a long list of more and less reputable ideas about the fundamental information-processes in physics, going back to Wheeler’s slogan “It from Bit”. My own favorite approach is Carlo Rovelli’s “Relational QM”, which tries to derive the quantum formalism from information theory – but there are many variations on this theme.

There’s also a lot of well-established theory directly relevant to the question of physical information-processes. The problem is, this is exactly where theories tend to become counter-intuitive and even contradictory. A couple of obvious examples – the issue in QM about when a physical interaction constitutes a “measurement”, or the problem of how “non-local” quantum correlations can occur, since Relativity limits physical communication to the speed of light.

So despite all the relevant theory and experiment, we have no clear picture of what actually happens with information, in the physical world. As a result, the ideas about fundamental information-processes in physics tend just to be guesses, sometimes almost unrelated to all that’s known about QM and particle physics.

There are three main points I want to make about this line of thought, which I’m putting in separate posts below.

1. Physical observation (communication) is not just recording (transmitting) data. This may be obvious, but it’s important because information-theory was to begin with a theory about data-transmission.

2. The actual information-processes in physics are anything but simple. Logically simple processes (like duplicating data) almost don’t occur in physics, while what does happen in physical interactions has an informational structure that’s profoundly complex, in more than one respect.

3. Observation (communication) in physics could be a process that defines itself recursively. This goes back to Rovelli’s interpretation of quantum measurement, and suggests a way that inherently complex information-processes might nevertheless be fundamental.
 

ConradDJ

Gold Member
307
1
1. Physical observation (communication) is not just recording (transmitting) data.

Before QM, it was a nearly unquestioned assumption that information resides in the real, objective properties and states of physical entities. To “observe” something meant to make a more or less accurate copy of the data given in the thing itself – to make a mental or physical record of some kind. To “measure” something meant to copy the given data into a standard numeric format, as when we measure the length of a stick with a ruler.

Likewise standard information-theory was based on the idea that data given at one location can be copied through some data-transmission channel to another location. Since the data was assumed to be inherently determinate, the fundamental issue was only the accuracy of the copying process through a given channel.

But in QM, we no longer have inherently determinate data to begin with. It’s not clear how there get to be definite facts in the world, but all the evidence indicates that this occurs only to the extent that information about things is actually observed, i.e. communicated between one physical system and another.

So however we conceive the fundamental information-processes, it seems that they involve more than duplicating or processing logical data that's "just given" in advance -- they have to do with how information gets defined (determined) in the first place, through physical interaction. And physical communication is more than just transmitting logical data that’s assumed to be well-defined in and of itself.
 

ConradDJ

Gold Member
307
1
2. The actual information-processes in physics are anything but simple.

If we ask, how is information determined (observed) through physical interaction? – the problem isn’t that we have no answer. There are way too many answers, since there are multiple ways of measuring any physical parameter. And, measuring any parameter always requires that several other parameters be “known” – so any way of observing anything assumes we have information gained from other observations of other kinds of things. The bottom line is that in physics, nothing is ever observed simply and “directly”.

For example, nearly all observations refer to relative locations in space and time. This seems simple, but if we ask what’s physically required to define a distance in space or an interval of time, we have to go back to atomic structure, since without atoms there would be nothing physically usable as “clocks and rods”. This bothered Einstein quite a bit, that the simple logical elements of Relativity could only be defined, operationally, by reference to the complex structure of atomic matter.

The issue isn’t just that observation is complex – it has a kind of complexity we don’t know very well how to deal with, because it’s essentially contextual. Somehow or other, this web of many different kinds of physical interaction provides contexts in which each kind of interaction can convey definite, observable information. But each type of interaction requires an appropriate type of context, in which other types of interaction are involved.

Fra suggests that to get to the base-level information-processes, we need to “scale down” to the level of very simple observers. That makes sense – but what would be the simplest level at which each item of information can have a context of other information that makes it observationally meaningful?

Now even the simplest interactions – apart from virtual ones – convey more than one distinct type of information. E.g. photons transfer momentum (which is “analog” data) and also spin-orientation (which is “digital” data, with only two possible values in any reference-frame). And in neither case does the data just get copied from one system to another – instead, a change in momentum of the charged particle that emits a photon is balanced by an opposite change in the momentum of the receiving particle.

We know that the interaction-structure of physics – taken as a whole – supports the communication of observable information. This is in a sense the most basic and most obvious fact about our world. But each type of information gets communicated along with other types, and gets defined in terms of other types. If we “scale down” to the simplest, most fundamental information-structures, how much of this complexity do we have to preserve, to maintain this sort of functionality?
 

Fra

3,073
142
Hello Conrad,

I largely agree with what you say. From what I recall previous discussions, we are reasonably close in reasoning but if i remember your viewpoint correctly, you - like rovelli - doesn't seek to change QM as change, just reinterpret it?

This is consistent with your beeing lead to your key question...

"but what would be the simplest level at which each item of information can have a context of other information that makes it observationally meaningful?"

My own favorite approach is Carlo Rovelli’s “Relational QM”, which tries to derive the quantum formalism from information theory – but there are many variations on this theme.
A note: I'm not sure what of his writings you specifically refer to but my impression is that Rovelli is not as ambitious to actually try to "derive te formalism" from a deeper principle. This would IMHO at leat, imply different turns than taken by rovelli, but that might well be done reasonably in the spirit of the RQM paper. I think that paper is yet to be written.

1. Physical observation (communication) is not just recording (transmitting) data. This may be obvious, but it’s important because information-theory was to begin with a theory about data-transmission.
Indeed. This information views I'm consistently referring to can not be treated by standard shannon type information theory. Indeed there ARE books of "quantum information theory" that applies to STANDARD QM, that more or less uses standard information theory, but this is of course not what I am talking about. The intrinsic information views is something deeper, not yet existing.

We actaully need also a new development of mathematical information theory that fits the intrisic evolving idea here.

I'm not sure howto put it, but maybe one could say that shannons theory relates to what we are looking for, abit like like SR relates to GR.

but what would be the simplest level at which each item of information can have a context of other information that makes it observationally meaningful?
Assuming we more or less are on the same page about the general ideas, it's true that the biggest problems to solve are

i. How can these ideas be turned into something that can make predictions and thus tell us something about the structure of physical law?

ii. Where do we start?

(i) The basic conjecture is that the laws of physics are inferrable constrains on actions, and the physical interactions with actions and reactions from the environment have traits of an inference process. The self-reference suggests that the inference processes are also constrained by the current context, and again the current context is also affected by the inference processes (see the analogy with GR? dynamics in the spacetime vs dynamics of spacetime).

Now this will suggest that inferences processes and the context are evolving. If we can described this process, and find a reasonable "initial condition", then predictions of the processes that are most likely to emerge might be possible. The population of inference systems would correspond to the population of laws and propertis of matter (standard model).

(ii) Where to start, the initial condition? I've given this some thought and the current startingpoint I consider is simply the notion of distinguishability, which is basically a bit if you like. But since there is no certainty, you can say that the starting point is a unreliable bit.

So the processes is pictured like a game. The action is based on the premise at hand, reliable or not, because it's the only choice.

The other major issues is to describe how one can play with one uncertain bit, to by means of compression, get say TWO unreliable bits. And thus to grow memory (mass). I picture the origin of mass like a game, the attraction of bits is only possible if the action system has the right compression.

Note that the entire view here is from the point of view of the simplest possible observer.

The point is that when you start at this point, you can use combinatorics to build expectations and possibilities. There is no such thing as the continuum at this point.

it might be tempting to think that an unreliable bit is better described by a continuum probability, but unfortunately I see no a priori physicla basis for this continuum. Instead, as the complexity of the observer grows, time history sequences can be stored, and discrete probability spaces can be built. So that the continuum probability and continuum information models are emergent only in the "large mass/complexity limit".

But if you were to start with a limit case in a fundamental construction you have totally lost track of things. The limit is a limit, not a starting point.

3. Observation (communication) in physics could be a process that defines itself recursively. This goes back to Rovelli’s interpretation of quantum measurement, and suggests a way that inherently complex information-processes might nevertheless be fundamental.
Recursion or evolution is the only solution I have to these ideas. Non-recurse static solutions are not consistent with my reasoning as far as I can see. Again, compare to Einsteins expanding universe. But recursion is not a problem. It's only a problem for mind if you insist of nailing a static realist view of the world, while there in fact is none.

Instead I try to understand the effective laws of evolution, it's as close to the truth I think we'll come.

/Fredrik
 

apeiron

Gold Member
1,971
1
There would be two ways of looking at information here.

One assumes information to exist at a locale. The other that information is created at a locale. And these are two different views (though in both cases you would appear to find information at a locale).

The idea that bits actually could just exist would be the standard view - the information theoretic approach. The static view.

The idea that information gets created, gets shaped up, would be the semiotic approach.
The active or process view.
 

ConradDJ

Gold Member
307
1
Thanks for your comments... I'll consider and respond below. Meanwhile here's my 3rd installment, trying to make the connection with biological evolution.


3. Observation (communication) in physics could be a process that defines itself recursively.
If we ask, how is information determined (observed) through physical interaction? – the problem isn’t that we have no answer. There are way too many answers, since there are multiple ways of measuring any physical parameter. And, measuring any parameter always requires that several other parameters be “known” – so any way of observing anything assumes we have information gained from other observations of other kinds of things.

Since we want the foundations of physics to be ultimately simple, it hardly seems promising to struggle with untangling this complicated “semantic web” of different types of physical information, communicated by many different kinds of interaction in different physical contexts. In what way could that lead to any fundamental insight?

Let’s go back to the thought that in QM, information is only well-defined to the extent that it gets physically communicated from one system to another. First of all, that’s not really right – it glosses over the whole issue of entanglement. What QM actually says is that when two systems interact and exchange information, their states get correlated. But these “states” are still superpositions of all their possible correlations.

The big issue in QM is that information-transfer as such doesn’t “collapse” the wave-function to give a definite fact about a system’s state. It only creates an entangled superposition of the states of the two systems. And yet, any time an actual observer (whatever that is) looks at a system, it never sees a superposition – it always sees definite information.

Rovelli’s solution to this paradox in his “Relational QM” is simple: everything counts as an “observer” and every interaction produces definite information (collapsing the superposition) – but only for the two systems that interact. For any other system, the systems remain in an entangled superposition of their possible states – until they interact with that third system. That interaction also conveys definite information... but all three systems are still entangled in superposition so far as any other system is concerned, until a further interaction happens ... and so on. In other words, definite information only exists in the web of communications between individual systems, not in any overall “objective reality” common to all systems at once.

What I want to take from this is that the fundamental information-process in physics may be recursive – i.e. defined in terms of itself. That is, information about a system becomes a determinate fact insofar as it’s successfully “observed” by another system – which means that the information gets passed on and is “observed” by another system, and so on. Information is “communicated” insofar as it results in information that’s communicated.

This is interesting because of the parallel with “reproduction” in biology, which is also defined recursively. I’ll explain that statement in a moment. But first – reproduction is clearly the fundamental information-process underlying biological evolution. And, it’s an inherently complex process, because molecules don’t just make copies of themselves. Every complex organic molecule gets created in a catalytic process that requires other types of molecules, which require other types of molecules to make them. It’s complex in many ways – including that to reproduce any organism, many distinct replication processes are involved that all have to be as exact and reliable as possible – and yet the failure of exact replication (mutation) is also critical to the evolutionary process.

So biological reproduction does not mean the exact duplication of an organism’s structure. An organism reproduces itself successfully insofar as it produces offspring that reproduce successfully, which means their offspring also reproduce, and so on. At any given point in evolutionary history, all the ancestors of all living organisms have actually met this criterion of continuous successful reproduction – so this is more than a theoretical definition.

In Rovelli’s Relational QM, from the standpoint of any observer, all the information about systems in the past that’s relevant to all currently available information, is factually determinate for that observer... even though this “determining” happens only for one observer at a time, in each interaction.

Now I realize Relational QM has its own unresolved issues – I’m not trying to propose any specific answer here. My point is that what’s most paradoxical about the issue of quantum measurement – the lack of any clear difference between interactions that “entangle” the wave-functions of systems and interactions that “collapse” them – could well be pointing to a recursive information-process in physics. And we know of at least one other field in which this kind of process plays a fundamental role, despite its inherent complexity.


Is there any conclusion from all this? First, I think there are strong reasons for trying to sort out what’s actually involved in the many different ways that information gets physically defined and communicated – even though in no case is this a logically simple process. But even how to approach this kind of analysis is a big open question.

Second, identifying basic information-processes in physics may not mean hunting for a single logical or mathematical principle from which we can derive the whole variety of observable phenomena. It could mean getting a clearer notion of the functionality involved in a system where information isn’t just “given” in the reality of “the things themselves,” independent of any context, but instead actually has to be defined in the context of other information in the web of real-time interaction.
 

ConradDJ

Gold Member
307
1
I'm not sure what of his writings you specifically refer to but my impression is that Rovelli is not as ambitious to actually try to "derive the formalism" from a deeper principle.

Actually his original Relational QM paper did try this... based on two main postulates. He needed a third to complete the derivation, but the meaning of that one wasn't very clear.

1. There's a maximum amount of information that can be extracted from a system.
2. It's always possible to acquire new information about a system.

What's interesting is that these postulates seem to be directly contradictory... if we’ve gotten the maximum information from a system, how can we then get more?

Well, at any given time and place, a system carries finite information. (This distinguishes QM from classical physics, where the location, momentum etc. of any particle is supposed to be precisely determinate, and so requires an infinite quantity of information to specify it.)

But if we then interact with the system, putting it in a new physical context – as Rovelli says, “ask a new question” of the system – we get new information. The “answer” is inherently unpredictable, because this information wasn’t already there in advance. In QM answers come into being only in response to physical questions.

Now this is just the dichotomy apeiron pointed out above:
There would be two ways of looking at information here. One assumes information to exist at a locale. The other that information is created at a locale.

Rovelli says that in the quantum world we have both finite pre-existing information (defined by past contexts/questions) and the creation of new information (defined in the context of a new question). Apparently just putting these two principles together goes a long way toward giving us the structure of QM.

I don’t know whether this has anything at all to do with Rovelli’s quantum gravity work. He’s recently been emphasizing the timelessness of fundamental physics, which seems to me to go in a different direction.

But I think the RQM approach definitely relates to the issue we’re thinking about. Instead of conceiving the universe as a body of given fact – that could in principle be described by an omniscient observer outside the universe, outside of time – Rovelli seemed to be imagining it as a kind of information-process that creates new facts “in real time,” based on a selected set of past facts carried forward through the web of interaction.

So there's a kind of "Q&A" model of physics here. I'm envisioning it as something like a web of "measurement events" that define new information in a context given by past events, and pass that information on to other events, as part of their context.

If anyone wants the link to Rovelli's RQM paper -- the postulates are on pg 11-12:

http://arxiv.org/abs/quant-ph/9609002v2" [Broken]
 
Last edited by a moderator:

ConradDJ

Gold Member
307
1
Where to start, the initial condition? I've given this some thought and the current startingpoint I consider is simply the notion of distinguishability, which is basically a bit if you like. But since there is no certainty, you can say that the starting point is a unreliable bit.

So the processes is pictured like a game. The action is based on the premise at hand, reliable or not, because it's the only choice.
Hi Fredrik –

I tried to describe possible “initial conditions” in this recent thread –

https://www.physicsforums.com/showthread.php?t=331008"

I do think we’re working in a very similar direction. And I think that at some point, the basic evolutionary mechanism we’re looking for will become clear, though it certainly isn't yet.

Before Darwin, it was obvious to everyone that living organisms reproduce themselves. And in pointing out that “natural selection” would inevitably effect the development of species, just the same way that human breeders develop new strains of plants and animals through selective reproduction, he was pointing out something also quite obvious. Even so, it took a long time before people’s mental focus could adjust to the evolutionary concept. Even today it’s remarkable how people can completely misunderstand how biological evolution works.

The thing is, with biology the information that’s evolving is objectively there, built into the structure of molecules and organisms. We can deal with it entirely in terms of “classical” determinate facts about things-in-themselves.

But in physics, the information we’re talking about isn’t just there as a set of given, context-independent facts about real objects. What “evolves” is information being communicated through the web of real-time interaction between things. And even though this kind of information is actually all anyone ever experiences of the physical world, as we interact with things in the moment, we’re so used to thinking of the world in terms of given facts about real objects that it may take a long time to make the mental adjustment.

But like reproduction in biology, what we’re trying to describe would be something that happens everywhere in physics, in every interaction that communicates information. So it's whatever makes the difference between "actual" measurement-events and "virtual" events that don't convey definite information.

As to “initial conditions” and the “starting-point”... If we have an “unreliable bit” we also need a way for that to make a difference to something else, which could then make a difference to something else, and so on. So it seems we need to be thinking of something like an interaction-web from the start. It would have to support some kind of random decision or agreement – I think this is what you must mean by your process of “inference” – but something much more “primitive” than the precise causal determining of one event by another, in classical physics.

It’s hard to imagine what very primitive measurement-events might be like – just as it’s hard to imagine what the earliest self-replicating systems may have been like in biology. But for me the key point is to keep in mind – there’s no such thing as information without a context to which it makes a difference in some way – where “making a difference” means passing on information that makes a further difference, in some way... The "game" has to reproduce itself.

Thanks again to you and apeiron... -- Conrad
 
Last edited by a moderator:

Fra

3,073
142
Actually his original Relational QM paper did try this... based on two main postulates. He needed a third to complete the derivation, but the meaning of that one wasn't very clear.

1. There's a maximum amount of information that can be extracted from a system.
2. It's always possible to acquire new information about a system.
Ok I see what you mean. I have read his paper before.

Several people, also Ariel Caticha in other papers has tried to derive QM, from plausible axioms. But I have not yet found one that doesn't add nontrivial axioms. It is just alternative ways to axiomatically introduce standard QM.

I really do think that you can do away with some of the postulates, but it will also deform QM.

Postulate 3 is more or less the key to the QM logic, which he just postulates. I didn't like that at all.

Other than that I've said it before that parts of that RQM paper is excellent. So it's worth reading at any rate.

I might commnet more later.

/Fredrik
 

Fra

3,073
142
Conrad, it seems you have some issues with trying to make sense out of rovelli's reasoning, in the light of these radical ideas we are discussing. If that's the point, I agree, and I think it's because Rovelli's ambition is not (IMHO at least) really to implement the full version of the reasoning we discuss here.

As I see it, Rovelli's doesn't acknowledge that the framework for the information about the state of a system, which can only be communicated, can not be distinguished from the framework of the information about the evolutionary laws (laws of physics, symmetries etc) that a systme obeys.

Rovelli's in other papers makes clear that he belives in fundamental and objective symmetries, like diffeomorphishm symmetry in GR. I think that by consistency of reasoning here, if we really take what we are discussing seriously, doesn't allow any fundamentally objective (non-inferred) laws. Simply because there is no perfect inference/measurement process, thus there can be no perfectly certain laws.

This last thing makes everymore much more complex indeed, but it also opens up more possibilities of unification of spacetime AND matter that IMO rovelli's reasoning won't easily allow.

About the recursive nature of nature, I also agree. The reason for the recursion as I see it, lies in that it's not possible to by physical inference, arrive at a static conclusion. Every conclusion you make, is constantly evolving due to a self-reference. But this is not circular reasoning, because the self-reference is sufficiently constrained to warrant stability. Which is pretty much exactly like in biology. Controlled imperfections to allow diversity, combined with a selection principle.

Lee Smolin is closer to this reasoning that is Rovelli. If you read any of this books. For example "the life of the cosmos" with is a popular style book without math. This is a book where he suggest his Cosmological Natural Selection that there is diversity and reproduction of universes with different laws. The diversity is small to ensure stability. His main idea with CNS is that this taks place in black holes. (A black hole gives raise to a new universe with slightle varied laws).
(https://www.amazon.com/dp/0195126645/?tag=pfamazon01-20)

I think there are different ways to picture this though, so I am not tuned in specifically on Smolins CNS, but from a general standpoint Smolins has written more in line with this that Rovelli seems to have been done.

Smolin also have this talk about "reality of law"
(http://pirsa.org/08100049/)

/Fredrik
 
Last edited by a moderator:

Fra

3,073
142
Conrad, it seems you have some issues with trying to make sense out of rovelli's reasoning, in the light of these radical ideas we are discussing. If that's the point, I agree, and I think it's because Rovelli's ambition is not (IMHO at least) really to implement the full version of the reasoning we discuss here.
See page 16 in the RQM paper.

"My effort here is not to modify quantum mechanics to make it consistent with my view of the world, but to modify my view of the world to make it consistent with quantum mechanics"

Thus, he really does not even question QM. His derivation is "invented/constructed" to fit precisely.

I think rovelli's paper would be great to read in your first QM course. Because it gives line out a "proper way" to understand quantum theory, as opposed to classical thinking. But I have since long left realist thinking, this is not at all the problem I have with QM. I fullt accept the lack of realism and determinism of outcomes. My objections are deeper; these objections are not addressed by rovelli at all.

So I think Rovelli is not the answer to the evolving constraint/symmetry and physical inside inference views we were discussing.

/Fredrik
 

ConradDJ

Gold Member
307
1
I looked up Ariel Caticha on arXiv and found a bunch of interesting papers more recent than the one I’d seen before. Have you looked at this? What do you think?

From Inference to Physics -- http://xxx.tau.ac.il/abs/0808.1260" [Broken]

As to Rovelli --
See page 16 in the RQM paper.

"My effort here is not to modify quantum mechanics to make it consistent with my view of the world, but to modify my view of the world to make it consistent with quantum mechanics"

Thus, he really does not even question QM... So I think Rovelli is not the answer to the evolving constraint/symmetry and physical inside inference views we were discussing.

I agree, he does not provide an answer. He doesn’t try to reach any conclusion about what happens in the measurement process. And I agree that the 3rd “supplementary” postulate he uses to derive QM isn’t trivial, so we don’t know what it might be telling us about QM. Most important, Rovelli doesn’t try to explain how it is that different observers end up agreeing on a common, “objective reality” – given that there are only “facts” about things for each observer, not in an absolute sense. He just says, this is the way QM describes the world, and experiments show that’s correct.

And I agree that in his other work he seems to be on a different track – though I don’t pretend to understand what’s going on with Quantum Gravity theories. But I think both Rovelli and Smolin still tend to think about the world as a structure of relationships “seen from outside” rather than only from a point of view inside the interaction-web

The reason I keep coming back to the Relational QM paper is that it puts forward this basic idea so clearly – that whatever a measurement is, it happens for one observer at a time, not as a once-and-for-all, objective “collapse” of a quantum superposition. To me this is key to seeing the web of one-on-one relationships “from inside”. I think that if more people took this point of view in thinking about QM, we’d start to make real progress at last with the measurement issue.

But I don’t really get why you want to go beyond QM, or to modify it in some way. Maybe that will ultimately be necessary. But we’re still only part-way to understanding what the basic weirdness and simplicity of QM are telling us about the world.

So I like your quote from Rovelli: Let’s not try to change QM to make it look “sensible” in terms of an old view of things that’s clearly inadequate – let’s find a new way of thinking about the world that makes QM make sense, as it is.

That isn’t to say we shouldn’t try to derive QM from more basic principles. Rovelli’s paper begins by saying we’ll only know we have a good understanding of QM when we can derive the formalism from clearly stated principles – the way Relativity is derived from the postulate about the speed of light.

Rovelli's first two postulates do seem pretty clear to me... so if you have ideas on what the third one says, I’m definitely interested. Does this relate to Caticha’s work in any way? In the earlier paper I read of his, he thought that the linearity of QM was essentially required by consistency conditions -- so I don't think he was proposing any change to QM.
 
Last edited by a moderator:

Fra

3,073
142
I looked up Ariel Caticha on arXiv and found a bunch of interesting papers more recent than the one I’d seen before. Have you looked at this? What do you think?

From Inference to Physics -- http://xxx.tau.ac.il/abs/0808.1260"
I think i've seen that, I like and share a good part of his overall spirit and reasoning. Relative to some other programs I'm fairly close to his reasoning, but I object to a few quite important fundamental things in his reasoning. I think I've elaborated on it before. Ariel continous along a line of reasoning of ET Jaynes who has writte a nice book "Probability theory the logic of science"

https://www.amazon.com/dp/0521592712/?tag=pfamazon01-20

ET Jaynes even has a line of reasoning early in that book where he kind of "derives" the formalism of standard probability theory from a sequence of plausible reasoning and plausible postulates.

One key postulate is to associate degree of belief with a real number.

For reasons that is implicit in the way I've reasoned several times here, this does not quite make sense and is not satisfactory.

Ariel builds on to Jaynes foundation. I am suggesting that the very reconstruction (that Jaynes did be reconstructing probability theory as an extension to logic as a type of reasoning) should be done different. The introduction of the concept of a the contniuum must not be made so easily.

What I want to do is similar to what Ariel and Jaynes do, but with at least two key difference: the difference in reconstruction of probability as an extesion of logic and thus abstraction of inference is that I reject the real numbers as physically distinguishable, instead I am working on a discrete reconstruction of probability. so we get a discretized version of the probability theory, and I also don't like the way Ariel reasons about which entropy measure that is "right". In my view, there is no universal entropy measure. Instead the "right" measure is a result of evolving the reasoning sufficiently many times.

This is basically the objection that some have bout maxent methods - that the choice of entropy is amgious, because the entropy measures around are a result from requiring (for ambigous reasons) that the entropy measure "must have" certain properties, this is how shannon entropy is often "derived", sure it's adeduction, but made from ambigous premises - and it is. That's somehow wha I'm tring to cure.

There are some key components IMO that are not present at all in their reasoning. Other than that, I love their reasoning.

/Fredrik
 
Last edited by a moderator:

Fra

3,073
142
So I like your quote from Rovelli: Let’s not try to change QM to make it look “sensible” in terms of an old view of things that’s clearly inadequate – let’s find a new way of thinking about the world that makes QM make sense, as it is.
This way you gave the statement an extra twist, it sounds like Rovelli is is not representing an old view, but he does to me :)

He is not old relative to pre-QM thinking, but hey QM has been around for 100 years now, so with the old thinking I'm referring to the mainstream post-QM thinking. Note that rejecting QM can be for two reasonas - you want back to deterministic realism and pre-QM reasoning (which I dno't), or you want to push current framework into the next one and get a unified understanding that includes gravity.

The view of things where I think QM is objectionable is definitely not the pre-QM old views that I agree are inadequate (Einstein etc).

/Fredrik
 

ConradDJ

Gold Member
307
1
One key postulate is to associate degree of belief with a real number.

For reasons that is implicit in the way I've reasoned several times here, this does not quite make sense and is not satisfactory... The introduction of the concept of a the continuum must not be made so easily.

Fredrik --

I'm certainly inclined to agree with you on this. One of the most basic facts in physics seems to be that all interaction that communicates information between things takes the form of discrete, momentary one-on-one connection-events. This seems to be the meaning of Planck's "quantum of action" and to underlie the uncertainty principle. So if the world is fundamentally a structure of information, the continuum must be an approximation that evolves in the interaction-web, not a structural a priori.
 

Fra

3,073
142
So if the world is fundamentally a structure of information, the continuum must be an approximation that evolves in the interaction-web, not a structural a priori.
Yes, I don't think it's a structural a priori either.

This seems to be the meaning of Planck's "quantum of action" and to underlie the uncertainty principle.
In my view it's clear that action measures and entropy are related measures, action measures is a measure of how probable a given change is, and entropy is a measure of how likely a given state is a priori.

As I see it, an generally uncertain structure - given that we have an information measure - implicitly defines a space of differential changes, on which you can define a related natural information divergence measure, that is closely related to an action measure. If we are only talking about one single microstructure, then what we get is something simple such as classical thermodynamics where the actions are basically of diffusion type.

But, if you, like I am trying to work in, think that the microstructure of an observer is more like several different microstructures, that by relations are related, a more non-trivial dynamics appears. In particular to more persistent cyclic phenomena, that aren't the simple dissipative types that end up in heath death.

In this case the relations will define a new logic, in the sense that when you try to compute the transition probability, the x OR y and AND operators when x and y come from different sample spaces, new stuff happens. In particular if one transformation is the fourier transform we get position-momentum spaces. But that is merely a special case in this view. I'm exploring the case of general transformations, and how the transformaions themselves re subject to evolution.

So the answer to why fourier transporm and the special relation between postiion and momentum seems to be around is probably that this transform in this context has special desirable properties. But I'm still trying to figure some things out.

The trick seems to be to find out what the structures are and how they are related. Such a systems of a set of microstructures, with defined transformations, are in my view a kind of data compression. Decompositing and spawning of new structures and transformations are driven by evolution for best fitness.

So IMO, there is indeed a deep relations between states, and actions, information and information divergences.

But this view still contains alot of unresolved issues. Standard formalism such as QM is IMHO simply to static to harbour this vision.

/Fredrik
 

Fra

3,073
142
Some source of inspiration: Ponder over how the brain handles the analogous tasks. Certainly the human brain does not store raw time history data with infinite sampling rate of all sensory input, for very good reasons! First of all it would run out of memory quite soon, and the processing capacity to make use of the stored into would stall daily business.

It's thought from brain research that the brain actually recodes(transforms) and stores the information in a way and form that the brain itself judges to be most useful for hte future. So in short the brain doens't seem designed to accurately remember the past as a time history, it rather remember the past in the way/form that is likely to be of highest utility for the future. This things/characteristics of the original state of information that is expected to be useless, is discared by lossy compressions.

There is a very strong analogy with the probable function of the human brain and the way I think physical law and nature is organised. Nature learns, and only the fittest persists. This logic can be applied down to the notion of physical law.

/Fredrik
 

ConradDJ

Gold Member
307
1
But, if you, like I am trying to work in, think that the microstructure of an observer is more like several different microstructures, that by relations are related, a more non-trivial dynamics appears.

The trick seems to be to find out what the structures are and how they are related. Such a systems of a set of microstructures, with defined transformations, are in my view a kind of data compression.

Fredrik -- So much in your point of view makes sense to me. But "data compression" I don't get, as a fundamental process... because it seems to assume the data is just "out there" and doesn't need to be defined or determined through interaction.

Certainly the human brain does not store raw time history data with infinite sampling rate of all sensory input, for very good reasons! First of all it would run out of memory quite soon, and the processing capacity to make use of the stored info would stall daily business.

So in short the brain doesn't seem designed to accurately remember the past as a time history, it rather remember the past in the way/form that is likely to be of highest utility for the future. This things/characteristics of the original state of information that is expected to be useless, is discarded by lossy compressions.

I get the relevance of this, but I don't see the brain as recording the time-history of sensations and compressing it. Rather it seems as though there are many levels of neural networks sensitive to certain kinds of patterns in the data, and then to patterns in those patterns, etc. -- filtering the data up to a level where the organism can respond. Seems more like selection than compression.

But the issue I'm raising is really -- what is the "data" in the first place, and how is it physically defined? Your point of view seems so right to me, though I'm over my head with probability theory, entropy measures and so on. But unless you have an idea what form the information takes -- specifically, in physical interaction -- how do you find a starting-point for the mathematical theory?
 
5,598
39
Very interesting discussion on information...thanks guys....

One question: What do you guys think about information redundancy??

Charles Seife has some interesting views on information in his 2006 book DECODING THE UNIVERSE an information based perspective of physics. His book is interesting because it looks at things like QM and relativityfrom an information based perspective.

He says in the introduction:
Information theory is so powerful because information is physical. It is a concrete property of matter and energy that is quantifiable and measureable....just like mass and energy, information is subject to a set of physical laws that dictate how it can behave...how it can be manipulated, transferred, created,erased, or destroyed.
..no one really knows what entropy is...information is intimately related to entropy and energy...in a sense thermodynamics is just a special case of information theory.
A measurement is an extraction of information from a particle. the information does not come for for free...Something about that information - either extracting it or processing it - would increase the entropy of the universe.
He also discuss how information theory vanquished Maxwell's Demon (via his 1871, Theory of Heat) by Leon Brillouin around 1951 (finally by an IBM researcher in 1982!!!) and Landauer's Principle (bit erasure is the one activity that costs energy..increases entropy) irreversable operations increase entropy...

There is a current reference cited in another thread, one by Marcus I believe, that says essentially discrete sampling is identical to continuous information....I forgot to bookmark it ..if anyone can find that, I 'd like to read from the view of quantum discreteness vs continuous relativity....I could not just find it...will try again...

Seife also points out the Erwin Schrodinger noticed living things fight off decay...and likely sensed that "the essentail function iof living beings is the consumption, processing,preservation and duplication of information...." and also notes that "the principle of superposition..is information based....."
 

ConradDJ

Gold Member
307
1
One question: What do you guys think about information redundancy??
Zurek took the idea of decoherence in an interesting direction that relates to redundancy of information... he calls it "Quantum Darwinism". You might be interested -- this is one of several papers.

http://arxiv.org/PS_cache/quant-ph/pdf/0308/0308163v1.pdf" [Broken]

I like this work, to the extent I understand it -- but it seems to me that it doesn't address the key issue, i.e. how physical information gets defined / determined / measured / communicated in the first place.

By the way, I'm starting a separate thread in the Philosophy forum having nothing to do with physics, but focusing from another point of view on the question of the structure of communication-systems -- "What do human beings do when they communicate?"

https://www.physicsforums.com/showthread.php?t=334249"

Thanks -- Conrad
 
Last edited by a moderator:

Fra

3,073
142
Fredrik -- So much in your point of view makes sense to me. But "data compression" I don't get, as a fundamental process... because it seems to assume the data is just "out there" and doesn't need to be defined or determined through interaction.
No, that's not what I mean, but I understand your objection. It's just prolbematic to give an accurate description of this.

Information requires a context, yes. And there is a context. However it's not a fundamental fixed context. it's a dynamical context. So there IS a self-reference between information relative to a context, and the structure of the context itself. Because the context itself, does contain also information in the sense that it comes with ergodic hypothesis etc.

This means that all there is, in my view, is an hierarchy of information. One state of information can define a context for a more fine structured level of information.

the solution I envision that a given observer only sees a "window" in this hierarchy of information structures - this effectively truncates the world of all possibilities to a window of inside-DISTINGUISHABLE possibilites. At one end the structure is indistinguishable fro ma fixed context (but not that indistinguishable is a relative notion:) and at the other end, we have unpredictability. The size of the windows so to speak depends onf hte observer information capacity.

But this is still under reflection, while I have a reasonably clear vision and idea here, the details are under evolution.

I get the relevance of this, but I don't see the brain as recording the time-history of sensations and compressing it. Rather it seems as though there are many levels of neural networks sensitive to certain kinds of patterns in the data, and then to patterns in those patterns, etc. -- filtering the data up to a level where the organism can respond. Seems more like selection than compression.
In my view, the point is that there is a strong analogy with the CHOICE of WHICH lossy compression to implement, and selection. The neural networks and the actual biological makeup of the brain IS of course one level of context of the information of hte brain.

But the issue I'm raising is really -- what is the "data" in the first place, and how is it physically defined? Your point of view seems so right to me, though I'm over my head with probability theory, entropy measures and so on. But unless you have an idea what form the information takes -- specifically, in physical interaction -- how do you find a starting-point for the mathematical theory?
It you are asking for an explicit explanation that will have to wait. I'm still working slowly on this, nothing is published. Also since this is quite radical it does not make sense to publish part results that make sense only in my program. I need to motivate the program itself, and to do that I need to make alot of progress.

My starting point is an abstraction in terms of a microstructure of distinguishable states that are constructed in a way that there is an inside, and and interface (communication channel). the communication channels, has certain distinguishable states (like a surface of a sphere) that defines the distinguishable events, inside the sphere then, the events are processed.

Conceptually my starting point is an imagined "inside view", this inside view constrains the possibilities of both stats and actions. (this is the basis of the reconsturction of the "probability theory"). It's a reconstruction of the information hierarchy I mentioned, starting from the smallest complexity. So the mathematical starting points are combinatorics and various dynamics sets, that correspond to sets of related discrete probability spaces. The fact tht there are relations between the possibility spaces ithe key to nontrivial actions.

These structures then are assume to respone rationally to disturbances. The rational response predices an action, and also an selection for the perturbed action itself. So the starting poitns is basically combinatorics, the continuum are emergent as the sets grow large, by since it's an hierarchy the cnotniuum hypopthesis is not valid at all levels. So there might one stronly quantised, or strongly continous phenomena depending on the level.

/Fredirk
 

ConradDJ

Gold Member
307
1
Information requires a context, yes. And there is a context. However it's not a fundamental fixed context. it's a dynamical context. So there IS a self-reference between information relative to a context, and the structure of the context itself.

This means that all there is, in my view, is an hierarchy of information. One state of information can define a context for a more fine structured level of information.

My starting point is an abstraction in terms of a microstructure of distinguishable states that are constructed in a way that there is an inside, and and interface (communication channel).

Conceptually my starting point is an imagined "inside view", this inside view constrains the possibilities of both states and actions... The fact that there are relations between the possibility spaces is the key to nontrivial actions.

This all makes sense to me, and seems very interesting. Is your sense of the basic information-process that an "observing" system has at any given point a "space" of possible outcomes, in relation to another "observed" system? And the "action" then would consist of choosing a particular outcome as what gets communicated to another system -- i.e. as a constraint on its set of possible outcomes?

I'm wondering whether you envision something like my suggestion in post #6 above --
What I want to take from this is that the fundamental information-process in physics may be recursive – i.e. defined in terms of itself. That is, information about a system becomes a determinate fact insofar as it’s successfully “observed” by another system – which means that the information gets passed on and is “observed” by another system, and so on. Information is “communicated” insofar as it results in information that’s communicated.

... identifying basic information-processes in physics ...could mean getting a clearer notion of the functionality involved in a system where information isn’t just “given” in the reality of “the things themselves,” independent of any context, but instead actually has to be defined in the context of other information in the web of real-time interaction.

You commented on this --
About the recursive nature of nature, I also agree. The reason for the recursion as I see it, lies in that it's not possible to by physical inference, arrive at a static conclusion. Every conclusion you make, is constantly evolving due to a self-reference. But this is not circular reasoning, because the self-reference is sufficiently constrained to warrant stability. Which is pretty much exactly like in biology. Controlled imperfections to allow diversity, combined with a selection principle.

It seems that you focus on modeling an "inference" process internal to the observing system, and that the "evolution" takes place in the observer's possibility-space, while I tend to think of it taking place in the structure of the communications channels and how they provide a context for each other. But it does seem that we may be looking at the same "game" from different viewpoints.
It’s hard to imagine what very primitive measurement-events might be like – just as it’s hard to imagine what the earliest self-replicating systems may have been like in biology. But for me the key point is to keep in mind – there’s no such thing as information without a context to which it makes a difference in some way – where “making a difference” means passing on information that makes a further difference, in some way... The "game" has to reproduce itself.
 

Fra

3,073
142
As a general comment of my impression of your last post I would like to say that on the philosophical side we seem quite close in our reasoning, even if there might be slight details.

Note that I said this also I think in your very first post here. The similarly in our reasoning was clear already in your first post. :)

But at another level, we are also discussing problems that are not yet solved, so there is unavoidable uncertain things there, which might be attributed to the fact that it's still open issues, rather than clean disagreements, although the symptms is sometimes disagreements due to the fuzzy nature of this. This is the kind of disgreement you easily achieve even when discussing with yourself ;-)

So relative to alot of other major views represented here, I think we are fairly tuned. Perfecting tuning might be impossible simply because we're discussing open issues.

This all makes sense to me, and seems very interesting. Is your sense of the basic information-process that an "observing" system has at any given point a "space" of possible outcomes, in relation to another "observed" system? And the "action" then would consist of choosing a particular outcome as what gets communicated to another system -- i.e. as a constraint on its set of possible outcomes?
With the resevation for the diffuculty of beeing precise, yes that sounds reasonably close!

In relation to the ENVIRONMENT. Ie. from a strict point of view of mine, to take a part of the environment and think that I am observing that, is ambigous and an idealisation. More properly I think that every observation/interaction is simply with your environment.

I'm wondering whether you envision something like my suggestion in post #6 above --
Yes, it makes quite good sense to me, probably becuse our reasoning is so close, your words actually make sense to me :)

It seems that you focus on modeling an "inference" process internal to the observing system, and that the "evolution" takes place in the observer's possibility-space, while I tend to think of it taking place in the structure of the communications channels and how they provide a context for each other. But it does seem that we may be looking at the same "game" from different viewpoints.
Actually I see no contradiction here. I see what you mean, and indeed communication channels is in my view identified with in a certain sense the boundary of the observers state space. IE. there is a relation between the "possibility-space" and thte communiation channel even in my view! The capacity of the communicaation channel can NEVER exceed the complexity of the possibility space. This is why remodulation of the inside, influences also the communication channels.

what you say makes sense to me, and the two view are not I think in contradiction.

The state space and communication channels are supporting each other. One without the other makes no sense, and it's mutuatl, by the same token you can't have an ordinary object with non-zero volume with zero boundary area.

/Fredrik
 

apeiron

Gold Member
1,971
1
This means that all there is, in my view, is an hierarchy of information. One state of information can define a context for a more fine structured level of information.

Have you tried thinking about this as an issue of thermodynamic-like equilibration? So observation is, as you seem to be describing, a semiotic process. A relationship between locales and contexts, figures and their grounds.

In terms of "where is the information?", I would suggest this then leads to a view that the "data-compression" is about the minimisation of information represented in this dynamic interaction between locales and contexts. There is a least mean path, a sum over histories, story going on. So locales and contexts become equilibrated as far as possible. Some information be left - an irreducible configuration energy if locales are knotted up in some particle or soliton like fashion. But generally the dynamic is about minimising all visible differences. And that is how data gets compressed. By the flattening effect of interactions.

In QM, this would be the decoherence approach. The observer exists evenly over all scales. Hence QM collapse follows a powerlaw over physical spatiotemporal scale. A poisson distribution for "QM events" like atomic decay.

It can also be likened to a tensegrity approach - http://en.wikipedia.org/wiki/Tensegrity

So a system is disordered and then becomes ordered along an axis of fractal scale symmetry as "observerhood" - interaction measured in some information theoretic coin - is equilibrated over all possible scales.

You really ought to check out hierarchy theory - Salthe's scalar hierarchy in particular - for good insights on this. I would also link it to the "pansemiotic" approach that derives from Peirce.

The question I am unsure of is whether you are thinking "information" has some particular scale for the universe as a system - especially, for instance, being a planckscale phenomenon? So the bit is fundamentally the size of the planck limit.

In the scalefree description of information I have given - and which you seem to be alluding to - information would exist for observers over all scales. So it would be holographic in this way. A whole stack of horizons stretching out to "infinity". There is a local planck scale limit of course that anchors things. But also the global context of the visible cosmic horizon. And the two complementary limits are what are in dynamic interaction, equilibrating their mutual information to produce the final "maximally compressed" - or better yet, dissipated - state of irreducible information. Flat and even observerhood.

The notion of observerhood - a located point of view - has become an issue because of GR and, even more so, QM. But this was because in the mathematical models, information has become so divorced from meaning - the local bits so divorced from their interactions with a contextual frame. Thermodynamic approaches - of the dissipative structure kind - seem now a very natural framework for going back and making some sense of this. Of reuniting what has been broken apart - the idea of the observer and the observed.

These structures then are assume to respone rationally to disturbances. The rational response predices an action, and also an selection for the perturbed action itself. So the starting poitns is basically combinatorics, the continuum are emergent as the sets grow large, by since it's an hierarchy the cnotniuum hypopthesis is not valid at all levels. So there might one stronly quantised, or strongly continous phenomena depending on the level.

/Fredirk

As far as I can follow your ideas, you are indeed making the mistake (sorry) of thinking of information as hard located bits. Which then must create ontological issues with the emergence of the global GR continuum. But if you instead think of "soft bits" that are formed as the local limit, then your modelling begins to look like heading towards the same hierarchical outcome.

Salthe, for example, started his scale hierarchy by assuming the existence of "entities" over all spatiotemporal scales and then pointing out how QM-like discreteness would appear holographic-like at the smallest scales, and GR-like continuity would appear at the largest scales. As an observer-based effect. So there are models out there that may help shape your thinking, even if just because you disagree with them.

The key question here is: should information have a fundamental scale? (Or equivalently, should an observer?). Then what happens to your modelling if you instead assume that information is free to arise, to exist, over all scales. And then this existence must equilibrate. And then some kind of holographic, self-organised, limits will be observable. There will be a smallest scale and a largest scale as emergent effects.

Information theory is based on the atomistic notion that bits are fundamentally small. So the local scale is fixed and not free-floating - self-organised through the equilibration of "observation" - dynamic interaction. This is one way of looking at things for sure. But then there is a second completely different approach which abandons the fixed atoms and starts instead with free (vague) possibility. Recovering the local atoms, along with the global continuum, in the holographic event horizon limits.
 

Fra

3,073
142
Have you tried thinking about this as an issue of thermodynamic-like equilibration?
Yes, except of course that there are only equilibriums at each level, here is no fixed microstructure to which equilibrium refers.

As far as I can follow your ideas, you are indeed making the mistake (sorry) of thinking of information as hard located bits. Which then must create ontological issues with the emergence of the global GR continuum. But if you instead think of "soft bits" that are formed as the local limit, then your modelling begins to look like heading towards the same hierarchical outcome.
Nooo, definiteily not hard objective bits!! :cry: I think your impression is because this is hard to describe.

That's indeed the whole point, there are bits in each view, but these bits or degrees of freedom are subjecto evolution, and can thus be said to be soft.

This is completely analogous to the conecpt of background independence, the special meaning of metrics, or the general meaning of any context. There HAS to be a background and the context, this is why it makes no sense to talk to complete background independence. It's just that the background (the background here is hte microstrucutre deininf the degrees of freedom) is subject to change, it is uncertain! To suggest objective hard bits would be in complete contradiction with my main point! This is not what I think even if you interprete fragments of what I tried to convey in that sense.

But the difficulty here is how to turn a framework of soft bits into quantiative mathematical predictions. This is still an open technical problem I think. The problem for me at this point is not the conceptal part, I have a pretty good vision. I am working on making this vision translate into a quantiative predictions. All attempts so far indicats without exception that this process itself is part of the point, and that the quantiative prediction IS evolving or iterative by nature. And it's important then to capture the mathematical nature of this evolution process.

Apeiron, we are also in reasonable tune I think. Your and Condrad's have alot in common and I share large parts of it.

I think the repeating confusions, like the hard bits, is because of difficulty of describing this. There ARE open problems here - in progress, this is why I think the main discussion is about the main spirit that sets the direction of research. Here I pereceive that we are quite close.

Now when we are converging on the same spirit, the discussions must be changed from the general spirit to the the real open questions the seeks quantification here (that we understand more or less due to tuning) and this of course gets harder. I don't have all answers. What I do have too immature and is not ready to by published out of context.

But it's indeed a generalisation of statistical mechanics and probabilistic inference, but based on a new measure of information construct. Words can't nail this.

/Fredrik
 

Related Threads for: What are the fundamental information-processes in physics?

Replies
5
Views
9K
Replies
22
Views
5K
Replies
1
Views
956
  • Last Post
Replies
1
Views
5K
Replies
1
Views
2K
Replies
42
Views
6K
Replies
11
Views
3K
Replies
17
Views
1K
Top