Evolving law, Smolin and others

  • Thread starter Fra
  • Start date
  • Tags
    Law
In summary, the conversation discusses Smolin's Cosmological Natural Selection, which proposes that the laws of physics are evolving through the reproduction of universes and the introduction of variation at each generation of a new black hole or universe. This is related to the idea of evolution, but it is questioned why it is necessary to limit the concept to black holes. The conversation also touches on the possibility of extending this idea to all observers, not just black holes. The concept is being tested and can be falsified by finding a way to make our universe more reproductively efficient. The conversation also briefly mentions an article about a quantum threat to special relativity.
  • #1
Fra
4,105
607
Smolins Cosmological Natural Selection - that the laws, or at least some parameters of physical law, is evolving by reproducting universes and that some variation is introduced at each generation of a new black hole/new universe, contains a basic evolutionary idea that after some evolution the a randomly picked universe are likely to be somewhat selected for it's reprodctive fitness.

This is closely related to some reasoning I have on my own, but I don't see why it is necessary to contrain the concept to black holes. In a certain sense, a black hole can be seen as an observer, who continoually learns and consumes information. But what about all other observers? I had come to the conclusion that the same idea Smoling argues for should be even more natural when applied to a general observer, not only black holes.

This way, the logic of the normal flow of time, is the same as the logic "flow of time" in the universe population. Their origin are the same principle. If we can extend the logic here to normal observer, then perhaps it's easier to try to fill in the major missing points, to understand exactly what happens during the bounce, and exactly how the variation of laws is desribed.

The idea here is that the "DNA of the laws of physics" should be encoded in the microstructure of it's population, and thus variation among the population should in principle be thought of as variation of physical law, and then instead of arguing like Smoling does, the uniformity of physical law as we see it, could be explained by that fact that equilibration has taken place for so long. The oddball particles and system has since long dissolved.

I honestly don't see what insisting on the bounce as the only way to introduce variation is necessary. From my point of view, a black hole is just a very special (extreme) observer, but as I see it the logic must be present at all levels. This also has the advantage that you need not worrt about hypothetic collections of "other universes", because the logic lined out may be played out in front of our eyes in the this same universe.

Anyway, like smolin notices I think there could be plenty of ways to falsify this once it's more developed.

Has anyone, Smolin or anyone else taken the idea into that direction? ie. extend the reasoning beyond the black houle bounce, and bring the principle to unficiation with the ordinary flow of time within a single universe?

/Fredrik
 
Physics news on Phys.org
  • #3
Fra said:
Smolins Cosmological Natural Selection - that the laws, or at least some parameters of physical law, is evolving by reproducing universes and that some variation is introduced at each generation of a new black hole/new universe, contains a basic evolutionary idea that after some evolution the a randomly picked universe are likely to be somewhat selected for it's reproductive fitness...


I honestly don't see what insisting on the bounce as the only way to introduce variation is necessary...

Your paraphrase omits an important point about the black hole bounce reproduction mechanism---it's essential role in making the conjecture scientific.

For the conjecture to be science, it must be empirically falsifiable. The conjecture that universe regions reproduce via black holes is clearly testable, and is being tested as we speak.

To discredit the hypothesis, all one would need to do is figure out some small mutation of the standard parameters of physics and cosmology which would have made the universe more reproductively efficient---that is which would have caused more stars to form and collapse into black holes.

Suppose one could imagine, for example, changing particle parameters slightly in a way that makes neutron stars less stable and more subject to collapse.
And suppose one could do that without some undesirable side effect (like eliminating an element from the periodic table that is essential for efficient star formation) then one would have shown that our region is sub-optimal. (Not at a fixed point of the evolutionary flow.)

Smolin didn't propose the bounce-evolution hypothesis because he liked black holes, or because he liked the bounce idea---those are secondary. The point is we can observe neutron stars and black holes and estimate masses and abundances. And we understand their history and formation well enough to get a handle on what change in fundamental constants might make them more abundant.

If you want to think up an alternate reproduction mechanism, and propose an alternative hypothesis, that's excellent as long as you can meet a basic requirement:
You have to be able to analyze how the parameters of the standard models of physics and cosmology affect the abundance of that alternative mechanism.

Because that's how you would test the hypothesis. By seeing whether the standard parameters are optimal for a prolific region---and thus at a fixed point of the evolution flow---or whether they are suboptimal.
 
Last edited:
  • #4
grosquet said:
Fredrik:
It would be interesting to read your comments on the Scientific American article, "Was Einstein Wrong?: A Quantum Threat to Special Relativity" at http://www.sciam.com/article.cfm?id=was-einstein-wrong-about-relativity

The title "was einstein wrong" sounds like something you've heard before ;) but i'll skim this and see what it is, got a tub-slot coming up!

/Fredrik
 
  • #5
marcus said:
Your paraphrase omits an important point about the black hole bounce reproduction mechanism---it's essential role in making the conjecture scientific.

For the conjecture to be science, it must be empirically falsifiable.
...

Suppose one could imagine, for example, changing particle parameters slightly in a way that makes neutron stars less stable and more subject to collapse.
And suppose one could do that without some undesirable side effect (like eliminating an element from the periodic table that is essential for efficient star formation) then one would have shown that our region is sub-optimal. (Not at a fixed point of the evolutionary flow.)
...

Because that's how you would test the hypothesis. By seeing whether the standard parameters are optimal for a prolific region---and thus at a fixed point of the evolution flow---or whether they are suboptimal.

Yes, you're absolutely right of course. I didn't mean to make a proper summary, because I didn't mean to attack _that logic_, rather his application of it, that he didn't take it far enough :)

anyway I thought I capture the falsification point with

"after some evolution the a randomly picked universe are likely to be somewhat selected for it's reproductive fitness"

"reprodctive fitness" is in smolins idea the ability to produce a lot of black holes, so by the conventional understanding of how black holes are formed, and how that process depend on the parameters one can estimate wether significant improvement in black hole formation would be possible with a slight chage of parameters - which itself would make the CNS more unlikely.

I don't have much to say about that. My idea does not compete with this, it is not either or. I just see it as taking the idea one step further, because I think the basic logic is good. I think selection and evolution can take place even in between the bounces, but by a similar reasoning.

As I see it, in the analogy smolin presents, one question is exactly how and where are the actual "DNA" of physical law stored, and how does it "copy"?

/Fredrik
 
  • #6
I think the motivation for Smolin's natural selection cosmology is best understood when you contrast it with other multiverse scenarios, such as eternal inflation. Most multiverse scenarios conclude that universes are being created all the time with random properties. If this is true, then we would expect our universe to have random properties. This is not really a prediction. What Smolin was trying to do was modify the multiverse scenario such that some kind of universe would be preferred over some kind of other universe in the distribution of produced universes, so that our expectations if we assume a multiverse will be in some way different than our expectations if we do not.

So given this goal, the function of the black hole "bounces" is to provide a mechanism for certain kinds of universes outnumbering the other universes. If new universes come from black hole bounces, then the new universes will have some predictable set of values (i.e. values close to those of the parent universe) and not just random values.

In other words there is no reason to expect the new universe creation to be occurring through black holes, it is just that we hope it is because if that is true then that would be convenient for us (because it would make our scientific theories possible to test)...

As I see it, in the analogy smolin presents, one question is exactly how and where are the actual "DNA" of physical law stored, and how does it "copy"?
So if you look on the Arxiv, this is the most recent thing Smolin's published about the natural selection thing. If you look at the overview he offers he explains that the "DNA", if you want to call it that, of the universe would simply be its "dimensionless parameters", that is things like the 20 parameters of the standard model.

The dimensionless parameters p_new of each new universe differ, on average by a
small random change from those of its immediate ancestor. Small here means with
small with respect to the change that would be required to significantly change F(p).
[F(x) is the universal 'fitness function']

He does not explain what mechanism causes the parameters on the other side of a black hole bounce to vary, or what constrains that variance to be "small". The above paragraph is offered as one of three "hypotheses" that the cosmological NS theory depends on; in other words, he does not include an explanation for how the "copying with errors" of the dimensionless constants occurs, he simply asserts it as an assumption of the theory which some other, external theory would have to provide an explanation for. He later discusses the question of finding that explanation:

The hypothesis that the parameters p change, on average by small random amounts,
should be ultimately grounded in fundamental physics. We note that this is compatible
with string theory, in the sense that there are a great many string vacua, which likely
populate the space of low energy parameters well. It is plausible that when a region of
the universe is squeezed to Planck densities and heated to Planck temperatures, phase
transitions may occur leading to a transition from one string vacua to another. But there
have so far been no detailed studies of these processes which would check the hypothesis
that the change in each generation is small.

One study of a bouncing cosmology, in quantum gravity, also lends support to the
hypothesis that the parameters change in each bounce[48].
[48] is this paper.
 
  • #7
grosquet said:
Fredrik:
It would be interesting to read your comments on the Scientific American article, "Was Einstein Wrong?: A Quantum Threat to Special Relativity" at http://www.sciam.com/article.cfm?id=was-einstein-wrong-about-relativity

That seems only very remotely connected to the thread, and it discusses SR and QM, and doesn't relate to evolution of law?

I can see some remote links to the physical basis of the wavefunction and evolution, but that's not discussed in that paper. I expect that SR and GR to be emergent from not yet known deeper principle. Those ideas are "radical" enough that a parallell discussion of the problems withing the original EPR context is moderately interesting IMO.

Did you see any connections between that paper and evolution?

/Fredrik
 
  • #8
Fra said:
anyway I thought I capture the falsification point with
"after some evolution the a randomly picked universe are likely to be somewhat selected for it's reproductive fitness"

You are right. Your paraphrase was clear on that point. I didn't read carefully enough.
I always want to emphasize that point because empirical testing is so essential as the contrast that Coin draws illustrates.
 
  • #9
Thanks for your comments Coin.

Coin said:
I think the motivation for Smolin's natural selection cosmology is best understood when you contrast it with other multiverse scenarios, such as eternal inflation. Most multiverse scenarios conclude that universes are being created all the time with random properties. If this is true, then we would expect our universe to have random properties. This is not really a prediction. What Smolin was trying to do was modify the multiverse scenario such that some kind of universe would be preferred over some kind of other universe in the distribution of produced universes, so that our expectations if we assume a multiverse will be in some way different than our expectations if we do not.

Perhaps I'm just coming from a different reasoning. I haven't ever considered choosing a between different multiverse theories. I was attracted to the evolution idea, not the multiverse part (it's the part I don't like). The notion of multiverse often implies some kind of strange external view. I'm trying to find an intrinsic view.

(My point/hope here was that I saw a way to implement an evolutionary idea, but toss the multiverse talk, and combined it with a more explicit idea of how the DNA itself evolved)

To me the evolution concept has a specific purpose - it's the solution to the problem of infinite regress, appearing in the problem om induction. Or rather, the infinite regress can be thought of as an ongoing computation, and rather than seeing this as a problem of analysing a problem, I give it a physical interpretation and conjecture that it's simply evolution we also call time evolution in the short perspective.

The rough idea I was after is that observers "opinion" are manifestations of physical law, and that their evolution - selection for microstructure of observers in the population is a sort of evolving DNA of physical law that goes on all the time, and the selection mechanism is made by the local environment, an system with totally twisted "opinion" of physical law would not survive, it would have very short lifetime.

So there would be a collective self-stabilisation, and one this is worked out falsificaiton should be possible in that this logic alone, should have some preferred first emergent structures. Ideally these structures and there interactions would correspond to the standard model we know. If a different structure was predicted, it would be unlikely to be true.

Coin said:
So given this goal, the function of the black hole "bounces" is to provide a mechanism for certain kinds of universes outnumbering the other universes. If new universes come from black hole bounces, then the new universes will have some predictable set of values (i.e. values close to those of the parent universe) and not just random values.

As I picture the alternative the source of possible variation is simply the uncertainty. Variation is made around the expectation ("parent"), so that very large mutations are unlikely. So it would not be random. Any structure that isn't fit to maintain stability in it's interaction with the environment, will eventually be destroyed.

I picture instead of spawning baby universes, that populating the universe with observers that share their opinion of physical law is the way to "propagate thte DNA". In a way one systems induces it's DNA of physical law to it's own environment by simply acting as per a particular logic (coded by the DNA) and this puts a selective pressure on the environment to negotiate. So the selection is a mutual pressure between two interaction systems.

Another analogy is that, simply by interacting and talking to you right now, we are exerting a selective pressure on each other. As I see it, each communication is a kind of subtle negotiation, wether we see it or not.

This could be combined with ideas of the origina of inertia, since as I like to see it, the information capacity of the observers are closely related to the inertia. If a "gene" if we put it like that, has a certain "count" to it's favour, that is the inertia of the gene, but since there is limited capacity, it competes with other genes, a gene that isn't reinforced will eventually be lost. New genes appear by unpredictable mutations, and once a fortunate mutation appears it's likely to preserve itselve.

That might not be readable, but I was hoping that smoling himself or someone else inspired by the evolutionary idea (not by the multiverse thing) has developd this. This is pretty much what I am slowly trying to work on myself, and the general reasoning with evolution: variation and selection is close to what I think is the right way for other reasons, that I coulnd't help hoping for more.

I think Smolins reasoning is interesting. But in order to take it further I think the origin of what I'd like to call the microstructure of the DNA (the parameters space itself of the standard model) should also be described by the same logic? Otherwise the variation and selection is only taking place on a fixed DNA placeholder structure/parameter space so to speak, that shouldn't be necessary if the essence of the evolutionary idea is taken all the way.

Perhaps I'll have to keep an eye out for this in upcoming papers.

/Fredrik
 
  • #10
Fra said:
selection for microstructure of observers in the population is a sort of evolving DNA of physical law that goes on all the time, and the selection mechanism is made by the local environment, an system with totally twisted "opinion" of physical law would not survive, it would have very short lifetime.

In this view, one might see the problem is then of course, how to make a predicition. What prevents any environment from simply preseving itself and thus beeing equally likely, as per some antrophic reasoning?

Here I pictured the trick to consider how our current state of the universe evolved from a state with no observers (we might as well call it the big bang, but wether that state was uniqued I am not sure, but I an not even sure it matters, different initial conditions might possibly evolve into the same equilibrium state) then all the intrinsic pictures are very simple, simply because every intrisinc pictures is constrained by the complexity of the obserervers, and with no observers yet, the complexity of observed physical law is strongly constrained (this is why I talked about constructing intrinsic measures by scaling the complexity of the view in some past threads). Then I have not worked this out yet, because many problems are related, but the VISION is that trick of the SIMPLE strongly intrinsic view, that evolves to more complex views, will come with a probability measure of what emergent structures that are possible. The whole structure formation would be guided by this relational selective pressure driven by self-preservation. Maybe one can also call these resonance structures, as they would correspond to a structure that, when duplicating itlsef into the environment, produces a stable/consistent state that will not collapse. Then there would probably be an hierarchy of such structures, ordered by increasing complexity (probably corresponding to system inertia).

This should also have builtin an idea of the origin of inertia, as a self-organising self-preserving measure of it's own environment.

If this would work, it might very well be that smolins black hole picture is still right, this doesn't per see contradict thta, but hopefully it would be a more constructive angle. I think it must be a deeper place to attach the evolutionary reasoning, along with the bounce idea.

/Fredrik
 
  • #11
Coin, the writings you dug up anwers the questions about Smolins reasoning. It seems he is himself looking for that deeper fundamental physics. Then it makes sense. Now we just need to find it.

Coin said:
the "DNA", if you want to call it that, of the universe would simply be its "dimensionless parameters", that is things like the 20 parameters of the standard model.
...

But then he basically has a "microstructure of the DNA" and ponders evolution of it's microstate (parameter settings), but if you really take the evolutionary idea serious, one would expect the microstructure itself to also be explained.

I guess a very good motivation for this is still that doing so seem very difficult, and meanwhile you at least get started. I think it's just that when you get a part, you want it all :)


Coin said:
He does not explain what mechanism causes the parameters on the other side of a black hole bounce to vary, or what constrains that variance to be "small".

My expectation and personal hypothesis that I think is in line with the logic is that the mechanism to keep a small variance, is related to inertia. Each parameters has an inertia. But for that to even make sense, a model is needed for the physical basis of this. Variation is unavoidable, to to uncertainty, but at the same time a certain stability should be guranteed by the inertia of the entire construct.

To understand this I expect no less prerequisite that to understand the origin of inertia.

Coin said:
The hypothesis that the parameters p change, on average by small random amounts,
should be ultimately grounded in fundamental physics. We note that this is compatible
with string theory, in the sense that there are a great many string vacua, which likely
populate the space of low energy parameters well.

Even though I see this, the prime problem of string theory from the point of view of an evolutionary reasoning is that the strings themselves are a fixed microstructure that is put in as a conjecture. I think this leap in reasoning is possible why you end up with a landscape. There is no way I see how a string, can qualify as a fundamental starting point. Strings may
still play a role as I see it, there is a decent chance that string like structure will emerge as primordal self-preserving structures way down in the complexity scale. The reason why I fail to consider strings fundamental is that they are pictured are continuum structures, relating to an external spacetime (part of the microstructure).

If instead strings could be understood to emergey from a deeper theory, I thikn the landscape would also narrow.

/Fredrik
 
  • #12
grosquet said:
Fredrik:
It would be interesting to read your comments on the Scientific American article, "Was Einstein Wrong?: A Quantum Threat to Special Relativity" at http://www.sciam.com/article.cfm?id=was-einstein-wrong-about-relativity

Very good article.

In my humble opinion, one can forget about the last two proposals (Tumulka and Albert). But this is, of course, not a criticism of the article, but a personal preference. And my preference is a quite simple one: I prefer the conceptually simplest theory, which is clearly the de Broglie-Bohm pilot wave theory. That's not because I'm very much prejudiced in favour of classical determinism - if there would be good evidence, I would throw it away without bothering much. But there simply is no good evidence.

Thank you for the link.

Ilja
 
  • #13
Ilja said:
And my preference is a quite simple one: I prefer the conceptually simplest theory

I have a similar peference. But I it seems to me that simplicity is relative. I have yet never seen a universal measure of simplicity :)

To me simplicity means low degree of speculation. Simplest possible theory to me is the one which adds a minimum of assumptions and speculations. This relates simplicity to measures of certainty of information.

In that setting, classical physics is fairly speculative. The whole assumptions of existence of absolute references is to me quite speculative. When the remove the speculations, you also loose definitness, but it gets simpler in the sense of less speculative. This is how probabilisitc models could be less definite, but also "simpler" in my mind.

Edit: this is the sense where I personally think the various quantum weirdness and actions based on expectations and information at hand, rather than som realism is simple. It is somehow more natural. A systems "simplest or smallest action" seems to be the one, that adds a minimum of speculation, and is thus per construction the most proable one.

/Fredrik
 
Last edited:
  • #14
Fredrik –

I think what you’re suggesting makes sense, or at least it sounds a bit like a train of thought I’ve been pursuing, in connection with Carlo Rovelli’s Relational Quantum Mechanics. (I think Lee Smolin might also have been involved in developing that approach to QM, but he doesn’t seem to have made a connection between that and his idea of an evolutionary explanation for fundamental physics.)

I think the key issue is whether we can understand the basic functionality of “observing” or “measuring” as the kind of thing that can evolve. In biology, the basic functionality is that of organisms making multiple copies of themselves. Since we know evolution is possible on that basis, Smolin supposes that universes might also evolve by making copies of themselves. But I agree with you that we can imagine an evolutionary process that’s much closer to home, connected with “the ordinary flow of time” in this universe.

In Relational QM, as you know, any local physical system counts as an “observer”. The fundamental physical process is the measurement of one system S by another system O, or equivalently, the communication of information between the two systems. In any case, the result is an agreement between S and O on certain information (“the state of S”) that both S and O can then potentially communicate on to other systems.

The thing is, for something like this to happen, it’s not enough that S and O just interact. Most interaction between things remains merely “virtual”, and communicates no definite information. For O to determine something about S, O has to be “set up” in a certain way – so its interaction with S can have a definite outcome, i.e. make some specific difference to O’s subsequent interactions with the world. Likewise S has to be ”prepared” in such a way that an interaction with O can leave it in a specific state, that makes a difference to S’s subsequent interactions.

So we can envision the world as something like a web of interactions that communicate information between “measurement situations”. In order for a system to be “set up” to measure something, or “prepared” to be measured, there has to be other background information in its environment, communicated to it from previous measurement events.

The information communicated in each interaction would be merely random. But as you suggest, only information that turns out to be consistent with other information could help make a coherent background-context for future measurements. What we call “the flow of time” would have to do with the way information determined in one situation gets passed on as background information to “set up” other measurement situations.

Is that the sort of thing you have in mind? If we’re thinking about a web of interactions that define information in the context of other information, it seems reasonable that certain basic structures might evolve as a kind of “DNA” – i.e. base-level information that has to be agreed on in every interaction, in order for it to constitute a measurement that contributes to the coherent body of “the real world.”

Conrad
 
  • #15
Hello Conrad and welcome to the forum! It looks like you connect unexpectadly well given that this seems to be your first post :)

ConradDJ said:
In any case, the result is an agreement between S and O on certain information (“the state of S”) that both S and O can then potentially communicate on to other systems.

Emergent agreements, is exactly how I think of it to, and the very process of negotiation is the physical interaction itself. So by trying to understand in the general case, the logic of negotiation and feedback due to mutual selection pressure due to disagreement, we can hopefully also understand the deep information basis of physical interactions.

The early parts of the reasoning in Rovelli's RQM is brilliant IMHO. However I do not like how he develops it, in particular how he explicitly avoids the physical basis of probability, but nevertheless his initial reasoning is one of the more worth-reading papers. A brilliant philosophical beginning that IMO still awaits the same brilliant completion.

ConradDJ said:
The thing is, for something like this to happen, it’s not enough that S and O just interact. Most interaction between things remains merely “virtual”, and communicates no definite information. For O to determine something about S, O has to be “set up” in a certain way – so its interaction with S can have a definite outcome, i.e. make some specific difference to O’s subsequent interactions with the world. Likewise S has to be ”prepared” in such a way that an interaction with O can leave it in a specific state, that makes a difference to S’s subsequent interactions.

So we can envision the world as something like a web of interactions that communicate information between “measurement situations”. In order for a system to be “set up” to measure something, or “prepared” to be measured, there has to be other background information in its environment, communicated to it from previous measurement events.

The information communicated in each interaction would be merely random. But as you suggest, only information that turns out to be consistent with other information could help make a coherent background-context for future measurements. What we call “the flow of time” would have to do with the way information determined in one situation gets passed on as background information to “set up” other measurement situations.

I think your reasoning is well tuned to mine given that this is your first post here. What you write goes well in line with my reasoning.

Smoling black hole stuff seems in the light of the reasoning that you seem to share, to be a special case at best. I think time evolution itself must be seem in the same evolutionary sense. I don't doubt for a second it's a viable way forward, it's so plausible and presents the most consistent way of reasoning I'm aware of. But it still contains many complex problems.

Are you aware of anyone who has published anything along this reasoning? I am somewhat puzzled why not more progress has been made along these lines (or why I haven't found it). Perhaps not enough people has payed attention to this in the past.

ConradDJ said:
Is that the sort of thing you have in mind? If we’re thinking about a web of interactions that define information in the context of other information, it seems reasonable that certain basic structures might evolve as a kind of “DNA” – i.e. base-level information that has to be agreed on in every interaction, in order for it to constitute a measurement that contributes to the coherent body of “the real world.”

Yes, something like that is definitely what I mean! Contexual information without universal references. The only common references existing are emergent from mutual interactions. It may seem mad but like you seem to agree with, there is a good chance that even such very background independent logic, may produce locally definite patterns of self-consistenct structures.

/Fredrik
 
  • #16
ConradDJ said:
The information communicated in each interaction would be merely random. But as you suggest, only information that turns out to be consistent with other information could help make a coherent background-context for future measurements. What we call “the flow of time” would have to do with the way information determined in one situation gets passed on as background information to “set up” other measurement situations.

Is that the sort of thing you have in mind? If we’re thinking about a web of interactions that define information in the context of other information, it seems reasonable that certain basic structures might evolve as a kind of “DNA” – i.e. base-level information that has to be agreed on in every interaction, in order for it to constitute a measurement that contributes to the coherent body of “the real world.”

The problem used to be that in this fully contextual and relative thinking, you have nothing to start with. So where do you even start?

I've come to the decision that in principle, any starting point would be valid, and apply the reasoning forward. But that gives me an infinite number of starting points, and it also probably makes the discovery process unnecessary complicated. So I've decided to start at the zero end of the complexity scale. Ie. how does very light observer interact? The exploit is that a simple observer, can not even encode arbitrary complex interactions. So I start like that. I use a qualifier of complexity as the number of distinguishable states. There are two such state spaces, the internal space and the communication channel state. If I can find a logic to that: how does that entire logic scale, as the observer acquires higher mass(complexity) and how does the mass acquisition process look like?

So I have at least 3 problems.

1. Do understand the basic logic in the simplest cases
2. How this this "logic" scale with mass, certaintly new complex interactions and structures will emerge
3. How is mass formed (or as I think of it, how is confidence or inertia formed; how can a light observer, grow massive)

I have some hints of ideas on all these, but the problem is that they are related, so I am trying to make parallell progress. The exact scaling of the logic, is intimately related to the origin (growth) of mass. In fact I think the two points are almost the one an same.

About time, I picture time as a random walk, guided by a constantly evolving internal map in the observer. Mass acuiqution corresponds to a more "massive" and thus more confident map. A light map, has poor resoltion and is more frequently wrong (less reliable).

So far my starting poits have been combinatorical, and the selective pressure and evolution is to understand how such constructud structures are either supported or destructed in interactions, structurs that encode the wrong logic or picture, will not be stable in the environemnt. That's the logic of the selection. SO I think of the DNA as the "logic" that constructs the microstructure and it's evolution.

/Fredrik
 
Last edited:
  • #17
Fra said:
I have a similar peference. But I it seems to me that simplicity is relative. I have yet never seen a universal measure of simplicity :)

To me simplicity means low degree of speculation. Simplest possible theory to me is the one which adds a minimum of assumptions and speculations. This relates simplicity to measures of certainty of information.

I would clearly disagree. Certainty of the theory has nothing to do with simplicity. These are simply very very different things.
 
  • #18
My guess:
LQG is wrong, because spin-networks is a correct idea, but the 'quantisation' into Planck-scale is wrong and superficial.
Superficial because one unexplained phaenomenon is explained over an other unexplained phaenomenon.
Relativity doesn't work with 'nodes' of spacetime, This is due to certain angles, that shift timelike and spacelike behavior into each other. That doesn't work with nodes, because an angle in spacetime is a velocity in respect to an observer. Since we can't have 'steps in velocity' we can't have nodes of spacetime.
And: the observed sky is cristall-clear for billions of light years, any discrete structure of spacetime would blur the images of distant stars a bit, what isn't observed.
 
  • #19
thomheg said:
My guess:
LQG is wrong, because spin-networks is a correct idea, but the 'quantisation' into Planck-scale is wrong and superficial.
Superficial because one unexplained phaenomenon is explained over an other unexplained phaenomenon.
Relativity doesn't work with 'nodes' of spacetime, This is due to certain angles, that shift timelike and spacelike behavior into each other. That doesn't work with nodes, because an

I started to read up on Rovelli's LQG thinking, and even though I am very impressed by his initial analysis and reflections, when he laters moves on to formulating the new ideas, I don't think they follow the same reasoning I smelled in some of his more reflective papers.

In my mind, the relativity transformations should be emergent along with spacetime as systems interact and evolve. The most serious objection is that while he has interesting interpretations in the RQM paper, in the end it effectively ends up still beeing the same quantum formalism. I personally felt that his view should imply and suggest a modification of the formalism itself.

These are some very basic reasons why I think LQG as I understand it, isn't ambitious enough. I sense a higher ambition in he early parts of Rovelli's reasoning, that's what was surprising to me.

At first I though that his spin networks was best interpreted as the same "microstructures" as I call them and is pondering, but I soon realized that while that might be a possibilitiy, it's not what rovelli is doing, thus I had no help of his formalism. I had to go back where it in my opinion went wrong.

His reasoning that the only way to compare observations, are by Observer-Observer communication is great, and that such interactions boils down to nothing but the normal physical interactions is great, but then he concludes that "and these interactions must be treated quantum mechanically" and then he throws in the same old QM. That's the step that's a gigantic leap to me. My conclusion from that point would instead be that the quantum mechanics formalism we know, is rather also just emergent. And the emergent process is an evolutionary one.

Somehow that's where my reasoning start off, I am expecting a different turn to the good reasoning he started.

/Fredirk
 
  • #20
Fra said:
At first I though that his spin networks was best interpreted as the same "microstructures" as I call them and is pondering, but I soon realized that while that might be a possibilitiy, it's not what rovelli is doing, thus I had no help of his formalism. I had to go back where it in my opinion went wrong.

His reasoning that the only way to compare observations, are by Observer-Observer communication is great, and that such interactions boils down to nothing but the normal physical interactions is great, but then he concludes that "and these interactions must be treated quantum mechanically" and then he throws in the same old QM.

/Fredirk
In my own model I use something, that you could call a spin-network. But it's build over a smooth spacetime with a very simple model. Since I think, this is possible, I would think LQG is wrong. QM is based on linear algebra, observations and real space. Now think about a kind of space with geometric relations of a multiplicative kind, based on imaginary numbers. That's called geometric algebra and related to quaternions.
What we call matter are timelike stable structures within that. A field is the connection over a timelike hypersheet. That looks static, because that sheet is defined this way.
An electron in this model is a full turn. Since it is anti-symmetric it requires two rounds for a return. Now left and right are different, too, a structure emerges with two types of electrons.
This structure could be shifted, what makes electrons radiate, since they expose the aspect of rotation to a distant observer.
 
  • #21
thomheg said:
In my own model I use something, that you could call a spin-network. But it's build over a smooth spacetime with a very simple model. Since I think, this is possible, I would think LQG is wrong. QM is based on linear algebra, observations and real space. Now think about a kind of space with geometric relations of a multiplicative kind, based on imaginary numbers. That's called geometric algebra and related to quaternions.
What we call matter are timelike stable structures within that. A field is the connection over a timelike hypersheet. That looks static, because that sheet is defined this way.
An electron in this model is a full turn. Since it is anti-symmetric it requires two rounds for a return. Now left and right are different, too, a structure emerges with two types of electrons.
This structure could be shifted, what makes electrons radiate, since they expose the aspect of rotation to a distant observer.

I don't know anything about your model but I wonder if your model are expected to predict for example particle masses from your choice of first principles?

I personally consider the mass generation to be one major challange, to understand how and why there is a preferred spectrum of frequently observed small structures (particles) and why have have their specific masses.

/Fredrik
 
  • #22
Fra said:
I don't know anything about your model but I wonder if your model are expected to predict for example particle masses from your choice of first principles?
Look at this https://www.physicsforums.com/showthread.php?t=293626".
What do we call 'mass'? It is related to the behaviour of matter to keep a certain direction of motion (intertia) and to gravity. Matter is, what is more or less stable. No think about a gyroscope. That keeps it's orientation. Think about a rotation in a spacetime with three imaginary axes and time as a scalar. Than any object has this property of rotation, that it tries to keep. A velocity is an angle in this model. Any such object tries to keep the velocity in respect to a distant observer.
A mass is related to the 'speed' of this roation. The more massive the more it stays with the observer and the lesser it wants to leave its orientation. Such a structure is a very narrow light-cone, what I call 'the mass term'. That is related to a very flat 'radiation term'.
For a massless particle, spacelike and timelike intervals are equal and that generates a light-cone for empty space. Since it is light that defines our usual euclidean space, that space is empty, too.
 
Last edited by a moderator:
  • #23
Fra said:
I started to read up on Rovelli's LQG thinking, and even though I am very impressed by his initial analysis and reflections, when he laters moves on to formulating the new ideas, I don't think they follow the same reasoning I smelled in some of his more reflective papers.

...The most serious objection is that while he has interesting interpretations in the RQM paper, in the end it effectively ends up still being the same quantum formalism. I personally felt that his view should imply and suggest a modification of the formalism itself.

Fredrik -- Many thanks for the warm welcome!

I also have the sense that there are deep implications to Relational QM that Rovelli hasn’t tried to work out. But I like it that he doesn’t want to modify or extend the quantum formalism. I agree with him that the difficulty is to see what QM is saying about the world, because we’re still coming from a point of view where it makes sense to talk about the “state” or properties of a system without thinking about the context of physical interaction in which those things make a difference and can be observed.

For me, the limitation of Rovelli’s RQM is that he doesn’t try to analyze what’s involved in “physical systems giving descriptions of other physical systems.” He says, “Information is exchanged via physical interactions” – clearly true – but then he goes on, “The actual process through which information is collected and perhaps stored is not of particular interest here, but can be physically described in any specific instance.” So he stays at the level of abstraction of the QM formalism, and doesn’t dig down to the specifics of what it takes to make this measurement / communication business work.

The thing is, asking how measurement works in physics is like asking how reproduction works in biology. It happens in a lot of different ways, and none of them are simple. And even though we know that self–replicating systems must have evolved out of some very primitive original form, envisioning what that may have looked like is really hard. But, we don’t have to have a good picture of how life began in order to understand how it evolved. So maybe the situation could be similar in physics.

Obviously Darwin wasn’t the first to realize that biological organisms reproduce themselves. But he was the first to realize what that simple fact means. If something can make copies of itself, and the copies can make copies, you can fill up a planet with millions of species of extremely complex life-forms, just by accident. Even the first self-replicating systems of molecules must have been fairly complicated. But by your definition of simplicity as “low degree of speculation”, Darwinian evolution is amazingly simple.

In physics, we all know that things are observable and measurable – i.e. that the physical world communicates information about itself. The problem is, this is so obvious we take it for granted, and hardly think about what that means. Has anyone has ever tried to invent a model where all the parameters are “measured” by other parameters of the model?

Anyhow, though I’m fascinated by models of the “basic logic” of physics, my understanding of QM and Relativity is pretty rudimentary... so I’m not in a position to grasp the pros and cons of quantum gravity theory. I’m just trying to understand what kind of structure could provide “measurement situations” that actually define and determine all its own structural elements. It seems as though that might be what the QM formalism is telling us – i.e. that there are no determinate elements of physical reality that are not actually being measured and communicated through the web of interaction-events. It seems as though a number of different basic structures might be needed, that provide measurement-contexts for each other.

Conrad
 
  • #24
ConradDJ said:
But I like it that he doesn’t want to modify or extend the quantum formalism.

Here we disagree on one point :) My personal opinion and view is that the extension of his reasoning (or rather my reasoning, which I assume is almost the same as his up to the point where I loose him), demands a modification of the QM formalism.

I think he takes GR too seriously. This requires him to compromise with the logic he previously held high.

I rather see the GR structure and actions as emergent, not fundamental. And what I find so paradoxal is that I think the extension of his own reasoning (with some twists) can produce it.

As I see it, in general, the various symmetries that defines the observer-observer transformations we know from various theores, for example GR, are a result of evolving observers negotiation. If we can understand the evolutionary mechanism which selects these symmetries, then I think we are close to home. And hopefully not only GR, but also the symmetries of the standard model should be explained fully analogous to that. No need to appeal to "mathematical beauty" of certain symmetry groupes, I think the groups are there for very special reasons, not by conincidence. It's this logic I'm after. I think we see hints of it.

ConradDJ said:
He says, “Information is exchanged via physical interactions” – clearly true – but then he goes on, “The actual process through which information is collected and perhaps stored is not of particular interest here

Your points of objections are very similar to mine. Unless you've studied physics, I'm curious what your source of intuition is? Biology? I have to say though that I do not know if he but wtih that statement refers to the somewhat limited scope of the the RQM paper - in which case he does have a point. But clearly in the general full QG case, the actual process whereby information is collected and store is even of key interest IMHO.

/Fredrik
 
  • #25
Fra said:
No need to appeal to "mathematical beauty" of certain symmetry groupes, I think the groups are there for very special reasons, not by conincidence. It's this logic I'm after. I think we see hints of it.

Unless you've studied physics, I'm curious what your source of intuition is? Biology? I have to say though that I do not know if he but with that statement refers to the somewhat limited scope of the RQM paper - in which case he does have a point. But clearly in the general full QG case, the actual process whereby information is collected and store is even of key interest IMHO.

/Fredrik

My background is in philosophy, but I’ve been thinking about foundational issues in physics for longer than I care to admit. And I agree with you about the scope of the RQM paper. Rovelli was trying to stay as close as possible to the established framework, to get his main point across, and I think he did a fine job. But it seems that to go further and see where this orientation leads, we need to think about what’s involved in the functionality of physical measurement / communication.

Your post points to one reason why this hasn’t been much explored – it’s been taken for granted for centuries that physics has a mathematical basis, and that if we can even ask the question why that structure is whatever it is, the answer can only be mathematical. Back in 1933 Einstein said “Our experience hitherto justifies us in the belief that nature is the realization of the simplest conceivable mathematical ideas.” There’s an amazing irony in that, given how vastly complicated base-level physics has since become.

The alternative is to understand the universe as some sort of functional system – it is the way it is, because that’s what works. Clearly this makes sense to you, but of course it’s very different from the traditional orientation of physics. That’s what made Smolin’s book on the Life of the Cosmos so remarkable, that he was actually taking this thought seriously.

Since I can’t contribute at the level of technical discussions, I’ve tried to focus on the question about what kinds of functionality can conceivably evolve. Smolin still thinks in terms of self-replication. What got me excited about Rovelli’s paper – besides his basic premise about giving up on “the state of a system” as real in and of itself – was that in RQM, the measurement of S by O and the communication of the result from O to O’ are treated as the same process. To determine something (“collapse of the wave function”) is to communicate it, and vice-versa.

Just copying information almost doesn’t happen in the physical world – though we humans have figured out lots of ways to make it happen. But physical things don’t make copies of themselves, in general, which is why the origin of life is so hard to envision – and why Smolin’s idea about black holes seems so speculative. But this thing of measuring / communicating information occurs constantly, in every type of physical interaction.

We used to think of measurement as just copying data about something into a standard format, as when we measure the length of a stick with a ruler. And we still tend to think of communication as a matter of copying data from one person’s brain to another. But I think QM shows us that something much more complex and interesting is going on... because there’s always the question of how the information gets defined, how it gets to mean something specific, on both sides of the communication. Just to know whether or not the information got across, always takes further communication.

Anyway, I agree with you that the mathematical structures of QM and Relativity aren’t just “given” but reflect whatever’s going on at a deeper level.

Conrad
 
  • #26
Fredrik --

In answer to your original question... I came across this paper by Zurek on Quantum Darwinism -- relates to his work on decoherence. Maybe this overlaps with your ideas? He seems to be thinking about evolution through replicating information, rather than (as I was trying to suggest above) an evolution of communication between "measurement situations." Maybe it comes to the same thing after all.

http://arxiv.org/PS_cache/quant-ph/pdf/0308/0308163v1.pdf

He has more recent papers but this one seems to be most to the point. He says:

"This view of the emergence of the classical can be regarded as (a Darwinian) natural selection of the preferred states. Thus, (evolutionary) fitness of the state is defined both by its ability to survive intact in spite of the immersion in the environment (i.e., environment-induced superselection is still important) but also by its propensity to create offspring – copies of the information describing the state of the system in that environment."

Conrad
 
  • #27
ConradDJ said:
Fredrik --

In answer to your original question... I came across this paper by Zurek on Quantum Darwinism -- relates to his work on decoherence. Maybe this overlaps with your ideas? He seems to be thinking about evolution through replicating information, rather than (as I was trying to suggest above) an evolution of communication between "measurement situations." Maybe it comes to the same thing after all.

http://arxiv.org/PS_cache/quant-ph/pdf/0308/0308163v1.pdf

He has more recent papers but this one seems to be most to the point. He says:

"This view of the emergence of the classical can be regarded as (a Darwinian) natural selection of the preferred states. Thus, (evolutionary) fitness of the state is defined both by its ability to survive intact in spite of the immersion in the environment (i.e., environment-induced superselection is still important) but also by its propensity to create offspring – copies of the information describing the state of the system in that environment."

Conrad


It would seem to me that information about a physical system is secondary to the foundational physics. A distribution which contains the information can only be foundational if it is introduced in only one form for fundamental reasons. What is the most fundamental distribution and why would it be introduced?
 
  • #28
ConradDJ said:
Fredrik --

In answer to your original question... I came across this paper by Zurek on Quantum Darwinism -- relates to his work on decoherence. Maybe this overlaps with your ideas? He seems to be thinking about evolution through replicating information, rather than (as I was trying to suggest above) an evolution of communication between "measurement situations." Maybe it comes to the same thing after all.

http://arxiv.org/PS_cache/quant-ph/pdf/0308/0308163v1.pdf

He has more recent papers but this one seems to be most to the point. He says:

"This view of the emergence of the classical can be regarded as (a Darwinian) natural selection of the preferred states. Thus, (evolutionary) fitness of the state is defined both by its ability to survive intact in spite of the immersion in the environment (i.e., environment-induced superselection is still important) but also by its propensity to create offspring – copies of the information describing the state of the system in that environment."

Conrad

I've read some of Zurek's papers and yes there are some ways of his reasoning that I share. One of my favourite Zurek quote, and one of my overall favourites all times is

From his paper
"Decoherence, einselection, and the quantum origins of the classical"
"...What the observer knows is inseparable from what the observer is: The physical state of his memory implies his information about the Universe..."
-- http://arxiv.org/abs/quant-ph/0105127

The way I choose to interpret that, it is a very deep statement about the connection between the nature of information (which exact notion is generally somewhat of an enigma) and the physical nature of reality. A kind of relation between ontology and epistemology.

However, Zurek's analysis which is in the context of decoherence, he still has an apparently different view of information than I do. Although there is no doubt that the environment exterts an selection on any system, the difference lies in the frog vs birds view.

I am all in on this idea of environmental selection, but there is different ways to picture that. From the inside view, the environment as in the reminder of the universe, is simply the unknown, thus it's not a structured definable "thing". Thus it can not be used to explain emergence of stable subsystems. I think it's probably part of the explanation, but the biggest keys are still missing as I see it.

But I am not updated on what his very latest papers treat.

/Fredrik
 
  • #29
friend said:
It would seem to me that information about a physical system is secondary to the foundational physics.

I personally disagree about that. I think they are strongly connection. But it's true that a satisfactory understanding of that connection is still missing. But I see enough bits and pieces to have confidence in the road forward in that direction.

friend said:
A distribution which contains the information can only be foundational if it is introduced in only one form for fundamental reasons. What is the most fundamental distribution and why would it be introduced?

One of the most fundamental starting points I have adopted is the concept of and observer dependent notion of distinguishability. This I consider to be the basics of boolean states and the basis for counting.

Then if we can find a mechanism for emergent memory (which I think is possible) then we can construct integers (countable state spaces), and in the large N case, we have effective continuum models. To me the generation memory structure is key to understand generation of of mass.

This can be implemented as a kind of combinatorical basis information, rather than the continuum probability.

As I see it ANY structure, can be used to encode information. But the structure itself is the manifestation of the "prior", on which the information relates. So we really have information on information.

This circular reasoning is why I think the evolutionary selection idea is the best. Any alternative imples relating to a microstructure to define information, and it's as arbitrary as to find a unique prior probability distribution in bayesian reasoning. To me, the observer is the physical manifestation of the prior, and it is evolving, and is subject to selection.

So there is no fundamental information, just evolving information on information, and I think the world we see, are from the point of view of evolved observers, selected for it's stability.

To me the deepest level of physical law, would be to try to understand as large part of this logic of evolving information, manifested as interacting matter systems as possible. To understand what is matter, and how it's built, would be the same as to understand what information is.

The usual definition of information such as shannon entropy, is clearly not even in the ballpark to be sufficient here. It's very simple minded as I see it. A true evolving measure of information, can not make use of a fixed background microstructure, becase the microstructure is itself hidden prior information. The key we IMHO at least, need to understand is how the diffculty to define a universal definition or measure of information, IMPLIES interactions and evolution. But then, this does not mean we are on the wrong track, because look around us, we DO se interactions.

So the trick is not IMO to find the ultimate entropy formula. No such formula could make sense. I think we instead need to see the measure of information, as evolving. The question then is to understand how a measure (a subsystem; an observer; a particle) responds to feedback from it's environment, and how not only his state of information is revised, but how his memory hardware is revised.

/Fredrik
 
  • #30
friend said:
It would seem to me that information about a physical system is secondary to the foundational physics.

In my picture, the information one system has about another system, and how that is described, is absolutely critical to foundational physics of how those two systems interact. I think any system acts as per a logic that is implied by it's information about it's own environment. This information is both explicit, and the implicit prior information manifested by the memory hardware.

However, I personally don't think the standard entanglement considerations is a satisfactory description of this. The problem is that it makes use of arbitrarily chosen structures. So those measures are not intrinsic.

Ideally the memory hardware should emerge from these evolutionary process, and if this is correct that something interesting should pop out, that match what we have in the standard model.

/Fredrik
 
Last edited:
  • #31
Fra said:
In my picture, the information one system has about another system, and how that is described, is absolutely critical to foundational physics of how those two systems interact. I think any system acts as per a logic that is implied by it's information about it's own environment. This information is both explicit, and the implicit prior information manifested by the memory hardware.

However, I personally don't think the standard entanglement considerations is a satisfactory description of this. The problem is that it makes use of arbitrarily chosen structures. So those measures are not intrinsic.

Ideally the memory hardware should emerge from these evolutionary process, and if this is correct that something interesting should pop out, that match what we have in the standard model.

/Fredrik

Can you hear yourself talk? You say "information about it's own environment". When you say the word "about", you automatically imply that what it is "about" exist prior to calculating information about it. In this case you are talking about information about the memory hardware. This would make the memory hardware more fundamental then the information calculated from it.
 
  • #32
friend said:
Can you hear yourself talk? You say "information about it's own environment". When you say the word "about", you automatically imply that what it is "about" exist prior to calculating information about it. In this case you are talking about information about the memory hardware. This would make the memory hardware more fundamental then the information calculated from it.

It's not easy to word this. Instead, my point is that the memory hardware, and the information it's microstructure endodes evolves together. One without the other doesn't make sense.

In a sense though, I think it's fair to say that the memory hardware is more fundamental, but the memory hardware is in my picture only a condensed form of information, that can evaporate. So it is not really fundamental.

The difficulty is to understand how the background (the memory structure) evolves along with the state living on the background.

The main distinguishing factor between the two phases of information, the microstate of a microstructure, and the microstructure itself, is that the inertia of the microstructure is much higher, that's why the background evolves much slower. In a way I think of the the microstructure (ie the memory hardware) is a very "massive microstate".

But it's true that whatever I say, is still only relative to me and my personal reasoning. This is something we can never get around, but consensus in science still emerges, it has done so in the past and I'm sure it will keep doing that. After all my reasoning is also continously formed by interacting with my environment. So the flaws in my wordings (in the above sense), really doesn't contradict what I'm trying to convey. There are not fool poof arguments, and life is a game. I think the same applies to fundamental physics. But there seems to be a logic to why this wild game does self-organize, and stable structures appears.

/Fredrik
 
  • #33
friend said:
When you say the word "about", you automatically imply that what it is "about" exist prior to calculating information about it.

But I mean to find out what this "about" is, and to "calculate" information about it, is in my picture the one and same process. There are no fixed points. Only the evolving and related structured can be described, by similarly evolving descriptions.

But there is ALWAYS an uncertainty in the statement itself, and in the *measure* of information. It is not possible to make deductive calculations of information. All information are IMO to be seen as speculations, as in a game, and the game itself, creates a selection among the players, this is how the memory structure is formed. The individual player doesn't know WHY there hardware is what it is, it is however their reference from which to play on.

I think the uncertainty in this, that you probably identify, can be formalized, and turned into a evolution, where the drive is to minimize the uncertainty, and thus increase the stability of the observer. The difficulty in reaching a consensus in this discussion is the flip side of the difficulty of finding universal static observers, they also evolve.

I probably picture there are some degrees of freedom of "distinguishability-bits" from which each picture is constructed. But unfortunately I don't think it makes sense to think of the degrees of freedom as fundamental, I think of them as observer dependent. And the symmetry relating the different observers is probably bound to be 1) emergent, and not fundamental; 2) generally, non-information preserving, and the transformations are not to be seen as mathematical transformations, but rather as physical processes involving time, and subject to selection.

/Fredrik
 
  • #34
friend said:
It would seem to me that information about a physical system is secondary to the foundational physics.

In physics as in ordinary language, we generally assume that if information is meaningful, it's because it accurately represents some given reality. So reality is always more fundamental.

But QM leads a lot of us to question whether this continues to be reasonable for the foundations of physics, because it's easier to understand QM as describing a structure of information than a structure that's real and well-defined "in itself."

To justify this -- well, none of us ever experiences reality "in itself". That doesn't mean there is no reality, but it probably does mean that information can be meaningful without reference to reality... i.e. by referring to other meaningful information (that refers to other information...). Because this is the fundamental nature of the world we actually experience.

That point of view could only make sense in physics, though, if we could show how and why a structure of inter-referential information might evolve to resemble a well-defined reality -- a tall order.

friend said:
A distribution which contains the information can only be foundational if it is introduced in only one form for fundamental reasons. What is the most fundamental distribution and why would it be introduced?

Again, a fundamental principle in physics is that whatever's foundational needs to be "in only one form" and hopefully fairly simple. I think Fra would agree, but I'm not sure I do, because I'm not sure any information can be meaningfully defined unless there are different kinds of information out there in the environment.

Conrad
 
  • #35
Fra said:
So there is no fundamental information, just evolving information on information, and I think the world we see, are from the point of view of evolved observers, selected for it's stability.

To me the deepest level of physical law, would be to try to understand as large part of this logic of evolving information, manifested as interacting matter systems as possible. To understand what is matter, and how it's built, would be the same as to understand what information is.

The usual definition of information such as shannon entropy, is clearly not even in the ballpark to be sufficient here.

Well, I've already made clear I think you're on the right track, for whatever that's worth. But "selected for stability" doesn't sound quite right, maybe because to me that seems to imply a background time structure (even if not a time-metric). In biology, the evolutionary game is all about how to get complex systems to last through time (by copying them before they inevitably break down due to their complexity). But in physics, the problem seems to be different, namely how to define any information in a meaningful way in the first place -- how to get any information to make a difference to anything. Time maybe comes about only in this process.

The Shannon theory is of course important to physics because it's merely quantitative -- it abstracts from all questions about what information "means" (how it affects things, how it's measured), and that's good if physics doesn't yet have ways of dealing with those questions. But I agree with you entirely that we need an information theory that explains why and how information does what it does -- i.e. gets defined / determined and in turn provides a context that defines / determines other information.

On the other hand, the connection between stability of information and mass/inertia is intriguing...

Conrad
 

Similar threads

Replies
2
Views
1K
Replies
9
Views
1K
  • Beyond the Standard Models
Replies
1
Views
1K
  • Special and General Relativity
Replies
1
Views
987
  • Astronomy and Astrophysics
Replies
1
Views
2K
Replies
1
Views
1K
  • STEM Academic Advising
Replies
3
Views
443
  • Beyond the Standard Models
Replies
8
Views
5K
Replies
2
Views
1K
Replies
20
Views
2K
Back
Top