Evolving Causality: Heuristic Point of View of Quantum Physics

  • Thread starter ConradDJ
  • Start date
  • Tags
    Causality
In summary, QM suggests that the principle of causality is not the only thing that's fundamental to the way the world works. The "laws" that govern the system are actually a complex set of stochastic guidelines, which evolve over time as the system interacts with other systems.
  • #1
ConradDJ
Gold Member
319
1
This is an attempt at a “heuristic point of view” for quantum physics that treats it as an evolutionary process. I hope it’s clear that I’m not proposing any new theory, but only trying to bring out some implications that are as well-established as anything can be, in this murky field. Please let me know if any of this makes sense to you... but I hope this doesn’t turn into a philosophical debate about the relationship between physics and mathematics. I’m not trying to prove anything one way or the other about that – just using the question to open up what might be a fruitful way of looking at the strange combination of causality and indeterminacy in QM.


Before the arrival of QM, the one thing that seemed completely certain in physics was the principle of causality. By that I mean, everything that happens in the world has to happen exactly the way it does, because everything “obeys” precise mathematical equations.

This idea of a “deterministic” universe had at least one very basic problem, that I’ll discuss in a moment. But the first point I want to make is that this idea works. All of physics amounts to finding the equations that describe how things happen, and this is still true in QM. Its equations still describe the “causal evolution” of the wave function, even though the wave function itself gives only a probability distribution for the actual results of an observation.

Before QM, this notion that things “obey” precise mathematical laws could just be taken for granted. But if we take the evidence of QM at face value – if we accept that at the most basic level interactions don’t “obey” equations, but happen at random – then this opens up a new question. It’s not just a matter of “giving up causality” – because we know that to an extremely good approximation, the world is causally predictable. The question is – where does this causality come from? How and why does a world based on random interaction evolve into something that looks as deterministic as it possibly can?

This is not a philosophical question, in my opinion; we’re not going to get the answer from logic or metaphysics. But if we take this seriously as a question for physics, we can find some fairly clear indications in QM about where to look for the answer.


First – the very basic problem with determinism in physics is that mathematics is nowhere near powerful enough to run a universe. As I’m sure we all know, the equations for even quite simple hypothetical systems – take three point-masses interacting via Newtonian gravity, in flat Euclidean space – generally have no analytic solution. So we may talk about things “obeying equations” in the real world, but that’s just shorthand for a situation we haven’t really figured out how to describe. The evolution over time of any actual physical system is far more precisely “lawful” than anything we can ever expect from mathematics.

In any actual physical situation, there are many different kinds of systems “obeying” several different kinds of “laws” all at once, as they interact with each other. At most, this kind of situation can only be approximated mathematically. So while our equations are obviously important for understanding how physical systems evolve over time, it’s not at all clear that “obeying equations” can be a complete or basic explanation of what’s happening here.


Now QM comes along and says, quite clearly – causal evolution according to equations is only part of the story. At bottom, the “laws” represented in the equations are a complex set of stochastic “guidelines” within which systems “choose” at random how to behave. More precisely, the equations of QM represent the set of possibilities available to a system at a given time. The system evolves by interacting with other systems in such a way that a choice gets made and communicated between the systems (i.e. in a “measurement”), so that the choice then creates a new set of possibilities available to each system.

What I want to suggest is that the word “evolution” in this context means something closely related to “evolution” in the biological sense. In both cases we are dealing with situations that are both highly controlled and open to random input. And in biology, we have a clear idea of where the complicated controlling structures come from, because we understand how the whole process has evolved. If you look at the molecular mechanics that goes on all the time in living cells, you see an extremely complex and quite chaotic system based on random interaction, that nonetheless manages with amazing reliability all the operations involved in sustaining and reproducing the organism. This seems very similar to the way physical systems manage to behave in amazingly “lawful” ways, even though the base-level processes are essentially random.


In physics, the nature of the evolutionary process is not well understood, to say the least. But we have a lot of evidence for it, in QM, so we know a lot about how it works... and we know what it has ultimately accomplished – namely, causality.

Each quantum interaction is a selection (measurement) that determines certain information and passes that on to other such events. That is, each event narrows down and makes more specific the range of possibilities for what can happen in the future. This happens in many different ways that define many different kinds of physical information. But the long-term result of the process is that the possibilities for what can happen next, in any given situation, get narrowed down as much as they can be... given the base-level indeterminacy on which the whole business operates.

In other words, there’s an evolving selection process that continues until – except at the very basic level – randomness is nearly eliminated, and everything that happens looks like it has to happen just that way. It comes to appear as though the outcome of any physical situation is completely predictable, even though in nearly all cases the situation is too complex for an actual prediction to be computed mathematically – even when our equations are correct. That’s because what’s basically going on in physical interaction is not computation. It’s random selection within a complex set of evolved constraints. Therefore the behavior of physical systems can be “causally determined” at a much higher level of complexity than we can imitate in mathematics.
 
Physics news on Phys.org
  • #2
The above is true only for stochastic interpretation
All other Interpretations (after the discovery of Quantum Decoherence, and the death of the 'Collapse', so I consider CI and TI dead) are deterministic
 
  • #3
ConradDJ said:
This is an attempt at a “heuristic point of view” for quantum physics that treats it as an evolutionary process. I hope it’s clear that I’m not proposing any new theory, but only trying to bring out some implications that are as well-established as anything can be, in this murky field. Please let me know if any of this makes sense to you... but I hope this doesn’t turn into a philosophical debate about the relationship between physics and mathematics. I’m not trying to prove anything one way or the other about that – just using the question to open up what might be a fruitful way of looking at the strange combination of causality and indeterminacy in QM.


Before the arrival of QM, the one thing that seemed completely certain in physics was the principle of causality. By that I mean, everything that happens in the world has to happen exactly the way it does, because everything “obeys” precise mathematical equations.

This idea of a “deterministic” universe had at least one very basic problem, that I’ll discuss in a moment. But the first point I want to make is that this idea works. All of physics amounts to finding the equations that describe how things happen, and this is still true in QM. Its equations still describe the “causal evolution” of the wave function, even though the wave function itself gives only a probability distribution for the actual results of an observation.

Before QM, this notion that things “obey” precise mathematical laws could just be taken for granted. But if we take the evidence of QM at face value – if we accept that at the most basic level interactions don’t “obey” equations, but happen at random – then this opens up a new question. It’s not just a matter of “giving up causality” – because we know that to an extremely good approximation, the world is causally predictable. The question is – where does this causality come from? How and why does a world based on random interaction evolve into something that looks as deterministic as it possibly can?

This is not a philosophical question, in my opinion; we’re not going to get the answer from logic or metaphysics. But if we take this seriously as a question for physics, we can find some fairly clear indications in QM about where to look for the answer.


First – the very basic problem with determinism in physics is that mathematics is nowhere near powerful enough to run a universe. As I’m sure we all know, the equations for even quite simple hypothetical systems – take three point-masses interacting via Newtonian gravity, in flat Euclidean space – generally have no analytic solution. So we may talk about things “obeying equations” in the real world, but that’s just shorthand for a situation we haven’t really figured out how to describe. The evolution over time of any actual physical system is far more precisely “lawful” than anything we can ever expect from mathematics.

In any actual physical situation, there are many different kinds of systems “obeying” several different kinds of “laws” all at once, as they interact with each other. At most, this kind of situation can only be approximated mathematically. So while our equations are obviously important for understanding how physical systems evolve over time, it’s not at all clear that “obeying equations” can be a complete or basic explanation of what’s happening here.


Now QM comes along and says, quite clearly – causal evolution according to equations is only part of the story. At bottom, the “laws” represented in the equations are a complex set of stochastic “guidelines” within which systems “choose” at random how to behave. More precisely, the equations of QM represent the set of possibilities available to a system at a given time. The system evolves by interacting with other systems in such a way that a choice gets made and communicated between the systems (i.e. in a “measurement”), so that the choice then creates a new set of possibilities available to each system.

What I want to suggest is that the word “evolution” in this context means something closely related to “evolution” in the biological sense. In both cases we are dealing with situations that are both highly controlled and open to random input. And in biology, we have a clear idea of where the complicated controlling structures come from, because we understand how the whole process has evolved. If you look at the molecular mechanics that goes on all the time in living cells, you see an extremely complex and quite chaotic system based on random interaction, that nonetheless manages with amazing reliability all the operations involved in sustaining and reproducing the organism. This seems very similar to the way physical systems manage to behave in amazingly “lawful” ways, even though the base-level processes are essentially random.


In physics, the nature of the evolutionary process is not well understood, to say the least. But we have a lot of evidence for it, in QM, so we know a lot about how it works... and we know what it has ultimately accomplished – namely, causality.

Each quantum interaction is a selection (measurement) that determines certain information and passes that on to other such events. That is, each event narrows down and makes more specific the range of possibilities for what can happen in the future. This happens in many different ways that define many different kinds of physical information. But the long-term result of the process is that the possibilities for what can happen next, in any given situation, get narrowed down as much as they can be... given the base-level indeterminacy on which the whole business operates.

In other words, there’s an evolving selection process that continues until – except at the very basic level – randomness is nearly eliminated, and everything that happens looks like it has to happen just that way. It comes to appear as though the outcome of any physical situation is completely predictable, even though in nearly all cases the situation is too complex for an actual prediction to be computed mathematically – even when our equations are correct. That’s because what’s basically going on in physical interaction is not computation. It’s random selection within a complex set of evolved constraints. Therefore the behavior of physical systems can be “causally determined” at a much higher level of complexity than we can imitate in mathematics.

Hello Condrad, I recall from previous threads that we have some views in common, although I think I recalled you used tried to avoid thinking that the QM needs reformulation.

I also think evolving causality as in evolving constraints or law is an interesting aspect, but I don't fully understand exactly what you proposed above but I think it was more of reflections that precise suggestions.

I also think of dynamical (ie. lawful evolution, IN time) as a special case of more general (less decidable) evolutions that are more like darwinian style evolutions. And I fully agree that this is apparently not very well understood in physics. Most people seem to have a hard time to let go of the concept of eternal (non-evolving) law, because they see no other way of making physics constructive.

I think in effect of causality as constraints that guide the action of the observer. So to me, the only may to make sense out of this, is to take the action of the observer into the picture. And the observer is evolving, but not according to dynamical laws, I think of it evolving in a true darwinian style.

I think of it like this: At each information update, there is also a possibility of "mutation" in the coding structure of the observer. The original of the Heisenberg relation in QM, is then the result of evolved non-commutative internal codes of the observer. In my picture, we don't need a classical picture of this code, the code is simply expected from the point of view of evolutionary fitness. The non-commutative structures are more FIT than classical ones.

So in my picture, I think of the observers as manifestations of these "coding systems" that are constantly evovling. There is then a SELECTION against coding systems so that only the most FIT coding systems stay stable. "Reproduction" can be pictured, as the action of the system that gives feedback to it's environment, "induces" similar codes. This is how I see it. So it's a group dynamics or group democracy that does the selection. There is no clear beginning nor end.

This suggests that current QM, is a special case, where the observer is NOT evolving, and where the system of observation is a semi-closed system (a small subsystem).

I think the general case, is what we need to understand QG, and cosmological measurement models.

/Fredrik
 
  • #4
Fra said:
So in my picture, I think of the observers as manifestations of these "coding systems" that are constantly evovling. There is then a SELECTION against coding systems so that only the most FIT coding systems stay stable. "Reproduction" can be pictured, as the action of the system that gives feedback to it's environment, "induces" similar codes. This is how I see it. So it's a group dynamics or group democracy that does the selection. There is no clear beginning nor end.

One analog is to sociological systems, or games, where each "player" can only have an expectations of the rules he should/must comply with in order to stay alive. So this takes the idea that is true in QM, that all we can know about reality is infered from measurements, so we arrive at a state of information about reality. I suggest we need to extend this also to the state of laws, and instead the objectivity of law is emergent just analogous to how social laws evolve.

Social laws, are not something the controls the individuals, it's something that's a result of negotiation, and individuals can not too aggresively violated the current laws if it wants to stay in the game. Social laws are more like something like constraints on the action and free choics of each individual player. But there is a mutual influence here, each player by it's actions can actually influence the social laws, just like the social laws influences the players. In the overall game it's about inertia and democracy.

This is why I think the extension to physics needs to consider evolution of observers (evolution of law comes along; when you see that expectations of law is coded into each observer, and objective law is just a negotiation) where objective stable law represents an equilibrium condition where all observers, as a rsult of evolution, interactionand mutual influence coexists in a steady state.

The idea would be that the standard model might be one such equilibrium. Now if we can find a way to argue that the standard model represents maximum stability in this sense, it would be a great support for this idea.

The connections is also, that in sociology you can imagine that "simple minds" or non-intelligent players can only encode "simple rules". The complexity of the player/observer Physically constraints the complexity of the constraints.

This is the obvious exploit we could use to extrapolate to the a unification scale of things.

/Fredrik
 
  • #5
The last analogy I made, is pretty much the line of reasoning that I EXPECT Roberto Unger to contribute with along with Smolins evolving law in the book I hope they will release. R. Unger is among other thinbgs into of social theory and law, and it seems his thinking has inspired Smolin and I can see why.

I expect Ungers idea to be quite in line with how I see it, but what I look forward to in his book is if Smolin - once he picked up more of Ungers thinking - has been able to come up with a more constructive mathematical formalism out of this that can apply to physics.

But these analogies are still a big source of inspiration as social games are I'd say extremely intuitive to most people, since we all play it, even if not always consciously.

/Fredrik
 
  • #6
Fra said:
I think in effect of causality as constraints that guide the action of the observer... And the observer is evolving, but not according to dynamical laws, I think of it evolving in a true darwinian style.


Hi Fredrik – I’ll respond to some of your comments in a separate post, hopefully tomorrow... I’m sorry I don’t have more time to pursue this, and my mind doesn't seem to work so fast anymore.

This is just an overview of where I think where our thoughts overlap – in the idea that “measurement” or “observation” as it appears in QM is an evolutionary process. This contrasts with Smolin’s proposal in The Life of the Cosmos – he was thinking of a process in which the entire universe reproduces itself in many new variant universes, each giving rise to new variant universes... so the laws of physics are still “eternally given” and static within each universe.

Here we’re starting from the quantum description of measurement-interactions – assuming this describes something that happens everywhere, not just where “conscious” observers are involved. In this picture, all physical interaction both defines new information – by selecting certain outcomes as “actual” from a structure of possibilities represented by the wave-function – and also passes that information on, reshaping the possibility-structure for subsequent interactions.

If this can be understood as an evolutionary process, reproducing and selecting information, then it’s something that’s going on all the time, and would presumably be the main process shaping the history of our universe.

Of course measurements can happen in many different ways – there are many different parameters in physics, and for each parameter there are various interaction-contexts that can determine them. But that makes sense in an evolutionary interpretation. In biology also, life depends on many different kinds of processes operating at many levels – within the organism, between the organism and the environment, between organisms and between species. What’s basic to biology is not anyone such process – e.g. the replication of DNA – but the underlying functionality of reproduction. Whatever the first self-replicating systems may have been, they would have been extremely simple, involving relatively few distinct physical processes. But we know they somehow succeeded in making copies of something that succeeded in making more copies. Once that got underway, it could eventually evolve many kinds of structure and incorporate many kinds of physical processes.

So if “measurement” in physics is an evolutionary process, then it has gradually come to involve many different interdependent kinds of measurements. But what’s needed is to clarify the basic functionality, analogous to replication in biology – so we can start to see which aspects of the complex structure of physical interaction are more “primitive” and which are more highly evolved forms.

The main point I was making above is that in QM, every kind of measurement amounts to a “reduction” of a wider range of possibilities to a narrower one, eliminating possibilities for what can happen among systems in the future. So it seems sensible that such a process would eventually find ways to eliminate nearly all future possibility – giving rise to a world where everything looks “causally determined”... at least at the macroscopic level, where the web of different kinds of measurement-interactions is very tight and "finely woven".

Thanks -- Conrad
 
  • #7
I have some comment about that idea.
Randomness can be associated with entities evolving under consistent constrains or randomness can be associated with constrains with somewhat already evolved entities.
In one case we can describe constrains mathematically (approximately) and in second case we can describe entities mathematically (again approximately).
Next more complex case is systematic altering between two previous cases.

I think this third case is closer to what QM tries to describe.
 
  • #8
ConradDJ said:
Hi Fredrik – I’ll respond to some of your comments in a separate post, hopefully tomorrow... I’m sorry I don’t have more time to pursue this, and my mind doesn't seem to work so fast anymore.

No worries Conrad, some of these things are truly mind boggling indeed, and the entire discussion contains unsolved problems and it's easy to enter loops when thinking about this, and I'm happy if I can see the way forward, rather than to final destination.

I find your posts more easily digested though since it's clear that we are reasonably on the same track, it makes is a lot easier to read ones mind.

ConradDJ said:
This is just an overview of where I think where our thoughts overlap – in the idea that “measurement” or “observation” as it appears in QM is an evolutionary process. This contrasts with Smolin’s proposal in The Life of the Cosmos – he was thinking of a process in which the entire universe reproduces itself in many new variant universes, each giving rise to new variant universes... so the laws of physics are still “eternally given” and static within each universe.

I think sounds right and I agree with your assessment here. On this point I am more radical than Smolin. But I still Smolins view makes sense to me, as a special case of a generalization, connecting to below...

ConradDJ said:
But what’s needed is to clarify the basic functionality, analogous to replication in biology – so we can start to see which aspects of the complex structure of physical interaction are more “primitive” and which are more highly evolved forms.

I fully agree. Here smolins picture the black hole generation spawning new universes with permuted parameters. I think this may still be sort of sensible, but I see a possibility also of more basic ways of "reproduction", and this is what we seek.

In effect a system can by it's own action "induce" copies of similar action forms in it's environment, by the selective pressure it exerts; exactly like a gigantic environment with all it's characteristics more or less determines the what's fit and what's not fit. And thus there is a preferred selection. It's in this sense I think that there is a more general way of seeing reproduction.

The mechanism I image is that the backreaction from the environment; selects for a particular evolution. And thus situation is mutual, and which deforms more than the other is all about inertia.

ConradDJ said:
The main point I was making above is that in QM, every kind of measurement amounts to a “reduction” of a wider range of possibilities to a narrower one, eliminating possibilities for what can happen among systems in the future. So it seems sensible that such a process would eventually find ways to eliminate nearly all future possibility – giving rise to a world where everything looks “causally determined”... at least at the macroscopic level, where the web of different kinds of measurement-interactions is very tight and "finely woven".

Hmmm if I understand you right here, I think we disagree on this point.

It seems you think of the measurement process as a monotonous narrowing down of possibilities?

It's not how I see it, becaues the "set of possibilities" is ALSO changing, so it's fully possible that the space of possibilities increases faster than the corresponding prior narrows the possibilities down. But the latter situation takes to it's extreme is how I'd imagein decomposition and destabilisation of a system, in the sense that the structure looses inertia, but that's also part of evolution, that actions that fail to keep itself together will not stary around the therefor not reproduce.

I can agree that a "successful" structure/system, will narrow down in some abstract sense the possibilities to the point where a stable condensate forms. This could be particles perhaps. But I think in order to understand how these systems form, the idea needs to be relaxed. It's should be possible to destabilise a system and then the narrowing down and concentration of inertia over a small state space is reversed and the system is hijacked and looses complexity (and thus energy/mass) to it's environment.

I think there are game analogies to that process, where one player can by clever strategies effectively gain control or "power" over other players, and "use them". Similary also in econimical models there you start with a speculation and by either wits or luck, manage to grow more money, that you can use for further control of the game itself.

To me the challange is ot make this "ideas" into a new mathematical formalism, which solves several open problems in the current formalisms.

I think that Ungers input to Smolin will be healthy in this respect, that's why I look forward to see if Smolin makes and advances in this direction.

/Fredrik
 
  • #9
Fra said:
It seems you think of the measurement process as a monotonous narrowing down of possibilities?

It's not how I see it, becaues the "set of possibilities" is ALSO changing, so it's fully possible that the space of possibilities increases faster than the corresponding prior narrows the possibilities down.


No, I agree with you – the picture of starting with a huge set of possibilities and eventually selecting out a certain subset is somewhat metaphorical. For one thing, anytime we do a measurement that determines the position of a system within a certain range, the range of possibilities for its momentum widens. So the evolution from a less determinate to a more determinate state of the universe is not literally a gradual elimination of given possibilities.

However, making a position measurement does create new information (taking a standard “stochastic” view of QM). And the future evolution of the system has to be consistent with its having that position at that time.

There’s no paradox here, it’s just how “possibility” evolves. When you measure something, you do get a specific result (within a range of “uncertainty”), and that does eliminate all other possible results. But the “total possibility” of the system isn’t smaller, it’s just more determinate – in that there’s more past information about the system that its future possibilities have to be consistent with.

So if in the early universe, everything is possible – no rules or constraints at all – then effectively almost nothing is possible... i.e. there is no way that any state of any system can be meaningfully defined, or have any observable effect on anything. If somehow systems arise that can “observe” something about each other, within some context of constraints that they also “observe” – then possibility has been drastically “narrowed down” within the universe of their interaction... but in another sense there is now much more possibility.

(It reminds me of Stravinsky’s comment about learning to write music within the constraints of the 12-tone system – a terrible idea that probably only worked well for Stravinsky – i.e. that the more constraint you impose, the more artistic freedom you have. What others have said about writing poetry within the sonnet form.)

Anyhow the evolution I’m thinking of is “monotonous” in the sense of becoming always more and more specific and determinate – choosing certain paths, so to speak, and so making other alternative paths forever impossible – but also opening up new kinds of possibilities that weren’t there before. Thinking for example of how life evolved on Earth.

So in this sense I think it’s reasonable to imagine more and more measurement-interactions creating more and more specific, determinate information, that contributes to the context for making more measurements... and that such a process would naturally evolve a universe that looks as though it were strictly “deterministic”. Except at the base level, where what can be determined is limited by the physical resolution of measurement-interactions.
 
  • #10
zonde said:
Randomness can be associated with entities evolving under consistent constrains or randomness can be associated with constrains with somewhat already evolved entities...

Next more complex case is systematic altering between two previous cases. I think this third case is closer to what QM tries to describe.


If what is evolving is the possibility of making measurements, then it's true that the context of measurement has to evolve along with what actually gets measured. The specific information determined in one measurement becomes part of the given context for other measurements.

Also, measurements are possible only if there are certain "rules" that the results of each measurement "observe". So even a very simple, "primitive" system of measurement has to involve a number of different kinds of information that provide a context for defining each other -- including both universal "laws" and the particular "data" that "obeys" those laws.
 
  • #11
Fra said:
At each information update, there is also a possibility of mutation in the coding structure of the observer... I think of the observers as manifestations of these coding systems that are constantly evolving. There is then a SELECTION against coding systems so that only the most FIT coding systems stay stable. Reproduction can be pictured, as the action of the system that gives feedback to it's environment, induces similar codes.

This suggests that current QM, is a special case, where the observer is NOT evolving, and where the system of observation is a semi-closed system (a small subsystem). I think the general case, is what we need to understand QG, and cosmological measurement models.

Fra said:
This is why I think the extension to physics needs to consider evolution of observers... where objective stable law represents an equilibrium condition where all observers, as a result of evolution, interaction and mutual influence coexist in a steady state.

The idea would be that the standard model might be one such equilibrium.


There’s a lot in this picture that makes sense to me. But I’m thinking it may reflect a fairly advanced stage in the evolution of measurable information... rather than describing the underlying “functionality”.

To take a parallel – it would be as though someone is thinking of the evolution of life as based on the reproduction of DNA molecules. That’s certainly true, in a sense... but the DNA coding system was almost certainly not there from the beginning. It’s part of a very complex interaction-system that evolved as a way to ensure very precisely controlled reproduction, by centralizing and protecting the information needed for the process in a small set of highly stable molecules.

In your picture, the observer’s “coding system” seems to be the individual’s best “guess” at the rules all interaction must follow. As the observer gains more experience, the coding system evolves, as more and more interactions either do or don’t fit the expected pattern.

Then the observer’s own behavior – as observed by others – communicates something about its own coding system, which may or may not conflict with the expectation-systems guessed by the other observers. So the universe of many systems observing each other may tend toward a state in which all observers share pretty much the same expectations and behave according.

If you’re working on a game-theoretic formulation of this sort of system, that does seem very interesting to me. My problem with it is only the question of how to connect it with current physical theory. We need something that holds the observer’s “coding system” over time, and something that corresponds to the observer’s process of comparing the coding system with new data input. We need a way for the coding system to reflect itself in the observer’s behavior, so it gets communicated to other observers.

It doesn’t seem impossible to me that this could work. Thinking about “primitive” biological systems is similarly difficult, because even the earliest self-replicating entities may have been fairly complex – presumably some unusual system of molecules capable of catalyzing each other’s formation, in an environment that supplied raw materials, and also allowed these systems to break up and form versions separated from each other.

But in biology there is a clear and very simple “underlying functionality” – i.e. making copies of systems that make copies. In your picture, the basic functionality seems to be making a comparison between a hypothetical law and actual data, so as to update the hypothesis.

My guess is that how observers communicate with each other is more basic. In that the observable universe is essentially made of information that makes a difference to the context in which other information can make a difference in some other context.

But this is also complicated – since for each observer, the context to which the incoming information makes a definable difference must be supported both by some sort of “memory” belonging to the observer, and by other kinds of communications with other observers. So my picture is not that different from yours.

But intuitively it seems right to me that communication plays a similar role in physics to reproduction in biology. The thing is, in biology we’re dealing with something entirely “objective” and easy to grasp – i.e. objects literally making more objects. In the physical realm what we’re talking about is much less easy to conceptualize – i.e. how information “experienced” in the world of one system becomes “meaningful” in the world of another system.

Again though, of course – in order to formulate something like this as a real hypothesis, we’ll need to be able to map it onto what’s actually known in physics, in a way that sheds some light on the findings of QM and Relativity, etc. Not something I expect to accomplish on my own!... so I’m grateful someone else is working in a similar direction.
 
  • #12
ConradDJ said:
My problem with it is only the question of how to connect it with current physical theory. We need something that holds the observer’s “coding system” over time, and something that corresponds to the observer’s process of comparing the coding system with new data input. We need a way for the coding system to reflect itself in the observer’s behavior, so it gets communicated to other observers.

Yes, these are questions that need to be solved. My starting point, that should ultimately connect to the physical theory, is the abstract interaction of these information coding action systems. This abstraction contains natural paramters, such as complexity, and in this complexity emerges also commutative and non-commutative structures. The first connection I seek is a connection between inertia with this complexity. Since my thinking is based at a combinatorical picture, my starting points as discrete, and all contiuum models are merely certain limits of the high complexity limit.

Also since the action codes I picture are pretty much combinatorical expressions, the codable configuration spaces are only discrete subsets of the continuum, so I'm not starting with continuum probability.

Another very fundamental, low level, connection to physics I picutre is how the increase and descreas in complexity of the coding structure are expected to be related to the origin (creation as well as destruction) of mass and inertia.

As soon as structures with large inertia appear, there will also unavoidably emerge spontaneously formed complex structures, and interaction patterns between these.

Here I picture the emergence of spacetime and the simplest unified interactions. Then these interactions are broken down into the well known standard model, as complexity increases - or so is my idea. In these basic evolutions of complex structure, the coded relations probably also contains non-commutativity as a preferred efficient coding method for low complexity systesm - this should (I hope) encode quantum logic, and maybe even provide a deeper understanding of why quantum logic as opposed to classical logic seems to be right (or more fit) for descprition of (small=not so complex) systems.

Still, relative to the unification scale, quantum scale is still VERY complex and massive, and I think the explanation starts far lower.About the possible relation to Planck scales etc, I really don't have an opinon. I personally find a lot of the information theoretic view that play around with hte Planck scale as some obvious natural scale as somewhat of a toying with units. I'd ave to await far more understanding before connecting to a particular scale.

That's a massive project indeed, but what do you do? There is no other option but to dig on, no matter how many others that mysteriously seem to be digging in the wrong direction ;-)

My own work is still very early: I am thinking and elaborating a lot with discrete expressions for combinatorial "actions" and how these can decompose into systems of non-commuting substructures that conserve complexity, and how such structures self-evolve (as per the natural induces flow) and how such systems evolve as they evolve and further how a population of such systems evolve, and allow preferred system properties. So far I'm sniffing at connecting to basic measures, such as mass, and some geometrical measures, but the framework neesds much more work before any real predictions or even post-dictions is remotely possible.

ConradDJ said:
for each observer, the context to which the incoming information makes a definable difference must be supported both by some sort of “memory” belonging to the observer, and by other kinds of communications with other observers. So my picture is not that different from yours.

Yes I agree this sounds close to what I'm envisioning. The "memory" is in my view the same system of microstructures that encodes the "laws". I expect a duality between the state of the microstructure and the state of laws, since the "laws" are implicit in the evolution of hte microstructure; or put differently the state of the memory will define a natural flow; this is the expected flow of time, implictly encoding the dynamical "laws".

/Fredrik
 
Last edited:

1. What is "Evolving Causality"?

"Evolving Causality" is a concept that explores the idea of causality in the context of quantum physics. It suggests that causality is not a fixed or linear concept, but rather a dynamic and evolving one that is influenced by quantum phenomena.

2. What is the Heuristic Point of View in relation to Quantum Physics?

The heuristic point of view is a way of approaching and understanding complex systems, such as quantum physics, by using simplified models and principles. It allows scientists to gain insights into the behavior of these systems without fully understanding all of their intricacies.

3. How does "Evolving Causality" relate to the uncertainty principle?

"Evolving Causality" suggests that the uncertainty principle, which states that the position and momentum of a particle cannot be known simultaneously, is a result of the dynamic nature of causality in the quantum world. It challenges the traditional view that causality is always fixed and deterministic.

4. What implications does "Evolving Causality" have for our understanding of the universe?

This concept challenges our traditional understanding of causality and suggests that the universe may not operate in a linear cause and effect manner. It opens up the possibility for new explanations for quantum phenomena and may lead to a deeper understanding of the fundamental principles that govern our universe.

5. How can the concept of "Evolving Causality" be applied in practical terms?

In the field of quantum computing, "Evolving Causality" can be used to develop new algorithms that take advantage of the dynamic nature of causality in quantum systems. It may also have implications for fields such as artificial intelligence and machine learning, where traditional notions of cause and effect may not apply.

Similar threads

Replies
3
Views
619
  • Quantum Physics
Replies
3
Views
936
  • Quantum Physics
Replies
7
Views
1K
Replies
4
Views
833
Replies
16
Views
2K
Replies
7
Views
1K
Replies
134
Views
7K
  • Quantum Physics
2
Replies
36
Views
2K
  • Quantum Interpretations and Foundations
Replies
15
Views
2K
Back
Top