Individuation as Brute Assumption

  • Thread starter ZachHerbert
  • Start date
In summary, the author questions the viability of treating dynamical qualities and particles as primitive concepts in a unified theory of physics. He argues that they must be treated as emergent phenomena. This argument is supported by the work of researchers in fields like black hole thermodynamics and the holographic principle. If accepted, this would mean that particles and dynamical qualities cannot be considered fundamental and must be considered as part of a model of the universe which uses information rather than "stuff" to identify and track changes.
  • #1
ZachHerbert
20
0
In https://www.physicsforums.com/showthread.php?t=467045", I question whether dynamical qualities (like energy and charge) can survive as primitive concepts in physics while contemporary theory undergoes the transformation from a mechanical to an informational model of the universe. In it, I argue that dynamical qualities - and even the particles they are assigned to – must be treated as emergent phenomena.

The purpose of this post is to highlight problems with taking individuation for granted, particularly in the context of a unified theory of physics. This will serve as a second, complimentary argument for treating particles and dynamical properties as emergent.

To clarify my use of the term “individuation,” I am referring to the process of identifying a body – both in space (to differentiate a body from an environment) and over time (to assign and track changes in state and location). The most common way to meet this requirement is to simply assume it from the outset by adopting a philosophy rooted in ontological duality. Whether this duality manifests as particle and field, matter and void, strings and spacetime, ones and zeroes, or something-ness and nothingness, the same essential rift is present from the outset.

Of course, this situation is more damaging to philosophers than to experimentalists. But it is important to understand the magnitude of the assumption. (The classical debates on the ontological status of space and the existence of absolute or relative velocities are famous examples which the experimentalist may safely embrace or ignore – but without assuming the individuation of a distinct body from its environment we cannot even define motion or velocity in the first place! Let’s see the experimentalist do without that!) If bodies are fundamentally distinct from the space (or void, or vacuum, or field, or nothingness, or whatever) that they inhabit, then individuation is a non-issue. We simply assume that bodies are somehow “different” from their surroundings, and that’s that. I jokingly refer to this line of thinking as “Things-in-Space!” (which is properly pronounced using a booming, cinematic voice full of imaginary reverb.)

Einstein avoided the assumption of ontological duality when conceiving of a unified field by relying on fundamental quality alone. In The Evolution of Physics he writes, "A thrown stone is, from this point of view, a changing field, where the states of greatest field intensity travel through space with the velocity of the stone." In this scenario, field intensity (i.e. dynamic quality) provides a path to individuation. But what if we don't have access to dynamical quality at a primitive level either?

In my post on https://www.physicsforums.com/showthread.php?t=467045", I argue that these qualities must be treated as emergent. Areas of research like black hole thermodynamics and the holographic principle are beginning to paint a picture of the universe using bits of information, rather than bits of “stuff” that float through space. If correct, this idea does not allow dynamical qualities to be treated as primitive aspects of physics. Very briefly, the argument is this: if information is proportional to area, and dynamical qualities must be embodied by information, then these qualities cannot be assigned to individual events. This banishes the old idea of point particles. However, if particles have extension, we are presented with a new issue. If particles are considered to be an objective part of the universe, we must have a way to uniquely associate the events which comprise them. But this cannot be done dynamically, since any new properties we assigned to the events would also require extension and could not be embodied by the events in question. This leaves us to use brute assumption to uniquely associate some events with one another, while uniquely disassociating others. Obviously, this gets ugly in a hurry. On the other hand, if we do not assume these unique associations, then particles and their properties cannot be considered fundamental.

Now, assuming that we want to keep our primitive dynamical qualities by ignoring the problem entirely, we may resort to invoking Things-in-Space. But this now precludes any possibility of a truly unified model of physics – or at least one with a unified ontology. Worse, all of physics remains critically dependent upon unique associations (and dissociations!) between events which have not been classified or defined, and cannot, by definition, be explained by any dynamical model of interaction. And if we can live with that, then why not the aether or phlogiston?

The way out, of course, is to treat both dynamical qualities and individuation as emergent. Since a model of dynamics requires more than just the concepts of space and time (see references to Jammer in my https://www.physicsforums.com/showthread.php?t=467045") - and since we are already forced to draw relations between events which cannot be classified using those concepts alone – my proposal is to define a third primitive relation and place it on equal footing with space and time. These relations could then be used both to individuate a body from an environment (though not uniquely!) and to define the dynamical qualities that were previously considered to be primitive aspects of reality.

(I’ve submitted a paper to the Independent Research forum here on PF that presents this idea in more detail. But since that forum is moderated, and since the reviewers are volunteers and are extremely busy, I get the impression that it could be a very long time before the submission is approved… assuming it gets approved at all. In the meantime, I thought I’d initiate some targeted discussion here in the philosophy forum.)

So, in keeping with the new rules of this forum, I ask the following questions:

Has anyone ever seen any publications which address issues of individuation which do not rely on either assumptions of ontological duality, or primitive dynamical qualities? If not, are there any issues with my approach?
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
ZachHerbert said:
Has anyone ever seen any publications which address issues of individuation which do not rely on either assumptions of ontological duality, or primitive dynamical qualities?

Are you saying anything different from the soliton or topological knot approaches of people like Laughlin, Volovik, Xiao-Gang Wen, Olaf Dreyer?

You probably are as it sounds like you want to treat the dynamics as an ontic entity - an "atom of motion" perhaps.

However I think the emergent argument works fine. If a "particle" becomes topologically resolved in soliton fashion to the point that it has definite location, then it equally also has a definite motion with regards to its context. Location and momentum are just two sides to the same coin.
 
  • #3
Yes, I'm trying to start a level lower. Essentially, "how do you define a soliton in the first place without invoking an 'energy' or an independent background as a reference?"

Basically, if all we have is a collection of events, and each event is allowed a relative phase value and nothing more, how do we individuate stable forms and build a system of dynamics? At this stage, even if we are allowed to "step outside" our toy universe and draw spatiotemporal relations between the events, there is no way to discern variation from change. Without access to dynamical concepts like energy or amplitude, any waveforms or spacetime structures that we draw on our block universe are completely arbitrary.

Since space and time (as primitive relations between events) are not enough to get our toy universe off the ground, and since adding dynamical qualities like energy into the picture at a primitive level causes problems (both conceptually and practically), my approach was to admit a third primitive relation and treat it as a full sibling to space and time. This approach doesn't provide any unique theoretical answers in and of itself, but it does change the conceptual context of some long standing problems. (For example, since we would now have 3 "legs" to work with, a singularity would no longer have to be viewed as a point where a worldline comes to an end, but the point at which the intervals of a worldline turn perpendicular to both space and time.)
 
  • #4
ZachHerbert said:
my approach was to admit a third primitive relation and treat it as a full sibling to space and time.

Of course, I've yet to see this third relation and so have no intuitive sense of what you mean. But to me this seems to be multiplying the ontological difficulties rather than dissolving them. It gives you three unexplained entities rather than just two.

The better route seems to be the one that unites space and time as geometry. Then to dig deeper by seeking out the pre-geometry that can self-organise and develop into the geometry we see. ie: the general approach people are taking with loop quantum gravity or condensed matter approaches. Spacetime (and the "massive objects" that are knots in its fabric) would all be developmentally emergent in this view. It seems the natural way to go.
 
  • #5
apeiron said:
The better route seems to be the one that unites space and time as geometry. Then to dig deeper by seeking out the pre-geometry that can self-organise and develop into the geometry we see. Spacetime (and the "massive objects" that are knots in its fabric) would all be developmentally emergent in this view. It seems the natural way to go.

Yikes! I should have been more specific. I didn't mean to suggest that we should divorce space and time, and then add a third relation. I agree, that's multiplying the problem. The point is to do exactly what you propose above, while using the third primitive (which I call 'Perspective') as the means to define self organization. In the end, we eliminate the paradoxes that come from assuming primitive dynamical qualities, while simultaneously reducing the number of ontological entities from two down to one.

So, https://www.physicsforums.com/showthread.php?t=467045", if we characterize physics as being built upon three primitive concepts - space, time and mass - the goal is to kick mass up the evolutionary ladder, replace it with perspective, and then unify perspective with spacetime. So you end up with this:
 

Attachments

  • STP.jpg
    STP.jpg
    27.5 KB · Views: 343
Last edited by a moderator:
  • #6
ZachHerbert said:
Yikes! I should have been more specific.

OK, I'll have to wait until you can be more specific about what this "perspective" is.

I'm not familiar with Jammer but the way I would remove mass (or at least matter particles) from the story is to add expansion (and cooling) to spacetime. So the baseline state of the universe is the dynamic view of a spreading and cooling bath of radiation. A relativistic gas. Then mass condenses out due to some mechanism like the Higgs and becomes an additional level of action. In this way, mass is not fundamental. What is fundamental is a spacetime running down its thermal gradient by dynamically cooling~expanding.

After all, the interesting thing about particles with mass is not that they move, but that they move at sub-light speed. They fall locally behind the baseline rate of thermalisation.
 
  • #7
ZachHerbert said:
The purpose of this post is to highlight problems with taking individuation for granted, particularly in the context of a unified theory of physics.

The way out, of course, is to treat both dynamical qualities and individuation as emergent... my proposal is to define a third primitive relation and place it on equal footing with space and time.

ZachHerbert said:
The point is ... using the third primitive (which I call 'Perspective') as the means to define self organization.

So, following Jammer, if we characterize physics as being built upon three primitive concepts - space, time and mass - the goal is to kick mass up the evolutionary ladder, replace it with perspective, and then unify perspective with spacetime.


I too am interested in hearing how you conceive “Perspective” as an aspect of spacetime. But I’m not sure your philosophical critique is on point – arguing against an ontological dualism that takes mass or individuation for granted. 50 years ago Jammer’s book was a pioneering effort in philosophy to catch up with what physicists were doing, but I never found it very insightful... and of course it's way out of date.

If you look at the two main paradigms discussed in the https://www.physicsforums.com/forumdisplay.php?f=66" forum, String theory does seem to take the individuation of strings for granted, but not their mass, which is supposed to emerge from their interaction. Loop Quantum Gravity is entirely a spacetime construction, and individual particles are expected to emerge in a way not yet known. Even in the Standard Model, mass is thought to be generated through interaction with a “Higgs boson”. I think for many decades, now, most theoreticians would say that the concept of individual “particles” is just a convenient short-hand for certain invariants of dynamic field-interaction.

Even so, I would agree it’s a fundamental problem that when we try to understand what our theories are telling us about the world, we have to revert to the old “things in space” picture. I think it’s a very important conceptual task to evolve a different way of imagining the world, that finally makes some intuitive sense out of Relativity and Quantum theory.

What’s interesting to me about “perspective” in this connection – I suspect that the main obstacle to a deeper understanding of physics is our very strong tendency to imagine the world from an “objective”, God’s-eye point of view – i.e. from no actual point of view... despite the fact that no person or atom will ever actually experience the world that way. And the basic breakthroughs made in both Relativity and Quantum Mechanics had to do with incorporating “the observer” into the theory in a fundamental way. Even so, physical theory has stuck with the classical non-perspective, treating the world as a reality that can be described (at least abstractly, in terms of spacetime structure and quantum superpositions) without reference to any actual perspective inside it.

When we eliminate the observer’s point of view, we take for granted the single most obvious and most important empirical fact about our world – that it is in fact physically observable. No theory I’m aware of has tried to account for the fact that for every physical parameter and every aspect of determinate structure in the world, there are other parameters and other kinds of structure that are perfectly adapted to define and measure it. We live in a world where things exist and have determinate properties only insofar as those properties are communicated through interaction and are physically “observed” from some particular point of view.

So I’d be happy to talk about the meaning of “perspective” in physics. But I’m not so happy about assuming conceptual “unification” as an ultimate goal. If the world were just a vast body of Fact to be analyzed, then understanding it would be a matter of finding the most efficient and economical description, involving the fewest fundamental parameters. Even under those circumstances it’s not obvious to me that the world has to be resolved into One fundamental principle. But the real point is that the world is not just a set of facts to be resolved into a formal, mathematical order. It’s also a vast collection of viewpoints from which facts are constantly being “measured” and communicated to other points of view. This is a highly functional structure, not just a formal one.
 
Last edited by a moderator:
  • #8
apeiron said:
the way I would remove mass (or at least matter particles) from the story is to add expansion (and cooling) to spacetime.

The idea is that no dynamical qualities should be available at a primitive level. I'm using mass as a quintessential example, but the same applies to "energy" or "temperature" or even "entropy." All of those things require information to model, and therefore cannot be assigned to individual events without selectively associating some events and selectively dissociating others. (It's the "selective" part that's the problem.) So, while we can talk about an interval between events, without a way individuate a location over time, we can't talk about expansion or cooling. (And even talk about "distance" becomes arbitrary, because without energy or a light cone structure, we can't even uniquely classify an interval as space-like.)

Basically, I'm not disagreeing with the details of how to treat the emergent particles and properties. In fact, I'm completely agnostic to them.

ConradDJ said:
I too am interested in hearing how you conceive “Perspective” as an aspect of spacetime.

The idea is that both perspective and spacetime should be conceived of as aspects of "SpaceTimePerspective."

ConradDJ said:
50 years ago Jammer’s book was ...
Jammer has two books on mass. One about historical concepts and one about contemporary concepts that was published in 2000. I'm referencing the latter.

ConradDJ said:
I think for many decades, now, most theoreticians would say that the concept of individual “particles” is just a convenient short-hand for certain invariants of dynamic field-interaction.

This is sort of crossing over with my https://www.physicsforums.com/showthread.php?t=467045" now. I agree, the ontological duality argument is more about aesthetics than anything else. Basically, what I am proposing is that it isn't the details of the field interactions that are the big issue (or at least not the first issue). My argument is that we are taking something for granted that we shouldn't be - and that is dooming all subsequent efforts from the outset.

ConradDJ said:
What’s interesting to me about “perspective” in this connection – I suspect that the main obstacle to a deeper understanding of physics is our very strong tendency to imagine the world from an “objective”, God’s-eye point of view – i.e. from no actual point of view...
ConradDJ said:
No theory I’m aware of has tried to account for the fact that for every physical parameter and every aspect of determinate structure in the world, there are other parameters and other kinds of structure that are perfectly adapted to define and measure it.

Yes! This is exactly the point. So given that we need to account for this; and given even an aesthetic desire for a unified ontology; and given that primitive dynamical qualities cause problems and are incompatible with an informational paradigm - my point is that we need something new. The dynamical qualities that we are using now are too specialized to do everything that needs to be done - hence, kicking them up the evolutionary ladder. By admitting a third primitive relationship, we can define perspective in a way that can accomplish all of these things in one swoop.

Don't get me wrong. "Perspective" has to be assumed by fiat - just as "space" and "time" do. But this is significantly better than assuming "energy" by fiat - both in terms of ontological economy and in terms of eliminating paradox. (Again, I'm not proposing a new falsifiable theory. I'm proposing a new way to create falsifiable theories.)

Believe me, I'm not being secretive. I'm very anxious to get feedback on my paper. But my understanding is that I'm not allowed to post it outside of the independent research forum... and unfortunately I have no idea when (or if) it will ever appear there. (Mentors correct me if I'm wrong, and I'll attach it here straight away.) But if anyone would like a copy, just PM me and I'll forward it to you.
 
Last edited by a moderator:
  • #9
ZachHerbert said:
In https://www.physicsforums.com/showthread.php?t=467045", I question whether dynamical qualities (like energy and charge) can survive as primitive concepts in physics while contemporary theory undergoes the transformation from a mechanical to an informational model of the universe. In it, I argue that dynamical qualities - and even the particles they are assigned to – must be treated as emergent phenomena.

The purpose of this post is to highlight problems with taking individuation for granted, particularly in the context of a unified theory of physics. This will serve as a second, complimentary argument for treating particles and dynamical properties as emergent.

To clarify my use of the term “individuation,” I am referring to the process of identifying a body – both in space (to differentiate a body from an environment) and over time (to assign and track changes in state and location).



Just a few quick comments, i am not arguing with the general direction of your thoughts.

There hasn't been much talk about a crucial and relevant aspect of reality, namely that "Body", space, time, motion, perhaps even dimensionality, are approximate scale-specific concepts. 'Scale-specific' has proven to be an indispensable trait of nature. The fact that below certain scales, the HUP and fuzzyness(indeterminancy) are the dominant and defining characteristic of nature, is VITAL in the understanding of what and how nature is. In this regard, individuation appears to be scale-specific and even intuitive. The deeper problem lies with what defines 'scale' and the appropriate limit(and the elimination of infinities).








The most common way to meet this requirement is to simply assume it from the outset by adopting a philosophy rooted in ontological duality. Whether this duality manifests as particle and field, matter and void, strings and spacetime, ones and zeroes, or something-ness and nothingness, the same essential rift is present from the outset.



Primitive ontologies are most certainly wrong. Old philosophical ideas(Kant, Whitehead, Berkely, etc.) seem to be somewhat off too. I would guess that at this point, everyone is wrong, at least to some extent, in their worldview.




Einstein avoided the assumption of ontological duality when conceiving of a unified field by relying on fundamental quality alone. In The Evolution of Physics he writes, "A thrown stone is, from this point of view, a changing field, where the states of greatest field intensity travel through space with the velocity of the stone." In this scenario, field intensity (i.e. dynamic quality) provides a path to individuation. But what if we don't have access to dynamical quality at a primitive level either?



Something big and crucial is missing in the bolded text. Field intensity itself is quite smeared out, so there has to be something unaccounted for.



Areas of research like black hole thermodynamics and the holographic principle are beginning to paint a picture of the universe using bits of information, rather than bits of “stuff” that float through space. If correct, this idea does not allow dynamical qualities to be treated as primitive aspects of physics. Very briefly, the argument is this: if information is proportional to area, and dynamical qualities must be embodied by information, then these qualities cannot be assigned to individual events. This banishes the old idea of point particles.




Information can not be fundamental as it itself is an artifact of the inner workings of mind. All this stripping down of nature in the search of fundamental constituents either reaches a stage where there is only mind left(we can't reduce mind to anything simpler, we don't even understand what it is), or the conclusion that reductionism is dead and new philosophical ideas are needed to move on - holism, biocentrism or even more radical ideas(deficient as it seems).
 
Last edited by a moderator:
  • #10
Maui said:
Information can not be fundamental as it itself is an artifact of the inner workings of mind.

I come at this from a different direction, but I suspect you may like the approach I take in my paper. It's too far off topic to go into all the details here, but maybe I'll start another post that addresses the phenomenological angle.

Very briefly, the idea is to stop drawing harsh distinctions between between mind and matter (or mind and information), and treat all phenomena as a part of a single reality - without reducing either side to the other. (The point is stop having "sides" at all.) True unification is about treating all aspects of experience as a part of one universe - not struggling to tip the balance of power to one side or another of an artificial polarity.

Information may reside in the awareness of a conscious observer, but that information is also molded by the cognitive contours of that observer. This means that both must be treated as part of a single system. (i.e. no more Cartesian-style dualisms and no more God's eye, non-perspective frames of reference.)

But unless we are willing to confine ourselves to a first person view of reality, we must honor other perspectives as well. (And if we choose to embrace solipsism, then what's the point in anything?) Once we start allowing everyone's perspective to be a part of reality, the trick is to honor all of them, and to define their relations in a way that avoids both irreversible reductionism and polarizing divisions - and doesn't end up in a useless, hyper-egalitarian mush.

Maui said:
new philosophical ideas are needed to move on
That's absolutely true. PM me if you'd like a copy. :)
 
  • #11
Maui said:
'Scale-specific' has proven to be an indispensable trait of nature. The fact that below certain scales, the HUP and fuzzyness(indeterminancy) are the dominant and defining characteristic of nature, is VITAL in the understanding of what and how nature is.

I would say that scale is the BIG thing missing from most people's thinking. If we are talking about fundamental ontological ingredients that are being overlooked, then scale is tops.

The Planck scale in fact bounds our existence to either side of scale (locally and globally). It is an asymptotic limit on both size and temperature. For instance, a particle can't go any faster than the speed of light, but it also can't go any slower than "absolute rest" (uncertainty creates an irreducible jitter).

And then what characterises the universe is that it is scale invariant - it has an axis of scale symmetry - over the realm inbetween these limits. Size and energy scales make no difference, so long as you are within the bounding limits.

So spacetime is a phase transition from one boundary state (its local limit of smallest/hottest) to its other (its global limit of largest/coldest). The rate of that transition is scaled by the speed of light (the speed of thermal equilibration). And because this is an equilibrium transition, it appears scale-invariant from the inside.

Of course mass and its associated gravity fields then messes up this picture, making a flat (flatly expanding) equilibrium state a little lumpy. But it only changes the essential picture by a percent or two.

Lineweaver gives a good summary of the facts here.
http://www.mso.anu.edu.au/~charley/papers/LineweaverChap_6.pdf
http://www.mso.anu.edu.au/~charley/papers/LineweaverEgan2008v2.pdf

Information can not be fundamental as it itself is an artifact of the inner workings of mind. All this stripping down of nature in the search of fundamental constituents either reaches a stage where there is only mind left(we can't reduce mind to anything simpler, we don't even understand what it is), or the conclusion that reductionism is dead and new philosophical ideas are needed to move on - holism, biocentrism or even more radical ideas(deficient as it seems).

No, information is a way to model constraints, a limit on entropy. It is a way to atomise form.

Minds are concerned with meaning or semiosis - some specific state of constraint. Information theory is a way to measure reality at a more general level, stripped of definite meaning. It is actually taking the "mind" out of the measuring and so allowing specific states of constraint to be broken down into generalised descriptions.

But you are right that radical ideas are needed. In current holistic approaches such as infodynamics, you have the necessary generalisation of minds/observers to information/constraints.

http://www.harmeny.com/twiki/pub/Main/SaltheResearchOnline/ssaltheinfodynamics_update.pdf
http://mdpi.org/entropy/papers/e6030327.pdf
 
Last edited by a moderator:
  • #12
ConradDJ said:
But the real point is that the world is not just a set of facts to be resolved into a formal, mathematical order. It’s also a vast collection of viewpoints from which facts are constantly being “measured” and communicated to other points of view. This is a highly functional structure, not just a formal one.

Yes, and what is the general nature of those observations or measurements? It is about equilibration or thermalisation. Every interaction results in further entropification.

This is why I argue it is so important to focus on the baseline action of the universe (its slithering down the Planckian gradient from a hot point to a cold void). It is easy to get distracted by the issue of human consciousness, or even lagging mass. These are localised accelerations and decelerations of the baseline global rate of entropification.

This is what the theory of dissipative structure is about. You can get pockets of order - localised build-ups of constraints/information - that pay for their existence with a matching local acceleration of disordering, entropification. Conscious humans are quite exceptional in this regard (we can warm whole planets! unlock millenia of accumulated fossil fuels!).

But we are special (in the sense of a highly particular and extremely localised feature of the universe). So our "consciousness" is not a baseline description of observation, or the infodynamic development, of the universe in general.

From an ontological point of view, we do need to include observers in the most general possible sense into physics. Which means breaking down what humans do into a framework such as dissipative structure theory and infodynamics, where we are modelling a system in terms of its global constraints and local degrees of freedom (its information or context, and its dynamics or inherent potential to produce "spontaneous" events).
 
  • #13
I'd like to expand on the central issue that everyone rightfully seems to be responding to - how do you model a reality where nothing seems fixed?

To make a specific measurement (of something that is a change, and individuation, or other localised mark), you have to make it from some general baseline. This is what we mean by the god's eye or objective stance. We seek out a place that is "outside" the change or mark we want to describe. Yet as physics and science have progressed, that has become every harder in practice.

Clearly we are also part of the system. The baseline is part of what has to be described and accounted for (hence the call for a background independent model of physics, a proper theory of human consciousness). So how do you stand (with god :smile:) at some unchanging baseline, while also modelling the baseline itself? Especially if you feel that the baseline itself is subject to change, development, creation, difference.

So modelling has a long history of seeking out the static in order to measure the dynamic - and so leaving dynamics out as an ontological basic. The idea of observation is just one of the dynamical things that get squeezed out of the picture. Just like the static notion of information created a fixed and universal baseline for measurement by squeezing out the highly variable issue of meaning.

But there is in fact a way to have fixed baselines that are also inherently dynamic. And that is to invoke the notion of equilibrium (a state where everything changes in a way that no longer produces a change). Change (real, measurable and meaningful change) now becomes a detectable departure from equilibrium.

This is why I have argued that we should seek out the equilibrium description of the universe (as a conformal de sitter space, as an expand~cooling relativistic gas) and begin with that as the objective baseline that does not change (even though it is changing in the sense it slides down its entropic, dissipative, gradient at a single globally general rate).

So a smoothy self-equilibrating action along a gradient is our primitive notion. And then the question becomes what is the natural unit of measurement if we want to model departures from this equilibrium? Is it information (which has proved a pretty good yardstick for dealing with entropy)? Is it instead geometry (which is what GR and some quantum gravity approaches would seem to prefer)? Or are information and geometry two faces of the same coin here (the discrete and the continuous versions of the same measurement)?

Regardless of how the question is eventually answered, I think this is in fact the deep ontological issue. Reductionism has generally tried to find an unmoving baseline from which to then measure/observe all motion, change, local difference. That is "too simple" an ontology once we realize that the universe itself is not fixed (the big bang/heat death has proved that, along with GR and QM).

So we are now in the new game of modelling reality at a whole systems level. The baseline gets to be included in what gets made, what can change. And we have an understanding of this kind of "static" baseline from the thermodynamic notion of closed system equilibrium states. We have in turn an understanding of equilbrium states that globally evolve at a steady rate from dissipative structure theory (ie: open systems) and the phase transition literature.

This kind of thinking is at least implicit in much of modern physical theorising (such as spacetimes emerging from quantum foams, causal triangles, and other "sum over histories" global equilibrium-seeking mechanisms).

I think it ought to be more explicit - then for example people would be more familiar with ur-equilibrium concepts such as ontic vagueness. But anyway, again, the core issue everyone seems to be concerned with is the need to have a baseline to make a measurement...and the problems that arise when the demand for so long has been that this baseline is actually static, unchanging, rather than just unchanging in the sense of a dynamic equilibrium balance.
 
  • #14
ZachHerbert said:
... At this stage, even if we are allowed to "step outside" our toy universe and draw spatiotemporal relations between the events, there is no way to discern variation from change. Without access to dynamical concepts like energy or amplitude, any waveforms or spacetime structures that we draw on our block universe are completely arbitrary. ...

ZachHerbert, I find this a very fascinating comment. What if we somehow draw up our toy universe with patterns in the manifold geometry that themselves are equivalent to the physical properties and interactions that give us physics? I could in 3-dimensions, for example, weave a fabric using yarn (with random microscopic fuzziness), using a set of rules about certain fundamental relationships among the yarn "worldlines" -- rules analagous to relatedness found among worldlines of the physics Standard Model. I'm thinking that I should be able to, in principle, create analagous charges (electric, weak, color, etc.) and relatedness among the charges that are associated mathematically with analagous physical laws.

Having done that, I'm not sure how to put things like time, past, present, future, and consciousness into the setup. Perhaps all it would really need is consciousness, since the ordering of the patterns could create the impression of passage of time. I assume the consciousness would rely on the time owned by us guys in the real 3-D world weaving the toy model. If the entire fabric was conscious, then the psychological impression (experience) of "NOW" (along with memory of past and dreams of future, etc.) would be present at all positions of the yarn world lines. Maybe just shining a very bright flood light on the entire fabric would evoke the consciousness. If I was clever enough with the creation of the patterns, they could be consistent with Lorentz transformations

I actually went through all of this just to evoke from you a little more clarity about your above comments. I just set this up as a strawman to see if this is exactly the kind of thing that you would think not possible--thus, motivating your third leg.

I'm not very good at this kind of stuff, so you will have to be a little patient with me and bring me along with your thinking.
 
Last edited:
  • #15
Apeiron,

You've given me a lot of homework here. But thanks (I guess). And that Maui guy is beginning to amaze me.
 
  • #16
At this stage, I'll just go ahead and http://dl.dropbox.com/u/8804875/Primitives%2BIntro.pdf" as a reference for anyone who may be interested. Otherwise, I'm just going to end up recreating it post by post anyway.
apeiron said:
I'd like to expand on the central issue that everyone rightfully seems to be responding to - how do you model a reality where nothing seems fixed?

*SNIP*

So a smoothy self-equilibrating action along a gradient is our primitive notion. And then the question becomes what is the natural unit of measurement if we want to model departures from this equilibrium? Is it information (which has proved a pretty good yardstick for dealing with entropy)? Is it instead geometry (which is what GR and some quantum gravity approaches would seem to prefer)? Or are information and geometry two faces of the same coin here (the discrete and the continuous versions of the same measurement)?

Regardless of how the question is eventually answered, I think this is in fact the deep ontological issue.

I won't quote your entire post, but I'm in agreement with all of it (and tend to imagine things as per the bolded section). What I'm proposing is that there is value in abstracting that idea one step lower – and then continuing in the same direction.

I’ll start by characterizing your position as being based on a Spacetime+Order ontology. (I know this isn’t exactly right, but it misses the “emergent” mark in the same way as starting from a SpaceTimePerspective ontology does, so any deviation washes out in the end.) Basically, we have a collection of events which form patterns of information that are constantly shifting, but have enough overall equilibrium that we can establish a semi-static baseline as a point of departure to make meaningful measurements – even though the baseline isn’t really static.

For ease of illustration, let’s pretend that we can step outside the system, and study it from a truly godlike, dissociated point of view. If we imagine the system as a 3D spacetime block (2 space, 1 time), we can model the “Order” dimension by coloring each event in some shade of gray. Further, we’ll say that the “arrow of time” points in whichever direction shows the greatest overall trend from light to dark – that way our system slides to greater and greater entropy (darkness) as time passes.

Now, before we start coloring events (i.e. assigning relative locations in the “order” dimension) all we have is empty spacetime. We don’t even have a standard by which to establish an arrow of time. It could point up, or left, or back – it wouldn’t matter at all. In fact, since the whole block is featureless, we can’t even really call it a spacetime. It’s just a bunch of events. It isn’t until we add some variable component that we get enough contrast to say anything about it at all. So even though we may never have any such thing as “absolute black” or “absolute white,” we at least have contrast, and contrast let's us say something about the system.

(To use Bobc2’s analogy, if we don’t have any yarn, we don’t have any worldlines, or interactions, and therefore no way to define a set of light cones. The light cones get introduced with the yarn, and with the rules we establish about the way the threads interact.)

Likewise, if we focus on a single event within the gradated system, we really can’t say anything about that either. A single event simply “is.” (Any grayscale value is in the context.) So, in order to start building anything useful, we need to start establishing the relations between multiple events. We need intervals. Some of these intervals we’ll call space-like, some we’ll call time-like (and the ones that fall smack in the middle of these two we’ll call light-like, but I’ll ignore those because they just complicate the discussion unnecessarily). We can also classify the intervals between shades of gray as “potentials” and create rules about the way they "interact" and change over time.

But now we’ve introduced something different. The space-like and time-like intervals are used passively. We use them to describe the system. But the varying potentials aren’t used to merely describe – they are used to explain. So we have three fundamental units of measure, but one of those three is not like the others. And that prevents us from truly unifying them.

What I’m suggesting is that we need a third unit of measure (class of relation) that merely describes. That way, instead of Spacetime+Order, we can get rid of that pesky + sign and have SpaceTimePerspective. Any dynamical qualities we then assign to the system (like order) would be defined as complex relations (knots) in STP, and would decompose to those relations. In fact, we can even make a point to define “Perspective” in a way that the newly emergent quality “order” appears and behaves exactly as it did before.

Now, at the level we are discussing, this change appears completely arbitrary. "So we have STP instead of ST+Order, so what?" The conceptual shift doesn't really seem to accomplish anything at all – after a short detour, we seem to be right back where we started - and at this level, that's absolutely true. But it's like compound interest. There may not be any appreciable conceptual gain in the short term, but in the long term, the difference can be staggering.

Current approaches to unification focus on tactics like increasing the dimensionality of the symmetry groups that define interactions. Or increasing the number of spatial dimensions in the universe. But we don’t need more dimensions of space. (Well, at least that's not all we need.) We need a new class of dimension. New dimensions of space have to be treated just like the old ones (inverse square law, etc.). But having a dimension of Perspective gives us something really new – a free parameter that isn’t bound up in the same rules. And if we define it properly, it gives us a new way to approach some sticky situations that we’ve been mired in for a long time.

For example, the term “locality” would no longer be synonymous with “spatiotemporal locality.” So rather than trying to avoid any conflict between quantum entanglement and relativity using subtle semantics and wordplay – eg. making careful distinctions between “correlation” and “causation” - we can confront the issue head on. In this case, if we adopt a position that interprets relativity as forbidding space-like causal interactions – while simultaneously interpreting entanglement as demonstrating causal interactions between patterns at space-like separation – we don’t actually have any conflict, since the phrases “can’t be space-like” and “must be time-like (or light-like)” would no longer be synonymous.

Perspective then isn’t a new charge or potential or dynamical component to physical theory that attempts to explain the outcome of interactions. By itself, it is simply a passive relation that can be drawn between events. And the point that I began with in this thread (and in https://www.physicsforums.com/showthread.php?t=467045"), is that we are already making implicit associations between events that cannot be classified using the relations of space or time! Any time we speak of “information,” we are associating a collection of events with one another. But we dismiss that as "just being part of the process of observation." If we define perspective in a way that requires any associated events to have a perspectival (perspective-like) interval in common with our point of reference (either intrinsically or extrinsically), then we aren’t even adding anything new. We’re just formally acknowledging the associations that we are already making anyway!
 
Last edited by a moderator:
  • #17
ZachHerbert said:
Perspective then isn’t a new charge or potential or dynamical component to physical theory that attempts to explain the outcome of interactions. By itself, it is simply a passive relation that can be drawn between events.

I sort of agree with you on "perspective". But I can't then really evaluate the calculational apparatus you develop from it. It certainly sounds crank because you claim to explain everything in the end from dark energy, to inflation, to nonlocality. :smile:

However sticking just with the notion of perspective, I think what you are talking about here is the hierarchy theory, and Peircean semiotics, point that scale is a third "primitive".

A hierarchy is where you have a global or higher level context acting as a downward set of constraints, focusing the identity (individuating) the local events (which equally are the constructive acts that in bottom-up fashion create the global context). So there is a two way causality or interaction or equilibrating that acts across scale (and makes scale the meaningful third "primitive").

Now in your scheme, you treat perspective as an axis that measures some distance (or angle, as you are using spherical co-ordinates). To me, this seems like the distance from the general context (the global ambient state of the system that is its baseline) and some local event (the measurable extent of deviation from that global state).

Which would fit a thermal view. The observing context in practice for the universe is the average temperature of the vacuum. The CMB gives you a baseline. Then local events (in the shape of massive particles and the fields/energies associated with them) each have some distance from this universal average that determines their dynamics.

Now you will note that spacetime in fact exists at both the local and global scale in this view. The global scale is a thermally coherent spacetime, just a large enough light cone to act as a stable (stable enough) context. And the events happen at spacetime instances within this spacetime context. Perspective would then be the thermal gap represented by individual particles - the distance between them and the general background.

As I say, this idea of triadic hierarchical system is quite general. It does indeed apply to modelling consciousness - but in a really complex way. So for the purposes of modelling the physical situation, thermodynamics gives you a straightforward way to talk about energy gradients using (local) information and (global) event horizons as your yardstick.

Thus if I interpret your approach correctly, what you are doing is zeroing the local spacetime scale to some ambient average (your space and time axes mark out all the potential locales in a cold void), then the third axis of perspective measures how hot and energetic some actual locale is by comparison.

Most of the universe's locales would in fact have no perspective as they would be generic. There would be no measureable phase difference for points of the void that shared the same temperature. But you are of course concerned with modelling the localised departures from the generic state represented by locales that are massive or otherwise energetic and so have a distinguishable local dynamics.

Hope this is a reasonable interpretation anyway.
 
  • #18
First of all, thank you Apeiron for actually reading the paper and making the attempt to understand what I’m saying. The effort is very much appreciated.

apeiron said:
I sort of agree with you on "perspective". But I can't then really evaluate the calculational apparatus you develop from it. It certainly sounds crank because you claim to explain everything in the end from dark energy, to inflation, to nonlocality. :smile:
I know it sounds crank, but I do want clarify (for those who haven’t read it) that I’m not claiming that Perspective provides a unique explanation for those phenomena. Only that it provides a fundamentally different context in which to examine the topics. My belief is that that in itself is worth something. Any calculational apparatus would obviously have to be built on top of the idea (which is why I keep reinforcing that I’m presenting a philosophy, not a theory). Perspective, as a general concept, could be modeled in different ways in different theories - just like "time."

apeiron said:
Thus if I interpret your approach correctly, what you are doing is zeroing the local spacetime scale to some ambient average (your space and time axes mark out all the potential locales in a cold void), then the third axis of perspective measures how hot and energetic some actual locale is by comparison.
I think you’ve mostly got what I’m saying, but there seems to be one crucial difference. (And help me if I’m not following you.) The idea is that Perspective works as a conceptual compliment to space and time - adopting the causal “involvement” of space-like intervals, without spreading that involvement over time-like intervals. The distances in perspective – the actual intervals – are then used to model the geometry of the hierarchy, but not (by itself) to model entropy (which would still vary over time). Perspective itself remains a (semi) independent variable (just like space and time).

So, rather than relating entropy to distances in perspective, we want to relate it to the number of phase values at each point. Since a frame of reference has to be found inside the system, the number of phase values assigned to an event is partially dependent on the point of origin. The idea here is to build epistemological limitations into the model. (Am I viewing a single phase from two perspectives, or two from a single perspective? Do the values share a deeper point in common (thus modeling a portion of an intrinsically complex hierarchal system)? Or is it just noise? Or is my perspective compromising my view of the system?)

apeiron said:
Most of the universe's locales would in fact have no perspective as they would be generic.
Not quite. Each event would simply be a location in STP. We can't have an event in STP with no P coordinate, just like we can't have an event in spacetime with no T coordinate. Again, we want to abstract perspective away from the explanation of the system. (We don't want to explain a system by merely citing its location in perspective, just like we wouldn't want to explain a system by merely citing its location in space.)
 
Last edited:
  • #19
apeiron said:
No, information is a way to model constraints, a limit on entropy. It is a way to atomise form.


Information is most definitely NOT a way to model constraints. Information is MEANING which in turn is as vague as it could ever be. How is information a limit on entropy?? There could be information about near infinitely high entropy(i.e. just before the presumed heat death of the universe).



Minds are concerned with meaning or semiosis - some specific state of constraint. Information theory is a way to measure reality at a more general level, stripped of definite meaning.


No, information and meaning are one and the same. If information is not meaningful, it's not information. It's a USELESS sequence of symbols that carries NO information whatsoever.



It is actually taking the "mind" out of the measuring and so allowing specific states of constraint to be broken down into generalised descriptions.


It might be more difficult to remove mind from the measuring processes than is obvious. There is no fundamental difference between information(meaning) about reality and reality. You can reconstruct that statement in the form:

"There is no fundamental difference between information(meaning) about the Universe and the Universe"
 
Last edited:
  • #20
Maui said:
Information is most definitely NOT a way to model constraints.

See for example http://arxiv.org/PS_cache/arxiv/pdf/0906/0906.3507v1.pdf to understand what I meant.

"The key probability distributions often arise as the most random pattern consistent
the information expressed by a few constraints (Jaynes, 2003)."

No, information and meaning are one and the same. If information is not meaningful, it's not information. It's a USELESS sequence of symbols that carries NO information whatsoever.

This is confusing syntax with semantics. Information theory is a general measure of bits so that a nonsense string and a meaningfully ordered string can be counted as being equal in length.

See for instance http://en.wikipedia.org/wiki/Algorithmic_information_theory

"...people also sometimes attempt to distinguish between "information" and "useful information" and attempt to provide rigorous definitions for the latter, with the idea that the random letters may have more information than the encyclopedia, but the encyclopedia has more "useful" information."

So you are talking about information as if it is always useful. Information theory is a more general measure that breaks everything down into useless bits (shedding all particular meanings along the way)

It might be more difficult to remove mind from the measuring processes than is obvious.

Or vice versa - to find mind/meaning in the descriptions. Hence issues like the symbol grounding problem.

http://en.wikipedia.org/wiki/Symbol_grounding

The problem is that meanings are particular. They flow from a particular state of relations, a particular point of view. Information theory was derived by generalising points of view to a raw state of meaningless-ness - from which of course the intention was to reconstruct models of meaningful (ie: constrained in a particular way) states.

If you still have a different understanding of the relation between information theory, constraints and meaning, perhaps you can come back with some references that explain your position?
 
  • #21
As a newcomer to the Forum, let me jump in this interesting discussion where I would tend to support what Aperion proposes, and perhaps add a few things.
First, the word “information” is used in telecommunication engineering in order to evaluate the transmission capacity of communication channels [Shannon]. The possible meaning of information is not taken into account (“semantic aspects of communication are irrelevant to the engineering aspects.”).
Also, it is to be noted that a meaningful information does not exist by itself, but is generated or used by a system for some given reason related to the nature of the system. An animal sensing the presence of a predator will generate a meaningful information “presence not compatible with survival constraints”, and consequently implement an escape action.
Meaningful information is system related. Different systems receiving the same information can generate different meanings (think about different persons reading the same paper).
As a given information can produce different meanings (or produce no meaning) I feel we should keep with the wording “meaningful information”.
I would also favor a systemic approach to meaning generation where a system submitted to a constraint generates a meaningful information that will be used to implement an action in order to satisfy the constraint (http://www.mdpi.com/1099-4300/5/2/193/pdf)
 
  • #22
CMCdL said:
Also, it is to be noted that a meaningful information does not exist by itself, but is generated or used by a system for some given reason related to the nature of the system. An animal sensing the presence of a predator will generate a meaningful information “presence not compatible with survival constraints”, and consequently implement an escape action.

Hi Christophe - what your paper highlights for me is the dynamical aspect of meaning. So a chemical gradient is just static or passive information until it is "read" for a purpose.

My favourite example of this at the primitive level is e.coli. It has membrane receptors that sample the chemical gradients of its environment. But there is no meaning unless the bacterium is moving. The meaningful information is then the simple binary distinction of either "more", or "less".

So information theory deals with presence/absence. Meaning deals with a dynamical situation where there is an orientation to change - increasing/decreasing.

And the resulting behaviour is also beautifully simple. The signal detected only has to reverse the direction of the cork-screwing flagella. Rotating anti-clockwise, the flagella tangle into one and drive e.coli forward along a gradient. But lose the scent and the flagella reverse, untangle, and send the e.coli into a tumbling random swimming motion. Until the right direction is found again.

Coming back to "perspective", I think meaning is indeed a further axis of reality - and it would be an axis of hierarchical scale. You have space and time (all the information about what is static or located PLUS all the information that is dynamic and global) then the third thing of all the ways that this spacetime can be "read" off.

That is, all the possible scales of interaction, communication. And here, as we know from relativity, there is a baseline for such meaning-making, which is the constraint of the speed of light. There is a basic lightcone structure to the universe which in fact makes its spacetime meaningful.

Just space and time gives you only a view of the information that constitutes the universe. But space, time and scale gives you a world with now meaningful information. There is a gradient which can be "read".
 
  • #23
apeiron said:
If you still have a different understanding of the relation between information theory, constraints and meaning, perhaps you can come back with some references that explain your position?


No, we are talking about different things. I was pointing out that that there is no definition of "information" that doesn't require the use of ambiguous terms like 'minds'. Since we mean many different things by "minds", we mean different things by information. We all know what it(information) does, but ultimately we don't know what it is and how it does what it does. We need a theory of mind mechanics that is still hundreds of years into the future in the best possible scenario.

Speculation is good when it rests on more than someone's inclination to believe some idea or someone's confused thoughts laid out in a paper format(of which there are hundreds on the web and peer review is necessary so that the wildest ones get thrown in the trash bin). You can literally find a paper on the web supporting any idea you can think of these days(must be real hard for the moderation staff here to weed out all the junk).

For the record i had no intention to discuss information theory which has a very narrow focus and doesn't concern what i initially stated in the thread.
 
Last edited:
  • #24
Maui said:
No, we are talking about different things. I was pointing out that that there is no definition of "information" that doesn't require the use of ambiguous terms like 'minds'. Since we mean many different things by "minds", we mean different things by information. We all know what it(information) does, but ultimately we don't know what it is and how it does what it does. We need a theory of mind mechanics that is still hundreds of years into the future in the best possible scenario.

Well, I don't know how that helps. Sure there are a huge varieties of views out there on all the issues. But I also feel that it is possible to separate the signal from the noise if you devote yourself to the effort. Everybody's talking, but it turns out some know what they are talking about :wink:.
 

What is "Individuation as Brute Assumption"?

"Individuation as Brute Assumption" refers to the idea that every individual entity has a unique identity or essence that sets it apart from all other entities.

What is the purpose of "Individuation as Brute Assumption" in science?

The concept of "Individuation as Brute Assumption" is used in science to help explain the diversity and complexity of the natural world. It allows scientists to categorize and study individual organisms or objects based on their distinct characteristics.

How does "Individuation as Brute Assumption" relate to the concept of evolution?

"Individuation as Brute Assumption" and evolution are closely related as they both recognize the unique identities of individual organisms. Evolution explains how these identities and characteristics have changed over time through natural selection and adaptation.

What is the role of genetics in "Individuation as Brute Assumption"?

Genetics plays a crucial role in "Individuation as Brute Assumption" as it helps to determine an organism's unique traits and characteristics. The study of genetics allows scientists to understand how and why individuals within a species may differ from one another.

How does "Individuation as Brute Assumption" impact our understanding of the world?

"Individuation as Brute Assumption" allows us to better understand and appreciate the diversity and complexity of the natural world. It also helps us to recognize and appreciate the unique identities of individuals within a species, and how they contribute to the overall balance and functioning of ecosystems.

Similar threads

  • General Discussion
2
Replies
48
Views
8K
  • Classical Physics
Replies
5
Views
876
  • Quantum Interpretations and Foundations
9
Replies
309
Views
8K
  • Quantum Interpretations and Foundations
Replies
3
Views
2K
  • Quantum Physics
2
Replies
36
Views
1K
Replies
12
Views
2K
Replies
1
Views
81
  • Quantum Interpretations and Foundations
Replies
1
Views
530
  • Beyond the Standard Models
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
Back
Top