Dynamical Qualities and the Informational Paradigm

  • Thread starter ZachHerbert
  • Start date
In summary: If so, why is it a problem to preserve?In summary, the author suggests that if dynamical qualities cannot be embodied by a singular event, and if multiple events must be associated with one another in a way that cannot be classified using spatiotemporal relationships alone, then both particles and their dynamical qualities must be treated as emergent.
  • #36
ZachHerbert said:
I know that they aren't fundamental in the same way, but it seems that Jammer's critique still applies. If "quark color" can't be meaningfully isolated from "the symmetry of rotations in SU(3)" - and it seems that it can't - then doesn't that fall into the same circular mess as force/mass? The strong force wasn't invented arbitrarily. It was introduced to explain the behavior of physical systems - just like force/mass was used to explain why some things "weigh more" than others (I know, oversimplified again). I agree that the symmetry route is much cleaner. But it still doesn't make it through the field of play unscathed.

Well, yes, I noticed that you meant charge/mass in general, which is why I thought the other paper was relevant (speaking about structure on the sub-planck scale).

I actually agree with ZapperZ that it's not so circular really. For me, it's merely a matter of infinite regress. If you keep breaking matter up into smaller and smaller parts, how can you ever be confident that you've reached the "bottom"?

We seem to have found a bottom limit with quantum systems, and dispersive quantum systems (that can exchange with their environment) are still an open problem; this leaves open an informational approach. For many, the 2nd law of thermodynamics and quantum chaos seem like good ways to approach the system. (i.e. you trace the phase state of a classical system back to it's original state with such high precision that you must suddenly now consider the Heisenberg Uncertainty. But since it's a classical system, it's potentially chaotic (so small changes to the initial condition will evolve into large changes), but since we can't define the systems original state with infinite precision, it puts a significant weight on the importance of the inherent loss of information in the universe: or entropy.

My point is that if the value must demonstrate variation initially (to individuate it from the environment), and then change (to be considered dynamic) - and do both from a relative context (and not from a static absolute) - then even the initial value requires information to model. And if that value never changes, then by what criteria do we claim that time has passed? (Unless, like Newton, we assign time to a static, absolute point of reference. But then we're back to unobservable metaphysics.)

I'm not sure what difference you assign between "demonstrating variation" and "changing".

So then, you would say we can't individuate c or G from the environment, since they don't change?

This is where I depart a bit. I don't believe there is "definiteness" outside of a metaphysical (unobservable) ideal. Merely citing "macroscopic objects" is not sufficient. The observing system (as frame of reference, means of interaction, and consciousness interpreting the result) can not be ignored in any philosophically viable description of reality. (We don't experience true "definiteness," we experience "information" that has been pre-molded by unseen cognitive structures.) And I don't see that we have any established mechanism to adequately integrate an "observer" into a "system" without succumbing to Jammer's trap and preserving the gap between kinematics and dynamics - and thus (at least to my mind) failing to establish a truly adequate model of unification. (Yes, Jammer was partly a ruse to initiate the conversation, but the reference is still both valid and applicable to the topic of discussion. In case anyone forgot, it was to talk about the role of dynamical qualities in a potentially unified, informational model of reality. :tongue:)

I don't see how this is a departure; this seemed to be precisely the point to me. I can see how you might take the one sentence you quoted out of context if you didn't read the rest of what I provided...
 
Physics news on Phys.org
  • #37
ZachHerbert said:
We can classify the intervals between events as “space-like” or “time-like.” But how does a space-like interval between “associated” events differ from a space-like interval between “non-associated” events?

Or thirdly, as lightcone-like. People trying to develop more fundamental notions of co-ordinates post-GR, like Penrose with his twistors, would talk about integrated spacetime rather than separated space + time. So the gaps between events would be spacetime-like, that is lightcone-like.

There are other things going on, like QM non-locality (which implies a limited or local form of retrocausality a la Cramer IMHO). And then the thermodynamic arrow of time (which is the bigger, fatter, global arow of causality pointing the other way).

To break this complexity down into metaphysical primitives is indeed an interesting challenge. But so far I am not getting any sense that you are going the right way about it (even though it is always useful to try to see things from other people's POV).

As an aside, if you want the hierarchy theory persperctive on this, it might be worth reading Stan Salthe's Evolving Hierarchical Systems, especially the bits on "cogent moments". It is a far more rigourous approach than Wilber's holons and integral theory.
 
  • #38
Pythagorean said:
Well, yes, I noticed that you meant charge/mass in general, which is why I thought the other paper was relevant (speaking about structure on the sub-planck scale).

I actually agree with ZapperZ that it's not so circular really. For me, it's merely a matter of infinite regress. If you keep breaking matter up into smaller and smaller parts, how can you ever be confident that you've reached the "bottom"?

Unless infinite regress hits bottom at the Planckian limits of scale and "beyond" is "indefinite". ie: ontically indeterminate. ie: vague.

For many, the 2nd law of thermodynamics and quantum chaos seem like good ways to approach the system. (i.e. you trace the phase state of a classical system back to it's original state with such high precision that you must suddenly now consider the Heisenberg Uncertainty.

It is an interesting line of thought, but I think probably fallacious as if the initial conditions are traced back to a QM level, then we would be trying to make measurements in a place where all is indefinite/indeterminate/vague. We could not actually make any kind of exact measurement in such a place.

Chaos theory says if you could describe exact initial conditions, then you could have exact deterministic predictability. And QM then says well you can't when it comes to measuring the actual world. You can only start from a coarser grain of semi-classical measurements. So this gives you statistical predictions, not fully-deterministic ones.
 
  • #39
Pythagorean said:
I'm not sure what difference you assign between "demonstrating variation" and "changing".

So then, you would say we can't individuate c or G from the environment, since they don't change?
No difference other than how we label them. (eg. variation over space, change over time.) My point was simply that any value robust enough to serve as an explanation to some physical phenomenon couldn't be self-contained within a single event. (The same with the values for c and G. You don't need the values to change, but you do need a collection of events before the values are meaningful.)

Pythagorean said:
I don't see how this is a departure; this seemed to be precisely the point to me. I can see how you might take the one sentence you quoted out of context if you didn't read the rest of what I provided...
Sorry, I should have quoted more than that. I only meant that I differed in whether the program of decoherence actually met the full requirement quoted below...
D’Espagnat said:
In fact, scientists most righly claim that the purpose of science is to describe human experience, not to describe “what really is”; and as long as we only want to describe human experience, that is, as long as we are content with being able to predict what will be observed in all possible circumstances (. . . )
I don't think that demonstrating the existence of macroscopic objects and their spatiotemporal relations is sufficient to describe human experience. The most omnipresent aspect of human experience is our ability to associate multiple, distinct bits of information in our cognitive awareness simultaneously. Per my post above, defining the nature of this "association" is far more fundamental to describing human experience than just explaining why macroscopic objects demonstrate classical rather than quantum behavior or predicting the outcome of an experiment.

apeiron said:
Or thirdly, as lightcone-like.
Sorry, I've been leaving out "light-like" intervals for simplicity. I've been assuming an integrated spacetime, but didn't say so explicitly. (The idea of decomposing spacetime back down into space+time seems very kludgy to me, so I tend to forget to clarify.)

apeiron said:
As an aside, if you want the hierarchy theory persperctive on this, it might be worth reading Stan Salthe's Evolving Hierarchical Systems, especially the bits on "cogent moments". It is a far more rigourous approach than Wilber's holons and integral theory.
Thanks, I'll check that out.
 
Last edited:
  • #40
apeiron said:
Unless infinite regress hits bottom at the Planckian limits of scale and "beyond" is "indefinite". ie: ontically indeterminate. ie: vague.



It is an interesting line of thought, but I think probably fallacious as if the initial conditions are traced back to a QM level, then we would be trying to make measurements in a place where all is indefinite/indeterminate/vague. We could not actually make any kind of exact measurement in such a place.

Chaos theory says if you could describe exact initial conditions, then you could have exact deterministic predictability. And QM then says well you can't when it comes to measuring the actual world. You can only start from a coarser grain of semi-classical measurements. So this gives you statistical predictions, not fully-deterministic ones.

I think we're actually in agreement because that's the whole point of the discipline: quantum chaos. Here, I am presenting the problem, not a solution. Quantum Chaos, as a field, is essentially the search for a solution: it's not hinging on any currently proposed solution.

ZachHerbert said:
I don't think that demonstrating the existence of macroscopic objects and their spatiotemporal relations is sufficient to describe human experience.

That's not the claim though, is it? The claim is that macroscopic observations are insufficient to completely describe reality because they are limited by our perception of reality. That we have a fundamentally flawed notion of what reality actually is, and it's a notion that's not explicitly measured/observed/stated (as you pointed out in your first reply to this idea). So we don't really know where our flaw is: it's either a collection of assumptions we're not conscious of or a very fundamental flaw in the whole way we're able to perceive (the former being recoverable information, the latter being irrecoverable information).

Thermodynamics (properly described by quantum mechanics) implies it's the latter; for example, particles don't have a definite position or momentum. The notion of "being" that humans experience is being projected on to particles to which it does not apply, and this (the general uncertainty principle) is a result of a particle indistinguishability (there is fundamentally no difference between this electron and that electron).
 
  • #41
Pythagorean said:
That's not the claim though, is it? The claim is that macroscopic observations are insufficient to completely describe reality because they are limited by our perception of reality.
That maybe true, but it doesn't excuse the omission. If physics is comfortable with taking the concepts of space and time seriously - both of which have origins in phenomenal human perception - then this third relation should be taken just as seriously. Embracing variation and change, while simultaneously ignoring the integrated perspectives which give us the ability to define those concepts in the first place seems like dashing out of the restaurant before paying for the meal.
 
  • #42
ZachHerbert said:
That maybe true, but it doesn't excuse the omission. If physics is comfortable with taking the concepts of space and time seriously - both of which have origins in phenomenal human perception - then this third relation should be taken just as seriously. Embracing variation and change, while simultaneously ignoring the integrated perspectives which give us the ability to define those concepts in the first place seems like dashing out of the restaurant before paying for the meal.

It's taken seriously... that's the obvious direction of development (which is why Zurek ends on a note of information theory).

But why would you expect it in a physics paper? You're talking about neuroscience, psychology, and information theory now... not just physics. From a physics point of view, the paper is already suggestive enough as it is.

Also, recognize that utilizing space and time is not the same as studying the nature of it. All physicists don't study the nature of space and time, some just use it. And that is sufficient enough for physics yes, as long as the behavior they're describing is predicted reliably with the assumptions (which it is: i.e. planetary motion, cannon ball trajectories).

Your ideal version of scientist would agree with you, of course; he would want to know everything about everything and how it all ties together (this is called a first year physics undergraduate) but your ideal is not the reality by the time the student has a PhD. At least, not if they want to remain sane and working in physics.
 
  • #43
Pythagorean said:
But why would you expect it in a physics paper? You're talking about neuroscience, psychology, and information theory now... not just physics.
Well that's the part that is new in what I'm proposing. I agree that the complex structures that form human cognitive states are emergent. But what I'm suggesting is that the bare, fundamental relation that is at work in the observer/information dynamic should be treated as primitive.

Since physics is deeply rooted in the idea that spatiotemporal relations play some objective role in describing the universe - and aren't just first-person phantoms of human experience - I'm simply proposing that "perspective" be treated the same way. The mere introduction of a third class of relation at that level has a powerful impact on several long standing debates (both inside and outside of physics). Basically, the purpose of http://dl.dropbox.com/u/8804875/Primitives%2BIntro.pdf" [Broken] is to show how even a simple toy model of the idea changes the fundamental nature of some deeply rooted paradoxes. And, in my opinion, a different way to look at a problem is always valuable.
 
Last edited by a moderator:
  • #44
ZachHerbert said:
But how does a space-like interval between “associated” events differ from a space-like interval between “non-associated” events? There is a demonstrably different relation at work. But the relation has never been formally classified. We can’t “explain” it using the charge/force dynamic (in whatever clothes we choose to dress it in). And we can't account for it using space and time alone. We've run out of variables.
My guess would be that the spacelike intervals between associated and nonassociated events aren't different in any important way.

However, the relationships that allow us to distinguish associated (eg., entangled) events are embodied in, ie., ultimately traceable to, the preparation procedures (ie., emission by the same atom during the same transition process, direct interaction, the common application of an identical torque, etc.). This is how entanglement is understood and reliably and repeatedly produced. So, I don't think there's any great mystery as to what the relationships are due to.

ZachHerbert said:
My argument is that by classifying the relation axiomatically, and relating this "new primitive” to space and time ...
The exact physical nature of the relationship(s) can be different. But the relationships are also nonvariable from trial to trial and run to run wrt a given preparation. (Hence their resistance to being modeled as hidden variables.) So, how would you classify these relationships axiomatically in the form of a single new primitive?

My guess is that things will continue pretty much as they have been. For experimental preparation and measurement (ie., at the level of our sensory perception), individuation can continue to be taken for granted and the more or less de facto operational interpretation and practice of standard qm would remain the preferred formalism (as a set of rules) for calculating outcome probabilities wrt certain experimental procedures -- while less practical but entirely deterministic foundational formalisms, possibly incorporating a fundamental dynamical principle to tie together the apparent behavioral connections between the various particulate media in the emergent hierarchy, might be developed in addition to that.

ZachHerbert said:
... what I'm suggesting is that the bare, fundamental relation that is at work in the observer/information dynamic should be treated as primitive.
This seems the same as taking individuation and suitably collected and processed observational data at the level of our sensory perception for granted and using it to evaluate competing statements about the world. If so (and even if not), then how is what you're proposing going to change anything about the way scientists do science or one's worldview that isn't already possible without it? How might what you're proposing clarify anything?
 
  • #45
ThomasT said:
However, the relationships that allow us to distinguish associated (eg., entangled) events are embodied in, ie., ultimately traceable to, the preparation procedures (ie., emission by the same atom during the same transition process, direct interaction, the common application of an identical torque, etc.). This is how entanglement is understood and reliably and repeatedly produced. So, I don't think there's any great mystery as to what the relationships are due to.
But this isn't correct. If all of the relevant information could be found at time-like or light-like intervals to the result, then the terms "entanglement" and "nonlocality" would never have been invented in the first place.

ThomasT said:
But the relationships are also nonvariable from trial to trial and run to run wrt a given preparation. (Hence their resistance to being modeled as hidden variables.)
This isn't the whole picture though. Just because hidden variables can't preserve locality doesn't mean that locality can be preserved without them. From "Quantum Non-Locality and Relativity" by Tim Maudlin...
Maudlin said:
Bell himself derived the result as part of an examination of so-called local hidden-variables theories. Such theories attempt to eliminate the stochastic element of orthodox quantum theory by adding extra parameters to the usual quantum formalism, parameters whose values determine the results of the experiments. Bells’ results are therefore sometimes portrayed as a proof that local deterministic hidden-variables theories are not possible.
This is a misleading claim. It suggests that the violation of the inequalities may be recovered if one just gives up determinism or hidden variables. But as we have seen, the only assumption needed to derive the inequalities is that the result of observing one particle is unaffected by the experiment carried out on the other. Subject to this restraint, no deterministic or stochastic theory can give the right predictions, no matter how many or how few variables are invoked.
Since a natural method of attempting to ensure the isolation of the two particles from one another is to carry out the relevant observations in distantly separated places, the isolation condition is generally called “locality”. The assumption involved is that observations made on one photon can in no way alter the dispositions of the other photon to pass or be absorbed by its polarizer. Adopting this terminology uncritically for a moment, we have shown that Bell’s inequality must be obeyed by any local theory of any sort. Adding stochastic elements does not help the situation at all, as noted above. So experiments verifying the violation of Bell’s inequality would doom locality tout court.
 
Last edited:
  • #46
ZachHerbert said:
But this isn't correct. If all of the relevant information could be found at time-like or light-like intervals to the result, then the terms "entanglement" and "nonlocality" would never have been invented in the first place.
All the relevant information is in the preparation procedures, and is represented in the qm formalism.

The term entanglement was used by Schroedinger to refer to the relationship between particles that had interacted. The interesting thing about it was that after an interaction then more was known about the biparticle system, from the assumed relationship, than could be inferred from maximum knowledge of the individual subsystems without the relationship.

The term nonlocality, wrt its usual or historical meaning, is simply a misnomer when applied to this stuff. Conservation laws allow precise values wrt, say, entangled pair observables (ie., correlations, or coincidence stats), while individual stats remain totally random. This is what's being referred to when entangled biparticle states are called nonlocal.

Qm completely describes the correlations of spacelike separated, entangled, subsystems, while at the same time offering only an incomplete description of the physical reality of the subsystems. This is what's being referred to when the qm description of entanglement is called nonlocal.

ThomasT said:
But the relationships are also nonvariable from trial to trial and run to run wrt a given preparation. (Hence their resistance to being modeled as hidden variables.)

ZachHerbert said:
This isn't the whole picture though.
Of course not. There isn't any "whole picture". But it does point to why the qm account is sufficient and hidden variable accounts aren't.

ZachHerbert said:
Just because hidden variables can't preserve locality doesn't mean that locality can be preserved without them.
The principle of local causality is a fundamental working hypothesis of modern physics. There is, to date, absolutely no physical evidence to the contrary. Hence, preserving locality isn't a problem -- while demonstrating that nature is nonlocal is a problem.

Anyway, the important question posed to you had to do with how you think your approach might clarify anything. I don't think that it does. I think that at best it's just superfluous, and at worst confusing. But I invite you to convince me otherwise.
 
Last edited:
  • #47
ZachHerbert said:
Well that's the part that is new in what I'm proposing. I agree that the complex structures that form human cognitive states are emergent. But what I'm suggesting is that the bare, fundamental relation that is at work in the observer/information dynamic should be treated as primitive.

Since physics is deeply rooted in the idea that spatiotemporal relations play some objective role in describing the universe - and aren't just first-person phantoms of human experience - I'm simply proposing that "perspective" be treated the same way. The mere introduction of a third class of relation at that level has a powerful impact on several long standing debates (both inside and outside of physics). Basically, the purpose of http://dl.dropbox.com/u/8804875/Primitives%2BIntro.pdf" [Broken] is to show how even a simple toy model of the idea changes the fundamental nature of some deeply rooted paradoxes. And, in my opinion, a different way to look at a problem is always valuable.

I think ZapperZ's comments would begin to apply once you formalize these thoughts it in a paper like this. You have an interesting perspective, but you really need to ground it in the established physics and spend more time developing your ideas.

Here's a good lecture series from a very well-spoken instructor. He touches on many of our points just in this first lecture (plus I always love India's intros to their lecture videos):



Though, I think the best thing you can do is enroll in academia and get a degree in physics.
 
Last edited by a moderator:
  • #48
ThomasT said:
But it does point to why the qm account is sufficient and hidden variable accounts aren't.

The principle of local causality is a fundamental working hypothesis of modern physics. There is, to date, absolutely no physical evidence to the contrary. Hence, preserving locality isn't a problem -- while demonstrating that nature is nonlocal is a problem.
The qm account may be sufficient to describe the behavior, but it certainly is not local. I recommend reading Maudlin’s “https://www.amazon.com/dp/0631232214/?tag=pfamazon01-20” which I quoted above. It gives a thorough presentation of Bell’s theorem and the Aspect experiment, and then examines whether the observed phenomena meet the standards of superluminal causation, energy transfer, information transfer and/or signaling. It follows with several alternative ways of interpreting special relativity, and how each of these fits with the earlier analysis (and then proceeds to how the ideas of general relativity muddy the waters even more).

Another fascinating book is “https://www.amazon.com/dp/0226041824/?tag=pfamazon01-20” by Mara Beller. Beller essentially gives a “political commentary” on the various arguments and tactics employed by the two primary camps involved in the early quantum theory debates (Einstein, Schroedinger, et al vs. the Copenhagen physicists). It is interesting to see how the arguments that gave rise to the currently established positions (including the one you present above) weren’t exactly above board and employed many of the same tactics used in political debate. (For example, avoiding the critical attacks on implicit nonlocality by redirecting the response to the weaker attack on hidden variables and complementarity.)

Certainly I can see how my ideas would appear superfluous without a familiarity with the problems they are meant to address. But if you truly believe that quantum theory conforms to the idea of local causal determinism, then I’m having a hard time imagining what you must think of the last 85 years of debate over how to interpret the model? Or even the importance of the Aspect experiment (since the whole point was to randomly determine the alignment of the detectors outside of the future light cone of the photon emission from the source)? Unless, I suppose, you take the position that all events were strictly determined at the moment of the big bang?
Pythagorean said:
You have an interesting perspective, but you really need to ground it in the established physics and spend more time developing your ideas.
Thank you for the feedback (and for participating in the discussion with an “outsider” in the first place. :) Obviously my focus is more on the underlying philosophy than on the physics itself. But it’s been fascinating to see which points have been more readily accepted, and which points resulted in pushback – particularly since the points of contention were not at all where I expected! It’s been a real help!
 
Last edited by a moderator:
  • #49
Pythagorean said:
Though, I think the best thing you can do is enroll in academia and get a degree in physics.

ZachHerbert said:
Thank you for the feedback (and for participating in the discussion with an “outsider” in the first place. :)

Congrats to Zach for not getting riled here. And can Pythagorean please stop nicking my best lines. :rofl:

But anyway, I'd like to hear more about how Zach generalises from consciousness to perspective. His model of consciousness so far is sketchy and unconvincing. And I'm am left with no clear impression of what "perspective" is actually a measure of. It could be complexity, or integration, or locality, or a lot of things. So it would be interesting to refine the definition.

As I said earlier, it would also be a help, and more scholarly, to put his own definition in the context of other such efforts.

For instance, there is the semiotic approach based on CS Peirce's metaphysics. There must be 30 or 40 academics currently working on bio-semiosis and pan-semiosis as a way to generalise the notion of meaningful relationships (between observers and what is observed). You also have second-order cybernetics, relational biology, autopoietic systems - a lot of approaches that could be felt to have something to say here.

Reading Zach's paper, I don't think he gets how brains actually interact with the world. That does not necessarily matter as consciousness might be just the "analogy that inspires" here. He could be defining a metaphysical primitive that is useful and only very loosely like subjective awareness. Or he could instead be relying on his sketchy understanding to be exact, the primary motivation of his argument. In which case I would say he is in trouble. :smile:
 
<h2>1. What are dynamical qualities?</h2><p>Dynamical qualities refer to the characteristics or properties of a system that can change or evolve over time. These qualities are often used to describe the behavior or dynamics of complex systems, such as biological or social systems.</p><h2>2. How are dynamical qualities related to the informational paradigm?</h2><p>The informational paradigm is a theoretical framework that views the universe as a network of information processing systems. Dynamical qualities are important in this paradigm because they are seen as the fundamental units of information that shape the behavior and evolution of these systems.</p><h2>3. Can dynamical qualities be measured or quantified?</h2><p>Yes, dynamical qualities can be measured and quantified using various methods and techniques, such as mathematical models, simulations, and empirical data. These measurements can provide insights into the behavior and functioning of complex systems.</p><h2>4. How do dynamical qualities contribute to our understanding of complex systems?</h2><p>Dynamical qualities are essential in understanding how complex systems function and evolve. By studying the dynamics of these qualities, we can gain a deeper understanding of the underlying mechanisms and processes that drive complex systems, such as biological systems, ecosystems, and social systems.</p><h2>5. Are there any practical applications of studying dynamical qualities?</h2><p>Yes, there are many practical applications of studying dynamical qualities. For example, understanding the dynamics of biological systems can help in developing new treatments for diseases, while studying the dynamics of social systems can aid in creating more effective policies and interventions. Additionally, the study of dynamical qualities has also been applied in fields such as engineering, economics, and psychology.</p>

1. What are dynamical qualities?

Dynamical qualities refer to the characteristics or properties of a system that can change or evolve over time. These qualities are often used to describe the behavior or dynamics of complex systems, such as biological or social systems.

2. How are dynamical qualities related to the informational paradigm?

The informational paradigm is a theoretical framework that views the universe as a network of information processing systems. Dynamical qualities are important in this paradigm because they are seen as the fundamental units of information that shape the behavior and evolution of these systems.

3. Can dynamical qualities be measured or quantified?

Yes, dynamical qualities can be measured and quantified using various methods and techniques, such as mathematical models, simulations, and empirical data. These measurements can provide insights into the behavior and functioning of complex systems.

4. How do dynamical qualities contribute to our understanding of complex systems?

Dynamical qualities are essential in understanding how complex systems function and evolve. By studying the dynamics of these qualities, we can gain a deeper understanding of the underlying mechanisms and processes that drive complex systems, such as biological systems, ecosystems, and social systems.

5. Are there any practical applications of studying dynamical qualities?

Yes, there are many practical applications of studying dynamical qualities. For example, understanding the dynamics of biological systems can help in developing new treatments for diseases, while studying the dynamics of social systems can aid in creating more effective policies and interventions. Additionally, the study of dynamical qualities has also been applied in fields such as engineering, economics, and psychology.

Similar threads

  • General Discussion
Replies
23
Views
4K
  • Quantum Interpretations and Foundations
Replies
2
Views
991
  • Beyond the Standard Models
Replies
11
Views
2K
  • Poll
  • General Discussion
2
Replies
57
Views
12K
  • Quantum Interpretations and Foundations
Replies
5
Views
889
  • Beyond the Standard Models
Replies
2
Views
2K
Replies
6
Views
709
Replies
13
Views
1K
  • Special and General Relativity
2
Replies
40
Views
2K
Back
Top