Quantization isn't fundamental

In summary, the paper "Are Particles Self-Organized Systems?" by Manasson V. discusses the idea that elementary particles can be described as self-organized dynamical systems, and that their properties such as charge and action quantization, SU(2) symmetry, and the coupling constants for strong, weak, and electromagnetic interactions can be derived from first principles. The author also suggests that quantum theory may be a quasi-linear approximation to a deeper theory describing the nonlinear world of elementary particles. While the specific model presented in the paper may have some flaws, the approach of reformulating the axioms of quantum theory based on identifying its mathematical properties is thought-provoking and warrants further exploration.
  • #1
Auto-Didact
751
562
This thread is a direct shoot-off of this post from the thread Atiyah's arithmetic physics.

Manasson V. 2008, Are Particles Self-Organized Systems?
Abstract said:
Elementary particles possesses quantized values of charge and internal angular momentum or spin. These characteristics do not change when the particles interact with other particles or fields as long as they preserve their entities. Quantum theory does not explain this quantization. It is introduced into the theory a priori. An interacting particle is an open system and thus does not obey conservation laws. However, an open system may create dynamically stable states with unchanged dynamical variables via self-organization. In self-organized systems stability is achieved through the interplay of nonlinearity and dissipation. Can self-organization be responsible for particle formation? In this paper we develop and analyze a particle model based on qualitative dynamics and the Feigenbaum universality. This model demonstrates that elementary particles can be described as self-organized dynamical systems belonging to a wide class of systems characterized by a hierarchy of period-doubling bifurcations. This semi-qualitative heuristic model gives possible explanations for charge and action quantization, and the origination and interrelation between the strong, weak, and electromagnetic forces, as well as SU(2) symmetry. It also provides a basis for particle taxonomy endorsed by the Standard Model. The key result is the discovery that the Planck constant is intimately related to elementary charge.

The author convincingly demonstrates that practically everything known about particle physics, including the SM itself, can be derived from first principles by treating the electron as an evolved self-organized open system in the context of dissipative nonlinear systems. Moreover, the dissipative structure gives rise to discontinuities within the equations and so unintentionally also gives an actual prediction/explanation of state vector reduction, i.e. it offers an actual resolution of the measurement problem of QT.

However, this paper goes much further: quantization itself, which is usually assumed a priori as fundamental, is here on page 6 shown to originate naturally as a dissipative phenomenon emerging from the underlying nonlinear dynamics of the system being near stable superattractors, i.e. the origin of quantization is a limiting case of interplay between nonlinearity and dissipation.

Furthermore, using standard tools from nonlinear dynamics and chaos theory, in particular period doubling bifurcations and the Feigenbaum constant ##\delta##, the author then goes on to derive:
- the origin of spin half and other SU(2) symmetries
- the origin of the quantization of action and charge
- the coupling constants for strong, weak and EM interactions
- the number and types of fields
- a explanation of the fine structure constant ##\alpha##:
$$ \alpha = (2\pi\delta^2) \cong \frac {1} {137}$$
- a relationship between ##\hbar## and ##e##:$$ \hbar = \frac {\delta^2 e^2} {2} \sqrt {\frac {\mu_0} {\epsilon_0}}$$
In particular the above equation suggests a great irony about the supposed fundamentality of quantum theory itself; as the author puts it himself:
page 10 said:
Ironically, the two most fundamental quantum constants, ##\hbar## and ##e##, are linked through the Feigenbaum ##\delta##, a constant that belongs to the physics of deterministic chaos and is thus exclusively non-quantum.

Our results are assonant with ’t Hooft’s proposal that the theory underlying quantum mechanics may be dissipative [15]. They also suggest that quantum theory, albeit being both powerful and beautiful, may be just a quasi-linear approximation to a deeper theory describing the non-linear world of elementary particles. As one of the founders of quantum theory, Werner Heisenberg once stated, “. . . it may be that. . . the actual treatment of nonlinear equations can be replaced by the study of infinite processes concerning systems of linear differential equations with an arbitrary number of variables, and the solution of the nonlinear equation can be obtained by a limiting process from the solutions of linear equations. This situation resembles the other one. . . where by an infinite process one can approach the nonlinear three-body problem in classical mechanics from the linear three-body problem of quantum mechanics.”[11]
Suffice to say, this paper is a must-read. Many thanks to @mitchell porter for linking it and to Sir Michael Atiyah for reigniting the entire discussion in the first place.
 
Physics news on Phys.org
  • #2
Btw, if there is any doubt, it should be clear that I realize that Manasson's specific model isn't necessarily correct, and I am in no way promulgating his views here as being absolutely true. I however distinctly believe that Manasson's theory isn't just your mere run of the mill pseudoscientific mumbo jumbo. Instead I believe that what he is saying is just so highly non-traditional that most readers - especially those deeply familiar with QT but relatively unfamiliar with either the practice of or literature on dynamical systems analysis - have an extremely high probability of just outright calling it heresy with regard to established contemporary physics. Manasson is after all literally proposing that conservation laws in QT might be an emergent phenomenon and that therefore everything physicists think and claim to know about the fundamental nature of symmetries, the Noether theorem, gauge theory and group theory is hopelessly misguided; if this doesn't strike one as heresy in the contemporary practice of physics, I don't know what does!

There are other things which should be noted as well. Reading his paper critically for example, it is obvious his specific equations may be off by a small numerical factor; this however is almost always the case when constructing a preliminary model based on dimensional analysis and therefore shouldn't be a case for outright dismissal. His equations as of yet seem to have no known interpretation as geometric or as some known dimensionless group; this already shows that his specific equations are tentative instead of definitive. Also this should be obvious because he hasn't actually published this paper in a journal despite having submitted it 10 years ago to the arxiv. Curiously he doesn't have any other publications either on the arxiv and only one recent publication in a Chinese journal.

Regardless of exact numerics however, Manasson's approach in this paper i.e. an approach to reformulating the axioms of QT based on identifying QT's a priori mathematical properties, as resulting from a limiting case of an exactly linearizable dissipitave nonlinear dynamical system is a very promising theoretical direction which is very much a novel but distinctly mainstream science approach (see e.g. Dieter Schuh, 2018 "Quantum Theory from a Nonlinear Perspective: Riccati Equations in Fundamental Physics"). Moreover, this particular qualitative theoretical method for example is constantly used with other theories, both within mainstream science and mainstream soft matter and biophysics but it is especially popular outside of physics; importantly, in the practice of dynamical systems this is literally the method of empirically establishing the qualitative mathematical properties of some extremely complicated system after which a particular NDE or some class of NDE may be guessed.

The reason I am posting Manasson's theory, is because I couldn't find any earlier threads on it and I believe that this approach definitely warrants further investigation, whether or not it will turn out to be correct in the end. Lastly, other very prominent theorists have suggested very similar lines of reasoning as Manasson is doing in his paper without actually giving a mathematical model, in particular both Roger Penrose (1. in the form of a nonlinear reformulation of the linear characteristics of QM being necessary in order to unite it with GR, and 2. the idea that the fundamental laws of physics should be less symmetrical instead of more symmetrical) and Gerard 't Hooft (in the form of the idea that a dissipative theory might underly QM); moreover as both Richard Feynman and Lee Smolin have remarked, what may turn out to be wrong with the practice of theoretical physics is the assumption of the timelessness/ahistorical nature of (some) physical laws; dynamically evolved laws, e.g. in the form that Manasson is proposing here, would address these points as well. These coincidences only make me more curious about Manasson's proposal. Incidentally, it also reminds of something Feynman said about symmetry:
Feynman Lectures said:
We have, in our minds, a tendency to accept symmetry as some kind of perfection. In fact it is like the old idea of the Greeks that circles were perfect, and it was rather horrible to believe that the planetary orbits were not circles, but only nearly circles. The difference between being a circle and being nearly a circle is not a small difference, it is a fundamental change so far as the mind is concerned.

There is a sign of perfection and symmetry in a circle that is not there the moment the circle is slightly off—that is the end of it—it is no longer symmetrical. Then the question is why it is only nearly a circle—that is a much more difficult question. The actual motion of the planets, in general, should be ellipses, but during the ages, because of tidal forces, and so on, they have been made almost symmetrical.

Now the question is whether we have a similar problem here. The problem from the point of view of the circles is if they were perfect circles there would be nothing to explain, that is clearly simple. But since they are only nearly circles, there is a lot to explain, and the result turned out to be a big dynamical problem, and now our problem is to explain why they are nearly symmetrical by looking at tidal forces and so on.

So our problem is to explain where symmetry comes from. Why is nature so nearly symmetrical? No one has any idea why.
The Character of Physical Law said:
Another problem we have is the meaning of the partial symmetries. These symmetries, like the statement that neutrons and protons are nearly the same but are not the same for electricity, or the fact that the law of reflection symmetry is perfect except for one kind of reaction, are very annoying. The thing is almost symmetrical but not completely.

Now two schools of thought exist. One will say that it is really simple, that they are really symmetrical but that there is a little complication which knocks it a bit cock-eyed [NB: symmetry breaking]. Then there is another school of thought, which has only one representative, myself, which says no, the thing may be complicated and become simple only through the complications.

The Greeks believed that the orbits of the planets were circles. Actually they are ellipses. They are not quite symmetrical, but they are very close to circles. The question is, why are they very close to circles? Why are they nearly symmetrical? Because of a long complicated effect of tidal friction, a very complicated idea.

It is possible that nature in her heart is completely unsymmetrical in these things, but in the complexities of reality it gets to look approximately as if it is symmetrical, and the ellipses look almost like circles. That is another possibility; but nobody knows, it is just guesswork.
As we can see from Feynman's points, it would definitely not be the first time in the history of physics that an ideal such as symmetry would end up having to be replaced; not simply by some small fix-up like symmetry breaking, but more radically, by finding some underlying dynamical theory. It goes without saying that chaos theory and nonlinear dynamics only really came into their own as large highly interdisciplinary fields of science after Feynman stopped doing physics/passed away; suffice to say it would have been incredibly interesting to know what he would've thought about them.
 
Last edited:
  • Like
Likes Fra
  • #3
Its a bit depressing to always have too little time for things, I didnt yet read the paper in detail and i have no opinion of the author but but your description here makes me bite.

First a general comment: If questioning the constructing principles following from timeless symmetries on which most modern understanding of physics rests on is heresy, they I would say the next genious that physics needs to trigger a paradigm shift needed to solve open questios are likely a heretic by definition. So no shame per see to be labeled a heretic.
Auto-Didact said:
Regardless of exact numerics however, Manasson's approach in this paper i.e. an approach to reformulating the axioms of QT based on identifying QT's a priori mathematical properties, as resulting from a limiting case of an exactly linearizable dissipitave nonlinear dynamical system is a very promising theoretical direction which is very much a novel but distinctly mainstream science approach (see e.g. Dieter Schuh, 2018 "Quantum Theory from a Nonlinear Perspective: Riccati Equations in Fundamental Physics"). Moreover, this particular qualitative theoretical method for example is constantly used with other theories, both within mainstream science and mainstream soft matter and biophysics but it is especially popular outside of physics; importantly, in the practice of dynamical systems this is literally the method of empirically establishing the qualitative mathematical properties of some extremely complicated system after which a particular NDE or some class of NDE may be guessed.
...
The reason I am posting Manasson's theory, is because I couldn't find any earlier threads on it and I believe that this approach definitely warrants further investigation, whether or not it will turn out to be correct in the end. Lastly, other very prominent theorists have suggested very similar lines of reasoning as Manasson is doing in his paper without actually giving a mathematical model, in particular both Roger Penrose (1. in the form of a nonlinear reformulation of the linear characteristics of QM being necessary in order to unite it with GR, and 2. the idea that the fundamental laws of physics should be less symmetrical instead of more symmetrical) and Gerard 't Hooft (in the form of the idea that a dissipative theory might underly QM); moreover as both Richard Feynman and Lee Smolin have remarked, what may turn out to be wrong with the practice of theoretical physics is the assumption of the timelessness/ahistorical nature of (some) physical laws; dynamically evolved laws, e.g. in the form that Manasson is proposing here, would address these points as well. These coincidences only make me more curious about Manasson's proposal. Incidentally, it also reminds of something Feynman said about symmetry:
...
As we can see from Feynman's points, it would definitely not be the first time in the history of physics that an ideal such as symmetry would end up having to be replaced; not simply by some small fix-up like symmetry breaking, but more radically, by finding some underlying dynamical theory. It goes without saying that chaos theory and nonlinear dynamics only really came into their own as large highly interdisciplinary fields of science after Feynman stopped doing physics/passed away; suffice to say it would have been incredibly interesting to know what he would've thought about them.

I have a different quantitative starting point but some of the core conceptual ideas of the paper are fully in line with my thinking. And indeed its very hard to convince different thinkers of the plausability of these ideas, because they are indeed a heresy to much of constructing principles of modern physics. And even discussing this unavoidable gets into philosophy of science discussions which immediately makes some people stop listening. This is why, what it takes is for some of the bold heretics to make progress in the silent, and then publish it at a state that is mature enough to knock doubters off their chairs although that is an almost unhuman task to accomplish for a single person.

1) The idea that elementary particles (or any stable system for that matter) are the result of self organisation in a chaotic environment, is exactly what evolution of law also implies. In essence the population of elemetary particles and their properties, implicitly encode the physical laws. So the stability of laws in time, are just another side of the coin of determining the particle zoo.

The challange for this program is to first of all, explain the stability of physical law, if we claim its fundamentally chaotic - here is where the supersattractors come in. Attractors is a better word than equilibrium, but they are related. Ie. one needs to relaxe the laws, without introducing chaos at the macrolevel, and here SOS is the key. I fully share this view. But the mathemtcail details are a different story.

One way to understand also the stabilizing mechanis of how elemetary particles encode, reduced information about its environment, is to consider the topic of compression sensing, which is commonly used technique in signal processing (fourier analysis) and also how neuroscientists believe the brain works, ie. the brain is not a datarecorder, it reencodes information and stores it in a way that increases the chance of survival considering the expected future.

This is the exact analogy if how conceptually an elementary particles internal structure, mass and charges are tuned to be maximally stable agains a hostile (noisy, challaning) environment. Once you really understand the depth of this i find it hard not to be comitted.

This if course also another way to integrate the renormalization processes with physical interactions, and with that the formation of stable self organised information processing agents we later label elemetary particles.

My personal approach here is that i have pretty most abandoned the idea of trying to convince who do not want to understand. Instead i have taken on the impossible task of trying to work this out on my own.And likely there are a bunch of other heretics out there that have similar thinking, and hopefully someone will get time to mature things into something worth publishing. The risk of pulishing something too early is obvious.

What is needed is to take ideas from toy models and toy dimensions and make contact to contemporary physics and make explicit pre or postdictions about relations between some of the parameters of the standard models or find a consistent framework of QG that is still lacking etc. Anything less than this is likely beeing shoot down immediately by those that defend the conventional paradigm. The first impression from just skimming the paper is that it seems to be a qualitiative toy model mor ethan anything else. But like Auto-Didact says, there is no reason to let incomplete details haze our view of a clear conceptual vision.

/Fredrik
 
  • Like
Likes Andrea Panza, akvadrako and Auto-Didact
  • #4
Auto-Didact said:
Regardless of exact numerics however, Manasson's approach in this paper i.e. an approach to reformulating the axioms of QT based on identifying QT's a priori mathematical properties, as resulting from a limiting case of an exactly linearizable dissipitave nonlinear dynamical system is a very promising theoretical direction which is very much a novel but distinctly mainstream science approach (see e.g. Dieter Schuh, 2018 "Quantum Theory from a Nonlinear Perspective: Riccati Equations in Fundamental Physics").

I will throw in some conceptual things here in how i understand this:

I think of the the "dissipation" of a small system in a chaotic environment as a consequence of that a small information processing agent can not encode and retain all information about its own "observations", it will rather - just like we think human brains do - make a compressed sensing - and retain what is most importat for survuval, and dissipate the rest (discard). This what is discarded is what this observer considers to the "random noise". This has deep implications also for understanding micro black holes, bcause the "randomness" of the radation, is observer dependent. This is no such thing as objective randomness. Unly unabilities of particular observers to decode it.

During this process the information processing agent i evolving and either gets destabilised, or stabilised. Over time, the surviving structures will in some sense be in a steady state where simply information processing agent are in a kind of agreement with the chaotic environment; relative to the existing communication channel. In this sense once can consider the "theory" encoded by the small elemetary particle a naked version of the renormalized theory, that is encoded in the environment. So the question for unificaiton, where we think that the laws will the unified at TOE energies, are the analog of this. The TOE energy scale is where the energy is so high that there are evolved information processing agents are disintegrated. So evolution of law is what happens during cooling down the universe, with the exception that is probably wrong to thinkg of this cooling process in terms of thermodynamics, because there is no outside observer. The trouble is that we have only an inside view of this. This is where one often sees fallacious reasoning as one tries to describe this as per a Newtonian schema (to use smolins words). It's how i think of it. The "formation" of the first "proto-observers" in this process as we increase complexity of observers, is where one needs to make contact eventually to elementary particles. So there should be a one-2-one relatrion between the particle zoo, and the laws of physics. MY personal twist here is that i also associate the "particle zoo" with the "observer zoo". This is how you get the intertwining mix of quantum foundations and self organisation.

These ideas themselves does not rule out say string theory, these ideas could be compatible to string theory as well if you understand the comments in the context of evolution in the landscape and maye even as a pre-string era.

This way of thinking leads to a number or "problems" though, such as a circular reasoning, and the problem of meta law. See smolins writings on this to see what he means by meta law. How do you get a grip on this? this is a problem, and not an easy one. This is why its easier to put things in by hand, so at least you have a starting point. In an evolutionary picture, what is the natural starting point?

/Fredrik
 
  • #5
It seems to me that any attempt to explain quantum mechanics in terms of something more fundamental would run into the problem of accounting for Bell's Theorem. How can self-organizing systems explain EPR-type nonlocal correlations?
 
  • #6
stevendaryl said:
It seems to me that any attempt to explain quantum mechanics in terms of something more fundamental would run into the problem of accounting for Bell's Theorem. How can self-organizing systems explain EPR-type nonlocal correlations?
Glad you asked. Back in 2016 there was an experiment by Neill et al. which seemed to show an experimentally based mathematical correspondence between the entanglement entropy of a few superconducting qubits on the one hand and the chaotic phase space dynamics in the classical limit on the other hand. This implies that entanglement and deterministic chaos are somehow linked through ergodicity and offers interesting insights into non-equilibrium thermodynamics, directly relating it to the notion of an open dissipative nonlinear dynamical system as proposed here.

I posted a thread on this particular experiment a year ago, but unfortunately got no replies (link is here), and I haven't done any followup reading since then to see if there has been any new experimental developments. In any case, if you want to read the paper describing that particular experiment I'd love to hear your thoughts. In any case, as is, I think it might already go a fairly far way in 'explaining' entanglement purely on chaotic and ergodic grounds.
 
  • #7
If you go through Bell's argument leading to his inequality, it seems that the class of theories that are ruled out by EPR would include the sort of self-organizing systems that you're talking about, as long as the interactions are all local. Whether the dynamics is chaotic or not doesn't seem to come into play.
 
  • Like
Likes DrClaude and anorlunda
  • #8
That simply isn't necessarily true if:
a) there is a nonlinear reformulation of QM which is inherently chaotic in some spacetime reformulation like 2-spinors or some holomorphic extension thereof like twistor theory, which after geometric quantization can reproduce the phenomenon of entanglement,
b) there exists a mathematical correspondence between entanglement and chaos,
c) some special combination of the above.

The only way one can argue against this point is to assume that linearity and unitarity are unique, necessary axioms of QT and then view QT as an unalterable, completed theory, something which it almost certainly isn't. This is of course the standard argument most contemporary physicists do make, i.e. they rely on a premature axiomatization of physics based on unitarity and then elevating symmetry to a fundamental notion.

The problem with such an axiomatic stance w.r.t. unitarity is that physical theory is incomplete and - because physics is an experimental science wherein the outcomes of future experiments are unknown - there can never truly be a point where one may validly conclude that physical theory has actually become complete. Therefore such an in principle axiomatization will in practice almost always be invalid reasoning in the context of (fundamental) physics; this is extremely confusing because the exact same argument is valid reasoning in the context of mathematics, precisely because mathematics is completely unempirical in stark contrast to physics.
 
  • #9
Auto-Didact said:
That simply isn't necessarily true if:
a) there is a nonlinear reformulation of QM which is inherently chaotic in some spacetime reformulation like 2-spinors or some holomorphic extension thereof like twistor theory, which after geometric quantization can reproduce the phenomenon of entanglement,
b) there exists a mathematical correspondence between entanglement and chaos,
c) some special combination of the above.

I don't see how Bell's proof is affected by chaos.
 
  • #10
stevendaryl said:
I don't see how Bell's proof is affected by chaos.
You misunderstand my point: entanglement wouldn't be affected by chaos, instead entanglement would be itself a chaotic phenomenon instead of a quantum phenomenon, because all quantum mathematics would actually be linear approximations of chaotic mathematics. Bell's theorem doesn't exclude non-local hidden variables; in essence some kind of Lorentzian i.e. conformal spacetime formulation like projective twistor space would enable exactly such non-locality per GR.

This is entirely possible given that dynamical systems theory/chaos theory is still an immensely large open field of mathematics where daily new mathematical objects are being discovered pretty much everyday, unifying widely different branches of mathematics ranging from complex analysis, to fractal geometry, to non-equilibrium statistical mechanics, to modern network theory, to bifurcation theory, to universality theory, to renormalization group theory, to conformal geometry, and so on.

Manasson's point is precisely that the actual underlying origin of quantization i.e. quantumness of phenomenon in nature is really at bottom a very particular kind of chaotic phenomenon based in supercycle attractors, i.e. and that standard linear quantum mechanics is simply a limiting linear case of this underlying nonlinear theory.
 
  • #11
I only share traits of the idea mentioned in the paper, but at least what I had in mind, does not involve restoring realism at all. Actually relaxing the deductive structure (which i consider to follow from what i wrote before) suggests removing even more of realism. QM removes some realism, but we still have realism left as in the objectivity of physical law, which is assume timeless, eternal truth. The "realism" conditions in the bell argument is a deductive rule from a a hidden variable to outcome. What i have in mind means that this deductive link itself is necessarily unceratain, not to mention what a totally chaotic dependence would imply. It would imply that the function, representing realism in the proof would not be inferrable, due to chaos. To just assume it exists, while its obvious that its not inferrable from experiments, is to me an invalid assumption.

Anyway, in my eyes this was not the interesting part of the idea. I do not see lack of realism as a problem. I rather see the realism of physical law as we know it, as a problem because it is put in there in an ad hoc way by theorist ;-) Emergence aims to increase explanatory power by finding evolutionary and selforganisational reasons for why the laws are what they are.

/Fredrik
 
  • #12
Auto-Didact said:
You misunderstand my point: entanglement wouldn't be affected by chaos, instead entanglement would be itself a chaotic phenomenon instead of a quantum phenomenon, because all quantum mathematics would actually be linear approximations of chaotic mathematics.
I agree with @stevendaryl , Bell's theorem doesn't really use QM, just the assumptions of:
  1. Single Outcomes
  2. Lack of super-determinism
  3. Lack of retrocausality
  4. Presence of common causes
  5. Decorrelating Explanations (combination of 4. and 5. normally called Reichenbach's Common Cause principle)
  6. Relativistic causation (no interactions beyond light cone)
I don't see how chaotic dynamics gets around this.
 
  • Like
Likes no-ir and Demystifier
  • #13
I don't see that the purpose is to attack Bells theorem.

As I see is, the idea of particles and thus physical law that evolves to represent a kind fo maximally compressed encoding of their chaotic environment, leads to the insight that there can be observer dependent information that are fully decoupled and effectively isolated from the environment due to the mechanism of compressed encoding. But this does not mean that one can draw the conclusion that hidden variable theories are useful, it rather oppositely says that they are indistinguishable from non-existing hidden variables, from the point of view of inference, and part of the evolved compressed sensing paradigm means that any inference is "truncated" respecting the limited computational capacity.

This rather has the potential to EXPLAIN quantum weirdness, but NOT in terms of a regular hidden variable theory that are forbidden by bells theorem, but in terms of a picture where subsystems are selforganised compressed sensing structures which means that information can be genuinley hidden as in observer dependent.

To FULLY explain this will require nothing less than solving the problem of course which contais a lot of hard subproblems. But I personally think its easy to see the conceptual visions here, but i also understand that different people have different incompatible visions.

/Fredrik
 
  • #14
Fra said:
This rather has the potential to EXPLAIN quantum weirdness, but NOT in terms of a regular hidden variable theory that are forbidden by bells theorem, but in terms of a picture where subsystems are selforganised compressed sensing structures which means that information can be genuinley hidden as in observer dependent.

I guess I'm repeating myself, but it seems to me that self-organizing systems IS a regular hidden variable theory. So it's already ruled out by Bell. Unless it's nonlocal (or superdeterministic).
 
  • #15
Fra said:
I don't see that the purpose is to attack Bells theorem.
I definitely don't think that he is attacking Bell's theorem, it's just that in a sense Bell inequality violating correlations are the most perplexing feature of QM. We know other aspects of quantum mechanics, e.g. superposition, interference, teleportation, super dense coding, indistinguishably of non-orthogonal states, non-commutativity of measurements, measurements unavoidably disturbing the system, etc can be replicated by a local hidden variable theory. However post-classical correlations cannot be.

So anything claiming to replicate QM should first explain how it achieves these post-classical correlations. Replicating anything else is known to pose no problems.
 
Last edited:
  • Like
Likes no-ir, Demystifier and Auto-Didact
  • #16
Ie the reason that
stevendaryl said:
I guess I'm repeating myself, but it seems to me that self-organizing systems IS a regular hidden variable theory. So it's already ruled out by Bell. Unless it's nonlocal (or superdeterministic).
If this is your view, I understand your comments; so at this level we have not disagreement.

But what I have in mind with evolved partilces is NOT a regular hidden variable theory. Let me think how i can briefly explain better.

/Fredrik
 
Last edited:
  • #17
Fra said:
But what I have in mind with evolved partilces is NOT a regular hidden variable theory. It is rather something extremely non-linear.

But "regular hidden variable" theory INCLUDES "extremely non-linear" systems. Bell's notion of a hidden-variables theory allows arbitrary interactions, as long as they are local. Nonlinearity is not ruled out.

(Actually, "nonlinear" by itself doesn't mean anything. You have to say what is linear in what.)
 
  • #18
DarMM said:
I agree with @stevendaryl , Bell's theorem doesn't really use QM, just the assumptions of:
  1. Single Outcomes
  2. Lack of super-determinism
  3. Lack of retrocausality
  4. Presence of common causes
  5. Decorrelating Explanations (combination of 4. and 5. normally called Reichenbach's Common Cause principle)
  6. Relativistic causation (no interactions beyond light cone)
I don't see how chaotic dynamics gets around this.

Again chaotic dynamics doesn't have to get around any of that. I will try to answer each point, by stating what I am assuming/hypothesizing:

1) Analyticity/holomorphicity making single outcomes a necessity per the conventional uniqueness and existence arguments.
2) No assumption of superdeterminism.
3) No assumption of retrocausality of matter or information. What may however 'travel' in either direction in time is quantum information; this is a very unfortunate misnomer because quantum information is not a form of information!
4 & 5) The physical common cause is the specific spacetime pathway connecting some EPR pair: quantum information (which is not information!) 'travels' on this path. Actually 'traveling' is an incorrect term, the quantum information merely exists non-locally on this spacetime path. An existing mathematical model capable of capturing this concept is spin network theory. For those who need reminding, spin network theory is a fully spaceless and timeless description wherein quantum information exists on the edges of the network.
6) All matter and information can only follow timelike and lightlike curves, respectively traveling within or on the light cones. Quantum information existing non-locally across the entire spacetime path connecting any entangled EPR pair doesn't violate this, being neither a form of matter or information. A relativistic generalization of spin network theory capable of describing this - in which this non-locality is inherently explicit - is twistor theory. Twistor theory moreover utilizes a symmetry group which is both compatible with relativity theory and has a representation theory which is semi-simple (in contrast to the Poincaré group), namely the conformal group which is an extension of the Poincaré group.

An important point to make is that Manasson's theory, being dissipative in nature, then automatically provides an underlying theory capable of explaining state vector reduction i.e. wave function collapse, implying that is a physically real phenomenon as well. This would automatically imply a psi-ontic interpretation i.e. that the wave function is a really existing phenomenon in nature.

Notice that I am not saying that the above arguments are correct per se, but merely a logically valid possibility which is mathematically speaking completely conceivable and possibly even already directly constructable using existing mathematics.
 
Last edited:
  • #19
stevendaryl said:
I guess I'm repeating myself, but it seems to me that self-organizing systems IS a regular hidden variable theory. So it's already ruled out by Bell. Unless it's nonlocal (or superdeterministic).
It seems you are prematurely viewing the concept of self-organization from a very narrow viewpoint.

Moreover, entanglement is or at least can be understood fully as a non-local phenomenon; this isn't inconsistent with GR either.
stevendaryl said:
(Actually, "nonlinear" by itself doesn't mean anything. You have to say what is linear in what.)
In order to make this argument one actually doesn't need to explicitly specify linearity in what per se, but instead merely assume that the correct equation is or correct equations are nonlinear maps e.g. (some class of coupled) nonlinear PDEs instead of a linear PDE like the Schrodinger equation or coupled linear PDEs like the Dirac equation.
 
  • #20
Auto-Didact said:
It seems you are viewing the concept of self-organization from a very narrow view.

I haven't made any assumptions about self-organization, so I'm viewing it by the very broadest view---it could mean anything at all. Bell's theorem doesn't make any assumptions about whether the dynamics is self-organizing, or not.
 
  • #21
Auto-Didact said:
Again chaotic dynamics doesn't have to get around any of that. I will try to answer each point, by stating what I am assuming/hypothesizing:

1) Analyticity/holomorphicity making single outcomes a necessity per the conventional uniqueness and existence arguments.
2) No assumption of superdeterminism.
3) No assumption of retrocausality of matter or information.
4 & 5) The physical common cause is the specific spacetime pathway connecting some EPR pair
6) All matter and information can only follow timelike and lightlike curves...

Notice that I am not saying that the above arguments are correct per se, but merely a logically valid possibility which is mathematically speaking completely conceivable and possibly even already directly constructable using existing mathematics.
This will not simulate non-classical correlations then. You either need to have superdeterminism, retrocausality (i.e. actual information carrying physical interaction moving back in time), nonlocality (actual interaction at a distance), rejection of Reichenbach's principle (specifically the no decorrelating explanation part), rejection of single outcomes or reject the concept of the fundamental dynamics being mathematical (i.e. anti scientific realism).

It doesn't matter if the dynamics is chaotic, dissipative and more nonlinear than anything ever conceived, unless one of these things is true Bell's theorem guarantees it will fail to replicate non-classical correlations.
 
  • #22
If that is your point of view then this doesn't follow:
stevendaryl said:
it seems to me that self-organizing systems IS a regular hidden variable theory
'Regular' strange attractors are already infinitely complicated due to topological mixing. Supercycle attractors on the other hand, seem to increase the degree of complexity of this topological mixing by some arbitrarily high amount such that the entire space taken up by the dense orbits of the entire original strange attractor - after bifurcating in this particular way - 'collapses' onto a very particular set of discrete orbitals - becoming in the context of QM indistinguishable from discrete quantum orbits.
 
Last edited:
  • #23
DarMM said:
This will not simulate non-classical correlations then. You either need to have superdeterminism, retrocausality (i.e. actual information carrying physical interaction moving back in time), nonlocality (actual interaction at a distance), rejection of Reichenbach's principle (specifically the no decorrelating explanation part), rejection of single outcomes or reject the concept of the fundamental dynamics being mathematical (i.e. anti scientific realism).

It doesn't matter if the dynamics is chaotic, dissipative and more nonlinear than anything ever conceived, unless one of these things is true Bell's theorem guarantees it will fail to replicate non-classical correlations.
? Did you miss that I specifically said that the entire scheme can consistently be made non-local using spin network theory or (the mathematics of) twistor theory?

Manasson's theory only explains quantisation; it isn't a theory of everything. Just adding spin networks to Manasson's preliminary model alone already seems to solve all the problems regarding being able to reproduce QM entirely.
 
  • #24
Auto-Didact said:
If that is your point of view then this doesn't follow:
'Regular' strange attractors are already infinitely complicated due to topological mixing. Supercycle attractors on the other hand, seems to increase the degree of complexity of this topological mixing by an arbitrarily high amount such that the space taken up by the dense orbits of an entire attractor - after bifurcating through self-organisation - 'collapses' onto a very particular set of discrete orbitals - in the context of QM becoming indistinguishable from discrete quantum orbits.

If that is your point of view then this doesn't follow:
'Regular' strange attractors are already infinitely complicated due to topological mixing. Supercycle attractors on the other hand, seem to increase the degree of complexity of this topological mixing by some arbitrarily high amount such that the entire space taken up by the dense orbits of the entire original strange attractor - after bifurcating in this particular way - 'collapses' onto a very particular set of discrete orbitals - becoming in the context of QM indistinguishable from discrete quantum orbits.

I think that you are misunderstanding my point. I don't care how complicated the dynamics are because Bell's theorem doesn't make any assumptions about complexity.
 
  • #25
As I have stated multiple times now, consistently adding something like spin networks or twistor theory to Manasson's theory immediately makes the resulting theory non-local, thereby removing the complaints you have regarding Bell's theorem. I see no reason why this cannot be done.
 
  • #26
Auto-Didact said:
? Did you miss that I specifically said that the entire scheme can consistently be made non-local using spin network theory or (the mathematics of) twistor theory?

Manasson's theory only explains quantisation; it isn't a theory of everything. Just adding spin networks to Manasson's preliminary model alone already seems to solve all the problems regarding being able to reproduce QM entirely.
I saw it, but I was confining discussion to Manasson's theory explicitly, possible modifications are hard to discuss if they are not developed.

However to discuss that, I get that it might seem that spin network theory will solve the problem, but I would suggest reading up on current work in Quantum Foundations. All realist models (any of nonlocal, Many-Worlds, Retrocausal and Superdeterminism models) all display fine tuning problems as shown in the Wood-Spekkens and the Pusey-Leifer theorems for example. It's not enough that something seems to solve the problem. If you try an unnatural fine tuning will emerge.
 
  • #27
Manasson's theory is clearly preliminary; just because it has not yet reproduced entanglement or Bell inequalities doesn't mean that it is wrong or of no value whatsoever. It is way too early to expect that from the theory.

The fact that it - in its very preliminary form - seems to be able to directly reproduce so much (quantisation, spinors, coupling constants of strong/weak/EM, resolve measurement problem) using so little, is what one should be focusing on.

No one ever said it would as is immediately reproduce QM fully, but instead that it gives an explanation for where quantization itself comes from, which implies QM is not the fundamental theory of nature.

Complaining that a preliminary model which explains the origin of some phenomenon without fully reproducing the phenomenon as well is wrong/not worth considering because it doesn't immediately reproduce the entire phenomenon is making a serious categorical error. That would be analogous to a contemporary of Newton dismissing Newton and his work because Newton didn't invent a full theory of relativistic gravity and curved spacetime in one go.
DarMM said:
However to discuss that, I get that it might seem that spin network theory will solve the problem, but I would suggest reading up on current work in Quantum Foundations. All realist models (any of nonlocal, Many-Worlds, Retrocausal and Superdeterminism models) all display fine tuning problems as shown in the Wood-Spekkens and the Pusey-Leifer theorems for example. It's not enough that something seems to solve the problem. If you try an unnatural fine tuning will emerge.
Apart from the possible issue with finetuning, this part sounds thoroughly confused. QM itself can essentially be viewed as a non-local theory, this is what Bell's theorem shows. From what I understood before from Pusey and Leifer's paper was that QM may not just be non-local but has an element of retrocausality as well, i.e. quantum information through entanglement can travel backwards in time while not being a form of signalling i.e. quantum information not being information. How is this any different from what I am arguing for?
 
  • #28
Auto-Didact said:
this is a very unfortunate misnomer because quantum information is not a form of information!
Hi Auto-Didact:

I would appreciate it if you would elaborate on this concept. Wikipedia
says
In physics and computer science, quantum information is information that is held in the state of a quantum system. Quantum information is the basic entity of study in quantum information theory, and can be manipulated using engineering techniques known as quantum information processing. Much like classical information can be processed with digital computers, transmitted from place to place, manipulated with algorithms, and analyzed with the mathematics of computer science, so also analogous concepts apply to quantum information. While the fundamental unit of classical information is the bit, in quantum information it is the qubit.In physics and computer science, quantum information is information that is held in the state of a quantum system. Quantum information is the basic entity of study in quantum information theory, and can be manipulated using engineering techniques known as quantum information processing. Much like classical information can be processed with digital computers, transmitted from place to place, manipulated with algorithms, and analyzed with the mathematics of computer science, so also analogous concepts apply to quantum information. While the fundamental unit of classical information is the bit, in quantum information it is the qubit.​
Is your difference with Wikipedia simply a vocabulary matter, or is there some deeper meaning?

Regards,
Buzz
 
  • #29
DarMM said:
It's not enough that something seems to solve the problem. If you try an unnatural fine tuning will emerge.
I'm pretty sure you are aware that Sabine Hossenfelder wrote an entire book about the complete irrelevance of numbers seeming unnatural i.e. that naturalness arguments have no proper scientific basis and holding to them blindly are actively counter-productive for the progress of theoretical physics.

Moreover, I'm not entirely convinced by it, but I recently read a paper by Strumia et al. (yes, that Strumia) which argues quite convincingly that demonstrating near-criticality can make anthropic arguments and arguments based on naturalness practically obsolete.
Buzz Bloom said:
Is your difference with Wikipedia simply a vocabulary matter, or is there some deeper meaning?
Read this book.
Quantum information is a horrible misnomer, it is not a form of information in the Shannon information theoretic/signal processing sense i.e. the known and universally accepted definition of information from mathematics and computer science.

This fully explains why entanglement doesn't work by faster than light signalling, i.e. it isn't transmitting information in the first place, but something else. It is unfortunate this something else can be easily referred to colloquially as information as well, which is exactly what happened when someone came up with the term.

The continued usage is as bad if not worse than laymen confusing the concept of velocity with that of force, especially because computer scientists/physicists actually came up with the name!
 
  • #30
Auto-Didact said:
Read this book.
Hi Auto-Didact:

Thanks for the citation.
Quantum (Un)speakables
Editors: Bertlmann, R.A., Zeilinger, A.
Publication date: 01 Sep 2002
Publisher: Springer-Verlag Berlin and Heidelberg GmbH & Co. KG
List Price: US $129​
Neither my local library, nor the network of libraries it belongs to, has the book.
I did download the Table of Contents, 10 pages. Can you cite a particular part (or parts) of the book that deals with the question I asked about "quantum information vs. information"? The local reference librarian may be able to get me a copy of just the part(s) I need.

Regards,
Buzz
 
Last edited:
  • #31
Auto-Didact said:
I'm pretty sure you are aware that Sabine Hossenfelder wrote an entire book about the complete irrelevance of numbers seeming unnatural i.e. that naturalness arguments have no proper scientific basis and holding to them blindly are actively counter-productive for the progress of theoretical physics.

Moreover, I'm not entirely convinced by it, but I recently read a paper by Strumia et al. (yes, that Strumia) which argues quite convincingly that demonstrating near-criticality can make anthropic arguments and arguments based on naturalness practically obsolete
Well these aren't just numbers, unless fine tuned realistic models will have their unusual features become noticeable, i.e. in Retrocausal theories if you don't fine tune them then the retrocausal signals are noticeable and useable macroscopically, similarly for nonlocal theories. This could be correct, but it's something to keep in mind. It isn't fine-tuning in the sense you are thinking of (special parameter values), but the presence of superluminal (etc) signalling for these theories outside very specific initial conditions.

Auto-Didact said:
No one ever said it would as is immediately reproduce QM fully, but instead that it gives an explanation for where quantization itself comes from, which implies QM is not the fundamental theory of nature.
There are a few models that do that.
Auto-Didact said:
Complaining that a preliminary model which explains the origin of some phenomenon without fully reproducing the phenomenon as well is wrong/not worth considering because it doesn't immediately reproduce the entire phenomenon is making a serious categorical error.
I think this is overblown, I'm not saying it shouldn't be considered, I'm just saying that the features of QM it does solve (e.g. measurement problem, quantisation) are easily done, even in toy models. It would be the details of how it explains entanglement that would need to be seen and in advance we know it will involve fine-tuning in its initial conditions. Whether that is okay/worth it could then be judged in light of all the other features it may have. What I was discussing is that "solving" entanglement is known to take much more than this and have unpleasant features.
 
Last edited:
  • Like
Likes Auto-Didact
  • #32
Buzz Bloom said:
Hi Auto-Didact:

Thanks for the citation.
Quantum (Un)speakables
Editors: Bertlmann, R.A., Zeilinger, A.
Publication date 01 Sep 2002
Publisher Springer-Verlag Berlin and Heidelberg GmbH & Co. KG
List Price: US $129​
Neither my local library, nor the network of libraries it belongs to, has the book.
I did download the Table of Contents, 10 pages. Can you cite a particular part (or parts) of the book that deals with the question I asked about "quantum information vs. information"? The local reference librarian may be able to get me a copy of just the part(s) I need.

Regards,
Buzz
Its been awhile, I can't remember exactly. What I do remember however is that the book is definitely worth reading. It isn't merely some book on QM foundations, but a book on quantum information theory and a partial biography of John Bell as well. Just check the list of authors if you feel you need any convincing. In any case, check your conversations.
DarMM said:
It isn't fine-tuning in the sense you are thinking of (special parameter values), but the presence of superluminal (etc) signalling for these theories outside very specific initial conditions.
Might as well just say superluminal signalling etc; referring to these problems as finetuning is another very unfortunate misnomer, especially given the way more familiar fine tuning arguments for life/earth/the universe etc.

Btw I am actively keeping in mind what you are calling finetuning problems in so far as I'm aware of them. This is my current main go-to text for trying to see what a new theory needs to both solve and take into account w.r.t the known issues in the foundations of QM, and this is the text which in my opinion best explains how the "nonlinear reformulation of QM" programme is trying to achieve solving the above problem, which moreover uses a specific kind of preliminary prototype model illustrating the required mathematical properties.
DarMM said:
There are a few models that do that.
Some references would be nice, pretty much every other model/theory I have ever seen beside this was obviously wrong or completely unbelievable (in the bad sense of the word).
 
  • #33
stevendaryl said:
But "regular hidden variable" theory INCLUDES "extremely non-linear" systems. Bell's notion of a hidden-variables theory allows arbitrary interactions, as long as they are local. Nonlinearity is not ruled out.

(Actually, "nonlinear" by itself doesn't mean anything. You have to say what is linear in what.)
You are right, non-linear was the wrong phrase (which i realized and changed it, but too late). I was trying to give a quick answer.

Bells theorem is about probabilities, and my view is that any P-measure, or system of P-measures, to necessarily be conditional upon, or even identified with an observers, and they i of course take a observer dependent bayesian view on P-measures. (with observer here, read particle as a generalisation of measurement device, not the human scientist. In my view the generalized notion of observer is NOT necessarily a classical device, that is the twist. And the P-measures are hidden in the sense that no other obserer can observer the naked expectations of another observer, and there is no simple renormalization scheme you can use iether. This comparasion is simply indistinguishable from the normal physical interaction. One observer can only try to adbuce the naked expectations of another system by means of its observer actions, from the perspective of the other observer.

This is loosely analogous (given that analogies are never perfect) to how geometry guides matter, and matter evolves geometry. What we have here is an evolutionary process where theory (as encoded in a particles internal structure) guides the action of the particles, but the action of the population of particles similarly evolve theory. If you complain this is not precise enough mathematically that's correct, but i am trying to save the vision here, in despite of the admittedly incomplete and even confusing and almost contradictory details.

Its this evolution of law - as identified with tuning of elementary particles - that informally can be thought of as a random walk in a similarly evolving theory space, that is self-organising. The task is to find the explicits here, and show that there are stable preferred attractors, and that these correspond to the standard model. IF this totally fails, then we can dissmiss this crazy idea, but not sooner i think.

Once we are at the attractor, we have business at usual with symmetries etc. I am not suggesting to restore realism, neither do i suggest a simply self-organising classical chaos to explain QM! It is not enough, that is agreed, but this not what imean.

/Fredrik
 
  • #34
stevendaryl said:
Bell's theorem doesn't make any assumptions about complexity.

I agree that what will not work is any underlying observer invariant classical probability model, that with some crazy nonlinear chaotic deductions and where transitions follow some simple conditional probability. This will not work because the whole idea of an observer independent probability space is deeply confused.

This is my opinon, and tha each interacting subsystem implicitly encodes its own version of the P-spaces. Such models are to my knowledge not excluded by bells theorem. Because the P-measures used in the theorem are not fixed, they are evolving, and one has to define which observer is making the bell inferences.

So the conjecture is not to explain QM as a classical HV model (no matter how chaotic), where the experimenter is simply ignorant about these. The conjecture would be to explain QM as interacting information processing agents (elemetary particles to refer to the paper) that self-organize their "P-spaces" to reflect maximal stability. Any interaction between two systems take place at two levels, a regular residual interaction where observers evolved and agreement on disagreement, but that leaves them both stable. And a more desctructive level which evolves the P-measures. QM as we know should be emergent as residual interactions, but the evolutionary mechanisms are what is needed to understand unification. Ie. the KEY is to include the "observer", the encoder of the expectations, in the actual interactions.

But wit this said the link to the original paper that ia connected to was that in an approximate sense, one can probably "explain" an elementary particle as an evolved information processing agent, in a chaotic environment. Here the chaos is relevant as it demonstrates the particles insufficent computational complexity to decode the environment. And this fact, determines the properties of it - or so goes the conjecture. There is still not actual model for this yet.

I feel i may be drifting a bit here, but my only point in this thread was to support a kind of "hidden variable" model, but which is really just the observer dependent information, so it does not have the structure of classical realism that is rejected by bells theorem. And this will then have generic traits such as beeing evolved, and the exact symmetries we are used to would correspond to attractors, but not attractors in a simple fixed theory spcae, but attractors in an evolving theory space. This latter things is a key, as otherwise we run into all kinds of fine tuning problems well known to any Newtonian schema.

Sorry for the ramblings, on my way off air for sometime, so i will not interfere more the next days.

/Fredrik
 
  • #35
Auto-Didact said:
Might as well just say superluminal signalling etc; referring to these problems as finetuning is another very unfortunate misnomer, especially given the way more familiar fine tuning arguments for life/earth/the universe etc.
Fine tuning has long been used for both initial condition tuning and parameter tuning, I don't think parameter tuning has any special claim on the phrase. Besides it's standard usage in Quantum Foundations to refer to this as "Fine-Tuning" and I prefer to use terms as they are used in the relevant fields.

It couldn't be called "superluminal singalling" as the fine tuning is the solution to why we don't observe superluminal (or retrocausal, etc) signalling at macroscopic scales in realist models.

Auto-Didact said:
Some references would be nice, pretty much every other model/theory I have ever seen beside this was obviously wrong or completely unbelievable (in the bad sense of the word).
Well a simple toy model that shows a huge amount of quantum mechanical features result purely from a fundamental epistemic limit is here:
https://arxiv.org/abs/quant-ph/0401052

It's just a toy model, there are much more developed ones, but you can see the basic fact of how easy it is to replicate a huge amount of QM, except for entanglement. Which is why entanglement is the key feature one has to explain.
 
<h2>1. What is quantization and why is it not considered fundamental?</h2><p>Quantization is the process of discretizing a continuous variable into distinct values. It is not considered fundamental because it is a mathematical tool used to simplify complex systems and does not reflect the true nature of reality.</p><h2>2. If quantization is not fundamental, what is the underlying reality?</h2><p>The underlying reality is described by quantum mechanics, which explains the behavior of particles at the smallest scales. In this framework, particles do not have a definite position or momentum, but rather exist in a state of superposition until measured.</p><h2>3. How does the concept of quantization relate to the uncertainty principle?</h2><p>The uncertainty principle states that it is impossible to know both the exact position and momentum of a particle at the same time. This is because quantization implies that particles do not have a definite position or momentum, but rather exist in a range of possible values.</p><h2>4. Can we ever truly understand the fundamental nature of reality?</h2><p>It is currently unknown if we can ever fully understand the fundamental nature of reality. Some theories, such as string theory, attempt to unify quantum mechanics and general relativity to provide a more complete understanding, but it is still an ongoing area of research.</p><h2>5. How does the concept of quantization impact our daily lives?</h2><p>Quantization has a significant impact on our daily lives through various technologies, such as computers and smartphones, which rely on the principles of quantum mechanics. However, in our macroscopic world, the effects of quantization are often negligible and do not greatly affect our daily experiences.</p>

1. What is quantization and why is it not considered fundamental?

Quantization is the process of discretizing a continuous variable into distinct values. It is not considered fundamental because it is a mathematical tool used to simplify complex systems and does not reflect the true nature of reality.

2. If quantization is not fundamental, what is the underlying reality?

The underlying reality is described by quantum mechanics, which explains the behavior of particles at the smallest scales. In this framework, particles do not have a definite position or momentum, but rather exist in a state of superposition until measured.

3. How does the concept of quantization relate to the uncertainty principle?

The uncertainty principle states that it is impossible to know both the exact position and momentum of a particle at the same time. This is because quantization implies that particles do not have a definite position or momentum, but rather exist in a range of possible values.

4. Can we ever truly understand the fundamental nature of reality?

It is currently unknown if we can ever fully understand the fundamental nature of reality. Some theories, such as string theory, attempt to unify quantum mechanics and general relativity to provide a more complete understanding, but it is still an ongoing area of research.

5. How does the concept of quantization impact our daily lives?

Quantization has a significant impact on our daily lives through various technologies, such as computers and smartphones, which rely on the principles of quantum mechanics. However, in our macroscopic world, the effects of quantization are often negligible and do not greatly affect our daily experiences.

Similar threads

  • Beyond the Standard Models
Replies
16
Views
4K
Replies
2
Views
2K
  • Beyond the Standard Models
Replies
2
Views
2K
  • Quantum Interpretations and Foundations
Replies
0
Views
1K
  • Beyond the Standard Models
Replies
14
Views
3K
  • Beyond the Standard Models
Replies
11
Views
2K
  • Quantum Physics
Replies
4
Views
2K
  • Beyond the Standard Models
Replies
5
Views
2K
Back
Top