Summing not over configurations, but over theories

  • A
  • Thread starter jordi
  • Start date
  • Tags
    Theories
In summary: But that's a whole other story. In summary, Path integrals (and generalizations) are sums over configurations. A logical extension of that process would be to sum not over configurations, but over theories (configurations are possible solutions of a single theory). Renormalization already plays around the "space of theories" (with changing parameters), but AFAIK, this path is not pursued much. My (very basic) thinking goes alongside the central limit theorem: summing many realizations of uncertain distributions gives something very definite (close to a Gaussian distribution). So could the summation over many realizations of uncertain distributions (all possible theories) give something definite (the theory we live in, given
  • #1
jordi
197
14
A not very well defined question:

Path integrals (and generalizations) are sums over configurations. A logical extension of that process would be to sum not over configurations, but over theories (configurations are possible solutions of a single theory).

Renormalization already plays around the "space of theories" (with changing parameters), but AFAIK, this path is not pursued much.

My (very basic) thinking goes alongside the central limit theorem: summing many realizations of uncertain distributions gives something very definite (close to a Gaussian distribution). So could the summation over many realizations of uncertain distributions (all possible theories) give something definite (the theory we live in, given some measurements)?
 
  • Like
Likes Demystifier
Physics news on Phys.org
  • #2
jordi said:
My (very basic) thinking goes alongside the central limit theorem: summing many realizations of uncertain distributions gives something very definite (close to a Gaussian distribution).
It isn't clear what an "uncertain distribution" would be. Perhaps you are thinking of a random variable that is a probability distribution over a space of probability distributions.

So could the summation over many realizations of uncertain distributions (all possible theories) give something definite (the theory we live in, given some measurements)?

Try giving a specific example of that concept - an example using specific probability spaces.
 
  • Like
Likes jordi
  • #3
  • Like
Likes jordi and atyy
  • #4
I was thinking in the analogous case to spin glass or random field ising model: in addition to the probability space of the spins, there is a probability space for the parameters (either the coupling constant J or the magnetic field H).

The phase transitions of these models are richer than the ones of the Ising model (only probability space of the spins).

In the same way the second law of Newton is recovered from quantum mechanics as the first order in the expectation value of the Schroedinger equation, couldn't it be the fundamental laws we have so far are only "first order averages" of averaging over many (maybe all) equations?

I know what I say is highly speculative. But it is the only way out I find to the question why there is a law describing the universe. This is analogous to the question why the distribution of heights in a population is close to a normal distribution. One possibility is "magic" (GR and the Standard Model are "magic"). Another possibility is there is some dynamical reason for that (the central limit theorem).
 
  • Like
Likes Fra
  • #5
Another (speculative) comment: if the parameters of a Lagrangian were not fixed numbers, but probability distributions (and the theory sums not only on fields, but on the probability distributions of the parameters), maybe that could be an additional method to regularize a theory.

If the probability distribution were a Dirac delta, we would recover a standard theory. But if the probability distribution were a (thin) normal, maybe this effect would result in a regularized theory?

In fact, this leads to another question: if the probability distribution is a (thin) normal, and we sum over "everything", the resulting "theory" can be understood as a standard theory (i.e. a Lagrangian with fixed parameters)?
 
  • #6
I don't know if this is related to what you really have in mind, but your posts made me remember of Dyson's interpretation of random matrices ensembles, outlined here: https://aip.scitation.org/doi/abs/10.1063/1.1703773

Basically, he considers random matrices enembles as ensembles of some unknown objects which are related to different hamiltonians by certain conditions, that is, he has some kind of ensemble of different systems which may be described by different hamiltonians, and then he tries to compute things out of this interpretation.
 
  • Like
Likes jordi
  • #7
The Lagrangian parameters are already not fixed in the RG context. You have your coupling constants as a function of energy scale. The difficulty seems to be when moving to a region where the real world would require terms that we didn't consider in the Lagrangian. So if your theory starts diverging you add the terms that would fix that, for example. But if we are moving between theories we should consider the possibility that there are models that can't be described by Lagrangians, regardless of how many terms you have. My point is, any method for moving between theories seems highly dependent on the theories in question.

This kind of speculative discussion is fun but I honestly don't know if what I just said even makes sense.
 
  • Like
Likes jordi
  • #8
diegzumillo said:
But if we are moving between theories we should consider the possibility that there are models that can't be described by Lagrangians, regardless of how many terms you have.
This kind of speculative discussion is fun but I honestly don't know if what I just said even makes sense.

Good point - does it make sense to discuss phenomena only in terms of the motion of particles?

Discussion of everything in terms of a Lagrangian or Hamiltonian brings to mind classical mechanics where all there is to Nature is matter and all there is to matter is position and mass. When a physical theory attributes some other properties to matter besides position and mass, how does it formulate the behavior of those attributes in terms of a Lagrangian? - I mean: what is the general procedure? - incorporate those attributes as variables in a Lagrangian and write an equation involving those attributes and their time derivatives?

----

In pondering some recent threads on the many worlds interpretation of quantum mechanics, it occurs to me that all physics (of various kinds) assumes there is a self-similarity to Nature. For example, the typical "physical law" is an equation with both variables and "constants". The constants are present to adapt the law to different situations. Presuming the law is useful, there must be many different situations in Nature that are at least approximately described by the law using appropriate constants. One can call this "discovering patterns in Nature". However, taking the physical law as the authority, one could also call it "Requiring Nature to have certain self-similarities".
 
  • Like
Likes jordi
  • #9
jordi said:
But it is the only way out I find to the question why there is a law describing the universe. This is analogous to the question why the distribution of heights in a population is close to a normal distribution. One possibility is "magic" (GR and the Standard Model are "magic"). Another possibility is there is some dynamical reason for that (the central limit theorem).
So you are seeking an "explanation" of the lawfulness of the universe? Ie. out of the landscape of all possible crazy theories, what is the attractor mechanisms that selects the actual theories that seem to describe our observations?

That is good question, that i share with you!

Your idea seems to be to average over all possible theories to see if we can statistcally find a unique average?

IMHO, one complication here is that to even describe a space of possible theories, you need ANOTHER higher theory, and at some point this higher theory becomes an arbitrary random choice that can never be reduced to a rational choice as per reductionist thinking because then the higher theory gets more and more complex. Instead we need to find first a way to physically regulate this diverging tower of turtles, and assign physical meaning to the regularisation. My approach is to use the observing systems capacity to store and process information as the natural regulator.

This way, for a given observer, there are some theories in the landscape that are simply too complex to be encoded and computed, and this naturally regulates the picture. Then one can in the usual block baed energy scaling understand decoupling of high energy DETAILS at lower energy from a causal perspective simply as throttling the communication implicit in the "interaction".

So I think there is no true observer independent way to make the summation you envision, that is imo a fallacy. Instead think we need to see it in an evolutionary perspective, and ask ourselfes how two theories interact. If we, like I strongly suggest we must to, associate one theory with one observer, then two interacting observers are the same thing as two interactng theories. And you can only describe two interacting theories by a third theory.

So a KEY to make progress in this mess, is to try to get a grip of the computational regularisation required here. Theories that are not computable by the host system are imo simply useless, and thus necessarily "wrong".

One can interpret the implications of this both into string theory and other theories. So i think these questions are somewhat neutral fundamental.

/Fredrik
 
  • Like
Likes jordi
  • #10
mitchell porter said:

This is one way to give the question a good meaning.

Maybe I may use the occasion to point out that, while Witten did look into the issue of background independence of open string field theory (OSFT), the question remained open, as he highlighted a few years later in section 3.3 of

Edward Witten, Overview of K-theory applied to Strings (arXiv:hep-th/0007175)

Namely, to properly account for brane/antibrane annihilation and nucleation in the background of the OSFT, captured by the K-theory classification of D-branes, a rough plausibility argument suggests that one would need to formulate OSFT on ##N## D-branes for "##N = \infty##". This is open.

But I wonder whether one should not try to turn this around: Instead of trying to make large ##N## OSFT reproduce the K-theory classification of D-branes, maybe their K-theory nature needs to be built in at a more fundamental level, and instead a description in terms of stacks of ##N## D-brane be derived from that in special cases and under special conditions. That's where we are headed towards in Gauge enhancement of Super M-Branes (arXiv:1806.01115), see the exposition in the introduction.

In any case, this points to what seems like a blind spot in essentially all of the contemporary discussion of brane engineering of gauge theories: It is known that the popular picture of a stack of ##N## D-branes carrying an ##SU(N)## gauge field is not actually an invariant description of string theoretic reality, but that such data only serves as a kind of coordinatization of K-theory (co-)cocycles.

I guess the reason for this omission is that for the traditionally trained researcher, K-theory seems like an opaque or even contrived concept compared to traditional Chan-Paton style arguments. But if mathematics is to have a say in the foundations of string theory, then it is likely the other way around.
 
  • Like
Likes jordi, Spinnor and Greg Bernhardt
  • #11
jordi said:
A not very well defined question:

Path integrals (and generalizations) are sums over configurations. A logical extension of that process would be to sum not over configurations, but over theories (configurations are possible solutions of a single theory).

Renormalization already plays around the "space of theories" (with changing parameters), but AFAIK, this path is not pursued much.

My (very basic) thinking goes alongside the central limit theorem: summing many realizations of uncertain distributions gives something very definite (close to a Gaussian distribution). So could the summation over many realizations of uncertain distributions (all possible theories) give something definite (the theory we live in, given some measurements)?
Why summing over theories (infinite number of theories?) will give you a theory?

I mean there could be some contradictory theories to each other, so why will it be logical to sum over both of them?!
 
  • #12
MathematicalPhysicist said:
Why summing over theories (infinite number of theories?) will give you a theory?

I mean there could be some contradictory theories to each other, so why will it be logical to sum over both of them?!

If one takes one the perspective implicit in my post #9 (that you need a third theory to compare other theories, and that you associate a theory with an observer), the rational answer to this question IMHO is:

The contradictory theories implies that there is a physical interaction with the theories, and the generalisation of "adding contradictions" simlpy means "allow them to interact", and the outcome of the negotiation is the "sum". The "sum" is then the "expected theory of interactions", but one that is conditional upon the third observer.

This is IMO analogous to how we "predict" an interaction between two subatomic particles (identified by gauges choices) by the thirds observers expectation of gauge equivalence between the first two. Only problem is that the third observer isn't and the top of the tower, its is just an internal information processing agent like the others.

What I lack is the exact mathematical and algorithmic machinery to describe this process in a way that is self-consistent. To describe this "negotiation process" is the same thing as to describe the evolution of physical law, which is the selection process for the de facto actual physical laws we observer today in the universe. Its just that this is of course not a direct native summation like path integral! It has to be far more complex and in particualar it has be seen as part of an evolutionary process - not reductionist thinking.

/Fredrik
 
  • #13
MathematicalPhysicist said:
Why summing over theories (infinite number of theories?) will give you a theory?

I mean there could be some contradictory theories to each other, so why will it be logical to sum over both of them?!

I will give a non-scientific answer (a "sofa philosopher", as Feynman said):

Why is there something rather than nothing? I think the answer is the opposite: nothing is just a possibility. Then, there are many other possibilities. A more "logical" way of thinking is that ALL possibilities happen at the same time.

Then, if we believe that all lagrangians span the whole set of theories (possibly wrong), and if we believe that summing is the right way to aggregate (big if), summing all lagrangians could give "something". For spin glasses or random field Ising model, this way of calculating works (of course, the way of thinking is completely different).

Why should there be a single lagrangian which is the theory of everything? I think it is more "logical" that there is no preferred lagrangian, but that all lagrangians are summed over, and the resulting sum is what we see.

In a sense, it is a generalization of sum over histories in quantum mechanics: in QM, we sum over solutions of a lagrangian. Here, we would sum over lagrangians.
 
  • #14
jordi said:
I will give a non-scientific answer (a "sofa philosopher", as Feynman said):

Why is there something rather than nothing? I think the answer is the opposite: nothing is just a possibility. Then, there are many other possibilities. A more "logical" way of thinking is that ALL possibilities happen at the same time.

Then, if we believe that all lagrangians span the whole set of theories (possibly wrong), and if we believe that summing is the right way to aggregate (big if), summing all lagrangians could give "something". For spin glasses or random field Ising model, this way of calculating works (of course, the way of thinking is completely different).

Why should there be a single lagrangian which is the theory of everything? I think it is more "logical" that there is no preferred lagrangian, but that all lagrangians are summed over, and the resulting sum is what we see.

In a sense, it is a generalization of sum over histories in quantum mechanics: in QM, we sum over solutions of a lagrangian. Here, we would sum over lagrangians.
Who gauarantees that this sum over all lagragians even converges?, there are many possible, concievable lagrangians.
 
  • Like
Likes jordi
  • #15
MathematicalPhysicist said:
Who gauarantees that this sum over all lagragians even converges?, there are many possible, concievable lagrangians.

Nobody. I agree with you. For this reason, before I have stated that my statements were not scientific. For sure, I should work out a specific proposal. Let me put it differently, though: when Feynman first explained others about his sum of paths, I am sure the others asked similar questions as you do. Feynman's genius was to work out the details, and to give a specific (and correct) answer.

Maybe the answer must come from mathematics. Could there be mathematicians work with concepts like "the space of all lagrangians" or something similar?
 
  • #16
@jordi your concept needs to be part of the professional scientific literature before it can be discussed here. The thread is closed for now.
 

What does "summing not over configurations, but over theories" mean?

Summing not over configurations, but over theories is a concept in theoretical physics where instead of summing over all possible configurations (or ways particles can arrange themselves), we sum over all possible theories. This means considering different mathematical models or frameworks to describe a physical system and seeing which one best fits the data.

Why do scientists sum over theories instead of configurations?

Summing over theories can provide a more comprehensive understanding of a physical system. It allows for the consideration of different theoretical frameworks and can potentially lead to new insights or solutions to problems that cannot be solved by summing over configurations alone.

How do scientists choose which theories to include in the sum?

There is no set method for choosing which theories to include in the sum. It often involves a combination of theoretical considerations, experimental data, and computational constraints. Scientists may also use mathematical techniques such as Occam's razor to choose the simplest theory that can adequately explain the data.

What are the advantages of summing over theories?

Summing over theories allows scientists to explore a wider range of possibilities and potentially find more accurate or complete descriptions of physical systems. It also allows for the incorporation of new theoretical frameworks or ideas, which can lead to breakthroughs in understanding and predicting the behavior of physical systems.

Are there any limitations to summing over theories?

Summing over theories can be a computationally intensive process and may not always lead to a unique or definitive solution. It also relies on the assumption that the correct theory is included in the sum, which may not always be the case. Additionally, the inclusion or exclusion of certain theories in the sum may be subjective and could potentially lead to bias or incorrect conclusions.

Similar threads

  • Beyond the Standard Models
Replies
4
Views
2K
  • Beyond the Standard Models
Replies
0
Views
501
  • Beyond the Standard Models
Replies
9
Views
476
  • Beyond the Standard Models
Replies
11
Views
2K
  • Beyond the Standard Models
Replies
14
Views
3K
  • Quantum Physics
Replies
5
Views
1K
  • Beyond the Standard Models
Replies
2
Views
4K
  • Beyond the Standard Models
Replies
7
Views
2K
  • Beyond the Standard Models
Replies
2
Views
2K
Replies
1
Views
637
Back
Top