B Unelegant, Unnatural, Ugly BSM theme books

  • Thread starter Thread starter star apple
  • Start date Start date
  • Tags Tags
    Books Bsm
  • #31
ohwilleke said:
The laws of Nature should fit together exactly perfectly and lo and behold, they do. If one wants to have an a priori assumption, the suggestion, I think it was in one of Lubos's blog posts, that the universe likes to be as extreme and "unnatural" as possible without breaking is probably a better hypothesis than "naturalness".
Lubos uses a very vivid metaphor, arguing that Nature is like James Bond and unlike European Union. :biggrin:
 
Physics news on Phys.org
  • #32
ohwilleke said:
If one wants to have an a priori assumption, the suggestion, I think it was in one of Lubos's blog posts, that the universe likes to be as extreme and "unnatural" as possible without breaking is probably a better hypothesis than "naturalness". (To be clear, he himself is a strong proponent of the idea that "Naturalnesss" is a valid and useful idea.)

But that is only true of the true ultimate theory. If one additionally considers that our present theories are not the true ultimate theory, then naturalness is a very natural idea. If one thinks our current theories are already close to the final theory, then naturalness is a less important consideration. So no, I don't think Sabine Hossenfelder is making an important point that many are ignorant of.
 
  • Like
Likes MrRobotoToo
  • #33
atyy said:
But that is only true of the true ultimate theory. If one additionally considers that our present theories are not the true ultimate theory, then naturalness is a very natural idea. If one thinks our current theories are already close to the final theory, then naturalness is a less important consideration. So no, I don't think Sabine Hossenfelder is making an important point that many are ignorant of.

Why is naturalness less important in the final theory, any reference?

By the way.. is scale symmetry approach like Agravity and Higgs dark sector portal version considered naturalness? Or is naturalness only valid if there is a preexisting scale?
 
  • #34
Demystifier said:
Lubos uses a very vivid metaphor, arguing that Nature is like James Bond and unlike European Union. :biggrin:

I can't help not comment on Lubos critiques in https://motls.blogspot.com/2017/04/like-james-bond-nature-loves-to-walk.html especially now that it's Halloween eve. So let's sit back and have fun and not be too serious (at least for this evening only).

Lubos stated:

"At the end, it's very natural for Nature to be courageous in this sense – to exploit all the possibilities that are still compatible with the survival. When something is possible and/or compatible with a logically consistent theory of Nature, it will almost certainly be exploited by Nature. Cowardliness is anthropomorphic and it's just silly to assume that Nature is afraid of the same things as beginners who start to learn modern physics."

Let me emphasize Lubos statement : "When something is possible and/or compatible with a logically consistent theory of Nature, it will almost certainly be exploited by Nature."

Can't ghosts exist? Some of us spend our entire lives dealing with ghosts. And we pretend it's separate from physics.. but what if the LHC just can no longer find anything new. In the 1960s when we asked this. It was a very silly question because the quarks and electroweak had nothing to do with ghosts. So we accepted they were separate. But now what if we need to integrate them for the final theory?

Anyway. Talk is useless. Do they hold yearly Halloween costume parties at the LHC too? All right. If we won't have new physics in the next five years. Then I really request let's make the LHC haunted. To conjure other forces, we don't use electrons or accelerators but spells (or sentient programming to initiate the Hamiltonian to bring down other dynamics). Because science forcibly forced it out of any studies. The public use very medieval terms for it. In this CNN article. They used the medieval language of exorcism. But the point is the same and effect. Summon the beyond standard model extra Hamiltonian forces especially the Poltergeist to make them focus at the Large Hadron Collider so scientists have something new to work with (it would be fun to watch dials at the control room moving on their own and ALICE detector detecting massive anomaly) :

http://edition.cnn.com/2017/08/04/health/exorcism-doctor/index.html

http://edition.cnn.com/2011/09/23/living/crisis-apparitions/index.html

All right. I promise I won't talk about this after Halloween tonight (and if I talked about it again.. then I'm welcomed to be banned.. but not now.. I'm just using Lubos statement against himself.. and Sabine's statement against herself). This two loves to critiques.. It's time they also get critiqued by us who can see it so obvious what they weaknesses are (they both swear they were no ghosts or anything like it but they are dead wrong). Again don't forget Lubos golden statement:

"When something is possible and/or compatible with a logically consistent theory of Nature, it will almost certainly be exploited by Nature"
If nature can exploit the entire 200 billion galaxies fitting into a marble ball.. why not ghosts?

Happy Halloween!

(tomorrow let's not talk about this anymore lest Greg gets angry.. thank you)

If the mods don't agree with Lubos statement. just delete this thread (even if it's just a Halloween cheer up mood message).. then please let atyy and ohwilleke first answered my questions to them.. thanks)
 
  • #35
atyy said:
I think what Pelaggi and colleagues mean by lower energy is an energy below the Landau pole, which is 1040 GeV, so they don't mean an energy scale near the LHC's energy scale of about 104 GeV. So this new physics at "lower energy" includes what you mean by all new physics being at very high energies.

I think what is interesting about both papers linked to in post #22 is that they consider that there may be no new physics, even at very high energies, ie. the standard model is asymptotically safe. Asymptotic safety of some form not a new idea, and researchers such as Weinberg have studied both supersymmetry as well as asymptotic safety. Polchinski's famous string theory textbook also mentions asymptotic safety as an alternative approach. However, Weinberg and Polchinski were referring to asymptotic safety of gravity, rather than the standard model, so there asymptotic safety refers to a group of ideas, rather than a single idea.

You mean our search for new physics is because the standard model is not asymptotically safe? and if it is safe.. no need for new physics even at the general relativistic level? no need for superstrings and loop quantum gravity, etc.? hmm...
 
  • #36
star apple said:
You mean our search for new physics is because the standard model is not asymptotically safe? and if it is safe.. no need for new physics even at the general relativistic level? no need for superstrings and loop quantum gravity, etc.? hmm...

I'm wondering what controls whether a field is asymptotically safe or not at certain energy.. or more technically as Sabine put it:
http://backreaction.blogspot.com/2014/03/what-is-asymptotically-safe-gravity-and.html

"But how theories in general depend on the energy scale has only been really understood within in the last two decades or so. It has been a silent development that almost entirely passed by the popular science press and goes under the name renormalization group flow. The renormalization group flow encodes how a theory depends on the energy scale, and it is at the basis of the idea of effective field theory."

So what controls the renormalization group flow?
 
  • #37
star apple said:
Why is naturalness less important in the final theory, any reference?

The arguments for naturalness are most natural in the context of considering our current theories as effective theories, ie. low energy theories that are useful at low energies, but incomplete at high energies. The renormalization group is the tool which allows us to understand why we can have useful theories at low energies, even though we are ignorant of the true high energy theory.
 
  • Like
Likes star apple
  • #38
star apple said:
You mean our search for new physics is because the standard model is not asymptotically safe? and if it is safe.. no need for new physics even at the general relativistic level? no need for superstrings and loop quantum gravity, etc.? hmm...

If gravity and matter are asymptotically safe, then that means that quantum general relativity is valid to infinitely high energies, and there is no need for superstrings. The relationship between loop quantum gravity and asymptotic safety is unknown - heuristic arguments suggest that if loop quantum gravity does work, then it will be a form of asymptotic safety - however, this is only a very rough argument.
 
  • Like
Likes star apple
  • #39
star apple said:
I'm wondering what controls whether a field is asymptotically safe or not at certain energy.. or more technically as Sabine put it:
http://backreaction.blogspot.com/2014/03/what-is-asymptotically-safe-gravity-and.html

"But how theories in general depend on the energy scale has only been really understood within in the last two decades or so. It has been a silent development that almost entirely passed by the popular science press and goes under the name renormalization group flow. The renormalization group flow encodes how a theory depends on the energy scale, and it is at the basis of the idea of effective field theory."

So what controls the renormalization group flow?

If a quantum field theory is asymptotically safe, that means that it is valid up to infinitely high energy.
 
  • Like
Likes star apple
  • #40
atyy said:
If a quantum field theory is asymptotically safe, that means that it is valid up to infinitely high energy.

But what makes the QFT asymptotically safe in the first place at infinitely high energy? doesn't it require new physics for that to happens.. So when we say we don't need new physics when the standard model is asymptotically safe.. but isn't what make first asymptotically safe in the first place is due to some new physics?
 
  • #41
I'm still wondering about my earlier question, which I'll repeat here since it seems relevant for this topic:

To which extent is finetuning (and hence naturalness) an artefact of doing perturbation theory? Are there exactly soluble QFT's which suffer from naturalness/finetuning problems?

I mean, how would finetuning of the Higgs mass show up in a non-perturbative formulation of the SM?

I thought the question is appropriate here, so I don't start a new topic. Without wanting to hijack this topic of course ;)
 
  • #42
haushofer said:
I'm still wondering about my earlier question, which I'll repeat here since it seems relevant for this topic:

To which extent is finetuning (and hence naturalness) an artefact of doing perturbation theory? Are there exactly soluble QFT's which suffer from naturalness/finetuning problems?

I mean, how would finetuning of the Higgs mass show up in a non-perturbative formulation of the SM?

I thought the question is appropriate here, so I don't start a new topic. Without wanting to hijack this topic of course ;)

There is an interesting discussion in https://www.google.com.sg/url?sa=t&...0D0sQFgg6MAY&usg=AOvVaw2-LVf2T6qnYUCeZ5kTGnKa.
 
Last edited:
  • Like
Likes haushofer
  • #43
atyy said:
Yes. Nonperturbatively, naturalness relates to the sensitivity of the theory to small changes in a dimensionless parameter: https://www.google.com.sg/url?sa=t&...0D0sQFgg6MAY&usg=AOvVaw2-LVf2T6qnYUCeZ5kTGnKa

Great article. The first thing that came to my mind was how come physicists didn't focus more on nonperturbative scheme instead of proposing supersymmetry to handle the quadratic divergences. Supersymmetric particles won't exist in nonperturbative scheme just like virtual particles are just side effect of perturbation theory that is not there in lattice QFT. Unless they think perturbation method could be chosen by nature intrinsically?.
I think it is analogous to the criteria for well-posedness: http://reference.wolfram.com/language/tutorial/DSolveWellPosedness.html
 
  • #45
haushofer said:
Thanks, I'll check it out!

Please share how you understand the paper. There is a passage in page 3 that got me puzzled: "In brief, the quadratic divergences are completely irrelevant for the naturalness and fine-tuning problems involving the physical parameters."

How do you interpret the statement? Does it mean nonperturbative approach doesn't or does remove the Higgs Hierarchy Problem? And when it mentioned "gauge hierarchy problem".. did it mean the higgs?

Also the paper was written in 1983.. a time when we still didn't have a cellphone. So it was ancient. Now after more than 30 years.. is there any update to it.. or new jargons being used now.. for example like relativistic mass no longer used now. Something similar in the terms used in the paper? atyy? anyone?

Thank you.
 
  • #47
atyy said:
I don't understand the paper well, but Wetterich has written more recent papers that do mention naturalness etc., so that could help put us understand whether his thinking has changed or not.
https://arxiv.org/abs/0912.0208
https://arxiv.org/abs/1612.03069

Haushofer mentioning about nonperturbative approach yesterday bothered me a bit about the electron's gyromagnetic ratio that perturbation can produce a value to better than one part in 10^10, or about three parts in 100 billion. I was supposed to mention this yesterday so let me ask about it. After reading the archives about nonperturbative approach. I found this message of yours written in April 4, 2011 in message 78 of https://www.physicsforums.com/threads/non-perturbative-qft-without-virtual-particles.485597/page-4

rogerl asked: "In Hierarchy Problem, the Higgs can have Planck mass because of quantum contributions. So what they do is propose that the virtual particles of Supersymmetric particles can cancel the very large quantum contributions in the Hierarchy Problem. Why do they have to take drastic measure and radical idea just to get rid of the large contribution if virtual particles are just multivariate integrals. Why didn't they just go to lattice methods to solve it?

atyy replied: "That's an interesting question. I don't know. My understanding is that the underlying theory is given by special relativity, quantum mechanics, Wilsonian renormalization, and the standard model Lagrangian. I would guess that the fine tuning problem is a heuristic argument based on Wilsonian renormalization, so it should have a counterpart in a lattice language.

Also, is there such a thing as non-perturbative QED? Unless a QFT is asymptotically free or safe, isn't it by definition only perturbatively defined? According to http://www.sciencedirect.com/scienc...02d57ae15e181b9774e884147a99780a&searchtype=a , QED is likely not asymptotically safe. The only question then is how we choose to name the terms in a particular perturbation expansion."

atyy. It's been 6 long years since you wrote the above. Please update us of your understanding now. So do you think the Higgs Hierarchy Problem has a counterpart in lattice language? And after so many results in the LHC and half a dozen years of pondering about it.. so is there such thing as a non-perturbative QED? What do you think? What's new in your thinking now compared to 2011?
 
  • #48
According to an expert/professor (Demystifier). Finetuning and naturalness are not artifacts of doing perturbation theory. Also for instance, if you study SM on the lattice, you have to choose some UV cutoff on the lattice. The physical quantities may strongly depend on that choice, which can lead to a fine tuning problem.

So with the nonperturbative approach not a solution of the Higgs Hierarchy Problem and crossed out, we are back to:

1. Supersymmetry (example of Naturalness)
2. Extra Dimensions (Randall RS1, RS2)
3. Natural Finetuning (Lubos')
4. Multiverse Anthropic principle
5. Scale Symmetry (is this an example of Naturalness?)

Let me ask you. When a grenade explode in the ground. Does anyone every ask if it's caused by Naturalness (simply by formula) or by Multiverse? It may sound silly.. is it not. So if we eliminate these three. We have left:

1. Extra Dimensions (Randall RS1, RS2)
2. Scale Symmetry (is this an example of Naturalness?)

If we don't have Extra Dimension. We are left with Scale Symmetry.

But is an exploding grenade caused by Scale Symmetry where the distances of the grounds and the size of the grenade were created on impact?

What seems to be missing in the choices are Anthropic principle without Multiverse.. or in other words Intelligent Design.. but let's not use these words as the words automatically denote mindlessness.. let's use the word "programming" instead... that's right.. the Standard Model parameters could be programmed that way instead of coming from naturalness or extra dimensions or multiverse.. is it not?

What could still solve the Higgs Hierarchy Problem is if the Higgs is a composite.. is this still possible?

Again someone please share whether scale symmetry is an example of naturalness because I can't decide. Thanks.
 
  • #49
@star apple regarding you original question: You can find a list of books in the same spirit as Woit's and Smolin's here, and essays written in a similar spirit here.
 
  • Like
Likes star apple
  • #50
ohwilleke said:
the suggestion, I think it was in one of Lubos's blog posts, that the universe likes to be as extreme and "unnatural" as possible without breaking is probably a better hypothesis than "naturalness".

A list with pointers to where this idea has been voiced is on the nLab here: universal exceptionalism.
 
  • Like
Likes ohwilleke
  • #51
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
A list with pointers to where this idea has been voiced is on the nLab here: universal exceptionalism.

It is always good to learn new terms.
 
  • Like
Likes Urs Schreiber
  • #52
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
A list with pointers to where this idea has been voiced is on the nLab here: universal exceptionalism.

Lubos is as knowledgeable as Witten and one of the most powerful defenders of superstring theory and supersymmetry.. but I thought superstrings and supersymmetry were about naturalness where they were looking for certain Calabi–Yau manifold configuration to explain the constants of nature (is this not the goal of superstring theory?) but in Lubos article https://motls.blogspot.com/2017/04/like-james-bond-nature-loves-to-walk.html, why was he supporting unnaturalness? Was he saying that we were just lucky that in one shot (without any Multiverse scenario), all the constants of nature in the form Calabi–Yau manifold configurations lined up to produce our universe.. like we were just lucky to win a 1 in a billion raised to 10s lotto??
 
  • #53
ohwilleke said:
So what. Standard Model parameters aren't random variables and the claim that we have any plausible basis upon which to expect that they have any particular value a priori is nothing but disciplinary folk myth. The laws of Nature should fit together exactly perfectly and lo and behold, they do.
Which laws fit perfectly together? ;-) This is the problem with the thinking reflected in your comment.

I think sometimes there is confusion between understanding the process of learning vs understanding knowledge. For some of use that think elsewise this is not a myth, its just the modest requirement of putting things into evolutionary perpective.

The task at hand is to find these laws, and which guiding principles we use.

What seems unnatural and unexplainable is only because we do not yet see the evolutionary development. For example, human existence may seem unnatural to some, but if you understand it in the evolutionary perspective it is rather natural. Evolution is as natural as anything gets.

Sabine put it clearly on her blog against naturalness though:

"But note that changing the parameters of a theory is not a physical process. The parameters are whatever they are."
-- http://backreaction.blogspot.se/2017/11/naturalness-is-dead-long-live.html

This was i think a clear statement, which is why i like it - however i disagree with it.

If we look at "theory" as human science knows it, it unquestionably IS a physical process in theory space. We can call learning, inference, or abduction of best explanation etc.

Then step 2 is to ask, how an atom nuclei "know" which theory to obey? You might think that it must ahve obeyed the same laws even 100 years ago, when human sciences has not yet understood it? Yes of course, this is true. But thing are more subtle. If we think that the laws of physics are universal they apply also to complex systems, and the BEHAVIOUR of complex systems. And if you also think about how microcausality can be implemented with any reasonable soundness, then it seems to be how absurd it is to think that atomic structures will "OBEY" rules written in the sky? That if anything is an irrational idea. Instead it seems to be the only way to have some causality is that these rules must be literally encoded in the microstructure. This all leads to the idea of evolution of law if you add an principle of equivalence that the "laws of physics" (or more correctly, the rules for self-organisaiton) must be the same on all complexity scales. The problem though is to understand what the real core of physical law IS? Maybe it is NOT a fixed mathematical structure? Maybe the core of the law is relations between structures? And that is also a possible fallacy to think of thse as existing in a gigantic space of possible structures.

Its not fair to say this is a myth, it is rather a fairly new idea and unexplored one.

[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
A list with pointers to where this idea has been voiced is on the nLab here: universal exceptionalism.
I do not see any conceptually sound reason behind those ideas. To me it sounds like some version of the old "mathematics beauty" argument or similar things.

Obviosly, if string theories out of the landscape could simply PICK the RIGHT solution, that describes our universe and unifies all forces, then the critique against the landscape would fade. But right now, the insight seems to be that the existence of this problem is telling us something about our strategy for navigating in theory space. In short, we seem to be lost according to the map, but not in reality. So the way we charted the map seems wrong.

/Fredrik
 
  • #54
Fra said:
Which laws fit perfectly together? ;-) This is the problem with the thinking reflected in your comment.

I think sometimes there is confusion between understanding the process of learning vs understanding knowledge. For some of use that think elsewise this is not a myth, its just the modest requirement of putting things into evolutionary perpective.

The task at hand is to find these laws, and which guiding principles we use.

What seems unnatural and unexplainable is only because we do not yet see the evolutionary development. For example, human existence may seem unnatural to some, but if you understand it in the evolutionary perspective it is rather natural. Evolution is as natural as anything gets.

Sabine put it clearly on her blog against naturalness though:

"But note that changing the parameters of a theory is not a physical process. The parameters are whatever they are."
-- http://backreaction.blogspot.se/2017/11/naturalness-is-dead-long-live.html

This was i think a clear statement, which is why i like it - however i disagree with it.

If we look at "theory" as human science knows it, it unquestionably IS a physical process in theory space. We can call learning, inference, or abduction of best explanation etc.

Then step 2 is to ask, how an atom nuclei "know" which theory to obey? You might think that it must ahve obeyed the same laws even 100 years ago, when human sciences has not yet understood it? Yes of course, this is true. But thing are more subtle. If we think that the laws of physics are universal they apply also to complex systems, and the BEHAVIOUR of complex systems. And if you also think about how microcausality can be implemented with any reasonable soundness, then it seems to be how absurd it is to think that atomic structures will "OBEY" rules written in the sky? That if anything is an irrational idea. Instead it seems to be the only way to have some causality is that these rules must be literally encoded in the microstructure. This all leads to the idea of evolution of law if you add an principle of equivalence that the "laws of physics" (or more correctly, the rules for self-organisaiton) must be the same on all complexity scales. The problem though is to understand what the real core of physical law IS? Maybe it is NOT a fixed mathematical structure? Maybe the core of the law is relations between structures? And that is also a possible fallacy to think of thse as existing in a gigantic space of possible structures.

Its not fair to say this is a myth, it is rather a fairly new idea and unexplored one.I do not see any conceptually sound reason behind those ideas. To me it sounds like some version of the old "mathematics beauty" argument or similar things.

Obviosly, if string theories out of the landscape could simply PICK the RIGHT solution, that describes our universe and unifies all forces, then the critique against the landscape would fade. But right now, the insight seems to be that the existence of this problem is telling us something about our strategy for navigating in theory space. In short, we seem to be lost according to the map, but not in reality. So the way we charted the map seems wrong.

/Fredrik

The Microsoft Windows operating system or the MacOS operating system doesn't uniquely pick out a certain company data. Because Windows and MacOS are operating system and programmable.. if Superstring theory is also an operating system and programmable. Does this makes Superstring theory a success even now? It's only a failure because string theories out of the landscape couldn't simply PICK the RIGHT solution to use your words.. but do the Microsoft Windows and MacOS pick out a certain company solution (like the company profile and data of Mercedes Benz).. it doesn't so could Superstring Theory be similar?
 
  • #55
star apple said:
I thought superstrings and supersymmetry were about naturalness where they were looking for certain Calabi–Yau manifold configuration to explain the constants of nature

There is no known mechanism in string theory that would dynamically prefer Calabi-Yau compactifications over other compactifications. The interest in CY-compactifications was entirely driven by the prejudice that nature should feature one unbroken supersymmetry at low (here: weak breaking scale) energy. For more see the string theory FAQ: "Does string theory predict supersymmetry?"
 
Last edited:
  • Like
Likes Demystifier
  • #56
Fra said:
I do not see any conceptually sound reason behind those ideas.

The entry starts out with the words "The philosophical sentiment..."
 
  • #57
star apple said:
The Microsoft Windows operating system or the MacOS operating system doesn't uniquely pick out a certain company data. Because Windows and MacOS are operating system and programmable.. if Superstring theory is also an operating system and programmable. Does this makes Superstring theory a success even now? It's only a failure because string theories out of the landscape couldn't simply PICK the RIGHT solution to use your words.. but do the Microsoft Windows and MacOS pick out a certain company solution (like the company profile and data of Mercedes Benz).. it doesn't so could Superstring Theory be similar?
Assuming i get your analogy, that is probably what some strain theorists hope, but the problem i see is...

Nothing wrong with a "hypothesis space"
because that is how an actions and inference under uncertainly works.

The pathology is that a rational inference system would be unable to generate more hypothesis than we can manage to test or even handle. In an intrinsic inference bounded resources for computing and encoding will always make sure the map of hypothesis space is managable. Anything else should intuitively be an evolutionary disadvantage. This always ensures naturality.

In my eyes this merely shows that string theory with its nice things unfortunately is not the right system. To find rational measures om the landscape that after scaling naturally explains the standard model probably requires some extra constructing principles.

Maybe these are found and added to string theory to tame the landscape though. Then in restrospect we woll understand the fine tuning issue and landscape in new light.

/Fredrik
 
  • #58
Fra said:
Maybe these are found and added to string theory to tame the landscape though.

Are you aware that the space of solutions to all other theories of nature that we know is much larger than the landscape of type II flux compactifications? The solution spaces to standard theories, such as general relativity or Yang-Mills theory, are continuous spaces of infinite dimension, hence of cardinality at least that of the real numbers. So a claim that the space of IIB flux compactificatins is a finite number such as ##10^{500}##, implies that it is tiny compared to what is to be expected from a space of solutions to a theory of nature. Even if finite numbers of this form "feel large", they are negligible compared to the cardinality of the continuum.

It is worthwhile to soberly think about what it would really mean if there were a unique string vacuum, or maybe two or three. It would be the Hegelian dream of all of nature derived uniquely from pure thought become real. While it is (or was) fun to hope that this is the case with string theory, it makes no sense to speak of a "problem" if it turns out not to be the case. That would be like speaking of a problem if you don't win the billion dollar lottery. It would have been great if you did, but now that you didn't this just means the situation is normal as it was before you bought the lottery ticket.
 
  • Like
Likes atyy
  • #59
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
While it is (or was) fun to hope that this is the case with string theory, it makes no sense to speak of a "problem" if it turns out not to be the case.
I think for people like some of us on here, this is the kind of "problem" that motivates us. So for me it IS a problem, although we can agree to put the "problem" in an appropriate geeky context which only a fraction of us care about. We sure have bigger - but less intriguing - problems on earth.
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
That would be like speaking of a problem if you don't win the billion dollar lottery. It would have been great if you did, but now that you didn't this just means the situation is normal as it was before you bought the lottery ticket.
Your odds comparasion i agree with. It isn't the first time i hear that exact analogy. But there is only one way and that is forward.
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
Are you aware that the space of solutions to all other theories of nature that we know is much larger than the landscape of type II flux compactifications? The solution spaces to standard theories, such as general relativity or Yang-Mills theory, are continuous spaces of infinite dimension, hence of cardinality at least that of the real numbers. So a claim that the space of IIB flux compactificatins is a finite number such as ##10^{500}##, implies that it is tiny compared to what is to be expected from a space of solutions to a theory of nature. Even if finite numbers of this form "feel large", they are negligible compared to the cardinality of the continuum.
I am glad you bring up cardinality and measures. You are indeed touching upon the core of the problems here. In fact i have been thinking a lot aout this, and the problem of how to compare "evidence" and an inferential abstraction is one of t he things that has led my to my current stance to all this.

Many problems root in the fact that it is ambigous to compare infinities that you have in a formal expression. But infinities are really defined by means of limits, and in contiuum mathematics i feel that very often one has lost the original track of this limiting procedure, and their order and "rate". You can of course fix this, but there is a lot of degrees of freedom in these models that are nonphysical, and to the point were we confuse ourselves with what we are doing. You have similar problems in the foundations of probability theory and inference. When you try to build inferences, one has to be quite anal about counting, because if you want to compare two sets of evidence and both sets are infinite, then something is wrong. The nyou have to find a the integration measures on hte spaces that are tuned so that they comply to the underlying limiting procedures. One of the problems of infinities is imo that we have lost the physical track of the real degrees of freedom, and we are LOST among the huge mathematical degress of freedom. Especially if you start out from a classical system (lagrangian) you ahve this baggage of uncountable sets in there, moreover in a disastrous mess! My goal is to make a reconstruction, not starting from classical mechanics, but from an abstraction of inference. Continuum models will obviously still be preferred in large complexity limit, but its is just a gigantic mess to start with uncountale sets from square one.

/Fredrik
 
  • #60
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
Are you aware that the space of solutions to all other theories of nature that we know is much larger than the landscape of type II flux compactifications? The solution spaces to standard theories, such as general relativity or Yang-Mills theory, are continuous spaces of infinite dimension, hence of cardinality at least that of the real numbers. So a claim that the space of IIB flux compactificatins is a finite number such as ##10^{500}##, implies that it is tiny compared to what is to be expected from a space of solutions to a theory of nature. Even if finite numbers of this form "feel large", they are negligible compared to the cardinality of the continuum.

It is worthwhile to soberly think about what it would really mean if there were a unique string vacuum, or maybe two or three. It would be the Hegelian dream of all of nature derived uniquely from pure thought become real. While it is (or was) fun to hope that this is the case with string theory, it makes no sense to speak of a "problem" if it turns out not to be the case. That would be like speaking of a problem if you don't win the billion dollar lottery. It would have been great if you did, but now that you didn't this just means the situation is normal as it was before you bought the lottery ticket.

How does this work of string theory is supposed to contain the other theories like GR?
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • Sticky
  • · Replies 33 ·
2
Replies
33
Views
9K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 115 ·
4
Replies
115
Views
14K
  • · Replies 48 ·
2
Replies
48
Views
4K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 2 ·
Replies
2
Views
5K
  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 16 ·
Replies
16
Views
2K