Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I What is the best theory why our vacuum may be in the edge of metastability?

  1. Apr 4, 2018 #21

    There seems to be many meanings of Naturalness so it's good to sort the subtleness of each meaning. In the paper depicting the vacuum metastability illustration in the initial message.. the conclusion in the paper started with the sentence "One of the most important questions addressed by the LHC is naturalness.". I found three good reference about naturalness after you emphasized opposite of naturalness is having a theory.. (I thought naturalness means exclusively having equations that gives the values or relationship.. and I mentioned these references so in the future I can refer to these if I forgot them):

    http://backreaction.blogspot.com/2018/02/what-does-it-mean-for-string-theory.html

    "That the mass be natural means, roughly speaking, that getting masses from a calculation should not require the input of finely tuned numbers".

    https://profmattstrassler.com/artic...ics-basics/the-hierarchy-problem/naturalness/ (about unknown physics and effect on the Higgs field I asked about)

    and https://arxiv.org/pdf/1501.01035.pdf

    "In implementing ’t Hooft’s notion of naturalness, we have so far considered symmetries of a sort familiar from quantum mechanics, generated by a charge operator which is a scalar under rotations. But there is another type of symmetry, allowed by general principles of quantum mechanics and relativity, where the symmetry generators are spinors. This symmetry is known as supersymmetry. We will consider it, first, as a global symmetry, but the symmetry can be elevated to a local, gauge symmetry."
    [...]
    "it is still possible that nature is “natural”, in the sense of ’t Hooft. Future runs of the LHC might provide evidence for supersymmetry, warped extra dimensions, or some variant of technicolor. But the current experimental situation raises the unsettling possibility that naturalness may not be a good guiding principle. Indeed, naturalness is in tension with another principle: simplicity. Simplicity has a technical meaning: the simplest theory is the one with the smallest number of degrees of freedom consistent with known facts. Contrast, for example, the minimal Standard Model, with its single Higgs doublet, with supersymmetric theories, with their many additional fields and couplings. So far, the experimental evidence suggests that simplicity is winning. The observed Higgs mass is in tension with expectations from supersymmetric theories, but also technicolor and other proposals."

    ----


    If the masses of the superpartners are very high.. what mechanism in superstring theory besides Kane's (his 0.5 TeV bino was already excluded) that can solve the Hiearchy Problem without 't Hooft notion of naturalness or natural supersymmetry (In the sense of the above paragraph at low masses)?
    I'm interested for now in theory that can solve for them (instead of Multiverse or real fine tuning between quadratic radiative corrections and the bare mass being put there on purpose by design (if these were the mechanism chosen by nature. Then we should have new fields that can maintain the constants values without possibly any formulas).


    And is the solution to the Hiearchy Problem independent of the Vacuum metastability issue (to what extend can solution of each be solution of the other)? Thank you.

     
  2. Apr 4, 2018 #22

    Urs Schreiber

    User Avatar
    Science Advisor
    Gold Member


    No, naturalness is a principle imposed in the absence of any known equation for the actual numbers. In the absence of an explanation, cancellations of large numbers against each other seems a weird coincidence, and naturalness means that weird coincidences should not happen. It's a qualitative principle akin to the principles of the times of "natural philosophy" before physics became a maths-based science.

    What you call "Kane's theory" is called the ##G_2##-MSSM, which became a serious contender for realistic model building with the result of Acharya-Witten 01.
    This model does not predict explicit numbers without feeding in some concrete assumptions on the precise nature of the compactification (if it did we would be living in a Douglas Adams novel), but -- and that's really the point that Kane et al. have been driving home over the years -- it does make generic predictions that hold irrespective of the detailed numbers that describe the choice of compactfication.

    One of these generic predictions is that there is an exponential hierarchy between the Planck scale and the gravitino mass scale, hence the susy breaking scale, hence the electroweak scale. This argument is due to Acharya-Kane-Kumar 12, section V.A.2 (pages 10-11) , and it is not hard to follow:

    First, the higher gauge symmetry of the supergravity C-field implies "shift symmetry" of its KK-modes ##\Phi_j##. But the perturbative superpotential must be a holomophic function of the ##\Phi_j## and under shift symmetry this is only possible if perturbatively it vanishes identically. As a result, one deduces that in these models the superpotential consist entirely of non-perturbative contributions, such as membrane instantons, which are known to break the shift symmetry. But these non-perturbative contributions are negative exponentials in the instanton contributions, and hence imply that the gravitino mass (hence the susy breaking scale) is exponentially smaller than the Planck mass. This is the required exponential hierarchy.

    This is not a mathematical proof, but it is a decent scientific argument based on an actual theory, and it makes clear that exponential hierarchies between the susy breaking scale and the Planck scale have a good scientific explanation from first principles, just as the exponential scale between the power of a supernova and that of an ordinary star does, both of which may look suprprisingly "unnatural" to the mathematically un-aided observer.

     
    Last edited: Apr 4, 2018
  3. Apr 4, 2018 #23
    What I was asking in your reply was the following especially the unknown contribution (glad to find this today as this vocalizes my concern):

    HunFhn.jpg

    https://profmattstrassler.com/artic...ics-basics/the-hierarchy-problem/naturalness/
    "Fig. 5: Summing up the energy from the quantum fluctuations of known fields (schematically shown, upper row) up to the maximum energy scale vmax(down to the minimum distance scale) where the Standard Model still applies, and adding to this contributions from unknown effects from still higher energies and shorter distances (schematically shown, middle row), we must somehow find what experiment tells us is true: that the Higgs field’s average value is 246 GeV and the Higgs particle’s mass is 125 GeV/c2. If vmax is much larger than 500 GeV, this requires a very precise cancellation between the known and unknown sources of energy, one that is highly atypical of quantum theories."

    Do you have a more mathematical version of it in your site in terms of renormalization group flow and Higgs self-coupling, etc that described the same stuff? If Vmax is just 500 GeV instead of planck scale.. does it mean the Higgs self-coupling cover up to 500 GeV only instead of up to planck scale? What stuff must the LHC produce so Vmax is up to 500 GeV only? Matt wrote in the commentary after the article:

    "The obvious solution is indeed that vmax is near 500 GeV. And if that is true, the LHC will discover as-yet unknown particles, and other predictions of the Standard Model will fail as well. The strongest evidence against it — inconclusive at this time — is that the LHC has not yet discovered any such non-Standard-Model particles, and there are no known deviations from the Standard Model at the current time. Arguably we should have seen subtle deviations already by now. But I will get to this issue soon."
     
  4. Apr 4, 2018 #24

    Urs Schreiber

    User Avatar
    Science Advisor
    Gold Member

    The problem with this folklore story is that it is trying to nail jelly to the wall by arguing about the size of quantum corrections in the presence of renormalization freedom.

    The fact of the matter is that the space of renormalization choices is an affine space (i.e. like a vector space, but with no origin singled out) which means first of all that there is no absolute concept of "size" of a quantum correction. This only appears once one fixes a renormalization scheme, which is like a choice of coordinate chart. It has no physical meaning. Even if we fix a renormalization scheme (which is implicitly assumed in discussions such as you quote) then it still remains a fact that there is an arbitrary freedom in choosing renormalization constants, large or not.

    In conclusion, to make progress on these kinds of matters, one needs more theoretical input than just low energy effective perturbative quantum field theory with its arbitrary renormalization freedom, or otherwise one is going in circles forever. As Kane 17 points out, "we should look harder for a theory that does provide a UV-completion".

    Notice how the solution of the hierarchy problem observed by Acharya-Kane-Kumar 12, section V.A.2 (p. 10-11) deals with this issue: They invoke a UV-completion that goes beyond perturbation theory. In that theory one knows 1) that the superpotential is protected against perturbative renormalization freedom and 2) the form of the non-perturbative corrections is known. Namely these are exponentials in the inverse coupling. This yields the exponential hierarchy that is to be explained.

    You see, this works not by long story-telling and analogies and showing colorful pictures, but by a logical deduction from a theoretical framework.
    (Not fully mathematically rigorous, but fairly solid by the standards of phenomenology.)
     
  5. Apr 5, 2018 #25
    To come back to the original question, here is another very recent calculation regarding meta-stability of our universe:

    Scale-invariant instantons and the complete lifetime of the standard model
    Anders Andreassen, William Frost, and Matthew D. Schwartz
    Phys. Rev. D 97, 056006 – Published 12 March 2018
    https://journals.aps.org/prd/abstract/10.1103/PhysRevD.97.056006
     
  6. Apr 5, 2018 #26
    I read and reread Matt arguments trying to understand his viewpoints. I think he got discouraged with the LHC non-null results in 2013.. and he wrote in separate page in August 2013 https://profmattstrassler.com/2013/08/27/a-first-stab-at-explaining-naturalness/

    "This in turn is why so many particle physicists have long expected the LHC to discover more than just a single Higgs particle and nothing else… more than just the Standard Model’s one and only missing piece… and why it will be a profound discovery with far-reaching implications if, during the next five years or so, the LHC experts sweep the floor clean and find nothing more in the LHC’s data than the Higgs particle that was found in 2012."

    It's happening now.. it's 5 years from his pronouncement. Only Higgs and nothing else is "Profound discovery with far-reaching implications". There is possibility there may not even be a UV complete theory. To be on topic, where is Matt page about Vacuum Metastability? He should emphasize it too.

    Anyway. If AdS/CFT is UV complete. But it doesn't describe our spacetime. Is this duality just toy model or does it completely describe a hidden sector that can act as holographic surface. Is there any reference about this search for actual surface or it will remain as toy model for centuries to come? And can AdS/CFT explain the why of the vacuum metastability?
     
  7. Apr 5, 2018 #27
    Here's Matt page about vacuum metastability:

    https://profmattstrassler.com/articles-and-posts/particle-physics-basics/theories-and-vacua/

    Metastability relies on the Higgs mass of ~125 GeV.
    Equally important to know is why is the Higgs vev 246 GeV. Can this be computed from first principle.
    And also why the Higgs is turned on.

    I guess pondering on them and solving for them simultaneously may be more encompassing. What I want to know is if 125 GeV was metastable.. what range of the Higgs field vev (it's 246 Gev but what if it's say 3 TeV) can still affect the metastability? (or no direct relationship or independent?)
     
  8. Apr 5, 2018 #28

    Urs Schreiber

    User Avatar
    Science Advisor
    Gold Member

    We have given you pointers to detailed discussion of this. If you want to go beyond scratching the surface, you'll need to eventually pass from reading blogs to reading real scientific documents. That will also allow you to incrementally improve your questions.

    From AFS18 the best present data on the stability issue the seems to be the following:

    HiggsVacuumStability4.png
     
  9. Apr 5, 2018 #29
    XvbFjP.jpg

    After watching this video it seems more likely the Correct Fix (above) would shed light on the vacuum metastability too. Yes. I'll read all scientific papers from now on having seen the Bird's eye view. Thanks a lot for helping.
     
  10. Apr 6, 2018 #30

    Urs Schreiber

    User Avatar
    Science Advisor
    Gold Member

    Sure. Notice that the "Correct Fix" on the left is "having a theory".

    Concerning the remainder of the video: Maybe our times serve to raise the subconcious archetype of crisis, but really in natural science it is not a "crisis" to discover a phenomenon that looks non-random (non-generic). On the contrary, this is what drives science, and the left column of that table is a triumphal testimony of that.

    One part of the hep community says that the LHC results reveal a "nightmare scenario" where nothing interesting is discovered; but when something interesting is discovered after all, the other part calls this a "crisis". Worse, these two parts of the community seem to have non-empty intersection. This smells of hysteria more than of sober scientific discourse.

    The truth is that the LHC results both confirm established theory and at the same time show clearly and unambiguously a new physical effect that is not explained by established theory. That's really the dream scenario of natural science. Instead of rambling on about chance and likeliness, this should make the scientific community turn to its core task, which is to produce theories and iteratively improve on them.

    Here we need a theory that explains why the vaccuum would sit on the verge of instabiity, but not beyond. I am aware of two good contestants:

    One is maybe I) asymptotic safety, the other is II) any theory in which a) field moduli such as the Higgs potential are themselves dynamical and b) they are prevented from crossing into the realm of instability, by some principle or symmetry

    About I) I wish the analysis of theory and data had been made more intelligibly, see #8, #20 above.

    About II): This is the theory that people like Gordon Kane are pointing out (Kane 18, "Clue 4"). It may turn out wrong, but at least it is a theory. The sensible thing to do in science is to investigate this theory further and check if it holds water.

    Or else come up with a better theory. But we do need to talk about theories and not get lost in informal handwaving about probabilities, likeliness, genericity. The universe is neither likely nor generic, instead it is exceptional in its existence and its properties. In the past natural science, in the modern guise of maths-based theoretical physics, has managed to understand to a large extent why this is so. There is no reason to give up on this success story now.
     
  11. Apr 6, 2018 #31
    It is not far from logic that the following needs to be solved at the same time (or at least to ponder on them simultaneously):

    1. Why is the Higgs mass and vev low for SM coupled to gravity.. Matt mentioned how it should be tuned to 1/10^-30 (Naturalness problem)
    2. Hierarchy Problem (why are all particles not planck size mass)
    3. Vacuum Metastability (why is the Higgs mass 125 GeV bordering on the edge of metastability
    4. Cosmological Constant Problem (disagreement between the observed values of vacuum energy density and theoretical large value of ZPE suggested by QFT)

    You mentioned the first 3 solutions to this thread. May I know what is your take on the 4th problem (cosmological constant problem) and possible solution and if there is (any) connection to the first 3 problems especially the vacuum metastability?

    Do you have a single model that can explain all the 4? How many separate models must they need to at least to explain them? 2? 4?

    Wiki last paragraphs on the cosmological constant problem is:
    "In the case of the Higgs mechanism, the usual Higgs vacuum expectation value in the instant-form vacuum is replaced by a constant scalar background field - a "zero mode" with kμ=0. The phenomenological predictions are unchanged using the LF formalism. Since the Higgs zero mode has no energy or momentum density, it does not contribute to the cosmological constant.

    The small non-zero value of the cosmological constant must then be attributed to other mechanisms; for example a slight curvature of the shape of the universe (which is not excluded within 0.4% (as of 2017)[14][15][16]) could modify the Higgs field zero-mode, thereby possibly producing a non-zero contribution to the cosmological constant."

    Could there be one model or solution to all of them at same time? It's weird they come in company.. maybe they have a common source or origin.
     
  12. Apr 7, 2018 #32

    Urs Schreiber

    User Avatar
    Science Advisor
    Gold Member

    On the one hand:

    A) in perturbative quantum field theory also the cosmological constant is subject to renormalization freedom (a basic point often neglected, detailed discussion is referenced here) so that all we can do is measure it.

    On the other hand:

    B) the approach of appealing to non-perturbative effects thought to be known in string theory, which has become a small industry since KKLT 03, has been argued to be performed too carelessly, see the well written recent article
    • Ulf Danielsson, Thomas Van Riet,
      "What if string theory has no de Sitter vacua?"
      (arXiv:1804.01120)
    These authors argue that the "several hundred papers on the topic" of which "unfortunately there is no review" (p. 7) are all inconclusive, because none of them take care to really check the assumptions being made; and in fact likely all wrong, since closer inspection shows that these assumptions are quite dubious.

    (Sometimes physicists question the point of rigorous mathematical argument. This is a good example for why it's useful: It simply means not to leave big gaps in your argument that later make the whole conclusion collapse. Large swaths of contemporary physics arguments are bound to eventually collapse this way, until the general attitude of the community changes.)

    The conclusion is (p. 26) "that string theory has not made much progress on the problem of the cosmological constant during the last 15 years [since] string theorists have not been up to the challenge."

    So we don't know how B) pans out. Until we do, we are stuck with A).
     
    Last edited: Apr 7, 2018
  13. Apr 7, 2018 #33
    Thanks. So the CC is not related to vacuum metastability. So let's get back to the latter exclusively.

    Do you have models where instead of just the ground state and the false vacuum.. there are many false vacuums instead? This makes better sense. You see, the universe is so exceptional in its existence and properties, so incredibly designed for life and all that it doesn't make a lot of sense the ground state is its abyss. To illustrate it. It's like our universe is Disneyland. It doesn't make sense to have tactical nukes in the middle of Disneyland able to destroy it at any moment (or possibility for false vacuum to tunnel to a destructive true vacuum). Instead, there must be many false vacuums.. with each having properties as incredible as our universe or compatible with life in Ascended forms. Or imagine all the Disneyland on earth are located in one location but only one of them can manifest at one time and tunneling to false vacuum means tunneling to different Disneyland scenery and design.

    I heard multiple false vacuums are not refuted yet. What is the best model about this?
     
  14. Apr 7, 2018 #34
    This one sentence caused me an hour of reading of your references. About your statement "This means that apart from the freedom of choosing a classical comsological constant in the Einstein-Hilbert action as above, its perturbative quantization (perturbative quantum gravity) introduces renormalization freedom to the value of the cosmological constant."

    and on your Quantum Gravity page reference
    "Thus, this idea seems to be very unlikely. At the same time, it is clear that present experiments cannot directly probe the effects of that supposed UV-completion of Einstein-gravity.

    Approaches to a full quantization of gravity therefore roughly fall into two different strategies
    1.One assumes that the Einstein-Hilbert action is indeed the effective QFT that approximates a “UV-completion”, a more fundamental theory valid at all energies. This is the approach taken for instance in string theory.
    2.One assumes that by some other fact that has been overlooked, one can make sense of a non-perturbative quantization of the EH action at arbitrary energies after all. This is for instance the case in speculations that EH-gravity has a UV fixed point?."

    Which of the two above has more weight (if they are not equal) to the renormalization freedom of the CC. I mean what if gravity and QFT were just emergence of a third theory (this was mentioned by Smolin and other). The renormalization freedom of the CC stays valid?

     
  15. Apr 8, 2018 #35
    My understanding is that in the G2-MSSM, the scale of the Higgs vev is generated and protected by specific mechanisms, such as you describe. But protection of the Higgs mass from large corrections still relies on low-scale supersymmetry (just like other susy models). And so in that regard, the G2-MSSM conforms to Strassler's reasoning. Strassler says that the scale "vmax" at which the SM ceases to be valid can't be too high, or else there will be unacceptable finetuning; and in these G2 compactifications of M theory, the MSSM is indeed supposed to replace the SM, not too far above the weak scale.

    As for the cosmological constant, Bobkov argued that a version of the Bousso-Polchinski mechanism can work within the G2-MSSM.
     
  16. Apr 9, 2018 #36
    eVQ5xG.jpg

    To tame Vmax without fine tuning.. it seems supersymmetry (like MSSM) is really required not far from weak scale.. but then I wonder what stuff can do it too without supersymmetry.. anyone got any idea? Perhaps some quantum gravity stuff that can affect outside the planck scale too? Remember above 10^-18 are scales that are beyond particles.. maybe space is composed of something that can tame it? solving it and CC at same time and even the Hierarchy Problem?

    Also what's weird beside having the mass so low is why its on the edge of metastability.. maybe the 4 are related somehow if you can use some mechanism of quantum gravity that can affect outside planck scale.
     
  17. Apr 13, 2018 #37
    These comments have been bothering me for over a week now. To really address them authoritatively, I would have to review a lot of renormalization theory, but for now I'll just state my thoughts and see what comes of it.

    There is a kind of field theory that is renormalizable up to arbitrarily high scales. The Lagrangian only contains renormalizable terms. This is the original concept of a renormalizable field theory.

    Then there are field theories which also contain non-renormalizable terms with coefficients that have a dependence on a physical cutoff scale, above which the theory is not defined. This is called an effective field theory.

    I feel like Urs's comments apply most directly to a field theory that is renormalizable in the original, unrestricted sense. This actually includes the standard model, because, as explained here, all the corrections can be absorbed into the renormalization.

    However, Strassler is, by hypothesis, treating the standard model as an effective theory, valid only up to a physical cutoff scale. And he's saying that in that case, the higher the cutoff scale, the greater the finetuning required in order to keep the Higgs mass low.

    Surely Strassler's reasoning can be made as rigorous as anything else in quantum field theory. We might have to use the "standard model effective field theory", in which the non-renormalizable terms are added to the SM lagrangian. And we might quantify the finetuning by looking at how restricted are the coefficients of the non-renormalizable terms (which codify the effects of unknown BSM physics), if we want the Higgs to stay light.

    My complaint is that Urs treats strictly renormalizable field theory, and a UV-complete theory like string theory, as the only useful forms of reasoning, when in fact effective field theory is enormously useful. And a technical question would be, just how great are the "freedoms", the ambiguities, in effective field theory? I think they would be far more reasonable in size.

    As evidence, I would point to the discussion of renormalon-induced uncertainties in the top quark pole mass, in QCD. No-one there is saying that the top quark pole mass can be anything at all; the uncertainty is pretty small. And I think that would be far more characteristic of the ambiguities in EFT quantities.
     
  18. Apr 14, 2018 #38

    Urs Schreiber

    User Avatar
    Science Advisor
    Gold Member

    I don't see which distinction you are meaning to invoke. The perspective of effective field theory is one way of several to look at the issue of renormalization freedom. All these are different perspectives on the same thing. This is explained in the Subsection titled Wilson-Polchinski effective QFT flow within the PF-Insights on Mathematical QFT -- Renormalization.

    Independently of that, it may be worthwhile to recall the following subtlety in the terminology "renormalizable":

    The technical meaning of "renomalizable" just means that of the a priori infinitely many renormalization constants, it so happens that only a finite number appear. If a theory happens to be non-renormalizable in this sense, it just means that there are infinitely many choices to be made in computing any quantity at arbitrary energy. That may be physically disappointing, but it is not mathematically inconsistent.

    This point was established way back by Epstein-Glaser 73, but somehow the community goes through cycles of forgetting and rediscovering it. Once place where this is rediscovered is

    J. Gomis and Steven Weinberg,
    "Are nonrenormalizable gauge theories renormalizable?",
    Nucl. Phys. B 469 (1996) 473
    (arXiv:hep-th/9510087)

    Namely we can just keep making choices of renomralization constants on and on. In Epstein-Glaser 73 this infinite list of choices is organized by loop order, while in effective field theory it is organized by energy scale, but the principle is the same in both cases.

    What is good about the Epstein-Glaser perspective of renormalization is that it gives one crystal clear picture of what the space of choices is.
     
  19. Apr 16, 2018 #39
    The difference between a renormalizable field theory with a cutoff that can go to infinity, and a nonrenormalizable field theory that is still predictive but only up to a finite cutoff. As a reference, let me cite Mark Srednicki's QFT text, which introduces the distinction at the start of chapter 18, and develops effective field theory in chapter 29. The fine-tuning problem is mentioned halfway through chapter 29. The physical interpretation of a nonrenormalizable theory as an effective action valid only up to a finite cutoff begins just before eqn 29.37 ("The Wilson scheme also allows us...").

    I'm just saying that this is the context in which Strassler's argument is made, and it is a valid argument given its premises: if there is an EFT with a light scalar, the higher the cutoff, the more tuning is required. (Maybe it could be formalized with Kevin Costello's framework.)
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted