What is the best theory why our vacuum may be in the edge of metastability?

In summary, the best theory why our vacuum may be in the edge of metastability is that it is easier to cross the different false vacua (not necessarily to the true vacuum).This is based on the premise that slight tweaks to the renormalization group of the Higgs can make the universe from metastable to stable. However, this is still an uncertain proposition and has yet to be proven.
  • #1
jtlz
107
4
14myE5.jpg


What is the best theory why our vacuum may be in the edge of metastability?

Is it possible there are many false vacuum separated by energy barriers and the reason why our vacuum may be metastable is so that it is easier to cross the different false vacua (not necessarily to the true vacuum)?
 

Attachments

  • 14myE5.jpg
    14myE5.jpg
    19.3 KB · Views: 1,768
  • Like
Likes Urs Schreiber
Physics news on Phys.org
  • #2
jtlz said:
What is the best theory why our vacuum may be in the edge of metastability?

It's got to be somewhere.
 
  • Like
Likes ohwilleke
  • #3
Here may be a simpler question related to it. I read in Deep Down Things that:

"What is the physical origin of the external Higgs potential? What is the physical basis of the peer pressure–like internal organizing principle that causes the Higgs field symmetry to be hidden in a uniform manner throughout space-time? These questions, unanswerable within the Standard Model, suggest that, beyond the unearthing of the Higgs boson, there may well be some radically new facet of the natural world awaiting our discovery."

So far what are the existing proposals of the physical origin of the external Higgs potential? How related is it to the vacuum metastability?
 
  • #4
The underlying premise of the question, it is worth noting, isn't on entirely solid ground.

It would only take the slightest of tweaks to the renormalization group of the Higgs (or including some slight usually overlooked and ignored nuance of the calculation that the metastable result relies upon) from BSM physics to nudge the universe from merely metastable to stable.

For example, the SM + quantum gravity, or a singlet dark matter particle with a Higgs portal, might cause such a slight tweak. The asymptotic gravity prediction of the Higgs boson mass that is now so famous, for example, included a quantum gravity adjustment to the renormalization of the Higgs with energy scale.

There is also a fair amount of imprecision in the top quark mass (in absolute terms even though the percentage uncertainty isn't that huge) that could give the stability-metastability question a nudge one way or the other. Probably not enough to cross the line by itself, but enough to make the necessary BSM tweak even more subtle.
 
  • Like
Likes Spinnor
  • #5
You all should provide citations for claims you make or refer to, lest the discussion becomes arbitrary.

For the record: The plot from the original question in #1 is taken from

Giuseppe Degrassi, Stefano Di Vita, Joan Elias-Miró, José R. Espinosa, Gian Giudice, Gino Isidori,
"Higgs mass and vacuum stability in the Standard Model at NNLO",
J. High Energ. Phys. (2012) 2012: 98 (arXiv:1205.6497, doi:10.1007/JHEP08(2012)098)

This computation has meanwhile been refined in

A. V. Bednyakov, B. A. Kniehl, A. F. Pikelner, and O. L. Veretin,
"Stability of the Electroweak Vacuum: Gauge Independence and Advanced Precision",
Phys. Rev. Lett. 115, 201802 (arXiv:1507.08833, doi:10.1103/PhysRevLett.115.201802)

where the result is plotted like so:

HiggsVacuumStabilityII.png


An expository summary of this result appeared in

Alexander Kusenko,
"Are We on the Brink of the Higgs Abyss?",
Physics 8, 108 (2015)

which concludes as follows:

"[The] conclusion is that the best theoretical fit to measured parameters, including the Higgs and top-quark masses, points to a metastable Universe. However, their analysis also concludes that values of parameters are closer to a region of absolute stability than suggested by previous studies: it is possible for the Universe to be fully stable (and for the standard model to work all the way up to the Planck scale), if the true values of measured parameters are only 1.3 standard deviations away from the current best estimates."

In view of this, it seems to me that the question in #1 as phrased does make good sense: According to available data the observed Higgs vacuum is indeed very close to the edge of metastability. This necessarily brings with it that it is uncertain on which side of this edge we are, but that was not the question.
 

Attachments

  • HiggsVacuumStabilityII.png
    HiggsVacuumStabilityII.png
    42.3 KB · Views: 1,411
Last edited:
  • Like
Likes Greg Bernhardt, jtlz, Spinnor and 1 other person
  • #6
jtlz said:
What is the best theory why our vacuum may be in the edge of metastability?

One argument that I am aware of is the following:

A number of references argue that Higgs vacuum instability is in fact incompatible with cosmic evolution, as due to vacuum fluctuations during inflation the vacuum decay would not have been avoided (EGR 07, EEGHR 09, HKSZ 14, EGMRSST 15, EKSYZ 16) unless a further mechanism served to prevent it. That supersymmetry provides such a mechanism has been argued in (BDGGSSS 13, section 7, Kane 18 , “Clue 4”).
 
  • Like
Likes jtlz
  • #7
As Philip Gibbs kindly points out to me, he was maybe the first to highlight the core of these points and arguments way back in 2011, right before the official announcement of the Higgs detection:

Philip Gibbs,
"What would a Higgs at 125 GeV tell us?",
in "Seminar Watch (Higgs Special), Rumoured Higgs at 125 GeV and What Would a Higgs at 125 GeV Tell Us?",
Prespacetime Journal, December 2011, Vol. 2 Issue 12 pp. 1899-1905 (web)
 
Last edited:
  • Like
Likes ohwilleke and jtlz
  • #8
It is also interesting to note how the precision analysis of Bednyakov et al 15. seems to invalidate the now famous suggestion by Shaposhnikov-Wetterich 09 (which was motivated from the principle of "asymptotic safety") that the beta function ##\beta_\lambda## for the quartic Higgs self-coupling ##\lambda## should vanish asymptotically. According to their precision plot (p. 17-18) it does not asymptotically vanish at all:

HiggsQuarticBetaFunctionRelative.png
 

Attachments

  • HiggsQuarticBetaFunctionRelative.png
    HiggsQuarticBetaFunctionRelative.png
    12.9 KB · Views: 835
  • #9
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
You all should provide citations for claims you make or refer to, lest the discussion becomes arbitrary.

For the record: The plot from the original question in #1 is taken from

Giuseppe Degrassi, Stefano Di Vita, Joan Elias-Miró, José R. Espinosa, Gian Giudice, Gino Isidori,
"Higgs mass and vacuum stability in the Standard Model at NNLO",
J. High Energ. Phys. (2012) 2012: 98 (arXiv:1205.6497, doi:10.1007/JHEP08(2012)098)

Thanks for the paper. It explained a lot.
However, is there no mathematical or theoretical possibility for there to be new physics that doesn't affect the Higgs coupling or Higgs potential? Or are they all categorically related. Perhaps some new forces of nature didn't have any virtual particle contribution at all? With regards to this paragraph:

"If the LHC finds Higgs couplings deviating from the SM prediction and new degrees of freedom at the TeV scale, then the most important question will be to see if a consistent and natural (in the technical sense) explanation of EW breaking emerges from experimental data. But if the LHC discovers that the Higgs boson is not accompanied by any new physics, then it will be much harder for theorists to unveil the underlying organizing principles of nature. The multiverse, although being a stimulating physical concept, is discouragingly difficult to test from an empirical point of view. The measurement of the Higgs mass may provide a precious handle to gather some indirect information"

Up to what TeV before we can say that there is no physics at the TEV? Is it 1 TeV? 10 TeV? 100?

And does it always have to involve energy.. for example.. general relativity was a new physics last century that didn't involve TeV.
This computation has meanwhile been refined in

A. V. Bednyakov, B. A. Kniehl, A. F. Pikelner, and O. L. Veretin,
"Stability of the Electroweak Vacuum: Gauge Independence and Advanced Precision",
Phys. Rev. Lett. 115, 201802 (arXiv:1507.08833, doi:10.1103/PhysRevLett.115.201802)

where the result is plotted like so:

View attachment 223195

An expository summary of this result appeared in

Alexander Kusenko,
"Are We on the Brink of the Higgs Abyss?",
Physics 8, 108 (2015)

which concludes as follows:

"[The] conclusion is that the best theoretical fit to measured parameters, including the Higgs and top-quark masses, points to a metastable Universe. However, their analysis also concludes that values of parameters are closer to a region of absolute stability than suggested by previous studies: it is possible for the Universe to be fully stable (and for the standard model to work all the way up to the Planck scale), if the true values of measured parameters are only 1.3 standard deviations away from the current best estimates."

In view of this, it seems to me that the question in #1 as phrased does make good sense: According to available data the observed Higgs vacuum is indeed very close to the edge of metastability. This necessarily brings with it that it is uncertain on which side of this edge we are, but that was not the question.
 
  • #10
jtlz said:
Thanks for the paper. It explained a lot.

Just out of interest: I take that to mean that you did not get the plot in your message #1 from that paper? Where did you get it from?
 
  • #11
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
Just out of interest: I take that to mean that you did not get the plot in your message #1 from that paper? Where did you get it from?

I used yahoo.com.. type "vacuum metastability" and several dozens of images came up. I chose the clearest picture.
 
  • #12
jtlz said:
...possibility for there to be new physics that doesn't affect the Higgs coupling or Higgs potential? ... Perhaps some new forces of nature didn't have any virtual particle contribution at all? ... Up to what TeV before we can say that there is no physics at the TEV? ...
And does it always have to involve energy.. for example.. general relativity was a new physics last century that didn't involve TeV.

I find it hard to tell what you are asking here, as it sounds a little garbled. You should be asking whether the renormalization group flow of the Higgs self-coupling to the Planck scale as shown here is computed with the assumption that there is no new physics in between the weak scale and the Planck scale (called the "great desert" assumption). And the answer to that is: Yes.

And then the logic turns around: Since under the "great desert"-assumption the RG flow is seen to run at least extremely close to the point where the Higgs vacuum would become unstable, one may argue that this assumption must be wrong, and that there should be some new physics kicking in, which prevents that. Such as, possibly, supersymmetry or something else. This is not a solid proof, of course, but that's the kind of plausibility argument usual in phenomenology.
 
  • Like
Likes jtlz
  • #13
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
I find it hard to tell what you are asking here, as it sounds a little garbled. You should be asking whether the renormalization group flow of the Higgs self-coupling to the Planck scale as shown here is computed with the assumption that there is no new physics in between the weak scale and the Planck scale (called the "great desert" assumption). And the answer to that is: Yes.

In the same paper.. if there were this "new scalar field, non-minimally coupled to gravity the slows down the inflationary phase".. is it possible it is new physics but doesn't belong between the weak scale and Planck scale.. it other words, can new scalar field exist that doesn't have a corresponding particle? remember we are not even sure gravity has gravitons. So generally could any new physics occur that doesn't belong between weak and Planck scale? Nature may use very exotic organizing principles that is not naturalness, multiverse, sabine cosmic lotto, etc.

"If the Higgs field is trapped in the false vacuum during the early universe, it can cause inflation. The normalization of the spectrum of primordial perturbations, which is determined by Vmin, can be appropriately selected by tuning the ratio λ∗/b. The main difficulty of this scenario is to achieve a graceful exit from the inflationary phase. Two mechanisms have been proposed. The first one [24] employs a new scalar field, non-minimally coupled to gravity, that slows down the expansion rate, thus allowing for quantum tunneling of the Higgs out of the false vacuum. The second mechanism [25] uses a scalar field weakly coupled to the Higgs which, during the cosmological evolution, removes the barrier in the Higgs potential in a process analogous to hybrid inflation. So, in practice, the minimality of the SM is lost and one may wonder if there is any conceptual gain with respect to adding a new scalar playing the role of the inflaton. Nevertheless, it is interesting to investigate whether the Higgs and top masses are compatible with the intriguing possibility of a false vacuum at large field value."

And then the logic turns around: Since under the "great desert"-assumption the RG flow is seen to run at least extremely close to the point where the Higgs vacuum would become unstable, one may argue that this assumption must be wrong, and that there should be some new physics kicking in, which prevents that. Such as, possibly, supersymmetry or something else. This is not a solid proof, of course, but that's the kind of plausibility argument usual in phenomenology.
 
  • #14
jtlz, you are rambling now in a non-educated way. What is it that you actually want to know?
 
  • Like
Likes Greg Bernhardt
  • #15
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
I find it hard to tell what you are asking here, as it sounds a little garbled. You should be asking whether the renormalization group flow of the Higgs self-coupling to the Planck scale as shown here is computed with the assumption that there is no new physics in between the weak scale and the Planck scale (called the "great desert" assumption). And the answer to that is: Yes.

And then the logic turns around: Since under the "great desert"-assumption the RG flow is seen to run at least extremely close to the point where the Higgs vacuum would become unstable, one may argue that this assumption must be wrong, and that there should be some new physics kicking in, which prevents that. Such as, possibly, supersymmetry or something else. This is not a solid proof, of course, but that's the kind of plausibility argument usual in phenomenology.

In more technical term. New physics can affect the computation of the renormalization group flow of the Higgs self-coupling to the Planck scale but what if that new physics was more fundamental than QFT or something where QFT and gravity were emergent. So my inquiry was whether it is possible to have new physics that is not in between the weak scale and Planck scale but outside of them.

In other words. There may be a simpler alternative to multiverse from Occam razor and it is that the organizing principles of nature is simply because they were made that way. So the new physics may be related to the causal mechanism behind the scene or behind our Standard Model.. meaning any new physics can be independent of Higgs coupling or Higgs potential and the virtual particles. For example Microsoft programmer can rewrite the program in different ways (using C or Fortran or newer version) without our commercial Windows software being affected. May I know what is the official language for this description of other ways to have new physics in our current theoretical physics?
 
  • #16
jtlz said:
In more technical term. New physics can affect the computation of the renormalization group flow of the Higgs self-coupling to the Planck scale but what if that new physics was more fundamental than QFT or something where QFT and gravity were emergent. So my inquiry was whether it is possible to have new physics that is not in between the weak scale and Planck scale but outside of them.

In other words. There may be a simpler alternative to multiverse from Occam razor and it is that the organizing principles of nature is simply because they were made that way. So the new physics may be related to the causal mechanism behind the scene or behind our Standard Model.. meaning any new physics can be independent of Higgs coupling or Higgs potential and the virtual particles. For example Microsoft programmer can rewrite the program in different ways (using C or Fortran or newer version) without our commercial Windows software being affected. May I know what is the official language for this description of other ways to have new physics in our current theoretical physics?

What I meant is. Our constants of nature is so incredibly fined tuned. The following are the possibilities:

1. Naturalness (like Calabi-Yau where the constants could be in the form of moduli, etc.)
2. Multiverse
3. One universe but constants chosen like Lotto Win

Right now we have utter division in the field with Sabine and company into the third and many into second and the others into the first possibility. Isn't it legit to have other possibilities like our constants of nature were designed and only one universe. So it's either the designed used specific calabi-yau shapes chosen at Big Bang or other new physics that guide them. I was asking if there is official name for this new physics that guide them. Something along the line of perhaps actual implementation of AdS/CFT? Right now it's just toy model because it is not our spacetime. Could there be new implementation where it can use our spacetime. Or similar to the concept. This is the context of my question. Our constants of nature are incredibly fined tuned we need to think of all possibilities. Hence my question are not rambling in non-educated way but reflect the great division now going on in theoretical physics and the confusion (This is the theme of Sabine new book coming soon).
 
  • #17
jtlz said:
What I meant is. Our constants of nature is so incredibly fined tuned. The following are the possibilities:

1. Naturalness (like Calabi-Yau where the constants could be in the form of moduli, etc.)
2. Multiverse
3. One universe but constants chosen like Lotto Win

Right now we have utter division in the field with Sabine and company into the third and many into second and the others into the first possibility. Isn't it legit to have other possibilities like our constants of nature were designed and only one universe. So it's either the designed used specific calabi-yau shapes chosen at Big Bang or other new physics that guide them. I was asking if there is official name for this new physics that guide them. Something along the line of perhaps actual implementation of AdS/CFT? Right now it's just toy model because it is not our spacetime. Could there be new implementation where it can use our spacetime. Or similar to the concept. This is the context of my question. Our constants of nature are incredibly fined tuned we need to think of all possibilities. Hence my question are not rambling in non-educated way but reflect the great division now going on in theoretical physics and the confusion (This is the theme of Sabine new book coming soon).

I just read http://backreaction.blogspot.com/ where Sabine reviews the book "Farewell to Reality: How Fairytale Physics Betrays the Truth for Scientific Truth".

Sabine doesn't believe in naturalness or supersymmetry or even Multiverse. She reasoned seemingly fined tuned stuff like the vacuum metastability just happens (like winning lotto) without formula because there is no Bayesian background. But I found her belief even weirder. So what if all of them are wrong and there should be other options. That's all. Hierarchy problem, vacuum metastability if confirmed were just too strange especially if there was no naturalness, no supersymmetry, no multiverse.. and Sabine model of cosmic lotto doesn't work (someone correct me if I got Sabine position wrong but I think's that's the contents of her book "Lost in Math".)
 
  • #18
jtlz said:
Our constants of nature is so incredibly fined tuned. The following are the possibilities:

1. Naturalness (like Calabi-Yau where the constants could be in the form of moduli, etc.)
2. Multiverse
3. One universe but constants chosen like Lotto Win

One option is missing:
  • Having a theory that explains the numbers.
"The arguments based on ‘naturalness’ are basically like saying the weather tomorrow should be the same as today. The opposite of naturalness is having a theory. […] It would have been nice if the naturalness arguments had worked, but they did not. Since they were not predictions from a theory it is not clear how to interpret that. [...] The failure of naïve naturalness to describe the world tells us we should look harder for a theory that does [work at high energies], an ‘ultraviolet completion’. [...] The alternative to naturalness, often neglected as an alternative, is having a theory." (Kane 17)

If you just think about it, you will realize that the universe is full of extreme ratios and hierarchical scales. Say the radius of the hydrogen atom over that of its nucleus, or the power emitted in a supernova over that of an ordinary burning star, to name just two. Without a theory of nature, these large ratios may seem mystifying. With a theory of nature they follow from basic laws and the mystery disappears.
 
Last edited:
  • Like
Likes WhatIsGravity
  • #19
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
It is also interesting to note how the precision analysis of Bednyakov et al 15. seems to invalidate the now famous suggestion by Shaposhnikov-Wetterich 09 (which was motivated from the principle of "asymptotic safety") that the beta function ##\beta_\lambda## for the quartic Higgs self-coupling ##\lambda## should vanish asymptotically.

It is worth noting that the beta function used by Shaposhnikov-Wetterich 09 is not the Standard Model beta function of the Higgs boson. It is modified to reflect the beta function that would exist in the Standard Model plus quantum gravity in the form of an asymptotic gravity massless spin-2 graviton that modifies the beta function of the Higgs boson from the Standard Model version. The asymptotic behavior of the tweaked Higgs boson beta function probably reflects this modification. So, the Shaposhnikov-Wetterich 09 beta function should not behave the same way that the different Higgs boson beta function modeled in Bednyakov et al 15 does.
 
  • #20
ohwilleke said:
It is worth noting that the beta function used by Shaposhnikov-Wetterich 09 is not the Standard Model beta function of the Higgs boson. It is modified to reflect the beta function that would exist in the Standard Model plus quantum gravity in the form of an asymptotic gravity massless spin-2 graviton that modifies the beta function of the Higgs boson from the Standard Model version. The asymptotic behavior of the tweaked Higgs boson beta function probably reflects this modification. So, the Shaposhnikov-Wetterich 09 beta function should not behave the same way that the different Higgs boson beta function modeled in Bednyakov et al 15 does.

True, I was wondering about this. But Bednyakov et al. cite Shaposhnikov-Wetterich on their p, 17, right before they point out failure of asymptotically vanishing beta, as if in reply to that claim. But it's true, it's not clear (to me) whether they really operate on the same assumptions. Would be good to sort this out...
 
  • #21
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:





      • Having a theory that explains the numbers.

"The arguments based on ‘naturalness’ are basically like saying the weather tomorrow should be the same as today. The opposite of naturalness is having a theory. […] It would have been nice if the naturalness arguments had worked, but they did not. Since they were not predictions from a theory it is not clear how to interpret that. [...] The failure of naïve naturalness to describe the world tells us we should look harder for a theory that does [work at high energies], an ‘ultraviolet completion’. [...] The alternative to naturalness, often neglected as an alternative, is having a theory." (Kane 17)

If you just think about it, you will realize that the universe is full of extreme ratios and hierarchical scales. Say the radius of the hydrogen atom over that of its nucleus, or the power emitted in a supernova over that of an ordinary burning star, to name just two. Without a theory of nature, these large ratios may seem mystifying. With a theory of nature they follow from basic laws and the mystery disappears.

There seems to be many meanings of Naturalness so it's good to sort the subtleness of each meaning. In the paper depicting the vacuum metastability illustration in the initial message.. the conclusion in the paper started with the sentence "One of the most important questions addressed by the LHC is naturalness.". I found three good reference about naturalness after you emphasized opposite of naturalness is having a theory.. (I thought naturalness means exclusively having equations that gives the values or relationship.. and I mentioned these references so in the future I can refer to these if I forgot them):

http://backreaction.blogspot.com/2018/02/what-does-it-mean-for-string-theory.html

"That the mass be natural means, roughly speaking, that getting masses from a calculation should not require the input of finely tuned numbers".

https://profmattstrassler.com/artic...ics-basics/the-hierarchy-problem/naturalness/ (about unknown physics and effect on the Higgs field I asked about)

and https://arxiv.org/pdf/1501.01035.pdf

"In implementing ’t Hooft’s notion of naturalness, we have so far considered symmetries of a sort familiar from quantum mechanics, generated by a charge operator which is a scalar under rotations. But there is another type of symmetry, allowed by general principles of quantum mechanics and relativity, where the symmetry generators are spinors. This symmetry is known as supersymmetry. We will consider it, first, as a global symmetry, but the symmetry can be elevated to a local, gauge symmetry."
[...]
"it is still possible that nature is “natural”, in the sense of ’t Hooft. Future runs of the LHC might provide evidence for supersymmetry, warped extra dimensions, or some variant of technicolor. But the current experimental situation raises the unsettling possibility that naturalness may not be a good guiding principle. Indeed, naturalness is in tension with another principle: simplicity. Simplicity has a technical meaning: the simplest theory is the one with the smallest number of degrees of freedom consistent with known facts. Contrast, for example, the minimal Standard Model, with its single Higgs doublet, with supersymmetric theories, with their many additional fields and couplings. So far, the experimental evidence suggests that simplicity is winning. The observed Higgs mass is in tension with expectations from supersymmetric theories, but also technicolor and other proposals."

----


If the masses of the superpartners are very high.. what mechanism in superstring theory besides Kane's (his 0.5 TeV bino was already excluded) that can solve the Hiearchy Problem without 't Hooft notion of naturalness or natural supersymmetry (In the sense of the above paragraph at low masses)?
I'm interested for now in theory that can solve for them (instead of Multiverse or real fine tuning between quadratic radiative corrections and the bare mass being put there on purpose by design (if these were the mechanism chosen by nature. Then we should have new fields that can maintain the constants values without possibly any formulas).


And is the solution to the Hiearchy Problem independent of the Vacuum metastability issue (to what extend can solution of each be solution of the other)? Thank you.

 
  • #22
jtlz said:
I thought naturalness means exclusively having equations that gives the values or relationship

No, naturalness is a principle imposed in the absence of any known equation for the actual numbers. In the absence of an explanation, cancellations of large numbers against each other seems a weird coincidence, and naturalness means that weird coincidences should not happen. It's a qualitative principle akin to the principles of the times of "natural philosophy" before physics became a maths-based science.
jtlz said:
what mechanism in superstring theory besides Kane's (his 0.5 TeV bino was already excluded) that can solve the Hiearchy Problem

What you call "Kane's theory" is called the ##G_2##-MSSM, which became a serious contender for realistic model building with the result of Acharya-Witten 01.
This model does not predict explicit numbers without feeding in some concrete assumptions on the precise nature of the compactification (if it did we would be living in a Douglas Adams novel), but -- and that's really the point that Kane et al. have been driving home over the years -- it does make generic predictions that hold irrespective of the detailed numbers that describe the choice of compactfication.

One of these generic predictions is that there is an exponential hierarchy between the Planck scale and the gravitino mass scale, hence the susy breaking scale, hence the electroweak scale. This argument is due to Acharya-Kane-Kumar 12, section V.A.2 (pages 10-11) , and it is not hard to follow:

First, the higher gauge symmetry of the supergravity C-field implies "shift symmetry" of its KK-modes ##\Phi_j##. But the perturbative superpotential must be a holomophic function of the ##\Phi_j## and under shift symmetry this is only possible if perturbatively it vanishes identically. As a result, one deduces that in these models the superpotential consist entirely of non-perturbative contributions, such as membrane instantons, which are known to break the shift symmetry. But these non-perturbative contributions are negative exponentials in the instanton contributions, and hence imply that the gravitino mass (hence the susy breaking scale) is exponentially smaller than the Planck mass. This is the required exponential hierarchy.

This is not a mathematical proof, but it is a decent scientific argument based on an actual theory, and it makes clear that exponential hierarchies between the susy breaking scale and the Planck scale have a good scientific explanation from first principles, just as the exponential scale between the power of a supernova and that of an ordinary star does, both of which may look suprprisingly "unnatural" to the mathematically un-aided observer.

 
Last edited:
  • Like
Likes Spinnor and jtlz
  • #23
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
I find it hard to tell what you are asking here, as it sounds a little garbled. You should be asking whether the renormalization group flow of the Higgs self-coupling to the Planck scale as shown here is computed with the assumption that there is no new physics in between the weak scale and the Planck scale (called the "great desert" assumption). And the answer to that is: Yes.

And then the logic turns around: Since under the "great desert"-assumption the RG flow is seen to run at least extremely close to the point where the Higgs vacuum would become unstable, one may argue that this assumption must be wrong, and that there should be some new physics kicking in, which prevents that. Such as, possibly, supersymmetry or something else. This is not a solid proof, of course, but that's the kind of plausibility argument usual in phenomenology.

What I was asking in your reply was the following especially the unknown contribution (glad to find this today as this vocalizes my concern):

HunFhn.jpg


https://profmattstrassler.com/artic...ics-basics/the-hierarchy-problem/naturalness/
"Fig. 5: Summing up the energy from the quantum fluctuations of known fields (schematically shown, upper row) up to the maximum energy scale vmax(down to the minimum distance scale) where the Standard Model still applies, and adding to this contributions from unknown effects from still higher energies and shorter distances (schematically shown, middle row), we must somehow find what experiment tells us is true: that the Higgs field’s average value is 246 GeV and the Higgs particle’s mass is 125 GeV/c2. If vmax is much larger than 500 GeV, this requires a very precise cancellation between the known and unknown sources of energy, one that is highly atypical of quantum theories."

Do you have a more mathematical version of it in your site in terms of renormalization group flow and Higgs self-coupling, etc that described the same stuff? If Vmax is just 500 GeV instead of Planck scale.. does it mean the Higgs self-coupling cover up to 500 GeV only instead of up to Planck scale? What stuff must the LHC produce so Vmax is up to 500 GeV only? Matt wrote in the commentary after the article:

"The obvious solution is indeed that vmax is near 500 GeV. And if that is true, the LHC will discover as-yet unknown particles, and other predictions of the Standard Model will fail as well. The strongest evidence against it — inconclusive at this time — is that the LHC has not yet discovered any such non-Standard-Model particles, and there are no known deviations from the Standard Model at the current time. Arguably we should have seen subtle deviations already by now. But I will get to this issue soon."
 

Attachments

  • HunFhn.jpg
    HunFhn.jpg
    29.1 KB · Views: 860
  • #24
jtlz said:
Do you have a more mathematical version of it in your site in terms of renormalization group flow and Higgs self-coupling, etc that described the same stuff?

The problem with this folklore story is that it is trying to nail jelly to the wall by arguing about the size of quantum corrections in the presence of renormalization freedom.

The fact of the matter is that the space of renormalization choices is an affine space (i.e. like a vector space, but with no origin singled out) which means first of all that there is no absolute concept of "size" of a quantum correction. This only appears once one fixes a renormalization scheme, which is like a choice of coordinate chart. It has no physical meaning. Even if we fix a renormalization scheme (which is implicitly assumed in discussions such as you quote) then it still remains a fact that there is an arbitrary freedom in choosing renormalization constants, large or not.

In conclusion, to make progress on these kinds of matters, one needs more theoretical input than just low energy effective perturbative quantum field theory with its arbitrary renormalization freedom, or otherwise one is going in circles forever. As Kane 17 points out, "we should look harder for a theory that does provide a UV-completion".

Notice how the solution of the hierarchy problem observed by Acharya-Kane-Kumar 12, section V.A.2 (p. 10-11) deals with this issue: They invoke a UV-completion that goes beyond perturbation theory. In that theory one knows 1) that the superpotential is protected against perturbative renormalization freedom and 2) the form of the non-perturbative corrections is known. Namely these are exponentials in the inverse coupling. This yields the exponential hierarchy that is to be explained.

You see, this works not by long story-telling and analogies and showing colorful pictures, but by a logical deduction from a theoretical framework.
(Not fully mathematically rigorous, but fairly solid by the standards of phenomenology.)
 
  • Like
Likes Spinnor, protonsarecool and jtlz
  • #25
To come back to the original question, here is another very recent calculation regarding meta-stability of our universe:

Scale-invariant instantons and the complete lifetime of the standard model
Anders Andreassen, William Frost, and Matthew D. Schwartz
Phys. Rev. D 97, 056006 – Published 12 March 2018
https://journals.aps.org/prd/abstract/10.1103/PhysRevD.97.056006
 
  • Like
Likes Urs Schreiber
  • #26
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
The problem with this folklore story is that it is trying to nail jelly to the wall by arguing about the size of quantum corrections in the presence of renormalization freedom.

The fact of the matter is that the space of renormalization choices is an affine space (i.e. like a vector space, but with no origin singled out) which means first of all that there is no absolute concept of "size" of a quantum correction. This only appears once one fixes a renormalization scheme, which is like a choice of coordinate chart. It has no physical meaning. Even if we fix a renormalization scheme (which is implicitly assumed in discussions such as you quote) then it still remains a fact that there is an arbitrary freedom in choosing renormalization constants, large or not.

In conclusion, to make progress on these kinds of matters, one needs more theoretical input than just low energy effective perturbative quantum field theory with its arbitrary renormalization freedom, or otherwise one is going in circles forever. As Kane 17 points out, "we should look harder for a theory that does provide a UV-completion".

Notice how the solution of the hierarchy problem observed by Acharya-Kane-Kumar 12, section V.A.2 (p. 10-11) deals with this issue: They invoke a UV-completion that goes beyond perturbation theory. In that theory one knows 1) that the superpotential is protected against perturbative renormalization freedom and 2) the form of the non-perturbative corrections is known. Namely these are exponentials in the inverse coupling. This yields the exponential hierarchy that is to be explained.

You see, this works not by long story-telling and analogies and showing colorful pictures, but by a logical deduction from a theoretical framework.
(Not fully mathematically rigorous, but fairly solid by the standards of phenomenology.)

I read and reread Matt arguments trying to understand his viewpoints. I think he got discouraged with the LHC non-null results in 2013.. and he wrote in separate page in August 2013 https://profmattstrassler.com/2013/08/27/a-first-stab-at-explaining-naturalness/

"This in turn is why so many particle physicists have long expected the LHC to discover more than just a single Higgs particle and nothing else… more than just the Standard Model’s one and only missing piece… and why it will be a profound discovery with far-reaching implications if, during the next five years or so, the LHC experts sweep the floor clean and find nothing more in the LHC’s data than the Higgs particle that was found in 2012."

It's happening now.. it's 5 years from his pronouncement. Only Higgs and nothing else is "Profound discovery with far-reaching implications". There is possibility there may not even be a UV complete theory. To be on topic, where is Matt page about Vacuum Metastability? He should emphasize it too.

Anyway. If AdS/CFT is UV complete. But it doesn't describe our spacetime. Is this duality just toy model or does it completely describe a hidden sector that can act as holographic surface. Is there any reference about this search for actual surface or it will remain as toy model for centuries to come? And can AdS/CFT explain the why of the vacuum metastability?
 
  • #27
jtlz said:
I read and reread Matt arguments trying to understand his viewpoints. I think he got discouraged with the LHC non-null results in 2013.. and he wrote in separate page in August 2013 https://profmattstrassler.com/2013/08/27/a-first-stab-at-explaining-naturalness/

"This in turn is why so many particle physicists have long expected the LHC to discover more than just a single Higgs particle and nothing else… more than just the Standard Model’s one and only missing piece… and why it will be a profound discovery with far-reaching implications if, during the next five years or so, the LHC experts sweep the floor clean and find nothing more in the LHC’s data than the Higgs particle that was found in 2012."

It's happening now.. it's 5 years from his pronouncement. Only Higgs and nothing else is "Profound discovery with far-reaching implications". There is possibility there may not even be a UV complete theory. To be on topic, where is Matt page about Vacuum Metastability? He should emphasize it too.

Anyway. If AdS/CFT is UV complete. But it doesn't describe our spacetime. Is this duality just toy model or does it completely describe a hidden sector that can act as holographic surface. Is there any reference about this search for actual surface or it will remain as toy model for centuries to come? And can AdS/CFT explain the why of the vacuum metastability?

Here's Matt page about vacuum metastability:

https://profmattstrassler.com/articles-and-posts/particle-physics-basics/theories-and-vacua/

Metastability relies on the Higgs mass of ~125 GeV.
Equally important to know is why is the Higgs vev 246 GeV. Can this be computed from first principle.
And also why the Higgs is turned on.

I guess pondering on them and solving for them simultaneously may be more encompassing. What I want to know is if 125 GeV was metastable.. what range of the Higgs field vev (it's 246 Gev but what if it's say 3 TeV) can still affect the metastability? (or no direct relationship or independent?)
 
  • #28
jtlz said:
What I want to know is if 125 GeV was metastable.. what range of the Higgs field vev (it's 246 Gev but what if it's say 3 TeV) can still affect the metastability?

We have given you pointers to detailed discussion of this. If you want to go beyond scratching the surface, you'll need to eventually pass from reading blogs to reading real scientific documents. That will also allow you to incrementally improve your questions.

From AFS18 the best present data on the stability issue the seems to be the following:

HiggsVacuumStability4.png
 

Attachments

  • HiggsVacuumStability4.png
    HiggsVacuumStability4.png
    35.1 KB · Views: 1,321
  • #29
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
We have given you pointers to detailed discussion of this. If you want to go beyond scratching the surface, you'll need to eventually pass from reading blogs to reading real scientific documents. That will also allow you to incrementally improve your questions.

From AFS18 the best present data on the stability issue the seems to be the following:

View attachment 223425

XvbFjP.jpg


After watching this video it seems more likely the Correct Fix (above) would shed light on the vacuum metastability too. Yes. I'll read all scientific papers from now on having seen the Bird's eye view. Thanks a lot for helping.
 

Attachments

  • XvbFjP.jpg
    XvbFjP.jpg
    46 KB · Views: 769
  • #30
jtlz said:
the Correct Fix (above) would shed light on the vacuum metastability too.

Sure. Notice that the "Correct Fix" on the left is "having a theory".

Concerning the remainder of the video: Maybe our times serve to raise the subconscious archetype of crisis, but really in natural science it is not a "crisis" to discover a phenomenon that looks non-random (non-generic). On the contrary, this is what drives science, and the left column of that table is a triumphal testimony of that.

One part of the hep community says that the LHC results reveal a "nightmare scenario" where nothing interesting is discovered; but when something interesting is discovered after all, the other part calls this a "crisis". Worse, these two parts of the community seem to have non-empty intersection. This smells of hysteria more than of sober scientific discourse.

The truth is that the LHC results both confirm established theory and at the same time show clearly and unambiguously a new physical effect that is not explained by established theory. That's really the dream scenario of natural science. Instead of rambling on about chance and likeliness, this should make the scientific community turn to its core task, which is to produce theories and iteratively improve on them.

Here we need a theory that explains why the vacuum would sit on the verge of instabiity, but not beyond. I am aware of two good contestants:

One is maybe I) asymptotic safety, the other is II) any theory in which a) field moduli such as the Higgs potential are themselves dynamical and b) they are prevented from crossing into the realm of instability, by some principle or symmetry

About I) I wish the analysis of theory and data had been made more intelligibly, see #8, #20 above.

About II): This is the theory that people like Gordon Kane are pointing out (Kane 18, "Clue 4"). It may turn out wrong, but at least it is a theory. The sensible thing to do in science is to investigate this theory further and check if it holds water.

Or else come up with a better theory. But we do need to talk about theories and not get lost in informal handwaving about probabilities, likeliness, genericity. The universe is neither likely nor generic, instead it is exceptional in its existence and its properties. In the past natural science, in the modern guise of maths-based theoretical physics, has managed to understand to a large extent why this is so. There is no reason to give up on this success story now.
 
  • Like
Likes jtlz
  • #31
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
Sure. Notice that the "Correct Fix" on the left is "having a theory".

Concerning the remainder of the video: Maybe our times serve to raise the subconscious archetype of crisis, but really in natural science it is not a "crisis" to discover a phenomenon that looks non-random (non-generic). On the contrary, this is what drives science, and the left column of that table is a triumphal testimony of that.

One part of the hep community says that the LHC results reveal a "nightmare scenario" where nothing interesting is discovered; but when something interesting is discovered after all, the other part calls this a "crisis". Worse, these two parts of the community seem to have non-empty intersection. This smells of hysteria more than of sober scientific discourse.

The truth is that the LHC results both confirm established theory and at the same time show clearly and unambiguously a new physical effect that is not explained by established theory. That's really the dream scenario of natural science. Instead of rambling on about chance and likeliness, this should make the scientific community turn to its core task, which is to produce theories and iteratively improve on them.

Here we need a theory that explains why the vacuum would sit on the verge of instabiity, but not beyond. I am aware of two good contestants:

One is maybe I) asymptotic safety, the other is II) any theory in which a) field moduli such as the Higgs potential are themselves dynamical and b) they are prevented from crossing into the realm of instability, by some principle or symmetry

About I) I wish the analysis of theory and data had been made more intelligibly, see #8, #20 above.

About II): This is the theory that people like Gordon Kane are pointing out (Kane 18, "Clue 4"). It may turn out wrong, but at least it is a theory. The sensible thing to do in science is to investigate this theory further and check if it holds water.

Or else come up with a better theory. But we do need to talk about theories and not get lost in informal handwaving about probabilities, likeliness, genericity. The universe is neither likely nor generic, instead it is exceptional in its existence and its properties. In the past natural science, in the modern guise of maths-based theoretical physics, has managed to understand to a large extent why this is so. There is no reason to give up on this success story now.

It is not far from logic that the following needs to be solved at the same time (or at least to ponder on them simultaneously):

1. Why is the Higgs mass and vev low for SM coupled to gravity.. Matt mentioned how it should be tuned to 1/10^-30 (Naturalness problem)
2. Hierarchy Problem (why are all particles not Planck size mass)
3. Vacuum Metastability (why is the Higgs mass 125 GeV bordering on the edge of metastability
4. Cosmological Constant Problem (disagreement between the observed values of vacuum energy density and theoretical large value of ZPE suggested by QFT)

You mentioned the first 3 solutions to this thread. May I know what is your take on the 4th problem (cosmological constant problem) and possible solution and if there is (any) connection to the first 3 problems especially the vacuum metastability?

Do you have a single model that can explain all the 4? How many separate models must they need to at least to explain them? 2? 4?

Wiki last paragraphs on the cosmological constant problem is:
"In the case of the Higgs mechanism, the usual Higgs vacuum expectation value in the instant-form vacuum is replaced by a constant scalar background field - a "zero mode" with kμ=0. The phenomenological predictions are unchanged using the LF formalism. Since the Higgs zero mode has no energy or momentum density, it does not contribute to the cosmological constant.

The small non-zero value of the cosmological constant must then be attributed to other mechanisms; for example a slight curvature of the shape of the universe (which is not excluded within 0.4% (as of 2017)[14][15][16]) could modify the Higgs field zero-mode, thereby possibly producing a non-zero contribution to the cosmological constant."

Could there be one model or solution to all of them at same time? It's weird they come in company.. maybe they have a common source or origin.
 
  • #32
jtlz said:
4. Cosmological Constant Problem (disagreement between the observed values of vacuum energy density and theoretical large value of ZPE suggested by QFT)

You mentioned the first 3 solutions to this thread. May I know what is your take on the 4th problem (cosmological constant problem) and possible solution and if there is (any) connection to the first 3 problems especially the vacuum metastability?

On the one hand:

A) in perturbative quantum field theory also the cosmological constant is subject to renormalization freedom (a basic point often neglected, detailed discussion is referenced here) so that all we can do is measure it.

On the other hand:

B) the approach of appealing to non-perturbative effects thought to be known in string theory, which has become a small industry since KKLT 03, has been argued to be performed too carelessly, see the well written recent article
  • Ulf Danielsson, Thomas Van Riet,
    "What if string theory has no de Sitter vacua?"
    (arXiv:1804.01120)
These authors argue that the "several hundred papers on the topic" of which "unfortunately there is no review" (p. 7) are all inconclusive, because none of them take care to really check the assumptions being made; and in fact likely all wrong, since closer inspection shows that these assumptions are quite dubious.

(Sometimes physicists question the point of rigorous mathematical argument. This is a good example for why it's useful: It simply means not to leave big gaps in your argument that later make the whole conclusion collapse. Large swaths of contemporary physics arguments are bound to eventually collapse this way, until the general attitude of the community changes.)

The conclusion is (p. 26) "that string theory has not made much progress on the problem of the cosmological constant during the last 15 years [since] string theorists have not been up to the challenge."

So we don't know how B) pans out. Until we do, we are stuck with A).
 
Last edited:
  • Like
Likes jtlz
  • #33
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
On the one hand:

A) in perturbative quantum field theory also the cosmological constant is subject to renormalization freedom (a basic point often neglected, detailed discussion is referenced here) so that all we can do is measure it.

On the other hand:

B) the approach of appealing to non-perturbative effects thought to be known in string theory, which has become a small industry since KKLT 03, has been argued to be performed too carelessly, see the well written recent article
  • Ulf Danielsson, Thomas Van Riet,
    "What if string theory has no de Sitter vacua?"
    (arXiv:1804.01120)
These authors argue that the "several hundred papers on the topic" of which "unfortunately there is no review" (p. 7) are all inconclusive, because none of them take care to really check the assumptions being made; and in fact likely all wrong, since closer inspection shows that these assumptions are quite dubious.

(Sometimes physicists question the point of rigorous mathematical argument. This is a good example for why it's useful: It simply means not to leave big gaps in your argument that later make the whole conclusion collapse. Large swaths of contemporary physics arguments are bound to eventually collapse this way, until the general attitude of the community changes.)

The conclusion is (p. 26) "that string theory has not made much progress on the problem of the cosmological constant during the last 15 years [since] string theorists have not been up to the challenge."

So we don't know how B) pans out. Until we do, we are stuck with A).

Thanks. So the CC is not related to vacuum metastability. So let's get back to the latter exclusively.

Do you have models where instead of just the ground state and the false vacuum.. there are many false vacuums instead? This makes better sense. You see, the universe is so exceptional in its existence and properties, so incredibly designed for life and all that it doesn't make a lot of sense the ground state is its abyss. To illustrate it. It's like our universe is Disneyland. It doesn't make sense to have tactical nukes in the middle of Disneyland able to destroy it at any moment (or possibility for false vacuum to tunnel to a destructive true vacuum). Instead, there must be many false vacuums.. with each having properties as incredible as our universe or compatible with life in Ascended forms. Or imagine all the Disneyland on Earth are located in one location but only one of them can manifest at one time and tunneling to false vacuum means tunneling to different Disneyland scenery and design.

I heard multiple false vacuums are not refuted yet. What is the best model about this?
 
  • #34
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
On the one hand:

A) in perturbative quantum field theory also the cosmological constant is subject to renormalization freedom (a basic point often neglected, detailed discussion is referenced here) so that all we can do is measure it.

This one sentence caused me an hour of reading of your references. About your statement "This means that apart from the freedom of choosing a classical comsological constant in the Einstein-Hilbert action as above, its perturbative quantization (perturbative quantum gravity) introduces renormalization freedom to the value of the cosmological constant."

and on your Quantum Gravity page reference
"Thus, this idea seems to be very unlikely. At the same time, it is clear that present experiments cannot directly probe the effects of that supposed UV-completion of Einstein-gravity.

Approaches to a full quantization of gravity therefore roughly fall into two different strategies
1.One assumes that the Einstein-Hilbert action is indeed the effective QFT that approximates a “UV-completion”, a more fundamental theory valid at all energies. This is the approach taken for instance in string theory.
2.One assumes that by some other fact that has been overlooked, one can make sense of a non-perturbative quantization of the EH action at arbitrary energies after all. This is for instance the case in speculations that EH-gravity has a UV fixed point?."

Which of the two above has more weight (if they are not equal) to the renormalization freedom of the CC. I mean what if gravity and QFT were just emergence of a third theory (this was mentioned by Smolin and other). The renormalization freedom of the CC stays valid?

On the other hand:

B) the approach of appealing to non-perturbative effects thought to be known in string theory, which has become a small industry since KKLT 03, has been argued to be performed too carelessly, see the well written recent article
  • Ulf Danielsson, Thomas Van Riet,
    "What if string theory has no de Sitter vacua?"
    (arXiv:1804.01120)
These authors argue that the "several hundred papers on the topic" of which "unfortunately there is no review" (p. 7) are all inconclusive, because none of them take care to really check the assumptions being made; and in fact likely all wrong, since closer inspection shows that these assumptions are quite dubious.

(Sometimes physicists question the point of rigorous mathematical argument. This is a good example for why it's useful: It simply means not to leave big gaps in your argument that later make the whole conclusion collapse. Large swaths of contemporary physics arguments are bound to eventually collapse this way, until the general attitude of the community changes.)

The conclusion is (p. 26) "that string theory has not made much progress on the problem of the cosmological constant during the last 15 years [since] string theorists have not been up to the challenge."

So we don't know how B) pans out. Until we do, we are stuck with A).
 
  • #35
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
this works not by long story-telling and analogies and showing colorful pictures, but by a logical deduction from a theoretical framework
My understanding is that in the G2-MSSM, the scale of the Higgs vev is generated and protected by specific mechanisms, such as you describe. But protection of the Higgs mass from large corrections still relies on low-scale supersymmetry (just like other susy models). And so in that regard, the G2-MSSM conforms to Strassler's reasoning. Strassler says that the scale "vmax" at which the SM ceases to be valid can't be too high, or else there will be unacceptable finetuning; and in these G2 compactifications of M theory, the MSSM is indeed supposed to replace the SM, not too far above the weak scale.

As for the cosmological constant, Bobkov argued that a version of the Bousso-Polchinski mechanism can work within the G2-MSSM.
 
  • Like
Likes jtlz and Urs Schreiber
<h2>What is metastability?</h2><p>Metastability refers to a physical state in which a system is not in its lowest energy state, but is also not in an unstable state. This means that the system can remain in this state for a long period of time before eventually transitioning to a more stable state.</p><h2>Why is our vacuum in a state of metastability?</h2><p>The current leading theory is that our vacuum is in a state of metastability because of the Higgs field. The Higgs field gives particles their mass and is believed to have a non-zero value throughout the universe. This non-zero value causes our vacuum to be in a state of metastability.</p><h2>What is the best theory for why our vacuum is on the edge of metastability?</h2><p>The best theory for why our vacuum is on the edge of metastability is the Standard Model of particle physics. This theory has been extensively tested and has accurately predicted many experimental results. It includes the Higgs field and explains why our vacuum is in a state of metastability.</p><h2>What could happen if our vacuum transitions to a more stable state?</h2><p>If our vacuum were to transition to a more stable state, it could have significant consequences for the universe. It could lead to a change in the fundamental constants of nature, which could affect the behavior of particles and the formation of structures in the universe. It could also potentially lead to the destruction of our current universe and the creation of a new one.</p><h2>Is there a way to prevent our vacuum from transitioning to a more stable state?</h2><p>Currently, there is no known way to prevent our vacuum from transitioning to a more stable state. However, scientists are actively researching and studying this phenomenon in order to better understand it and potentially find ways to manipulate it. This could potentially lead to new technologies and advancements in our understanding of the universe.</p>

What is metastability?

Metastability refers to a physical state in which a system is not in its lowest energy state, but is also not in an unstable state. This means that the system can remain in this state for a long period of time before eventually transitioning to a more stable state.

Why is our vacuum in a state of metastability?

The current leading theory is that our vacuum is in a state of metastability because of the Higgs field. The Higgs field gives particles their mass and is believed to have a non-zero value throughout the universe. This non-zero value causes our vacuum to be in a state of metastability.

What is the best theory for why our vacuum is on the edge of metastability?

The best theory for why our vacuum is on the edge of metastability is the Standard Model of particle physics. This theory has been extensively tested and has accurately predicted many experimental results. It includes the Higgs field and explains why our vacuum is in a state of metastability.

What could happen if our vacuum transitions to a more stable state?

If our vacuum were to transition to a more stable state, it could have significant consequences for the universe. It could lead to a change in the fundamental constants of nature, which could affect the behavior of particles and the formation of structures in the universe. It could also potentially lead to the destruction of our current universe and the creation of a new one.

Is there a way to prevent our vacuum from transitioning to a more stable state?

Currently, there is no known way to prevent our vacuum from transitioning to a more stable state. However, scientists are actively researching and studying this phenomenon in order to better understand it and potentially find ways to manipulate it. This could potentially lead to new technologies and advancements in our understanding of the universe.

Similar threads

  • Beyond the Standard Models
Replies
8
Views
1K
  • Beyond the Standard Models
Replies
9
Views
3K
  • High Energy, Nuclear, Particle Physics
Replies
3
Views
2K
  • Beyond the Standard Models
Replies
32
Views
4K
  • Beyond the Standard Models
3
Replies
73
Views
16K
  • Beyond the Standard Models
Replies
0
Views
898
Replies
37
Views
5K
  • Science Fiction and Fantasy Media
Replies
15
Views
591
  • Beyond the Standard Models
Replies
4
Views
1K
  • High Energy, Nuclear, Particle Physics
Replies
1
Views
1K
Back
Top