Higgs Boson Particle: Doom for the Universe?

  • Thread starter Thread starter DiracPool
  • Start date Start date
AI Thread Summary
The Higgs boson, with a mass of approximately 126 GeV, may indicate a potential instability in the universe, suggesting a catastrophic end in the far future. This instability is linked to the behavior of the Higgs vacuum, particularly the sign of the quartic term in its potential. If this term is negative, the vacuum could be unbounded, leading to a scenario where the Higgs field could roll off to infinity after a tunneling event. Current analyses suggest that if the tunneling time exceeds the age of the universe, the vacuum can be considered metastable. Overall, while the Higgs boson raises concerns about cosmic stability, life on Earth will likely end long before any potential catastrophic event occurs.
Space news on Phys.org
From the article:
For example, the mass of the new particle is about 126 billion electron volts, or about 126 times the mass of the proton. If that particle really is the Higgs, its mass turns out to be just about what's needed to make the universe fundamentally unstable, in a way that would cause it to end catastrophically in the far future.
Ostensibly, our sun will burn out, and all life on Earth will be extinguished, well before then.

No worries.
 
Astronuc said:
From the article: Ostensibly, our sun will burn out, and all life on Earth will be extinguished, well before then.

No worries.

Phew! Thanks, I was ready to cash out my Vanguard fund.
 
From a different thread:

fzero said:
The question of stability of the Higgs vacuum has to do with the shape of the potential, in particular, with the sign of the quartic term, \lambda h^4. If \lambda is positive, then the potential is bounded from below, so that there is a true vacuum at a finite value of the Higgs field, \langle h \rangle = v. This is the case that is usually drawn when people discuss the Higgs mechanism. If \lambda is negative, then the potential is unbounded from below and there is no global minimum. Depending on the quadratic and other terms, there can be a local minimum, which is the false vacuum. Given a long enough period of time, we can have tunneling through the barrier, after which the Higgs field value rolls off to infinity. If the tunneling time is sufficiently long (the age of the universe is the relevant scale here), we can call the vacuum metastable.

At lowest order (tree-level), the coefficients in the Higgs potential can be determined from the weak coupling constant and W/Z and Higgs masses. However, because of quantum effects, the coefficients actually "run" with energy scale according to the renormalization group. Large Higgs value corresponds to a large energy scale (the Higgs field h has units of mass), so the renormalization corrections can get large compared to the tree-level terms. The corrections can be determined in terms of parameters like the masses of all of the other particles participating in the SM interaction, especially the top quark. A detailed formula for the running of \lambda is eq (52) in http://arxiv.org/abs/1205.6497, which is cited in the Alekhin et al paper you linked to. This formula depends on technical details like Yukawa couplings and anomalous dimensions. The formula (63) is a simpler looking formula that boils experimental data into a value for \lambda(M_t) at the scale set by the top quark mass.

The relevant analysis is then to take the value of \lambda(M_t) given by (63) and then use (52) to run its value to large scales. If the sign goes negative before reaching the Planck scale, then the vacuum is not stable. If the tunneling time is longer than the present age of the universe, we distinguish the vacuum as metastable. I'm not 100% certain what Alekhin et al have done differently other than extract a value for the top quark pole mass that has a much larger uncertainty than the one considered by Degrassi et al. Both papers find the same stability bound, but of course the Alekhin et al result has a larger error bar.


That was from the HEP forum, so if there are some unfamiliar concepts or otherwise missing explanation that I might elaborate, ask away.
 
https://en.wikipedia.org/wiki/Recombination_(cosmology) Was a matter density right after the decoupling low enough to consider the vacuum as the actual vacuum, and not the medium through which the light propagates with the speed lower than ##({\epsilon_0\mu_0})^{-1/2}##? I'm asking this in context of the calculation of the observable universe radius, where the time integral of the inverse of the scale factor is multiplied by the constant speed of light ##c##.
Why was the Hubble constant assumed to be decreasing and slowing down (decelerating) the expansion rate of the Universe, while at the same time Dark Energy is presumably accelerating the expansion? And to thicken the plot. recent news from NASA indicates that the Hubble constant is now increasing. Can you clarify this enigma? Also., if the Hubble constant eventually decreases, why is there a lower limit to its value?

Similar threads

Replies
2
Views
2K
Replies
13
Views
4K
Replies
5
Views
2K
Replies
11
Views
3K
Replies
8
Views
2K
Replies
1
Views
3K
Replies
9
Views
2K
Back
Top