nikkkom said:
It seems you do not understand what "Higgs potential" is.
Each allowed field configuration has some potential energy. IOW: potential does not "change", it is a fixed value for every allowed field configuration.
If plasma temperature is above, say, 200 GeV, it means that space is filled by various particles with mean energy of >200GeV. Particles with these energies will easily create all other known SM particles. This means that energy of the plasma will be equally distributed among all possible "types of particles" (more precisely, among all degrees of freedom). _Including_ Higgs bosons.
This, in turn, means that Higgs vacuum expectation value does not matter anymore: Higgs field "jumps around" so much that the fact that it would settle to a nonzero value if "cooled" is not important. The behavior of the system is not much different from a theory where Higgs field would have a minimum at zero.
Imagine a hot frying pan with uneven bottom. If you heat it up to 400C and drip some water on it, droplets will ignore the existence of small "minimums" on the bottom - they will jump around all over the frying pan. The water would collect there only if the pan is cold.
Are you not familiar with the Higgs phase transition details. See
https://physics.stackexchange.com/questions/205607/how-do-symmetries-break-in-cosmology
"Symmetry breaking in the early Universe occurs through finite temperature effects. To study this one must look at the finite temperature effective action for the scalar field, involving a thermal field theory calculation. A scalar field receives contributions to it's effective potential from the relativistic degrees of freedom it couples to. So as the Higgs field couples to fermions and gauge bosons which are light and relativistic in the early Universe this modifies the potential.
Typically the potential has two terms:
V = V(T=0) + V(T)
At leading order V(T) ~ M^2 T^2 where T is the temperature of the thermal bath and M is the mass term of the fields the Higgs field couples to.
The key point is that at large T'>TT the thermal corrections dominate and the potential has a single minimum at the origin. As the temperature cools the thermal corrections fall off and new minima appear. Eventually the Higgs field is free to evolve away from the origin towards the new minima during which it breaks the symmetry."
I just want confirmation or elaboration of it. This was because I got this contradictory statements from:
https://physics.stackexchange.com/questions/205607/how-do-symmetries-break-in-cosmology
"I don't think that an unambiguous justification can be given because the dynamic of the electroweak symmetry breaking (EWSB) is still unknown. We don't have a well established theory describing how the Higgs scalar potential evolves with the temperature. When people talk about the scale of the EWSB, they usually refer to two possible things:
1. before EWSB, the weak bosons are massless. After EWSB, they get a mass (91 GeV for Z0 and 80 GeV for W+- The scale is therefore of the order of the mass of the weak bosons, roughly 100 GeV.
2. before EWSB, the Higgs vacuum expectation value (v.e.v.) is 0, the field is symmetric. After EWSB, the v.e.v. is about 246 GeV. So again, the v.e.v. value is representative of the scale of the EWSB, still of the order of 100 GeV."
What is your thought guys?
I want to know if the dynamic of the electroweak symmetry breaking (EWSB) is really unknown or known.
And whether we DO or DON'T have a well established theory describing how the Higgs scalar potential evolves with the temperature.