Preon quark models excluded by LHC

  • Context: High School 
  • Thread starter Thread starter cube137
  • Start date Start date
  • Tags Tags
    Lhc Models Quark
Click For Summary
SUMMARY

The discussion centers on the exclusion of Preon quark models by the Large Hadron Collider (LHC) findings. Preon models, first proposed by Pati and Salam in 1974, suggest that fundamental particles are composite entities. However, LHC data has not shown evidence of compositeness in leptons and quarks, establishing strict energy scale constraints on these models. Notably, the conversation highlights the lack of comprehensive preon models that account for multiple generations of fermions or explain the mass generation of Standard Model particles.

PREREQUISITES
  • Understanding of Standard Model of particle physics
  • Familiarity with composite particle theories
  • Knowledge of gauge symmetries, particularly SU(3)xSU(2)xU(1)
  • Awareness of the Large Hadron Collider's role in particle physics research
NEXT STEPS
  • Research the implications of LHC findings on particle compositeness
  • Explore the original 1974 paper by Pati and Salam on preon models
  • Investigate the concept of composite weak bosons as proposed by Harald Fritzsch
  • Examine alternative theories to the Standard Model, including Grand Unified Theories (GUTs)
USEFUL FOR

Particle physicists, theoretical physicists, and researchers interested in the fundamental structure of matter and the implications of collider experiments on particle theory.

cube137
Messages
360
Reaction score
10
from https://www.physicsforums.com/threads/world-made-up-of-2nd-3rd-gen-particles.883560/#post-5555878

quoting ohwilleke
I'll only address this briefly as it really belongs in the Beyond the Standard Model forum, while the original question which is really just a fun way to elicit the Standard Model properties of higher generation fermions does not.

Models in which some or all of the Standard Model fundamental particles are actually composite particles made up of something more fundamental are generically called "Preon" models after terminology which, if I recall correctly, was the terminology used by Pati and Salam in their 1974 paper which was one of the earliest preon model proposals (also "Technicolor" models propose a composite substitute for the Higgs boson). I've contributed to and in several cases been the original author of many of the articles on Wikipedia related to preon models and a number of notable preon papers are cited in the footnotes to the Preon article at Wikipedia and in other articles linked to it. So far, the LHC and prior colliders have not seen any sign of compositeness in the fundamental leptons and quarks, and have placed extremely strict bonds on the energy scales at which such compositeness could arise in the context of fairly naive and straightforward versions of preon models. Nobody has found any evidence distinguishing fundamental particles from the point particle representation that they have in the Standard Model at any scale we can probe.

Also, only a pretty small minority of preon models explain more than one generation of fundamental fermions or provide any insight into how preons give rise to the masses of the fundamental particles of the Standard Model which in that analysis are actually composite.

There are also quite a few papers that explore the idea that leptons and quarks are more similar than they seem in a scheme in which leptons are possible because there are really four colors rather than three colors, which one of those colors, or certain combinations of those colors, giving rise to leptons that don't interact via the strong force rather than quarks (in much the same way that nobel gases are chemically inert despite being composite particles made up of things that do interact chemically when found in other configurations). Indeed, Pati and Salam's original 1974 preon paper advanced this hypothesis. The paper was Pati, J.C.; Salam, A. (1974). "Lepton number as the fourth "color"". Physical Review D. 10: 275–289.

I am not aware of any peon models that specifically looks at bonds in the nature of "string-like excitations of the superconducting Higgs vacuum" and the connection between string excitation modes and particular fundamental particles in the Standard Model is much less direct and determinate than popularizations of string theory have implied.

The nuclear force holding nucleons together is established to be a residual force derived from the strong force carried by gluons that binds quarks together, and no one is publishing alternatives to this (i.e. basically QCD which is part of the Standard Model) although some physicists who have proposed preon models have considered the possibility that gluons may actually be composite bosons which are a residual force of the force that binds preons together.

Have you come across this paper in peer reviewed Physical Review D...

http://www.sciencedirect.com/science/article/pii/0370269379906671

or

http://feynman.phy.ulaval.ca/marleau/pp/10preons/user/image/subquark-model-of-leptons-and-quarks.pdf

I have the former paper written in 1979 that mentioned the "string-like excitations of the superconducting Higgs vacuum". So 37 years later, has LHC or other papers put constraints on all it said?? Quoting the relevant passages:

"Unlike in the Pati-Salam scheme, there are no exotic gluons strongly coupling fundamental hadrons with leptons. The proton is predicted to be absolutely stable, therefore. Also, baryon number and lepton number L = ne + nu + nr + n(omega) + nk are separately conserved. Two new heavy leptons are predicted: omega- + k-. They should appear in the electron positron annihilation reactions: e+ + e- = omega+ + omega-, e+ + e- = k+ + k-.
Non-abelian generations of the Nielsen-Olesen model [4], providing permanent confinement of quarks and saturation of quark forces in zerotriality states, are very attractive in the context of a composite quark theory. The condition for topologically distinct Nielsen-Olesen vortices and Dirac monopoles to exist such that the former cannot be transformed into one another by continuous gauge transformations is that the global gauge group should be multiply connected [5]. If this group is SU(9)/Z9 (small letter 9), where the cylic group
Z9= (I9,vI9, v^2I9,...,v^8I9) (v=exp (2pi(I)/9)) [Cube137 note: 9 is small letter bottom since I don't know how to type it ]

is the centre of SU(9), then nine distinct vortices exist, since SU(9)/Z9 is 9-fold connected. One corresponds to the ground state of the vacuum and consists of no vortex; the other eight correspond to non-equivalent magnetic monopoles of monopole moment g0,2g0,... , 8g0 (g0=1/2e). This is because, with the Higgs field belonging to an adjoint representation of SU(9), single-valuedness modulo 2pi/9 of its phase implies that vortex flus (and magnetic charge, there) is defined only modulo (9) (in Dirac Units). Thus nine and only nine SU(9) monopoles form a magnetically neutral system when embedded in a superconducting Higgs vacuum and bound by Nielsen-Olesen vortices. Similary for three SU(3) monopoles in a vacuum with broken SU(3) symmetry. Omegons are bound in quarks by three supergluon octets of SU(3). With omegons as SU(9) monopoles, a neutral system of nine monopoles can cluster in three groups of three, bound internally and externally by Y-shaped strings that are connected via vortex bifurcations. In this way, the Meissner effect would confine both omegons inside quarks and quarks in hadrons. Identification of quarks as clusters of three SU(9) monopoles guarantees zero triality for bound states of quarks. This empirical rule is a dynamical consequence of an SU(9) vacuum, which confines fermions with magnetic charge (omegons) but not fermions with zero magnetic charge (leptons). In this scheme, mesons are quark-antiquark pairs, joind by three strings, not by one."
 
Physics news on Phys.org
I had not come across that paper. Another fairly recent paper proposed a preon model is found at: http://www.worldscientific.com/doi/abs/10.1142/S0217732311036802 which has been self-cited by the author in subsequent papers through 2016:

HARALD FRITZSCH, Mod. Phys. Lett. A, 26, 2305 (2011). DOI: http://dx.doi.org/10.1142/S0217732311036802COMPOSITE WEAK BOSONS, LEPTONS AND QUARKS
HARALD FRITZSCH
University of Munich, Faculty of Physics, Arnold Sommerfeld Center for Theoretical Physics, Germany
Received: 23 August 2011

The weak bosons are interpreted as composite particles. They consist of two fundamental fermions, bound by a confining gauge interaction. The mass scale of this new interaction is determined. A new neutral isoscalar weak boson X must exist. Its mass is expected to be less than 1 TeV. It will decay mainly into quark and lepton pairs and into two or three weak bosons. Above 1 TeV excitations of the weak bosons must exist, which mainly decay into pairs of weak bosons. Leptons and quarks consist of a fermion and a scalar. Two scalars form bound states, which are either neutral in charge or have charge 2/3 e.

CITATIONS:

Harald Fritzsch. (2016) Composite weak bosons at the large hadronic collider. Modern Physics Letters A 31:20. Online publication date: 28-Jun-2016. [Abstract | http://www.worldscientific.com/doi/pdf/10.1142/S0217732316300196 | http://www.worldscientific.com/doi/pdfplus/10.1142/S0217732316300196]
Harald Fritzsch. (2014) Composite weak bosons. International Journal of Modern Physics A 29:21. Online publication date: 20-Aug-2014. [Abstract | http://www.worldscientific.com/doi/pdf/10.1142/S0217751X14440175 | http://www.worldscientific.com/doi/pdfplus/10.1142/S0217751X14440175]
Herbert W. Hamber, Reiko Toriumi. (2014) Composite leptons at the LHC. Modern Physics Letters A 29:07. Online publication date: 7-Mar-2014. [Abstract | http://www.worldscientific.com/doi/pdf/10.1142/S0217732314500345 | http://www.worldscientific.com/doi/pdfplus/10.1142/S0217732314500345]
Harald Fritzsch, Joan Solà. (2014) Quantum Haplodynamics, Dark Matter, and Dark Energy. Advances in High Energy Physics 2014, 1-6. Online publication date: 1-Jan-2014. [CrossRef]
HARALD FRITZSCH. (2013) COMPOSITE WEAK BOSONS AND DARK MATTER. Modern Physics Letters A 28:23. Online publication date: 30-Jul-2013. [Abstract | http://www.worldscientific.com/doi/pdf/10.1142/S021773231350106X | http://www.worldscientific.com/doi/pdfplus/10.1142/S021773231350106X]
Harald Fritzsch. (2012) The size of the weak bosons. Physics Letters B 712:3, 231-232. Online publication date: 1-Jun-2012.http://dx.doi.org/10.1016/j.physletb.2012.04.067
 
Last edited by a moderator:
  • Like
Likes   Reactions: cube137
ohwilleke,

Do you have any idea why our forces of nature has the pattern SU(3)xSU(2)xU(1)? Is it just a coincidence? I know about gauge bosons and how these are just counterterms to add to the wave function to make it obey local gauge invariance or symmetry of spacetime.. In other Anthropic multiverse.. could the strong force be made of SU(2) or even SU(4) or all forces having the same SU(2) or other combination.. like SU(4)xSU(4)xU(1).. why 3-2-1. It could make sense if it is part of GUT with SU(5) but there is no proton decay. What else is the reason for SU(3)xSU(2)xU(1).. how are they really related? any missing components that could unify it that doesn't use a larger symmetry group?

Also are there other ways to create universes without using gauge symmetry and gauge bosons? Could you use other building blocks that could recreate our same universe? Is SU(3)xSU(2)xU(1) telling us something like perhaps we are inside a computation machine and part of the computations or something? Any papers about alternative ways to create universes?
 
I have no credible idea why our forces of nature have the pattern SU(3)xSU(2)xU(1), and I'm not sure that I could understand an answer if someone presented me with one. My wild ass guess is that it might relate to the number of dimensions, or that every possible degree of freedom up to the highest number, is involved under the motto of everything that is possible is mandatory. I've seen interesting speculations about gravity as a sort of QCD squared, and about gravit-weak unification.

I suspect that any progress requires some new insight that takes us away from the tinkering with group representations approach that very smart people have been trying for several decades without success.

A "coincidence" is an odd way to describe something that is a singular defining fact of the way the universe it. I think it is a category error to think of it in probability terms.

I think the anthropic multiverse approach doesn't deserve to be called science and has no interesting insights to provide us, and don't put any thought into how one could go about creating universes.

I think that the SM is almost complete (apart from quantum gravity) and that no new non-composite particles or forces are likely to be discovered ever (although there is an outside chance of some minor tweaks - perhaps a boson that plays a part in neutrino oscillation for example).
 
ohwilleke said:
I have no credible idea why our forces of nature have the pattern SU(3)xSU(2)xU(1), and I'm not sure that I could understand an answer if someone presented me with one. My wild ass guess is that it might relate to the number of dimensions, or that every possible degree of freedom up to the highest number, is involved under the motto of everything that is possible is mandatory. I've seen interesting speculations about gravity as a sort of QCD squared, and about gravit-weak unification.

I suspect that any progress requires some new insight that takes us away from the tinkering with group representations approach that very smart people have been trying for several decades without success.

A "coincidence" is an odd way to describe something that is a singular defining fact of the way the universe it. I think it is a category error to think of it in probability terms.

I think the anthropic multiverse approach doesn't deserve to be called science and has no interesting insights to provide us, and don't put any thought into how one could go about creating universes.

I think that the SM is almost complete (apart from quantum gravity) and that no new non-composite particles or forces are likely to be discovered ever (although there is an outside chance of some minor tweaks - perhaps a boson that plays a part in neutrino oscillation for example).

I see. Anyway in my first message I mentioned the paper that predict new Leptons. Have this been seen in the LHC or it's heavier than the accelerator capability? Again:

"Unlike in the Pati-Salam scheme, there are no exotic gluons strongly coupling fundamental hadrons with leptons. The proton is predicted to be absolutely stable, therefore. Also, baryon number and lepton number L = ne + nu + nr + n(omega) + nk are separately conserved. Two new heavy leptons are predicted: omega- + k-. They should appear in the electron positron annihilation reactions: e+ + e- = omega+ + omega-, e+ + e- = k+ + k-.
 
Nobody has seen any new leptons. In fact, no one has seen any new beyond the Standard Model particles of any kind. Nor has anyone seen any beyond the Standard Model forces, or prohibited by QCD composite particles. And, there are no fundamental particles or forces predicted by the Standard Model which have not been observed.
 
ohwilleke said:
Nobody has seen any new leptons. In fact, no one has seen any new beyond the Standard Model particles of any kind. Nor has anyone seen any beyond the Standard Model forces, or prohibited by QCD composite particles. And, there are no fundamental particles or forces predicted by the Standard Model which have not been observed.

They say (esp. Lubos) we only have 1% of the LHC data.. does it mean if we see the other 99% of data, there is a possibility the 750 GeV Diphoton bump can still appear? Is the data similar to say resolution of a picture.. where the 1% means it is resolution of 120x80 and scouting the entire 99% means the resolution is really 12000x8000 but you can make out the gross picture already at 120x80? or is scouting 99% of rest of LHC data specifically related to higher TeV meaning those already excluded are forever excluded?
 
Getting four times more data on the same experimental apparatus increases signal-to-noise by 2 times. Image stacking in astronomy is a common example.
For LHC, more data means the same: better SNR. Smaller peaks can be discerned.
 

Similar threads

  • · Replies 70 ·
3
Replies
70
Views
9K
  • · Replies 26 ·
Replies
26
Views
5K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 61 ·
3
Replies
61
Views
9K
  • · Replies 6 ·
Replies
6
Views
7K
  • · Replies 31 ·
2
Replies
31
Views
5K
  • · Replies 3 ·
Replies
3
Views
3K