Death to background independence

In summary: This is a very bold statement, and I don't have time to go into all the details now, but it seems like this might be the first time that a theory of gravity has been presented in a fully covariant form.In summary, the paper is an introduction to a new framework for unification of all the fundamental interactions, including gravity. There are some pre-dictions, but more importantly the framework provides a way to quantitatively evaluate higher order corrections to the particles in the model.
  • #1
atyy
Science Advisor
15,167
3,374
:smile:

What do you guys think of the new non-background independent Chamseddine and Connes paper?
 
Physics news on Phys.org
  • #3
I'm glad to see Connes back in action!
I'm not going to pick nits about some doctrinal detail right now, I want to get the overall picture. Here's the paper, in case anyone else wants to have a look:

http://arxiv.org/abs/1004.0464
Noncommutative Geometry as a Framework for Unification of all Fundamental Interactions including Gravity. Part I
Ali H. Chamseddine, Alain Connes
56 pages
(Submitted on 3 Apr 2010)
"We examine the hypothesis that space-time is a product of a continuous four-dimensional manifold times a finite space. A new tensorial notation is developed to present the various constructs of noncommutative geometry. In particular, this notation is used to determine the spectral data of the standard model. The particle spectrum with all of its symmetries is derived, almost uniquely, under the assumption of irreducibility and of dimension 6 modulo 8 for the finite space. The reduction from the natural symmetry group SU(2)xSU(2)xSU(4) to U(1)xSU(2)xSU(3) is a consequence of the hypothesis that the two layers of space-time are finite distance apart but is non-dynamical. The square of the Dirac operator, and all geometrical invariants that appear in the calculation of the heat kernel expansion are evaluated. We re-derive the leading order terms in the spectral action. The geometrical action yields unification of all fundamental interactions including gravity at very high energies. We make the following predictions:

(i) The number of fermions per family is 16.

(ii) The symmetry group is U(1)xSU(2)xSU(3).

(iii) There are quarks and leptons in the correct representations.

(iv) There is a doublet Higgs that breaks the electroweak symmetry to U(1).

(v) Top quark mass of 170-175 Gev.

(vi) There is a right-handed neutrino with a see-saw mechanism.


Moreover, the zeroth order spectral action obtained with a cut-off function is consistent with experimental data up to few percent. We discuss a number of open issues. We prepare the ground for computing higher order corrections since the predicted mass of the Higgs field is quite sensitive to the higher order corrections. We speculate on the nature of the noncommutative space at Planckian energies and the possible role of the fundamental group for the problem of generations."

The first thing I notice is that however he goes about building his model, he gets some pre(post)dictions.
 
  • #4
I'm resigned to not being able to understand more than the first few sentences of the actual paper, but I'd be interested in understanding more about what this really means in terms of experimental tests.

"Top quark mass of 170-175 Gev." Flipping through the article casually, I didn't have much luck finding where this was discussed. You clearly can't predict a dimensionful quantity directly unless your theory has some scale built into it already. Are they really retrodicting the dimensionless ratio of the top quark mass to the Planck mass?

What's so special about the top quark in their model? If they have 16 fermions per family, then isn't the top just one particular member chosen out of the middle of one of their lists of 16? The really exciting thing would have been a prediction of the masses of the previously undetected members of their families of 16.

What is a "doublet Higgs?"
 
  • #5
Hmm,
what they have here looks very much like what they came out with in 2006-2007. Not different in kind, more just an adapted refined adjusted version.

That model was afterwards reformulated to achieve a degree of background independence by two Danes, Aastrup and Grimstrup.

The A&G approach uses LQG techniques. A&G have taken on one or two more co-authors and are still developing this kind of merger between Lqg and Connes Ncg standard model.

The idea is to do LQG for the overall spacetime (quantum) geometry and have the whole
Standard Model circus of particles available everywhere as well.

I don't know how successful the A&G program has been, but I see that Jesper Grimstrup was one of those who gave a paper last month at Zakopane, at this year's main Loop/Foam workshop.

Anyway, there are various possible ways things could develop. I wouldn't ignore the possibility of Connes' standard model getting an implementation that is independent of its 4D background geometry. I could be wrong of course though. have to go, back later.
============================

I just got back home and had another look at the new Connes paper. It presents a framework for unification. For the moment I see no reason to imagine that this framework is incompatible with background independence at the level of the 4d manifold. Perhaps one will show up, but at the moment (especially in light of several papers by Jesper Grimstrup) I see no reason to suppose incompatibility.

On the other hand what is probably more striking is that Connes' package of the Standard Model particles does not involve supersymmetry.

Did anyone else notice that little detail? :biggrin: Or am I wrong? It purports to hold out to very high energy, and the package is quite elegant. It is simple/economical. It is testable in ways that give a fair chance of proving it wrong, if it is wrong.
And if it is right then it holds out to very high energy (the 'big desert scenario' of a big stretch of energies with no new physics, once we get past the EW breaking/Higgs business.) This is just my first impression. Does anyone get a similar impression from the paper?
 
Last edited:
  • #6
I guess background independence is still alive - it is not a theory of quantum gravity.

"Since the model we developed contains both gravity and the Standard Model it is clear that this problem is the problem of quantizing gravity. We refer the reader to [57] for interesting suggestions concerning the role of the ghost fields. One challenging problem at this point is to compute the bosonic propagator for the inner fluctuations of the metric using the spectral action and functional derivatives of tracial functions. One may hope that the techniques developed in the context of renormalization of QFT on noncommutative spaces will be useful in the building of the quantum theory of the spectral action."
 
  • #7
marcus said:
The symmetry group is U(1)xSU(2)xSU(3).
I think this prediction was already in "why the standard model" and "A dress for standard model the beggar" but it is disputed. It was not a strict requirement, but rather the simplest ansatz. I do not expect that has improved, but I have not studied the paper yet. With this word of caution, I too am very glad he is (they are) back in business !

Also a new post on the blog
http://noncommutativegeometry.blogspot.com/
 
  • #8
atyy said:
I guess background independence is still alive...

It would seem so. And the new Connes Chamseddine paper will put additional energy into the LQG program. The synergy between the approaches is strong. You may recall that Chamseddine gave a plenary talk at the main Loops conference of 2008--organized by John Barrett at Nottingham.

Then this year we had the Zakopane Loop/Foam workshop in March
http://www.fuw.edu.pl/~jpa/qgqg3/schedule.html
and the Tuesday 2 March line-up was just these 5 talks:
Carlo Rovelli: SF give a new dynamics to Canonical LQG
Bianca Dittrich: Diffeomorphism in discrete gravity
James Ryan: Phase space of discrete BF theory and LQG
Jesper Grimstrup: Coupling fermions to LQG by using the non-commutative spectral action
Yongge Ma: Emergence of scalar matter from spinfoam model

Grimstrup's slides PDF are online. He uses the Connes Chamseddine way of realizing the Standard Model to get particles into LQG.

The liaison between research fields goes way back. Alain Connes and Carlo Rovelli have co-authored a paper, as I recall.
======================

Powell, we should really get links to the 2007 papers of Connes Chamseddine. It could make better expository sense because it is more like the introduction. What they just posted now in 2010 is more of a refinement/continuation.

Humanino in his post just now also referred to the two main 2007 papers.
 
  • #9
It may also be helpful to look at John Barrett's paper on realizing the Standard Model in Noncommutative Geometry. Barrett is one of a handful of top Loop/Foam people. He and Connes brought out essentially the same result within a week of each other in 2006. The exposition may in some cases actually be simpler in Barrett's paper, or at least having two different viewpoints can be helpful:

http://arxiv.org/abs/hep-th/0608221
A Lorentzian version of the non-commutative geometry of the standard model of particle physics
John W. Barrett
14 pages, J.Math.Phys.48:012303,2007
(Submitted on 31 Aug 2006)
"A formulation of the non-commutative geometry for the standard model of particle physics with a Lorentzian signature metric is presented. The elimination of the fermion doubling in the Lorentzian case is achieved by a modification of Connes' internal space geometry so that it has signature 6 (mod 8) rather than 0. The fermionic part of the Connes-Chamseddine spectral action can be formulated, and it is shown that it allows an extension with right-handed neutrinos and the correct mass terms for the see-saw mechanism of neutrino mass generation."

Here are the two Connes Chamseddine papers from 2007 that Humanino mentioned:

http://arxiv.org/abs/0706.3688
Why the Standard Model
Ali H. Chamseddine, Alain Connes
13 pages, J.Geom.Phys.58:38-47,2008
(Submitted on 25 Jun 2007)
"The Standard Model is based on the gauge invariance principle with gauge group U(1)xSU(2)xSU(3) and suitable representations for fermions and bosons, which are begging for a conceptual understanding. We propose a purely gravitational explanation: space-time has a fine structure given as a product of a four dimensional continuum by a finite noncommutative geometry F. The raison d'etre for F is to correct the K-theoretic dimension from four to ten (modulo eight). We classify the irreducible finite noncommutative geometries of K-theoretic dimension six and show that the dimension (per generation) is a square of an integer k. Under an additional hypothesis of quaternion linearity, the geometry which reproduces the Standard Model is singled out (and one gets k=4)with the correct quantum numbers for all fields. The spectral action applied to the product MxF delivers the full Standard Model,with neutrino mixing, coupled to gravity, and makes predictions(the number of generations is still an input)."

http://arxiv.org/abs/0706.3690
Conceptual Explanation for the Algebra in the Noncommutative Approach to the Standard Model
Ali H. Chamseddine, Alain Connes
The title "A Dress for SM the Beggar" was changed by the Editor of Physical Review Letters
(Submitted on 25 Jun 2007)
"The purpose of this letter is to remove the arbitrariness of the ad hoc choice of the algebra and its representation in the noncommutative approach to the Standard Model, which was begging for a conceptual explanation. We assume as before that space-time is the product of a four-dimensional manifold by a finite noncommmutative space F. The spectral action is the pure gravitational action for the product space. To remove the above arbitrariness, we classify the irreducibe geometries F consistent with imposing reality and chiral conditions on spinors, to avoid the fermion doubling problem, which amounts to have total dimension 10 (in the K-theoretic sense). It gives, almost uniquely, the Standard Model with all its details, predicting the number of fermions per generation to be 16, their representations and the Higgs breaking mechanism, with very little input. The geometrical model is valid at the unification scale, and has relations connecting the gauge couplings to each other and to the Higgs coupling. This gives a prediction of the Higgs mass of around 170 GeV and a mass relation connecting the sum of the square of the masses of the fermions to the W mass square, which enables us to predict the top quark mass compatible with the measured experimental value. We thus manage to have the advantages of both SO(10) and Kaluza-Klein unification, without paying the price of plethora of Higgs fields or the infinite tower of states."

Blue: I think this particular version of Connes Standard Model ran into trouble because experiment seemed to rule out Higgs around 170. Please correct me if I am mistaken.
However that was 2007 and now they have a different version.

Red: It sounds like a post-diction of the top quark mass. Is the experimentally measured mass also around 170-175? Is that what the current 2010 version post-dicts? Is it clearer in the 2007 paper how they derive this value of top quark mass?
 
Last edited:
  • #10
bcrowell said:
"Top quark mass of 170-175 Gev." ... You clearly can't predict a dimensionful quantity directly unless your theory has some scale built into it already.
That's not correct. Look at QCD: its classical action is scale-free, no dimensionful parameter. But due to the renormalization group calculations in the effective, quantum mechanical action a new scale often called LambdaQCD is generated. Otherwise you would have no preferred mass or energy scale for nucleons yand you could find protons and neutrons aof all sizes and masses.

bcrowell said:
What's so special about the top quark in their model?
The top quark mass is just too heavy in the standard model. It is expected that there is something special simply because it does not fit into the overall quark mass scale.

bcrowell said:
What is a "doublet Higgs?"
There are several possibilities how can can build Higgs-based models. It is not required that there is just one Higgs boson. Especially in SUSY-based models there can be families of Higgs bosons, the simplest possibility in the MSSM are two Higgs doublets: http://en.wikipedia.org/wiki/Minimal_Supersymmetric_Standard_Model
 
  • #11
That's not correct. Look at QCD: its classical action is scale-free, no dimensionful parameter. But due to the renormalization group calculations in the effective, quantum mechanical action a new scale often called LambdaQCD is generated. Otherwise you would have no preferred mass or energy scale for nucleons yand you could find protons and neutrons aof all sizes and masses.

You can't do renormalization group calculations without fixing the value of coupling at a certain value of energy.

Once you do that, you're no longer scale free.

Flipping through the article casually, I didn't have much luck finding where this was discussed. You clearly can't predict a dimensionful quantity directly unless your theory has some scale built into it already. Are they really retrodicting the dimensionless ratio of the top quark mass to the Planck mass?

Page 25, last two paragraphs. Looks like they are picking values for two parameters out of the thin air and then saying "look, if this parameter has this value and this parameter has this value, our theory predicts Higgs of 170 GeV and top quark of 179 GeV".

Whether it really does predict that, I can't say because they lost me halfway through the first paragraph of section 2. All words are familiar but they don't add up to a meaning.
 
Last edited:
  • #12
hamster143 said:
You can't do renormalization group calculations without fixing the value of coupling at a certain value of energy.

Once you do that, you're no longer scale free.
I agree. This is essentially the way how quantization and renormalization generate a mass scale which is absent in classical physics. The scale is not "built in" but emergent.
 
  • #13
marcus said:
Blue: I think this particular version of Connes Standard Model ran into trouble because experiment seemed to rule out Higgs around 170. Please correct me if I am mistaken.
I think the Tevatron is in trouble, because they tried to get more from their data than they can :smile:
Predictions for Higgs production at the Tevatron and the associated uncertainties
We update the theoretical predictions for the production cross sections of the Standard Model Higgs boson at the Fermilab Tevatron collider, focusing on the two main search channels, the gluon-gluon fusion mechanism [tex]gg \to H[/tex] and the Higgs-strahlung processes [tex]q \bar q \to VH[/tex] with [tex]V=W/Z[/tex], including all relevant higher order QCD and electroweak corrections in perturbation theory. We then estimate the various uncertainties affecting these predictions: the scale uncertainties which are viewed as a measure of the unknown higher order effects, the uncertainties from the parton distribution functions and the related errors on the strong coupling constant, as well as the uncertainties due to the use of an effective theory approach in the determination of the radiative corrections in the [tex]gg \to H[/tex] process at next-to-next-to-leading order. We find that while the cross sections are well under control in the Higgs--strahlung processes, the theoretical uncertainties are rather large in the case of the gluon-gluon fusion channel, possibly shifting the central values of the next-to-next-to-leading order cross sections by more than [tex]\approx 40[/tex]%. These uncertainties are thus significantly larger than the [tex]\approx 10[/tex]% error assumed by the CDF and D0 experiments in their recent analysis that has excluded the Higgs mass range [tex]M_H=[/tex]162-166 GeV at the 95% confidence level. These exclusion limits should be, therefore, reconsidered in the light of these large theoretical uncertainties.
However, Connes' model relies on the "big desert" hypothesis.
 
  • #14
I think the new paper is remarkable : they clearly made an effort towards the physics, in two related aspects. First, the calculations are presented in details using a tensorial notation more readable to physicists. But the use of this notation is also related to an effort towards quantitative predictions.

I find it quite exciting. Those efforts might trigger more inspiration for model building, as there is an enormous potential to harvest. In particular, hinting again towards the relevance of twistor methods.

I have a specific question however. As Marcus, I did not get the "background dependence" part. I do not think there is any issue here. For instance :
A Note on Background Independence in Noncommutative Gauge Theories, Matrix Model and Tachyon Condensation
 
  • #15
Can anybody explaint this

Before the reduction to the subgroup U(1) × SU(2) × SU(3) (coming from
the order one condition and the hypothesis of finite distance between the
two copies of the four-dimensional manifold
 

What is "Death to background independence"?

"Death to background independence" is a phrase used in the field of theoretical physics, specifically in the study of quantum gravity. It refers to the idea that the concept of a fixed background space-time, as described by general relativity, should be discarded in favor of a more fundamental, background-independent theory.

Why is background independence important in physics?

Background independence is important because it allows for a more complete and unified understanding of the universe. In traditional theories, such as general relativity, the background space-time is treated as fixed and unchanging. However, in reality, space-time is constantly evolving and interacting with matter and energy. A background-independent theory would be able to account for these interactions and provide a more accurate understanding of the universe.

What are some potential challenges in developing a background-independent theory?

One of the main challenges in developing a background-independent theory is the mathematical complexity involved. The concept of a fixed background space-time is often used as a simplifying assumption in theories, and removing it can make the mathematics significantly more difficult. Additionally, there is currently no consensus on what a background-independent theory would look like, so researchers are still exploring different approaches and ideas.

Has there been any progress towards a background-independent theory?

Yes, there have been various theories proposed that aim to be background-independent, such as loop quantum gravity, causal set theory, and spin networks. However, these theories are still in the early stages of development and have not been fully tested or accepted by the scientific community.

How could a background-independent theory impact our understanding of the universe?

A successful background-independent theory could have a significant impact on our understanding of the universe by providing a unified framework for understanding gravity and other fundamental forces. It could also potentially resolve some of the current conflicts and limitations of existing theories, such as the incompatibility of general relativity and quantum mechanics. Additionally, a background-independent theory could lead to new insights and discoveries about the nature of space and time.

Similar threads

  • Beyond the Standard Models
Replies
5
Views
2K
  • Beyond the Standard Models
Replies
5
Views
2K
Replies
4
Views
784
  • Beyond the Standard Models
Replies
22
Views
5K
  • Beyond the Standard Models
Replies
13
Views
2K
  • Beyond the Standard Models
Replies
5
Views
3K
  • Beyond the Standard Models
Replies
4
Views
2K
  • Beyond the Standard Models
Replies
4
Views
4K
  • Beyond the Standard Models
Replies
3
Views
2K
Replies
9
Views
3K
Back
Top