Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Lorentz violating severely restricted: Mqg/Mplank > 1200

  1. Aug 14, 2009 #1


    User Avatar
    Gold Member


    Testing Einstein's special relativity with Fermi's short hard gamma-ray burst GRB090510
    Authors: Fermi GBM/LAT Collaborations
    (Submitted on 13 Aug 2009)

    Abstract: Gamma-ray bursts (GRBs) are the most powerful explosions in the universe and probe physics under extreme conditions. GRBs divide into two classes, of short and long duration, thought to originate from different types of progenitor systems. The physics of their gamma-ray emission is still poorly known, over 40 years after their discovery, but may be probed by their highest-energy photons. Here we report the first detection of high-energy emission from a short GRB with measured redshift, GRB 090510, using the Fermi Gamma-ray Space Telescope. We detect for the first time a GRB prompt spectrum with a significant deviation from the Band function. This can be interpreted as two distinct spectral components, which challenge the prevailing gamma-ray emission mechanism: synchrotron - synchrotron self-Compton. The detection of a 31 GeV photon during the first second sets the highest lower limit on a GRB outflow Lorentz factor, of >1200, suggesting that the outflows powering short GRBs are at least as highly relativistic as those powering long GRBs. Even more importantly, this photon sets limits on a possible linear energy dependence of the propagation speed of photons (Lorentz-invariance violation) requiring for the first time a quantum-gravity mass scale significantly above the Planck mass.


    As I said elsewhere, the violation might be statistical. Not a naive, eliminated by 1 photon.
  2. jcsd
  3. Aug 14, 2009 #2


    User Avatar
    Science Advisor
    Gold Member
    Dearly Missed

    Great find! The number 1200 in your headline may be somewhat inaccurate however.
    See Table 4 on page 23 of the supporting material here:
    http://gammaray.nsstc.nasa.gov/gbm/grb/GRB090510/supporting_material.pdf [Broken]

    And also Table 2 in the main paper, which is essentially the same as that in the supporting material, but gives less explanation.

    They give several lower bounds for the Mqg/Mplanck ratio, which are based on different reasoning. None of the estimates say > 1200.

    What they call their "most conservative" estimate says > 1.19
    Their "least conservative" or most risky estimate says > 102.
    Last edited by a moderator: May 4, 2017
  4. Aug 14, 2009 #3


    User Avatar
    Gold Member

    Alright, then please corret the title to

    "Lorentz violating severely restricted: 1.19 < Mqg/Mplank < 102"

    I didn't find this article, even though every they I check astrophysics. It was someone that send it to LM and he posted the link on his blog. He used a line from "The Big Bang" TV comedy show to demonstrate that LQG is ruled out by this article. No kidding.
  5. Aug 14, 2009 #4


    User Avatar
    Science Advisor
    Gold Member
    Dearly Missed

    I can't edit other people's but you could PM a request to a Mentor.
    I would suggest saying

    Mqg/Mplanck > 1.2

    That would be a correct interpretation of their result. It is good to use conservative language in a headline and then you can always say in your post later that one possible interpretation of the data leads to a more stringent conclusion, namely
    Mqg/Mplanck > 102

    Unfortunately no one has yet been able to derive Lorentz violation from the main LQG or Spinfoam models (in the 4D case). So this result is very interesting but does not disfavor LQG.

    There have been both string and LQG papers which suggested there might be some finite Mqg, but even in the string case I know only of suggestion and speculation. So in neither case does anything get falsified.

    This kind of data from Fermi-LAT is a valuable guide to LQG researchers. The Fermi mission looks like it is going to make a big contribution to beyond-standard physics and the topics discussed in this forum.

    In July there was this paper by Doug Finkbeiner interpreting some Fermi-LAT data relating to the possible make-up of dark matter. They are helping to figure out what a possible WIMP could be like. Another great Fermi development.

    Under no conditions would you want to say Mqg/Mplanck < 102
    They did not show this.
    It is better to simply say Mqg/Mplanck > 1.2

    It could easily be that Mqg is infinite, that is equivalent to saying there is no violation or modification of Lorentz invariance at all (at least to first order).
    No one has shown an upper bound. We only have lower bounds. The higher you can push them the more like saying the ratio is infinite and there is no modification, no dispersion.
    Last edited: Aug 14, 2009
  6. Aug 14, 2009 #5


    User Avatar
    Gold Member

    Or that Mqg doesn't make sense, whatever delays may be a statistical effect of photon/space-time fluctuation. Try to look for a moving peak as you move to higher energies.
  7. Aug 15, 2009 #6
    Don't be completely silly, Marcus. Every single model marketed as loop quantum gravity, spinfoam, causal dynamical triangulation, Horava-Lifgarbagez gravity, and dozens of other names violates the Lorentz symmetry by first-order terms, with a coefficient of order one, and is simply safely dead after this paper.

    The "only" way how the paper may be useful to the researchers in LQG or any other field mentioned above is to show them that they have wasted their professional lives because their whole reasoning was based on a fundamentally wrong assumption, namely a complete denial of the 1905 Einstein't theory of relativity. There's no way to revive the same hypothesis that has been as cleanly falsified as Fermi did with all the discrete models of spacetime at the Planck scale.

    You're also completely deluded when you say that there are doubts that Lorentz symmetry at the Planck scale has to be respected by string theory.

    It is a fundamental law that holds everywhere in string theory. If you read at least one section of any textbook on string theory, you will see that string theory is first motivated by the Lorentz-invariant Nambu-Goto action - the proper area of the worldsheet - and this Lorentz invariance is preserved by all interactions, objects, and known vacua in string theory. It may be at most spontaneously broken, by the configuration of spacetime (e.g. B-field), but it surely holds at the fundamental scale.

    If you're unable to comprehend that this game and debate about LQG and similar stupidities is simply over, you're just unteachable crackpots.
  8. Aug 15, 2009 #7
    I'll comment on what I feel qualified to comment on. Models that use causal dynamical triangulations do not suggest lorentz invariance violations in nature simply because the continuum limit is always taken, it does not suppose that spacetime is descrete.

    Obviously Horava violates lorentz. Spin Foams/Loops I'm not sure if these do or not. Certainly they quantize areas and volumes in Loops but I don't think this nessarily means that lorentz is violated.

    Also calling people silly and stupid because they a disagree with you is a bit off. Then you go on too lump CDT, loops, spin foam and Horava together which clearly shows your ignorance of these different approaches.
  9. Aug 15, 2009 #8


    User Avatar
    Gold Member

  10. Aug 15, 2009 #9
  11. Aug 15, 2009 #10
    Dear Finbar,

    except that to respect the Lorentz symmetry, it's not enough not to be "discrete" (note the spelling). Even if you take the continuum limit (but you work in the Minkowski space), the "triangles" in the triangulation inevitably pick a privileged reference frame (another version of an aether!), and therefore break the Lorentz symmetry. Only the continuum limit of lattice-like structures in the Euclidean signature would have a chance to reproduce the Euclidean version of the Lorentz symmetry.

    Every theory where areas are quantized has to violate the Lorentz symmetry at the Planck scale (or the scale of the quanta). This is easy to see by a big boost. Almost null surfaces must have a very small proper area, but whenever the area is calculated as a sum over some intersections with anything resembling a spinfoam or spin network, the area is inevitably as large as similarly large (in the coordinate space) spacelike areas. Moreover, if one counts it from the spinfoam, the areas can never become imaginary, i.e. cannot distinguish time-like and space-like areas. The conclusion is that theories with discrete spectra for areas can't possibly respect the Lorentz symmetry.

    The violation of the Lorentz symmetry is actually huge at all distance scales, but these people were sticking to a lot of wishful thinking, hoping that the symmetry would only be broken at the Planck scale but gets restored at low energies. Even this very unlikely wishful thinking has been ruled out by now because the Lorentz violation doesn't exist even at the Planck scale.

    For papers showing that loop quantum gravity - and all other non-stringy theories of quantum gravity, for that matter - have to violate the Lorentz symmetry (and contradict the GZK cutoff), see, for example:


    MtD2, your statement about a "statistical violation" is completely meaningless. It doesn't matter that the conclusion stands primarily on one, highest-energy photon, unless there is a risk that the photon didn't come from the burst, which is extremely unlikely. Assuming that the photon has something to do with the burst, one can reconstruct the statistical distribution for the times when such photons should arrive, and the probability that it would arrive at the observed time - while (assuming the now-excluded assumption that) the journey would create delays (corresponding to the Planckian Lorentz-violating mass scale) - is de facto zero. This is the relevant statistics here - and it shows that at a very high confidence level, the coefficient of the violating term must be much smaller than the inverse Planck scale.

    I distinguish all the approaches and know all the critical differences between them. But that doesn't change one common feature of all of them: they have been proved wrong and I think that only stupid people will continue to work on them after this result. Sorry but this follows from my detailed understanding of physics and the term stupidity.
  12. Aug 15, 2009 #11
    I have to admit that Smolin has done a terrible job at claiming Lorentz violation for sure in LQG. You have to admit that publishing LQG "predictions" in Nucl.Phys. B is (to say the least) suspicious.

    The rest of your references shoots yourself in the foot
    Interestingly, both references
    are commented in your own first reference as
    Your manners are quite arrogant I should say. There is no need to be so aggressive.
  13. Aug 15, 2009 #12


    User Avatar
    Science Advisor

    Sotiriou, Visser and Weinfurtner, http://arxiv.org/abs/0905.2798: we can drive the Lorentz breaking scale arbitrarily high by suitable adjustment ............. Since the ultraviolet dominant part of the Lorentz breaking is sixth order in momenta, it neatly evades all current bounds on Lorentz symmetry breaking.

    I wonder whether this comment would still hold. I remember from one of marcus's posts Vissers soon to give some conference talk about something like "Who's afraid of Lorentz symmetry breaking?"
    Last edited: Aug 15, 2009
  14. Aug 15, 2009 #13
    Dear humanino,
    all the papers are "blurred" because the whole loop quantum gravity and all similar theories are ill-defined vague piles of nonsensical unphysical formalisms, so no calculations based on these theories can ever be trusted about anything, and the authors only state the very same fact, giving it a positive spin.

    But what's important is that there is no candidate calculation based on these theories where the Lorentz violations would cancel. There can't be any because these approaches "fundamentally" contradict the Lorentz symmetry at the Planck scale, by their very philosophy. For example, proper areas are thought to be sums of real numbers such as sqrt(j(j+1)) which can't go imaginary, as needed for timelike two-surfaces. For relativity to hold, the areas, whenever they can be defined, must be allowed to be continuous, and must be allowed to go imaginary. This is a simple way to see that all possible "revivals" of any of those discrete pictures will fail in the future, too.

    Give me a break with the arrogance. I am just alarmed that some people want to dilute this experimental result and its consequences on physics. But physics is all about direct and indirect comparisons of observations with theories. And this observation happens to be extremely clean and settles the question. It proves that people like me have always been right and people around loop quantum gravity have always been wrong, using their poor education, weak intelligence, and lacking intuition to study questions that go well beyond their abilities. The result proves that all sponsors and foundations who have funded theories building on the assumption that Lorentz symmetry will have to be broken have wasted the money, and as soon as they care about the empirical data, they should learn a lesson and fire all these people.

    I will not allow anyone to create fog about this very clear situation.
  15. Aug 16, 2009 #14


    User Avatar
    Science Advisor

    But if the LQG formalism is ill-defined in the first place, how can it make predictions? If it doesn't make predictions, how can it be falsified by experiment?
  16. Aug 16, 2009 #15
    I can only say that I am glad not to work with you around. I have never seen so little respect among professionals. As a matter of fact, the people I meet at conferences who disagree about philosophical approaches still tend to be interested in each other's technical constructions and talk in a civilized manner. That you can even think being wrong about a theory implies being fired from one's position is beyond all credible discussion. Did you never make any mistake ? How can one come up with such lines ? Even if you happened to be right about the science, the way you present it makes it hard to swallow.

    Let me put aside the theory, since you made it clear you do not want to hear the (quite significant) part of the LQG community disagreeing with you. You are ready to put all your eggs on a single observation of a single event ? How long have you been following historical developments in science ?
  17. Aug 16, 2009 #16
    Dear atty,
    it may be remarkable. But while the formalism is not well-defined enough to actually calculate precise numbers, it says a lot of sufficiently specific qualitative facts that it is possible to determine that the results can't be Lorentz-invariant, even if one can't calculate what the results are. So LQG successfully fails on both counts: it is ill-defined, and despite it's being ill-defined, one can show that it is wrong.

    It's like a theory of angels pushing the planets from the rear side, to orbit around the Sun. It's not good enough to make quantitative predictions, because the behavior of the angels is not determined, but it is specific enough to prove that it is wrong. Observations show that the angels push the planets from the outer side ;-), like in Newton's attractive gravitational force.

    Of course, the goal in physics is just the opposite. We want theories that allow us to calculate things quantitatively, and we want theories that make correct predictions, not wrong predictions. We want theories such as QCD or string theory.

  18. Aug 16, 2009 #17
    I don't have any respect because they don't deserve it. The Academia and professional science has been literally flooded by low-quality people who justify their existence (and funding) by brainwashing, lies, victimism, and whining. Most of this stuff is paid for by the taxpayer. Science has lost much of the standards and is becoming unworthy of respect as a whole, and I just think it is a very bad evolution.

    So what I want is the return to the standards. People predicting correct things must get advantages while people predicting wrong things - and people who are generally incompetent - should never be getting the same thing, regardless of the amount of demagogy and disgusting pathetic whining like yours. They must be eliminated, otherwise the science and mankind will face for a real trouble soon.

    I was always interested in all the approaches, and I know all of them in more detail than most of those you call "specialists", but that doesn't mean that I think it is correct to fill science with zombies and it is wrong for science to be overwhelmed by theories and approaches that have already been falsified. This approach was really falsified in 1905, by Einstein's special relativity, and I think that 104 years of tests that speak such a clear language is a long enough time for the people who reject the very relativity to be called crackpots.

    Although there's no doubt that this is the real situation, many people even on the "correct side" fail to say things that clearly because what they're really for in science is money, and it is useful for them to team up with the crackpots. Sorry, I find it immoral and I will never join such a behavior.
  19. Aug 16, 2009 #18
    I have to take back what I said. I know of one instance among professionals.
    Last edited by a moderator: Aug 17, 2009
  20. Aug 16, 2009 #19
    Oh so this is how science works. Some one writes down a theory and then we gather evidence for that supports that theory for an arbitary amount of time, say I dunno 104 years, then we conclude that the theory is correct and unquestionable.
  21. Aug 16, 2009 #20

    what about condense-matter analogue approaches like Volvovik and Wen?

    Perhaps the "atoms" of spacetime are discrete, but they give rise via collective emergent properties into a superfluid spacetime that appears continuous and lorentz invariant to particles (in 4D, SUSY optional)
  22. Aug 17, 2009 #21
    Dear ensabah6,

    I don't think that you're quite understanding the observation. The observation implies that the Lorentz symmetry not only "appears" to be there but it "is" actually there, up to 100 times the Planck scale. If the Lorentz symmetry were only an artifact of emergent or collective or blah blah features of many degrees of freedom, it would be violated at the Planck scale, but it is demonstrably not violated.

    All these condensed matter-like theories of spacetime were obviously falsified, too. Sorry I didn't include them to the list but I thought it was obvious that they were dead, too.

  23. Aug 17, 2009 #22

    In the supporting material document to that paper (Fermi collab.), the authors mention on page 24:

    "A specific model of particular interest that has been proposed is a space-time foam scenario inspired by string theory that predicts a small retardation of photon velocity to first order in Eph/MQG(...)"

    and cite this paper:

    SI39 - Ellis, J., Mavromatos, N. E., & Nanopoulos, D. V. “Derivation of a vacuum refractive index in a stringy space time foam model”, Phys. Lett. B 665, 412–417 (2008), and references therein.

    Do you have any particular comments on that paper (Ellis et al 2008)?


  24. Aug 17, 2009 #23
    Dear Christine, the most important fact about the paper is that their predictions have been falsified as cleanly as the predictions of any other kind of fundamental Lorentz-violating theories on the market. What they call the "most conservative" scenario has been proved right and it is not relevant for anything they want to speculate about in the paper.

    Not even the word "stringy" could have saved them.

    I respect at least some of the co-authors of this paper but I have always found such models dumb. By the way, they may have called it "stringy" but the model has nothing to do with string theory. The closest feature of this model to "string theory" is that they cite a paper or two co-written by people who are otherwise "string theorists" (like Myers, coincidentally at the Perimeter Institute), but those papers don't build on string theory, and they usually don't even pretend so (unlike your particular paper): Myers et al. just write some effective field theories. And Ellis et al. here cite many "anti-stringy" people (Amelino-Camelia, Jacobson, Gambini, Pullin, Magueijo, Smolin etc.) and essentially call their work "stringy", even though it's demonstrably not stringy: they do this trick probably to order to increase the credibility of those authors who are the real background of the paper by Ellis et al.

    String theory doesn't allow any kind of "foamy" violations of the Lorentz symmetry near the Planck scale. The latter is fundamentally incorporated into the theory, and it can only be broken by configurations (e.g. B-fields) of matter, and such breaking normally starts at low energies, while the violation is *smaller* at very high energies, much like in all other kinds of spontaneous symmetry breaking. Every well-known string theorist, and every grad student who is on her way to learn string theory from the textbooks, knows this much.

    I don't really believe that e.g. Ellis doesn't know, but if he doesn't, he may be getting too old. But this question - stringy or not - is less important than the basic adjective about the paper: it is wrong. So while the superficial label could be perhaps compatible, because string theory predicts no lags here, none of the details is compatible with reality, so the paper's model is exactly on the same level of falsification as any model that deliberately wanted to start with a "non-stringy" vocabulary.

    Best wishes
  25. Aug 17, 2009 #24


    User Avatar
    Gold Member

    No, it wasn't. They predict a distribuition for a given photon energy over a range for values of delay. That one photon that arrived to early is just one lucky that wasn't significantely delayed by the quantum foam.
  26. Aug 17, 2009 #25
    I have already explained that this may only be an interpretation of a downright crackpot.

    The probability that a multi-hour hour would be erased by "chance" is effectively zero, because it is the value of the probability distribution 10 sigma away from the central value etc. The photon would have to be created long time after (or before) the actual burst, and it's just negligibly unlikely.

    At any rate, your new, increasingly awkward hypothesis will be easily yet gradually falsified by further bursts in the future. When Fermi sees another burst of the same kind with a 30+ GeV photon, when do you think it will probably arrive? Together with others, like in the May 2009 case, or two hours or two weeks later? This is a test of basic intelligence and if you answer b), you should seek medical help.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook