- #1

- 6,724

- 431

"Black Hole Masses are Quantized," Gia Dvali, Cesar Gomez, Slava Mukhanov, http://arxiv.org/abs/1106.5894

There is a nontechnical summary on the arxiv blog: http://www.technologyreview.com/blog/arxiv/ , along with some inflammatory and uninformed speculation about safety at the LHC, including "This is a debate that particle physicists are strangely reluctant to engage in, having ignored most of the questions marks over safety." In fact, particle physicists have analyzed the issue in great detail: Giddings and Mangano, "Comments on claimed risk from metastable black holes," http://arxiv.org/abs/0808.4087

Anyway, getting back to the actual physics of the paper, the quantization rule they propose is [itex]m=\sqrt{N}m_P[/itex], where m

We give a simple argument showing that in any sensible quantum field theory the masses of black holes cannot assume continuous values and must be quantized. Our proof solely relies on Poincare-invariance of the asymptotic background, and is insensitive to geometric characteristics of black holes or other peculiarities of the short distance physics. Therefore, our results are equally-applicable to any other localized objects on asymptotically Poincare-invariant space, such as classicalons. By adding a requirement that in large mass limit the quantization must approximately account for classical results, we derive an universal quantization rule applicable to all classicalons (including black holes) in arbitrary number of dimensions. In particular, this implies, that black holes cannot emit/absorb arbitrarily soft quanta. The effect has phenomenological model-independent implications for black holes and other classicalons that may be created at LHC. We predict, that contrary to naive intuition, the black holes and/or classicalons, will be produced in form of fully-fledged quantum resonances of discrete masses, with the level-spacing controlled by the inverse square-root of cross-section.

There is a nontechnical summary on the arxiv blog: http://www.technologyreview.com/blog/arxiv/ , along with some inflammatory and uninformed speculation about safety at the LHC, including "This is a debate that particle physicists are strangely reluctant to engage in, having ignored most of the questions marks over safety." In fact, particle physicists have analyzed the issue in great detail: Giddings and Mangano, "Comments on claimed risk from metastable black holes," http://arxiv.org/abs/0808.4087

Anyway, getting back to the actual physics of the paper, the quantization rule they propose is [itex]m=\sqrt{N}m_P[/itex], where m

_{P}is the Planck mass. The Planck mass is 10^19 GeV in 3+1 dimensions, but it is much lower if you assume large extra dimensions. IIRC recent LHC results are putting some tough constraints on large extra dimensions, so it is probably not likely that that the ideas in this paper can be confirmed. The area would be quantized in integer multiples of the Planck area, which I guess sounds nice in relation to LQG...?
Last edited by a moderator: