- #1

jcap

- 170

- 12

$$\Delta x \Delta p \ge \hbar$$

If we assume a massless quantum object then we have the relationship ##\Delta E = c\Delta p## so that the above uncertainty relationship becomes

$$\Delta E \ge \frac{\hbar c}{\Delta x}.\tag{1}$$

I understand that if we have a real system with size ##\Delta x## then there is necessarily an uncertainty in the real energy, ##\Delta E##, given by Equation(1).

But this relationship is commonly applied to otherwise empty space in order to argue that there is a fluctuation of energy ##\Delta E## in each interval of space ##\Delta x##. As we take ##\Delta x## to be arbitrarily small then ##\Delta E## becomes arbitrarily big. If we apply a cutoff at the Planck scale then general relativity implies that space should expand exponentially at an enormous rate which is not observed (the cosmological constant problem).

But surely in the analysis regarding empty space there is no real object with size ##\Delta x## and therefore no real energy fluctuation ##\Delta E##?

Instead it seems to me that the arbitrary interval of space of size ##\Delta x## is a virtual construct so that by the uncertainty principle it should only lead to a virtual energy fluctuation ##\Delta E##.

Surely such a virtual energy fluctuation cannot be expected to lead to any real effects like exponential space expansion?

P.S. However I could imagine differences in virtual energy leading to real effects.