Virtual particles and Heisenberg

• I
• LuisBabboni
In summary: Von Neumann? In summary, the author explains that virtual particles are a mathematical fiction used in calculations, and that they do not have a lifetime the same way the pi in that equation doesn't have a lifetime. He also mentions that popular science descriptions get virtual particles wrong with nearly 100% probability because they prefer nice-sounding myths over more realistic descriptions.

LuisBabboni

TL;DR Summary
If ΔE Δt≥ h/4Pi, why Δt≤ h/4PiΔE for virtual particles?
Hi people.

Having as uncertainty principle that ΔE Δt≥ h/4Pi, why Δt≤ h/4PiΔE to allow the existence of a virtual particle?

How ≥ becomes ≤ ?

I think... real particles must obey ≥ so any particle that do not obey that is a virtual particle and thus why virtual particle needs to obey Δt≤ h/4PiΔE?

Thanks.

OK... thanks Phinds! But... I found in lot of places that use of the uncetainty principle related to time a virtual particle could live. What i do not understand is why a >= turns into <=
I have not the necesary math to understand it seriously. I just want for any kind of easy explanation.
Sorry.

What do you want us to do? Write the same answer again? Try and write a wrong answer that you will like?

Sorry Vanadium but yes, the answer was too complicated for me. May be Phinds tought I will could handle this answer. Well, shamely for me, is not the case.
I could just said : Thanks Phinds! As I understand his answer... well, I´m not a lier.
I still hope someone could help me with an easier answer... may be is not possible, may be yes. ;-)

PeroK
Virtual particles are tools in calculations. They do not have a lifetime, in the same way the pi in that equation doesn't have a lifetime.

And yes, popular science descriptions get that wrong with nearly 100% probability because they prefer nice-sounding myths over more realistic descriptions.

LuisBabboni said:
OK... thanks Phinds! But... I found in lot of places that use of the uncetainty principle related to time a virtual particle could live. What i do not understand is why a >= turns into <=
I have not the necesary math to understand it seriously. I just want for any kind of easy explanation.
Sorry.
Yes, and those calculations are at best heuristic, based on a myth. You should treat them with a ton of salt. Your question is the result of the robustness of this myth.

Lots of places can be wrong, apparently. Even Nobel-prize winners have exploited this myth.

phinds and vanhees71
Thanks guys!

berkeman and weirdoguy
mfb said:
Virtual particles are tools in calculations. They do not have a lifetime, in the same way the pi in that equation doesn't have a lifetime.

And yes, popular science descriptions get that wrong with nearly 100% probability because they prefer nice-sounding myths over more realistic descriptions.
Actually, I spend a great deal of time explaining it in my own (Dutch) book on fundamental physics just because of that. Dito with Hawking radiation. It's my way of making the world a slightly better place, I guess :P

mfb, weirdoguy and vanhees71
The technique of using "virtual particles" entails allowing a Markov like evolution process beneath the scale of the Fourier components. Thus, if the frequency yields a time scale T, we may want to adapt an artificial, simulated or "fantasy" approach to system evolution by breaking up T into a number of time steps, say. This leads to an evolution to the predictions for measurement that allows a "composite" system to be introduced (below the level of measurement). For a Fourier component k associated with a variable x (k can be E and x can be t, it doesn't matter: We are simply utilizing a Hilbert/Banach space approach of Fourier analysis of data). We have k.(x + 2 Pi T) = k.x + 2 Pi so exp(i k.x) = exp(i k.(x + 2 Pi T)). This is all you are REALLY addressing: a statistical approach to data analysis of physics experiments, and the convenience of introducing Markov pathways to get additional flexibility. Thus delta(E) delta(t) .le. h/ (4 Pi) simply refers to minimum size of a wave packet under Fourier analysis. We are undermining the assumption to allow additional Markov pathways: This artificially breaks the conditions of Fourier analysis. The reversal of the inequality means only that. Subtract the "colorful" language of the physicists, and it's just mostly basic Fourier analysis and stats with a functional analysis Hilbert space/Banach space perspective thrown in. von Neumann, in his characteristic obscure ways, pointed all this out in the late 1920s and 1930s. He was, in fact, one of the top creators of the field of functional analysis which arose in parallel with the new physics, but in pure math. Feynman is worth consulting about this too, especially as he was mainstream and not a pure math guy.