The technique of using "virtual particles" entails allowing a Markov like evolution process beneath the scale of the Fourier components. Thus, if the frequency yields a time scale T, we may want to adapt an artificial, simulated or "fantasy" approach to system evolution by breaking up T into a number of time steps, say. This leads to an evolution to the predictions for measurement that allows a "composite" system to be introduced (below the level of measurement). For a Fourier component k associated with a variable x (k can be E and x can be t, it doesn't matter: We are simply utilizing a Hilbert/Banach space approach of Fourier analysis of data). We have k.(x + 2 Pi T) = k.x + 2 Pi so exp(i k.x) = exp(i k.(x + 2 Pi T)). This is all you are REALLY addressing: a statistical approach to data analysis of physics experiments, and the convenience of introducing Markov pathways to get additional flexibility. Thus delta(E) delta(t) .le. h/ (4 Pi) simply refers to minimum size of a wave packet under Fourier analysis. We are undermining the assumption to allow additional Markov pathways: This artificially breaks the conditions of Fourier analysis. The reversal of the inequality means only that. Subtract the "colorful" language of the physicists, and it's just mostly basic Fourier analysis and stats with a functional analysis Hilbert space/Banach space perspective thrown in. von Neumann, in his characteristic obscure ways, pointed all this out in the late 1920s and 1930s. He was, in fact, one of the top creators of the field of functional analysis which arose in parallel with the new physics, but in pure math. Feynman is worth consulting about this too, especially as he was mainstream and not a pure math guy.