Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Product of two propagators U(-t)U(t) in coord representation

  1. Sep 8, 2015 #1
    Here is a mystery I'm trying to understand. Let ##\hat{U}(t) = \exp[-i\hat{H}t]## is an evolution operator (propagator) in atomic units ([itex]\hbar=1[/itex]). I think I'm not crazy assuming that ##\hat{U}(-t)\hat{U}(t)=\hat{I}## (unit operator). Then I would think that the following should hold
    [tex]\left\langle x\right|\hat{U}(-t)\hat{U}(t)\left|x^\prime\right\rangle = \left\langle x\right|\hat{I}\left|x^\prime\right\rangle = \left\langle x|x^\prime\right\rangle= \delta(x-x^\prime)[/tex]
    However, when using the resolution of identity in the coordinate representation ##\hat{I}=\int\left|y\rangle\langle y\right|dy## and well known expressions for propagators (for free particle or harmonic oscillator) in the coordinate representation, I get a coordinate dependent phase factor in front of the delta function:
    [tex]\left\langle x \right | \hat{U}(-t)\hat{U}(t) \left | x^\prime\right\rangle = \int dy\left\langle x\right|\hat{U}(-t)\left|y\right\rangle\left\langle y\right|\hat{U}(t)\left|x^\prime\right\rangle = e^{i\phi(x,x^\prime)}\delta(x-x^\prime)[/tex]

    Could someone, please, comment on that? I'm lost although something tells me the explanation must be trivial.

    THANKS!

    P. S. The integral over y becomes a Fourier integral, so you can easily verify all this.
     
    Last edited: Sep 8, 2015
  2. jcsd
  3. Sep 8, 2015 #2

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    You do not explain how you get the phase factor or what its expression is. I also suggest you think about how you can reexpress ##f(x)\delta(x-a)## in general.
     
  4. Sep 8, 2015 #3
    I get the following phase factors (##e^{i\phi(x,x^\prime)}##)

    Free particle with mass ##m##: $$\phi(x,x^\prime) = \dfrac{m\left(x^2-x^{\prime 2}\right)}{2t}$$

    Harmonic oscillator with frequency ##\omega##: $$\phi(x,x^\prime) = \omega\left(x^2-x^{\prime 2}\right)\cot\omega t$$
     
  5. Sep 8, 2015 #4

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    So why do you think that this gives a phase factor? Did you read the second part of my post?
     
  6. Sep 8, 2015 #5
    Thanks a lot for your input! I'm still confused though... What do you mean by "reexpressing ##f(x)\delta(x-a)##"? The product ##e^{i\phi(x,x^\prime)}\delta(x-x^\prime)## is still NOT a delta function, even if ##\phi(x,x)=0##. What am I missing?
     
  7. Sep 8, 2015 #6

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    Yes it is. It is a delta function regardless of what ##\phi(x,x)## is (although with a different phase if it is non-zero). You are missing a very basic property of the delta function.
     
  8. Sep 8, 2015 #7

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    To give you a hint: what is the defining property of the delta function (or more correctly, the delta distribution)?
     
  9. Sep 8, 2015 #8
    Okay, I see now what you mean. You mean that as long as $$\int\limits_{-\infty}^{\infty} f(x) \psi(x-a)dx = f(a)$$ for any f(x), the distribution ##\psi(x-a)## is a delta function (distribution). So, you mean that a delta function multiplied by an arbitrary phase factor ##e^{i\phi(x)}## is still a delta function. Well, what about a function (distribution) ##\tilde{\delta}(x-a) = (x-a+1)\delta(x-a)##. Obviously, this function still has a sifting property and its integral over the real axis is one. Is it also a delta function (distribution)?
     
  10. Sep 8, 2015 #9
    I guess, I'm taking back the last argument about ##(x+1)\delta(x)## simply because ##(x+1)\delta(x) = x\delta(x) + \delta(x) = \delta(x)## because ##x\delta(x) = 0##. So, the Dirac delta function actually absorbs any phase factor ##e^{i\phi(x)}## with ##\phi(0)=0##. Right? As to the coordinate representations of the propagators, is it well known? And if it is, where can I read about it? Thanks a lot for your replies!
     
  11. Sep 8, 2015 #10

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    Yes, or more generally: ##f(x)\delta(x-a) = f(a) \delta(x-a)##.
     
  12. Sep 8, 2015 #11
    Really? I always thought it's true only if you integrate over ##x## (sifting property)... If that is always true, that resolves the problem with coordinate representation of the propagators. But I still feel a bit uncomfortable about the whole thing. Thanks though!
     
  13. Sep 9, 2015 #12

    blue_leaf77

    User Avatar
    Science Advisor
    Homework Helper

    Remember that a delta function centered at ##x=a## has zero value everywhere outside this point, this means in an expression like ##f(x)\delta(x-a)##, ##f(x\neq a)## is superfluous as it will be multiplied with zero anyway.
     
  14. Sep 9, 2015 #13
    Thanks! That makes sense... However, if ##\delta(x)## is understood as a distribution, then the sifting property should work only when calculating an integral with the distribution, and this is a part of definition of ##\delta(x)## distribution. Otherwise, what if ##f(x)## in your example has a discontinuity at ##x\neq a##? Will then ##f(x)\delta(x-a) = f(a)\delta(x-a)## still hold?

    Anyways, I feel much better now thanks to you. So, thank you for your time!
     
  15. Sep 9, 2015 #14
    Here is something I wanted to point out. Your suggestion that ##f(x)\delta(x-a)=f(a)\delta(x-a)## should imply that $$\frac{d}{dx}\left[f(x)\delta(x-a)\right]=f(a)\delta^\prime(x-a)$$ which is not true for any (even smooth) function ##f(x)##. A simple example is ##f(x)=\sqrt{x-a}+1##: $$\frac{d}{dx}\left[\left(\sqrt{x-a}+1\right)\delta(x-a)\right] = \frac{1}{2}(x-a)^{-\frac{1}{2}}\delta(x-a) + \left(\sqrt{x-a}+1\right)\delta^\prime(x-a) \neq \delta^\prime(x-a)$$ I'm sure I can come up with many other examples of smooth and maybe even infinitely smooth functions.
     
  16. Sep 9, 2015 #15

    strangerep

    User Avatar
    Science Advisor

    Heh, this is an example of how (when working with distributions) one must keep in mind the particular rigged Hilbert space structure involved. E.g., in your example, f(x) is not a Schwartz function.
     
  17. Sep 9, 2015 #16

    rubi

    User Avatar
    Science Advisor

    Distributions are only defined on smooth functions, so this problem cannot occur.

    Your function is not smooth at ##x=a##. If you really used a smooth function, then these would actually be the same distribution, although it wouldn't be apparent from the expression. If you would evaluate them on a test function (smooth function with compact support), you would get exactly the same result.

    One is allowed to multiply any distribution by more general functions, since the multiplication is defined by ##(hD)(f) = D(hf)##, so we only need to make sure that ##hf## is again a test function (for any test function ##f##). In fact, one is allowed to multiply by any smooth function, but the result might happen to be just a distribution and not a tempered distribution anymore.
     
    Last edited: Sep 9, 2015
  18. Sep 10, 2015 #17

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    In addition to what has already been said, you cannot just take a function which is not differentiable and act as if its distributional derivative is given by the derivative away from the points where it is not. With this logic, the derivative of the Heaviside function would be zero, which is not the case.

    If you are not sure of how to differentiate something, go back to the definition of the derivative of a distribution: ##f'[\varphi] = - f[\varphi']##.
     
  19. Sep 11, 2015 #18

    vanhees71

    User Avatar
    Science Advisor
    2016 Award

    Well, let's clarify the issue with the derivative. Of course, all equations have to be read as equations in the sense of distributions. So let's do it carefully.

    First we start with the naive application of the rules and pretend that we can naively treat the ##\delta## disribution like a function. Of course, as already stated, the function ##f## must be differentiable at ##x=a##. Then we have in the naive way
    $$\frac{\mathrm{d}}{\mathrm{d} x} [f(x) \delta(x-a)]=f'(x) \delta(x-a)+f(x) \delta'(x-a)=f'(a) \delta(x-a) + f(x) \delta'(x-a).$$
    Now let's prove that this is justified. To that end we have to apply the distributions on the left and the right side of the equation to a test function (which is, e.g., ##\phi \in C_0^{\infty}(\mathbb{R},\mathbb{C})##). So the left-hand side is
    $$\int_{\mathbb{R}} \mathrm{d}x \frac{\mathrm{d}}{\mathrm{d} x} [f(x)\delta(x)]\phi(x):=-\int_{\mathbb{R}} f(x) \delta(x-a) \phi'(x) = -f(a) \phi'(a).$$
    Now let's evaluate the right-hand side:
    $$\int_{\mathbb{R}} \mathrm{d} x [f'(a) \delta(x-a) + f(a) \delta'(x-a)] \phi(x)=f'(a) \phi(a) -\int_{\mathbb{R}} \mathrm{d} x \frac{\mathrm{d}}{\mathrm{d} x} [f(x) \phi(x)] \delta(x-a)=f'(a) \phi(a) - f'(a) \phi(a)-f(a) \phi'(a)=-f(a) \phi'(a).$$
    So the above naive calculation is correct.

    The funny thing is that also the other suggested way is right too:
    $$\mathrm{d}{\mathrm{d} x} [f(x) \delta(x-a)] = \mathrm{d}{\mathrm{d} x} [f(a) \delta(x-a)] =f(a) \delta'(x-a),$$
    because when evaluating the right-hand side of this applied to a test function gives also the right result:
    $$\int_{\mathbb{R}} \mathrm{d} x f(a) \delta'(x-a) \phi(x)=-f(a) \phi'(x).$$
    Of course, it is crucial to realize that always
    $$f(x) \delta(x-a)=f(a) \delta(x-a)$$
    but that on the other hand
    $$f(x) \delta'(x-a) \neq f(a) \delta'(x-a).$$
     
  20. Sep 11, 2015 #19

    rubi

    User Avatar
    Science Advisor

    The product rule even holds for all distributions, not just ##\delta##. A proof can be found in Gelfand, Shilov, Generalized functions vol.1.
     
  21. Sep 11, 2015 #20
    Well, the phase factor ##e^{i\phi(x)}## is not a Schwartz function either. I think that the rest of this conversation (thanks to all contributors) resolves the "problem" with the relation ##f(x)\delta(x)=f(a)\delta(x)## which is valid for all distributions but not necessarily for all tempered distributions (which is wider class). The key is in the following quote:
    So, keeping in mind the definition of the multiplication of the distribution with a function, one can multiply the distribution with a function which is not necessarily smooth and compact but as long as the new test function is still smooth and compact, everything is fine and the relation ##f(x)\delta(x)=f(0)\delta(x)## holds. In that sense, the phase factor ##e^{i\phi(x)}## is smooth but not compact (also not Schwartz), but it works because the product of it with any smooth and compact function is smooth and compact. So, to summarize:
    $$e^{i\phi(x)}\delta(x) = \delta(x),\,\,\,\,\,\,\text{if}\,\,\phi(0)=0$$
    And finally, going back to the original question
    I conclude that the above is nice and clean because for both free particle and harmonic oscillator propagators one has ##\phi(x,x^\prime) = 0## when ##x=x^\prime##.

    I hate to be annoying but what is the deal with Boltzmann operators (obtained from propagators by replacing ##t\rightarrow-i\beta\hbar##) then:
    $$
    \left\langle x\right|\hat{U}(i\beta\hbar)\hat{U}(-i\beta\hbar)\left|x^\prime\right\rangle = \int\limits_{-\infty}^{\infty}dy\left\langle x\right|\hat{U}(i\beta\hbar)\left|y\right\rangle\left\langle y\right|\hat{U}(-i\beta\hbar)\left|x^\prime\right\rangle = \text{(divergent)}
    $$
     
    Last edited: Sep 11, 2015
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Product of two propagators U(-t)U(t) in coord representation
  1. U (1) symmetry (Replies: 2)

  2. <u|A|u> ? (Replies: 4)

  3. U(1) breaking (Replies: 12)

  4. What is this U(1)? (Replies: 3)

Loading...