Product of dirac delta distributions

Click For Summary
SUMMARY

The discussion centers on the mathematical implications of multiplying Dirac delta distributions, specifically questioning the validity of recursion relations derived from such products. Participants reference the definition of the Dirac delta function and its properties, including the integral representation and Gaussian approximation. The consensus acknowledges that while products of distributions are generally undefined, certain manipulations involving limits of delta functions can yield valid results, suggesting a nuanced understanding of distribution theory is necessary.

PREREQUISITES
  • Understanding of Dirac delta function properties
  • Familiarity with distribution theory in mathematics
  • Knowledge of integral calculus and limits
  • Experience with Gaussian functions and their applications
NEXT STEPS
  • Research the properties of distributions and their multiplication
  • Study the implications of the Gaussian approximation of the Dirac delta function
  • Explore advanced topics in functional analysis related to distributions
  • Examine the role of test functions in defining products of distributions
USEFUL FOR

Mathematicians, physicists, and students studying advanced calculus or functional analysis, particularly those interested in distribution theory and its applications in theoretical physics.

friend
Messages
1,448
Reaction score
9
I'm told that a product of distributions is undefined. See,

http://en.wikipedia.org/wiki/Distribution_(mathematics)#Problem_of_multiplication

where the Dirac delta function is considered a distribution.

Now the Dirac delta function is defined such that,

\[<br /> \int_{ - \infty }^{ + \infty } {{\rm{f(x}}_1 {\rm{)\delta (x}}_1 {\rm{ - x}}_0 ){\rm{dx}}_1 } = {\rm{f(x}}_0 )<br /> \]<br />

for all continuous compactly supported functions ƒ. See,

http://en.wikipedia.org/wiki/Dirac_delta_function

But the question is can we make \[<br /> {\rm{f(x}}_1 ) = {\rm{\delta (x - x}}_1 )<br /> \]<br />, in order to get,

\[<br /> \int_{ - \infty }^{ + \infty } {{\rm{\delta (x - x}}_1 {\rm{)\delta (x}}_1 {\rm{ - x}}_0 ){\rm{dx}}_1 } = {\rm{\delta (x - x}}_0 )<br /> \]<br />

which is a very convenient recursion relation?

But then we are faced with the product of distributions inside the integral. So does the recursion relation actually exist?

We are told that the delta function is not everywhere continuous so it is not allowed to be \[<br /> {\rm{f(x}}_1 )<br /> \]<br />.

Nevertheless, it seems obvious if we consider the limits of the delta function individually, then of course the recursion relation is allowed. For if we use the gaussian form of the delta function, we have,

\[<br /> {\rm{\delta (x - x}}_1 ) = \mathop {\lim }\limits_{\Delta _1 \to 0} \frac{1}{{(\pi \Delta _1 ^2 )^{1/2} }}e^{ - (x - x_1 )^2 /\Delta _1 ^2 } <br /> \]<br />

and

\[<br /> {\rm{\delta (x}}_1 {\rm{ - x}}_0 ) = \mathop {\lim }\limits_{\Delta _0 \to 0} \frac{1}{{(\pi \Delta _0 ^2 )^{1/2} }}e^{ - (x_1 - x_0 )^2 /\Delta _0 ^2 } <br /> \]<br />

Then,

\[<br /> \int_{ - \infty }^{ + \infty } {{\rm{\delta (x - x}}_1 {\rm{)\delta (x}}_1 {\rm{ - x}}_0 ){\rm{dx}}_1 } = \int_{ - \infty }^{ + \infty } {\left( {\mathop {\lim }\limits_{\Delta _1 \to 0} \frac{1}{{(\pi \Delta _1 ^2 )^{1/2} }}e^{ - (x - x_1 )^2 /\Delta _1 ^2 } } \right)\left( {\mathop {\lim }\limits_{\Delta _0 \to 0} \frac{1}{{(\pi \Delta _0 ^2 )^{1/2} }}e^{ - (x_1 - x_0 )^2 /\Delta _0 ^2 } } \right){\rm{dx}}_1 } <br /> \]<br />

\[<br /> = \mathop {\lim }\limits_{\Delta _1 \to 0} \int_{ - \infty }^{ + \infty } {\left( {\frac{1}{{(\pi \Delta _1 ^2 )^{1/2} }}e^{ - (x - x_1 )^2 /\Delta _1 ^2 } } \right){\rm{\delta (x}}_1 {\rm{ - x}}_0 ){\rm{dx}}_1 } = \mathop {\lim }\limits_{\Delta _1 \to 0} \frac{1}{{(\pi \Delta _1 ^2 )^{1/2} }}e^{ - (x - x_0 )^2 /\Delta _1 ^2 } = {\rm{\delta (x - x}}_0 )<br /> \]<br />

For if we let \[<br /> {\Delta _1 }<br /> \]<br /> remain a fixed non-zero number until after the integration then the exponential delta function is a continuous compactly supported function and qualifies to be \[<br /> {\rm{f(x}}_1 )<br /> \]<br />. Or

\[<br /> = \mathop {\lim }\limits_{\Delta _0 \to 0} \int_{ - \infty }^{ + \infty } {{\rm{\delta (x - x}}_1 )\left( {\frac{1}{{(\pi \Delta _0 ^2 )^{1/2} }}e^{ - (x_1 - x_0 )^2 /\Delta _0 ^2 } } \right){\rm{dx}}_1 } = \mathop {\lim }\limits_{\Delta _0 \to 0} \frac{1}{{(\pi \Delta _1 ^2 )^{1/2} }}e^{ - (x - x_0 )^2 /\Delta _0 ^2 } = {\rm{\delta (x - x}}_0 )<br /> \]<br />

if we let \[<br /> {\Delta _0 }<br /> \]<br /> remain a fixed non-zero number until after the integration so that \[<br /> {\rm{f(x}}_1 )<br /> \]<br /> becomes a continuous compactly supported function as before.

Since the result is \[<br /> {\rm{\delta (x - x}}_0 )<br /> \]<br /> for any order in which we take the limits. Does this prove that the limit is valid and the recursion relation holds? Thank you.
 
Last edited:
Physics news on Phys.org
friend said:
I'm told that a product of distributions is undefined.

This claim is correct usually...

It will of course become incorrect, if somebody comes up with the definition :smile:

But the question is can we make \[<br /> {\rm{f(x}}_1 ) = {\rm{\delta (x - x}}_1 )<br /> \]<br />, in order to get,

\[<br /> \int_{ - \infty }^{ + \infty } {{\rm{\delta (x - x}}_1 {\rm{)\delta (x}}_1 {\rm{ - x}}_0 ){\rm{dx}}_1 } = {\rm{\delta (x - x}}_0 )<br /> \]<br />

which is a very convenient recursion relation?

But then we are faced with the product of distributions inside the integral. So does the recursion relation actually exist?

In my opinion this is fine. You can get correct results when you calculate like this, and that can also guide you towards a rigor proof in some situation.

Nevertheless, it seems obvious if we consider the limits of the delta function individually, then of course the recursion relation is allowed.

It seems to be a common phenomena in human behavior, that the more unclear, ambiguous and uncertain some claim is, the more likely human will emphasize the obviousness of the claim :-p

For if we use the gaussian form of the delta function, we have,

\[<br /> {\rm{\delta (x - x}}_1 ) = \mathop {\lim }\limits_{\Delta _1 \to 0} \frac{1}{{(\pi \Delta _1 ^2 )^{1/2} }}e^{ - (x - x_1 )^2 /\Delta _1 ^2 } <br /> \]<br />

and

\[<br /> {\rm{\delta (x}}_1 {\rm{ - x}}_0 ) = \mathop {\lim }\limits_{\Delta _0 \to 0} \frac{1}{{(\pi \Delta _0 ^2 )^{1/2} }}e^{ - (x_1 - x_0 )^2 /\Delta _0 ^2 } <br /> \]<br />

Then,

\[<br /> \int_{ - \infty }^{ + \infty } {{\rm{\delta (x - x}}_1 {\rm{)\delta (x}}_1 {\rm{ - x}}_0 ){\rm{dx}}_1 } = \int_{ - \infty }^{ + \infty } {\left( {\mathop {\lim }\limits_{\Delta _1 \to 0} \frac{1}{{(\pi \Delta _1 ^2 )^{1/2} }}e^{ - (x - x_1 )^2 /\Delta _1 ^2 } } \right)\left( {\mathop {\lim }\limits_{\Delta _0 \to 0} \frac{1}{{(\pi \Delta _0 ^2 )^{1/2} }}e^{ - (x_1 - x_0 )^2 /\Delta _0 ^2 } } \right){\rm{dx}}_1 } <br /> \]<br />

\[<br /> = \mathop {\lim }\limits_{\Delta _1 \to 0} \int_{ - \infty }^{ + \infty } {\left( {\frac{1}{{(\pi \Delta _1 ^2 )^{1/2} }}e^{ - (x - x_1 )^2 /\Delta _1 ^2 } } \right){\rm{\delta (x}}_1 {\rm{ - x}}_0 ){\rm{dx}}_1 } = \mathop {\lim }\limits_{\Delta _1 \to 0} \frac{1}{{(\pi \Delta _1 ^2 )^{1/2} }}e^{ - (x - x_0 )^2 /\Delta _1 ^2 } = {\rm{\delta (x - x}}_0 )<br /> \]<br />

For if we let \[<br /> {\Delta _1 }<br /> \]<br /> remain a fixed non-zero number until after the integration then the exponential delta function is a continuous compactly supported function and qualifies to be \[<br /> {\rm{f(x}}_1 )<br /> \]<br />. Or

\[<br /> = \mathop {\lim }\limits_{\Delta _0 \to 0} \int_{ - \infty }^{ + \infty } {{\rm{\delta (x - x}}_1 )\left( {\frac{1}{{(\pi \Delta _0 ^2 )^{1/2} }}e^{ - (x_1 - x_0 )^2 /\Delta _0 ^2 } } \right){\rm{dx}}_1 } = \mathop {\lim }\limits_{\Delta _0 \to 0} \frac{1}{{(\pi \Delta _1 ^2 )^{1/2} }}e^{ - (x - x_0 )^2 /\Delta _0 ^2 } = {\rm{\delta (x - x}}_0 )<br /> \]<br />

if we let \[<br /> {\Delta _0 }<br /> \]<br /> remain a fixed non-zero number until after the integration so that \[<br /> {\rm{f(x}}_1 )<br /> \]<br /> becomes a continuous compactly supported function as before.

Since the result is \[<br /> {\rm{\delta (x - x}}_0 )<br /> \]<br /> for any order in which we take the limits. Does this prove that the limit is valid and the recursion relation holds? Thank you.

I hope you understand that when you write the equality signs "=" like that, it is not really an equality in such manner that you have numbers on the left and right side, and that the numbers would be the same.

For example, if I define a function \delta_{\Delta_0}(x) like this:

<br /> \delta_{\Delta_0}(x) = \lim_{\Delta_1\to 0^+} \int\limits_{-\infty}^{\infty} \Big(<br /> \frac{1}{(\pi \Delta_1^2)^{1/2}} e^{-(x-x_1)^2/\Delta_1^2}\Big)\Big(<br /> \frac{1}{(\pi \Delta_0^2)^{1/2}} e^{-x_1^2/\Delta_0^2}\Big) dx_1<br />

then the following is true:

<br /> \lim_{\Delta_0\to 0^+} \int\limits_{-\infty}^{\infty} \delta_{\Delta_0}(x-x_0) f(x_0)dx_0 = f(x)<br />

Unlike your heuristic equations, these two equations which I wrote are actually real equations, which have equal numbers on the left and right sides. If you understand when equation is heuristic and when a real one, then IMO you are fine.

Now when I started to think of this...

Suppose \delta^{n}_{x_0} is defined as a mapping C_0(\mathbb{R}^n)\to\mathbb{C}, f\mapsto f(x_0), wouldn't it make sense to define a product of \delta^{n}_{x_0} and \delta^{m}_{x_1} simply as

<br /> \delta^{n}_{x_0} \delta^{m}_{x_1} := \delta^{n+m}_{(x_0,x_1)},<br />

which is a mapping C_0(\mathbb{R}^{n+m})\to\mathbb{C}, f\mapsto f(x_0,x_1). Can anyone say what would be a problem with this?

It could be that one problem is that the definition is not particularly useful, but on the other hand I've been left slightly sceptical about the usefulness of the distributions anyway... and repeating the sentece "product of distributions does not exist" is not very useful either.
 
Last edited:
friend said:
I'm told that a product of distributions is undefined. See,
The difference here is that you're not really multiplying them -- this is more like a tensor product.

Given any two univariate distributions f and g, the expression f(x) g(y) makes sense because they are distributional in different variables, and its defining property is that
\int_{-\infty}^{+\infty} \int_{-\infty}^{+\infty} f(x) g(y) \varphi(x) \psi(y) \, dx \, dy = \int_{-\infty}^{+\infty} f(x) \varphi(x) \, dx \int_{-\infty}^{+\infty} g(y) \varphi(y) \, dy​

(any bivariate test function is a limit of sums of products of univariate test functions)


There's another subtlety here. Normally, \delta(x-y) \delta(x-z) would only make sense used in a double integral, so it's a bit of good fortune that we can express it as an iterated integral as you did!
 
jostpuur said:
Now when I started to think of this...

Suppose \delta^{n}_{x_0} is defined as a mapping C_0(\mathbb{R}^n)\to\mathbb{C}, f\mapsto f(x_0), wouldn't it make sense to define a product of \delta^{n}_{x_0} and \delta^{m}_{x_1} simply as

<br /> \delta^{n}_{x_0} \delta^{m}_{x_1} := \delta^{n+m}_{(x_0,x_1)},<br />

which is a mapping C_0(\mathbb{R}^{n+m})\to\mathbb{C}, f\mapsto f(x_0,x_1). Can anyone say what would be a problem with this?

Okey this has some problems in it. That works for situations like

<br /> \delta(x - x_0)\delta(y - y_0) dx\; dy<br />

but not for situations like

<br /> \delta(x - y) \delta(y - y_0) dx\; dy<br />
 
however Hurkyl could we do this ??

given S and T to be distributions with

g(\frac{x}{\epsilon})=S(x) and h(\frac{x}{\epsilon})=T(x)

in the limit epsilon tends to infinity

then my idea is to define the product of distribution with respect to a certain analytic test-function \phi (x) to be

(ST, \phi )=( g(\frac{x}{\epsilon})T,\phi)+(Sh(\frac{x}{\epsilon}),\phi)
 
Why would there exist a test function g with the property that
\lim_{y \to +\infty} g\left( \frac{x}{y} \right) = S(x)​
? I think that might even require S to be a constant.

But even if it does exist, can you show that your definition of product doesn't depend on your choice of g and h? That's the real killer for multiplying distributions.


Every distribution is a limit of test functions; i.e.
S(x) = \lim_{n \to +\infty} g_n(x)​
. Similarly, we can write T(x) as a limit of hn(x). The limit of gn(x)hn(x) (if it exists) is going to be a distribution -- but that depends crucially on your choice of g and h: it is not determined simply from S and T.

Here are four interesting sequences of functions that converge to the delta function. (They aren't test functions, but it's easy to smooth out these examples)

  • r_n(x) = \begin{cases} n &amp; x \in \left[-\frac{1}{2n}, \frac{1}{2n}\right] \\ 0 &amp; \text{otherwise}\right]
  • s_n(x) = \begin{cases} n &amp; x \in \left[0, \frac{1}{n}\right] \\ 0 &amp; \text{otherwise}\right]
  • t_n(x) = \begin{cases} 2n &amp; x \in \left[-\frac{1}{2n}, 0\right] \\ 0 &amp; \text{otherwise}\right]
  • u_n(x) = \begin{cases} n &amp; x \in \left[-\frac{2}{2n}, -\frac{1}{2n}\right] \\<br /> n &amp; x \in \left[\frac{1}{2n}, \frac{2}{2n}\right] \\<br /> 0 &amp; \text{otherwise}\right]

What do the various products of these sequences converge to?
 
Hurkyl said:
Here are four interesting sequences of functions that converge to the delta function. (They aren't test functions, but it's easy to smooth out these examples)

  • r_n(x) = \begin{cases} n &amp; x \in \left[-\frac{1}{2n}, \frac{1}{2n}\right] \\ 0 &amp; \text{otherwise}\right]
  • s_n(x) = \begin{cases} n &amp; x \in \left[0, \frac{1}{n}\right] \\ 0 &amp; \text{otherwise}\right]
  • t_n(x) = \begin{cases} 2n &amp; x \in \left[-\frac{1}{2n}, 0\right] \\ 0 &amp; \text{otherwise}\right]
  • u_n(x) = \begin{cases} n &amp; x \in \left[-\frac{2}{2n}, -\frac{1}{2n}\right] \\<br /> n &amp; x \in \left[\frac{1}{2n}, \frac{2}{2n}\right] \\<br /> 0 &amp; \text{otherwise}\right]

What do the various products of these sequences converge to?

It may depend on which limit you take first. It seems there are 3 separate limits involved in taking the product and then integrating. Do we take the limit of one of the sequences first, then do the limits involved with integration, then do the limit of the other sequence?

And there are situations in which it matters which limit you take first. For example, consider the following:

\[<br /> \mathop {\lim }\limits_{x,y \to 0,0} \frac{{x - y}}{{x + y}} = \mathop {\lim }\limits_{x \to 0} \mathop {\lim }\limits_{y \to 0} \frac{{x - y}}{{x + y}}<br /> \]<br />

Which limit do we do first. It matters because if we take the limit as x approaches zero first, leaving the y a fixed non-zero value, then the result is -1. But if we take the limit first as y approaches zero, then we get +1. And so here is an example of an undefined limiting process. But I think that if it doesn't matter which limit you do first because you get the same result, then those limit processes are defined. Does this sound right? Have you seen anything in functional analysis that considers more than one limiting process and rules for which limit is done first?

And it does seem that with the dirac delta that there are limiting processes that are done first before others. Part of the definition of the dirac delta is that it integrates to 1 no matter what the value is of the other parameter that goes to zero. So here we are taking the integration limit first before considering the other.
 
Last edited:
The \int symbol here isn't an integral. At least, it isn't like what you learned in elementary calculus. When used here, it's just a symbol denoting the evaluation of a distribution at a test function... \int is used here as a suggestive analogy, and also because when the arguments are both test functions, it does turn out to give the same answers as ordinary integration.

Other notations for this operation include:
  • Functional notation: something like \delta[\varphi] = \varphi(0), or maybe even \delta(\varphi) = \varphi(0).
  • Matrix-like notation: we would just write \delta \varphi = \varphi(0)
  • Inner product notation: (\delta, \varphi) = \varphi(0)
  • Bra-ket notation: \langle \delta | \varphi \rangle = \varphi(0)

In any case, this operation is jointly continuous in both of its arguments. In inner-product-like notation:
\lim_{n \mapsto \infty} (S_n, \varphi_n) = \left(\lim_{n \mapsto \infty} S_n, \lim_{n \mapsto \infty} \varphi_n \right)


In integral-like notation, where we write a distribution as a limit of test functions (really, as a limit of the distributions those test functions represent), this becomes the "always take the integral first" rule:
<br /> \int_{-\infty}^{+\infty} S(x) \varphi(x) \, dx = \int_{-\infty}^{+\infty} \left( \lim_{n \mapsto \infty} \hat{s}_n(x) \right) \varphi(x) \, dx = \lim_{n \mapsto \infty} \int_{-\infty}^{+\infty} \hat{s}_n(x) \varphi(x) \, dx = \lim_{n \mapsto \infty} \int_{-\infty}^{+\infty} s_n(x) \varphi(x) \, dx
where the last integrand is a distribution corresponding to a test function evaluated at a test function, and so can be computed as an ordinary Riemann integral.

I've added an extra feature to the above calculation: I put a hat (^) over the test function when I'm treating it as a distribution, so you can see more clearly where distributional things are happening, and when ordinary calculus is happening.
 
Last edited:
Hurkyl said:
Other notations for this operation include:
  • Functional notation: something like \delta[\varphi] = \varphi(0), or maybe even \delta(\varphi) = \varphi(0).
  • Matrix-like notation: we would just write \delta \varphi = \varphi(0)
  • Inner product notation: (\delta, \varphi) = \varphi(0)
  • Bra-ket notation: \langle \delta | \varphi \rangle = \varphi(0)

However, I think the details of actually doing the calculation would be exactly the integration process. The functional would equate to an integral equation as before. The matrix form would require multiplying and adding components which would look exactly like integration. And the inner product and Bra-ket notation are just notational differences.

Hurkyl said:
In any case, this operation is jointly continuous in both of its arguments. In inner-product-like notation:
\lim_{n \mapsto \infty} (S_n, \varphi_n) = \left(\lim_{n \mapsto \infty} S_n, \lim_{n \mapsto \infty} \varphi_n \right)

What is "jointly continuous in both of its arguments"? Are you sure this shouldn't be two individual limiting processes\[<br /> \mathop {\lim }\limits_{n \to \infty } <br /> \]<br /> and \[<br /> \mathop {\lim }\limits_{m \to \infty } <br /> \]<br /> so that you'd get,

\left(\lim_{n \mapsto \infty} S_n, \lim_{m \mapsto \infty} \varphi_m \right)

Hurkyl said:
In integral-like notation, where we write a distribution as a limit of test functions (really, as a limit of the distributions those test functions represent), this becomes the "always take the integral first" rule:
<br /> \int_{-\infty}^{+\infty} S(x) \varphi(x) \, dx = \int_{-\infty}^{+\infty} \left( \lim_{n \mapsto \infty} \hat{s}_n(x) \right) \varphi(x) \, dx = \lim_{n \mapsto \infty} \int_{-\infty}^{+\infty} \hat{s}_n(x) \varphi(x) \, dx = \lim_{n \mapsto \infty} \int_{-\infty}^{+\infty} s_n(x) \varphi(x) \, dx

Yes, I suppose it would not make sense to integrate after you take the limit of the delta function. For then the area under the curve would not be 1 as required.


But my broader question has to do with the path integral. Some say that the measure of the path integral in not defined. But I'm still not sure what they mean. I think it has to do with the product of distributions.

What can "not defined" mean if not that the evaluation could have more than one value or is infinite. So I think the problem may be in competing limits, which one you do first may result in different answers. I've not yet seen such competing limit concerns in any of the functional analysis books I've browsed through. I searched the Web for "multiple limit processes". And I have seen a few webpages that acknowledge the problem without giving any guidance. And there seems to be reference to Advanced Calculus books that may have more information. Maybe you've seen this issue addressed in some book somewhere.

It is important to me that this issue is addressed. In fact EVERYTHING depends on it. For it seems the path integral of physics and perhaps all of physics can be derived from this recursion relation of the Dirac delta function, if only it is valid. I can easily show this here if there is interest.

It seems that this problem of the measure of the path integral probably came about because they derived the path integral from the point of view of physics concepts. But I've come to the path integral from a purely mathematical perspective. And assuming the recursion relation of the delta holds, then the path integral measure problem might be resolved by resolving the product of distribution problem.

I think I've shown that the integral of the product of two delta functions results in the same answer no matter which limit is done first (See original post). Have I actually solved the product of distributions problem (and by extention the path integral measure problem) by addressing the competing limits involved?
 
Last edited:
  • #10
friend said:
Then,

\[<br /> \int_{ - \infty }^{ + \infty } {{\rm{\delta (x - x}}_1 {\rm{)\delta (x}}_1 {\rm{ - x}}_0 ){\rm{dx}}_1 } = \int_{ - \infty }^{ + \infty } {\left( {\mathop {\lim }\limits_{\Delta _1 \to 0} \frac{1}{{(\pi \Delta _1 ^2 )^{1/2} }}e^{ - (x - x_1 )^2 /\Delta _1 ^2 } } \right)\left( {\mathop {\lim }\limits_{\Delta _0 \to 0} \frac{1}{{(\pi \Delta _0 ^2 )^{1/2} }}e^{ - (x_1 - x_0 )^2 /\Delta _0 ^2 } } \right){\rm{dx}}_1 } <br /> \]<br />

I found a similar equation in, "The Feynman Integral and Feynman's Operational Calculus", by Gerald W. Johnson and Michael L. Lapidus, page 37, called a Chapman-Kolmogorov equation:

\[<br /> \int_{ - \infty }^{ + \infty } {\left( {\frac{\lambda }{{2\pi \left( {t - s} \right)}}} \right)^{\frac{1}{2}} e^{ - {\textstyle{{\lambda (\omega - \upsilon )^2 } \over {2\left( {t - s} \right)}}}} \left( {\frac{\lambda }{{2\pi \left( {s - r} \right)}}} \right)^{\frac{1}{2}} e^{ - {\textstyle{{\lambda (\upsilon - u)^2 } \over {2\left( {s - r} \right)}}}} {\rm{d}}\upsilon } = \left( {\frac{\lambda }{{2\pi \left( {t - r} \right)}}} \right)^{\frac{1}{2}} e^{ - {\textstyle{{\lambda (\omega - u)^2 } \over {2\left( {t - r} \right)}}}} <br /> \]<br />

This equation does not consider limits. But it's easy to see that placing limits on (t-s) and (s-r) would lead to a gaussian form of the Dirac delta function. The book does not spell out how they got this equation. Does anyone know how they got this equation?

This equation is also confirmed in the book review at:

http://books.google.com/books?id=yp...n-kolmogorov equation Brownian motion&f=false
 
  • #11
friend said:
It is important to me that this issue is addressed. In fact EVERYTHING depends on it. For it seems the path integral of physics and perhaps all of physics can be derived from this recursion relation of the Dirac delta function, if only it is valid. I can easily show this here if there is interest.

I found this equation in the book, "Path Integrals in Quantum Mechanics, Statistics, Polymer Physics, and Financial Markets", by Hagen Kleinert, page 91. You can also see in a book review at:

http://users.physik.fu-berlin.de/~kleinert/public_html/kleiner_reb3/psfiles/pthic04.pdf

It shows how a quantum transition amplitude can be interpreted as a dirac delta function equal to the integration of a great number of products of delta functions.

\[<br /> \left( {x_b t_b |x_a t_a } \right) = \prod\limits_{n = 1}^N {\left[ {\int_{ - \infty }^{ + \infty } {dx_n } } \right]} \prod\limits_{n = 1}^{N + 1} {\left\langle {x_n |x_{n - 1} } \right\rangle } = \prod\limits_{n = 1}^N {\left[ {\int_{ - \infty }^{ + \infty } {dx_n } } \right]} \prod\limits_{n = 1}^{N + 1} {\delta \left( {x_n - x_{n - 1} } \right)} = \delta \left( {x_b - x_a } \right)<br /> \]<br />

The last two equations on the right can be obtain by interating a recursion relation for the dirac delta function. So you can see here that QM can be derived from this recursion relation assuming that it is valid.
 
  • #12
why is the product undecided ??

using the "convolution theorem" i can get the product of 2 dirac delta functions

D^{m}\delta (u) D^{n}\delta (u)

as the Fourier transform of the convolution of the 2 functions

A(x^{m}*x^{n} ) so this convolution would define the product.

jere A is a constant that can be a real or pure imaginary number
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 25 ·
Replies
25
Views
4K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K