Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I Continous signals as sums of weighted delta functions

  1. Feb 5, 2017 #1
    so, continuous signals as sums of weighted delta functions can be represented like this:
    Image26.gif
    if you switch order of some variables you get ∫x(τ)δ(-τ+t)dτ, and since,I presume, Dirac delta "function" is even I can write it like this ∫x(τ)δ(-(-τ+t))dτ=∫x(τ)δ(τ-t)dτ=x(t) and we got ourselves a "sifting property". can we visualize it like this:
    x(τ) is function in "τ domain". for every τ there is x(τ) such that every x(τ) value represents some function x(t) which can be "extracted" using δ(τ-t) or, generally speaking, function x(τ) in τ domain represents some "family of functions" x(t) and by using sifting property, described earlier, we can sift one particular function x(t) from that family?
    if this is right, can described sifting property be some kind of "transformation" like Laplace one (you go from one to domain to another, do some things, and go back to original domain)?

    thanks!
     
  2. jcsd
  3. Feb 8, 2017 #2

    jasonRF

    User Avatar
    Science Advisor
    Gold Member

    Your variable manipulations make no difference to the meaning or interpretation of your expression. With your notation
    $$ x(t) = \int_{-\infty}^\infty x(\tau) \delta(t-\tau)\, d\tau \, = \, \int_{-\infty}^\infty x(\tau) \delta(\tau-t)\, d\tau$$
    is the definition of the delta function. It says that convolving a function ##x## with the delta function gives you the same function ##x##.

    Nope. There is no difference between ##x(\tau)## and ##x(t)##; they are both the same function. The argument is a dummy variable.

    Jason
     
  4. Feb 9, 2017 #3
    so, basically sifting property $$ \, \int_{-\infty}^\infty x(\tau) \delta(\tau-a)\, d\tau = x(a) $$ would give us one particular value of function $$ x(a) $$ since a=const., but
    $$ \, \int_{-\infty}^\infty x(\tau) \delta(\tau-t)\, d\tau = x(t) $$ is like picking every value of function (for every t, since t isn't constant [t=time]) using Dirac impulse ("walking Dirac impulse underneath the function" from -inf to +inf) and at the same time summing all of infinitely narrow weighted Dirac impulses (integration) gives you function x(t)? so, that's the idea behind convolution? and dummy variable gives you freedom to escape for a moment variable t and gives you opportunity to manipulate (slide) functions at will? okay, I think I understand at some point significance and meaning of recreating function using Dirac delta impulses but what is the significance behind convolution of any two randomly choosen signals (functions)? I don't understand what convolution, even as plain English word, really means, so maybe it would be helpful if someone give me a proper non-mathemathical definition (and include some analogies) of convolution for a moment. thanks!
     
  5. Feb 9, 2017 #4

    FactChecker

    User Avatar
    Science Advisor
    Gold Member

    This just uses the delta function to pick out the single value of the function x at a while ignoring all other values of x. Your rewritten version is such a degenerate example of convolution that I am not sure it illustrates anything significant about convolution in general. (Maybe I just don't have enough imagination to see more significance.)

    I wouldn't over-think it if I were you.
     
  6. Feb 9, 2017 #5

    jasonRF

    User Avatar
    Science Advisor
    Gold Member

    I'm sorry I used the concept of convolution to answer your question; it was not the best way to explain what you are trying to learn and it is causing more confusion than help.

    In all of the equations above, just think of ##x(t)## as the value of the function ##x## at a particular ##t##. Indeed, that is how a mathematician would typically interpret the notation. So all of the integrals above simply are examples of the sifting property of the delta function.

    Jason
     
  7. Feb 9, 2017 #6
    well, for me it illustrates an understanding how it works. instead of Dirac function write any function and make it generalised and you get convolution integral, so instead of $$ \, \int_{-\infty}^\infty x(\tau) \delta(\tau-t)\, d\g = x(t) $$, write $$ \, \int_{-\infty}^\infty x(\tau) g(\tau-t)d\g = (x*g)(t) $$ where ##g(τ-t)## can be any function (including Dirac delta "function"). based on Wikipedia illustration on convolution I think my version have at least some sense. you choose one function to be "fixed" ##x(t)## and you choose another one to "walk" from -inf to +inf ##g(τ-t)##.
     
  8. Feb 9, 2017 #7

    FactChecker

    User Avatar
    Science Advisor
    Gold Member

    Yes, that a correct understanding of convolution (although saying that one function is "fixed" may be misleading). And the delta function is the identity function for convolution. f*δ = δ*f = f.
     
    Last edited: Feb 9, 2017
  9. Feb 9, 2017 #8
    nice! what is an idea of taking convolution of two functions? what is convolution, generally speaking (plain English)? thanks!
     
  10. Feb 9, 2017 #9

    FactChecker

    User Avatar
    Science Advisor
    Gold Member

    An application of convolutions that I like to think of is when you have a sequence of two operations, f and g, done in series. The operations f and g have different responses (delays, gains, etc.), to an input at delayed times τ.

    Here is a rough, informal description of what I mean. Suppose an input signal I(t) is sent into operations f and then to g to get the output signal O(t): I => f => g => O
    Then the output at time t is the integral of all effects of f and g where the total time through is t. So at any time, t, you want the cumulative response of f and g where the delays add up to t: t = τ + (t-τ). That is, you want to determine the combined effects of f and g: C(t) =[∫f(τ)Ig(t-τ)dτ]. Then O = C * I.

    PS. This could have been a simpler example:
    Consider I => f => O. The output O(tO) at time tO is the cumulative response of f to the input I over all combinations of input time tI and delay time tf that combine to tO. That is all τf where tO = tI + τf. The cumulative response is O(tO) = [∫f(τf)I(tOf)dτf]
     
    Last edited: Feb 9, 2017
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Continous signals as sums of weighted delta functions
Loading...