1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Functional Derivatives

  1. Jan 31, 2015 #1

    BiGyElLoWhAt

    User Avatar
    Gold Member

    If I understand what's going on (quite possibly I don't), I think my book is using bad (confusing) notation.
    1. The problem statement, all variables and given/known data
    As written: "Calculate ##\frac{\delta H[f]}{\delta f(z)} \ \text{where} \ H=\int G(x,y)f(y)dy##"

    and ##\frac{\delta H[f]}{\delta f(z)}## is the functional derivative of H[f] with respect to f(y) at z (I think that's what it is).

    2. Relevant equations
    ...

    3. The attempt at a solution
    So if I think what's happening here is we're concerned with how the value of H[f] changes with f(y) when y = z.

    This gives rise to (changing notation slightly) ##\frac{dF[f]}{d f(y)} = \lim_{\epsilon\ \to\ 0} \frac{F[f(y) + \epsilon \delta (x-y)] - F[f(y)] }{\epsilon}## where ##\delta(x-y)## is the dirac delta function centered at y= x, (I believe this is so, please correct me, my book is not explicit) so this gives us our F[f + epsilon] - F[f] when y = x and F[f]-F[f]=0 when y doesn't equal x. If this is right, going through the example:

    ##\frac{d H[f]}{d f(z)} ##
    ##=lim_{\epsilon\ \to\ 0}\frac{\int[G(x,y)(f(z) + \epsilon \delta (x-y))] - G(x,y)f(z)]}{\epsilon }##
    ##=\left \{
    \begin{array}{lr}
    -\int G(x,y)dy\ \text{if y=z} \\
    0\ otherwise\\
    \end{array} \right. ##

    I tried to show my train of thought, please correct any misconceptions I have about this. I would also appreciate it if someone would check my answers. Apparently my book doesn't have the answer's to it's exercises.
     
  2. jcsd
  3. Feb 1, 2015 #2

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    Your y is a dummy variable and your answer cannot depend on it. The functional ##F[f]## itself does not take ##f(x)## as an argument (i.e., the value of a function in a point), it takes ##f##, the function, as an argument. It is therefore more instructive to write ##\delta(x-z) \equiv \delta_z(x)## and give ##f + \epsilon \delta_z## as an argument to the functional. The definition of the functional derivative would then be
    $$
    \frac{\delta F}{\delta f(z)} = \lim_{\epsilon \to 0} \frac{F[f+\epsilon \delta_z] - F[f]}{\epsilon}.
    $$
    You will obtain
    $$
    F[f +\epsilon \delta_z] = \int G(x,y) (f(y) + \delta_z(y)) dy.
    $$
    I will let you take it from there, but in the end it is not much stranger than having a sum
    $$
    S = \sum_m G_m x_m
    $$
    and asking for ##\partial S/\partial x_k##.
     
  4. Feb 1, 2015 #3

    BiGyElLoWhAt

    User Avatar
    Gold Member

    I guess I'm missing what this delta is. Is it not the dirac delta? If so what is it?
     
  5. Feb 1, 2015 #4

    BiGyElLoWhAt

    User Avatar
    Gold Member

    Is it perhaps just to represent a 2 dimensional epsilon? The product of epsilon and delta, that is. So small changes in f over a range of small changes in y?
     
  6. Feb 1, 2015 #5

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    What is the definition of ##\delta H(f) / \delta f## that your book uses?
     
  7. Feb 1, 2015 #6

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    It is the Dirac delta. But it must be seen as a function of an argument just as ##f## is a function of an argument and the functional takes functions as arguments, not function values. When you write ##\delta_z(x)## instead of ##\delta(z-x)##, it simply becomes more evident which of the variables should be used as the function parameter (not that it matters in this case, the delta is symmetric). The value ##F[f+\epsilon \delta_z]## is simply the value of the functional when its argument is the function ##f + \epsilon\delta_z##. Now, this function has a parameter ##z##, which is the reason that the resulting functional derivative is a function of ##z##.
     
  8. Feb 1, 2015 #7

    BiGyElLoWhAt

    User Avatar
    Gold Member

    It shows the limit definition of a run of the mill calc 1 derivative then says:
    "The derivative of the function tells you how the number returned by the function f(x) changes as you slightly change the number x that feed into the 'machine'. In the same way, we can define a functional derivative of a functional F[f] as follows:
    (insert limit definition from my first post, I don't feel like re-latexing it)
    The functional derivative tells you how the number returned by the functional F[f(x)] changes as you slightly change the function f(x) that you feed into the 'machine' "

    So, the dirac delta is 1 at one point and 0 everywhere else, no? So unless y = z, delta(y-z) is 0, which gives you a value of 0 for the derivative (0/epsilon I would think would be zero). Are we not looking at numerical values of f when we're looking at how F changes with respect to f? I see that f is definitely a function, and F takes a function as an argument; but epsilon is a number, and so multiplying it by the dirac delta gives us epsilon at the point ##\delta## is centered around, and 0 everywhere else, so aren't we really only looking at changes of our function f when f = z (or where ever we're centering around?) maybe I don't see the purpose of multiplying epsilon by the dirac delta otherwise.
     
  9. Feb 1, 2015 #8

    BiGyElLoWhAt

    User Avatar
    Gold Member

    actually, not that I think it really matters, but the argument in the dirac delta per my book would be y-x, not x-y as I have in my first post.
     
  10. Feb 1, 2015 #9

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    No. It has the properties that ##\delta(x-y) = 0## if ##x \neq y## and that ##\int \delta(x-y) dx = 1## as long as ##y## is in the integration domain.

    Yes, but the thing you need to understand is that a functional is essentially a map from a functional space to the real (or complex) numbers. As such, it makes no sense to write ##F[f(x)]## as ##f(x)## is a number (the value of the function ##f## at the point ##x##) and not a function and the functional ##F## needs a function as its argument. Now, in sloppy notation, you might write out the ##x## as a dummy variable, but your result is not dependent on this dummy variable.

    Let us say you have functions on the interval ##I = [0,L]## and that ##f## is a function defined on that interval. The function ##g = f + \epsilon\delta_y## is also a function on the same interval, which for ##x \in I## takes the value ##g(x) = f(x) + \epsilon \delta(x-y)##.

    If the functional ##F## is simply
    $$
    F[f] = \int_0^L f(x) dx,
    $$
    then it is quite apparent that this does not depend on any parameter ##x## - you might just as well have called the dummy variable ##x## something else.

    If you compare with the discrete case that I also mentioned, ##x## corresponds to the summation variable ##m##, while ##f(x)## corresponds to the value of ##x_m##.

    The Dirac delta is symmetric so this is indeed irrelevant.
     
  11. Feb 1, 2015 #10

    BiGyElLoWhAt

    User Avatar
    Gold Member

    So I guess my question now is this: how does f(x) (the argument taken by the functional) change. I see what you're saying about mapping from my f space to my F space.

    If f(x) were to equal cos(x), are we looking at changes in f(x) such as epsilon = sin(x)-cos(x) or something similar? but for all possible "small changes" to our function.
    So this would also include epsilon = constant, so that we look at the change of cos(x) to cos(x) + .001

    Perhaps this is my area of confusion. Is epsilon a number or a function itself?
     
  12. Feb 1, 2015 #11

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    ##\epsilon## is just a small number (in fact you send it to zero in the limit) just as usual. When considering the functional derivative, you are considering small (as predicated by ##\epsilon \to 0## changes to the function, just as you consider small deviations from the function argument when you do partial derivatives of a normal function. If f(x) = cos(x), the functional F[f] will be a number, just as it will be a number if f(x) = cos(x) - sin(x) or f(x) = cos(x) + 100000. However, these are not small perturbations of the function, just as evaluating f(x) at x = 0 and at x = 10^65 is not a small change of the argument x.
     
  13. Feb 1, 2015 #12

    BiGyElLoWhAt

    User Avatar
    Gold Member

    Ok.
    I appreciate you bearing with me, by the way.

    I am, however, still missing something. I'm not sure what that is, though.
     
  14. Feb 1, 2015 #13

    BiGyElLoWhAt

    User Avatar
    Gold Member

    So epsilon is a number, however, when we alter our function by a factor of epsilon, epsilon must be must be multiplied by a function for our functional to take it as an argument. Am i right so far?

    I see where the delta function would be the go to function, as in if someone said pick the most obvious function to multiply epsilon by. Is there any sort of derivation for this? Or was it just the most useful function anyone could come up with?
     
  15. Feb 1, 2015 #14

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    I would rather say that it is the most useful definition you can come up with. In a similar fashion you could ask why partial derivatives are defined in the way they are with respect to one coordinate at a time only and not like a more general directional derivative. The answer is that you can build the variation of the functional for any variation of the argument using the functional derivative, just as you can build the directional derivative using partial derivatives and the direction vector.
     
  16. Feb 1, 2015 #15

    BiGyElLoWhAt

    User Avatar
    Gold Member

    Actually if i remember correct, in calc 3 we went through and very rigerously derived partial derivatives by seeing what happens to small changes w.r.t. each variable, and it the limit definition for partials came out of it. Im curious if there is something like this for functional derivatives.
     
  17. Feb 2, 2015 #16

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    But this is my point exactly. You define partial derivatives using what happens when you change one of the coordinates and you define the functional derivative as what happens when you change one of the function values.

    Again, it helps seeing the function values as the coordinates and their argument as simply a label denoting which coordinate is intended.

    In a separable space, it might have been an idea to define thefunctional derivative using a countable basis, but still I think it is perfectly fine as it is.
     
  18. Feb 2, 2015 #17

    BiGyElLoWhAt

    User Avatar
    Gold Member

    Let me offer up a worked example to perhaps show my confusion.
    ##J [f] = \int [f (y)]^p \phi (y)dy##
    The derivative w.r.t f(x) is given as
    ## \lim_{\epsilon \to 0} \frac{1}{\epsilon}[\int [f (y) + \epsilon \delta (y-x)]^p \phi (y)dy-\int [f (y)]^p\phi (y)dy]##
    And they skip straight to the answer ##p [f]^{p-1}\phi (y)##
    If we actually expand this out and let epsilon get close to zero, f^p' cancel as expected, and all of the terms except f(y)delta^p phi (y) go to zero because they have a degree of epsilon^2 or higher and thus all go to zero when we cancel the epsilons and let epsilon go to zero. I guess the integral actually saves this one, because of the delta function having area 1, but what if j wasnt defined as an integral? Literally every worked example is an integrated functional. So the answer to my problem would be the case where p =0, and so get rid of the piecewise conditions, and its just g (x,y) everywhere. I still dont feel like i really understand exactly whats going on though.
     
  19. Feb 2, 2015 #18

    BiGyElLoWhAt

    User Avatar
    Gold Member

    I hope that rendered properly. My app wont display latex for some reason, i dont know if i need a 3rd party for latex code or what. Just got it yesterday.
     
  20. Feb 2, 2015 #19

    BiGyElLoWhAt

    User Avatar
    Gold Member

    Actually, the integral is irrelevant, as I forgot we're looking at the derivative w.r.t. f(x), and the derivative doesn't include the integral (the integral is an operand).
    So if we have, after canceling all the terms out in the numerator:

    ##\lim_{\epsilon\to 0} \frac{pf(x)^{p-1}\epsilon \delta(y-x)\phi(x)}{\epsilon}##
    How does that delta function just go away? Unless we're only looking at how our functional J changes with small changes to the function f at the point our delta function is centered around? I hope that is a clear question.
     
  21. Feb 2, 2015 #20

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    You had it right from the beginning. The integral is part of the functional and is used to perform the integration over the delta.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Functional Derivatives
  1. Derivative of function (Replies: 1)

  2. Deriving function (Replies: 13)

Loading...