• Support PF! Buy your school textbooks, materials and every day products Here!

Functional Derivatives

  • #1
BiGyElLoWhAt
Gold Member
1,560
113
If I understand what's going on (quite possibly I don't), I think my book is using bad (confusing) notation.

Homework Statement


As written: "Calculate ##\frac{\delta H[f]}{\delta f(z)} \ \text{where} \ H=\int G(x,y)f(y)dy##"

and ##\frac{\delta H[f]}{\delta f(z)}## is the functional derivative of H[f] with respect to f(y) at z (I think that's what it is).

Homework Equations


...

The Attempt at a Solution


So if I think what's happening here is we're concerned with how the value of H[f] changes with f(y) when y = z.

This gives rise to (changing notation slightly) ##\frac{dF[f]}{d f(y)} = \lim_{\epsilon\ \to\ 0} \frac{F[f(y) + \epsilon \delta (x-y)] - F[f(y)] }{\epsilon}## where ##\delta(x-y)## is the dirac delta function centered at y= x, (I believe this is so, please correct me, my book is not explicit) so this gives us our F[f + epsilon] - F[f] when y = x and F[f]-F[f]=0 when y doesn't equal x. If this is right, going through the example:

##\frac{d H[f]}{d f(z)} ##
##=lim_{\epsilon\ \to\ 0}\frac{\int[G(x,y)(f(z) + \epsilon \delta (x-y))] - G(x,y)f(z)]}{\epsilon }##
##=\left \{
\begin{array}{lr}
-\int G(x,y)dy\ \text{if y=z} \\
0\ otherwise\\
\end{array} \right. ##

I tried to show my train of thought, please correct any misconceptions I have about this. I would also appreciate it if someone would check my answers. Apparently my book doesn't have the answer's to it's exercises.
 

Answers and Replies

  • #2
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,671
6,454
Your y is a dummy variable and your answer cannot depend on it. The functional ##F[f]## itself does not take ##f(x)## as an argument (i.e., the value of a function in a point), it takes ##f##, the function, as an argument. It is therefore more instructive to write ##\delta(x-z) \equiv \delta_z(x)## and give ##f + \epsilon \delta_z## as an argument to the functional. The definition of the functional derivative would then be
$$
\frac{\delta F}{\delta f(z)} = \lim_{\epsilon \to 0} \frac{F[f+\epsilon \delta_z] - F[f]}{\epsilon}.
$$
You will obtain
$$
F[f +\epsilon \delta_z] = \int G(x,y) (f(y) + \delta_z(y)) dy.
$$
I will let you take it from there, but in the end it is not much stranger than having a sum
$$
S = \sum_m G_m x_m
$$
and asking for ##\partial S/\partial x_k##.
 
  • #3
BiGyElLoWhAt
Gold Member
1,560
113
I guess I'm missing what this delta is. Is it not the dirac delta? If so what is it?
 
  • #4
BiGyElLoWhAt
Gold Member
1,560
113
Is it perhaps just to represent a 2 dimensional epsilon? The product of epsilon and delta, that is. So small changes in f over a range of small changes in y?
 
  • #5
Ray Vickson
Science Advisor
Homework Helper
Dearly Missed
10,706
1,728
If I understand what's going on (quite possibly I don't), I think my book is using bad (confusing) notation.

Homework Statement


As written: "Calculate ##\frac{\delta H[f]}{\delta f(z)} \ \text{where} \ H=\int G(x,y)f(y)dy##"

and ##\frac{\delta H[f]}{\delta f(z)}## is the functional derivative of H[f] with respect to f(y) at z (I think that's what it is).

Homework Equations


...

The Attempt at a Solution


So if I think what's happening here is we're concerned with how the value of H[f] changes with f(y) when y = z.

This gives rise to (changing notation slightly) ##\frac{dF[f]}{d f(y)} = \lim_{\epsilon\ \to\ 0} \frac{F[f(y) + \epsilon \delta (x-y)] - F[f(y)] }{\epsilon}## where ##\delta(x-y)## is the dirac delta function centered at y= x, (I believe this is so, please correct me, my book is not explicit) so this gives us our F[f + epsilon] - F[f] when y = x and F[f]-F[f]=0 when y doesn't equal x. If this is right, going through the example:

##\frac{d H[f]}{d f(z)} ##
##=lim_{\epsilon\ \to\ 0}\frac{\int[G(x,y)(f(z) + \epsilon \delta (x-y))] - G(x,y)f(z)]}{\epsilon }##
##=\left \{
\begin{array}{lr}
-\int G(x,y)dy\ \text{if y=z} \\
0\ otherwise\\
\end{array} \right. ##

I tried to show my train of thought, please correct any misconceptions I have about this. I would also appreciate it if someone would check my answers. Apparently my book doesn't have the answer's to it's exercises.
What is the definition of ##\delta H(f) / \delta f## that your book uses?
 
  • #6
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,671
6,454
I guess I'm missing what this delta is. Is it not the dirac delta? If so what is it?
It is the Dirac delta. But it must be seen as a function of an argument just as ##f## is a function of an argument and the functional takes functions as arguments, not function values. When you write ##\delta_z(x)## instead of ##\delta(z-x)##, it simply becomes more evident which of the variables should be used as the function parameter (not that it matters in this case, the delta is symmetric). The value ##F[f+\epsilon \delta_z]## is simply the value of the functional when its argument is the function ##f + \epsilon\delta_z##. Now, this function has a parameter ##z##, which is the reason that the resulting functional derivative is a function of ##z##.
 
  • #7
BiGyElLoWhAt
Gold Member
1,560
113
What is the definition of ##\delta H(f) / \delta f## that your book uses?
It shows the limit definition of a run of the mill calc 1 derivative then says:
"The derivative of the function tells you how the number returned by the function f(x) changes as you slightly change the number x that feed into the 'machine'. In the same way, we can define a functional derivative of a functional F[f] as follows:
(insert limit definition from my first post, I don't feel like re-latexing it)
The functional derivative tells you how the number returned by the functional F[f(x)] changes as you slightly change the function f(x) that you feed into the 'machine' "

It is the Dirac delta. But it must be seen as a function of an argument just as ##f## is a function of an argument and the functional takes functions as arguments, not function values. When you write ##\delta_z(x)## instead of ##\delta(z-x)##, it simply becomes more evident which of the variables should be used as the function parameter (not that it matters in this case, the delta is symmetric). The value ##F[f+\epsilon \delta_z]## is simply the value of the functional when its argument is the function ##f + \epsilon\delta_z##. Now, this function has a parameter ##z##, which is the reason that the resulting functional derivative is a function of ##z##.
So, the dirac delta is 1 at one point and 0 everywhere else, no? So unless y = z, delta(y-z) is 0, which gives you a value of 0 for the derivative (0/epsilon I would think would be zero). Are we not looking at numerical values of f when we're looking at how F changes with respect to f? I see that f is definitely a function, and F takes a function as an argument; but epsilon is a number, and so multiplying it by the dirac delta gives us epsilon at the point ##\delta## is centered around, and 0 everywhere else, so aren't we really only looking at changes of our function f when f = z (or where ever we're centering around?) maybe I don't see the purpose of multiplying epsilon by the dirac delta otherwise.
 
  • #8
BiGyElLoWhAt
Gold Member
1,560
113
actually, not that I think it really matters, but the argument in the dirac delta per my book would be y-x, not x-y as I have in my first post.
 
  • #9
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,671
6,454
So, the dirac delta is 1 at one point and 0 everywhere else, no?
No. It has the properties that ##\delta(x-y) = 0## if ##x \neq y## and that ##\int \delta(x-y) dx = 1## as long as ##y## is in the integration domain.

Are we not looking at numerical values of f when we're looking at how F changes with respect to f?
Yes, but the thing you need to understand is that a functional is essentially a map from a functional space to the real (or complex) numbers. As such, it makes no sense to write ##F[f(x)]## as ##f(x)## is a number (the value of the function ##f## at the point ##x##) and not a function and the functional ##F## needs a function as its argument. Now, in sloppy notation, you might write out the ##x## as a dummy variable, but your result is not dependent on this dummy variable.

Let us say you have functions on the interval ##I = [0,L]## and that ##f## is a function defined on that interval. The function ##g = f + \epsilon\delta_y## is also a function on the same interval, which for ##x \in I## takes the value ##g(x) = f(x) + \epsilon \delta(x-y)##.

If the functional ##F## is simply
$$
F[f] = \int_0^L f(x) dx,
$$
then it is quite apparent that this does not depend on any parameter ##x## - you might just as well have called the dummy variable ##x## something else.

If you compare with the discrete case that I also mentioned, ##x## corresponds to the summation variable ##m##, while ##f(x)## corresponds to the value of ##x_m##.

not that I think it really matters, but the argument in the dirac delta per my book would be y-x, not x-y as I have in my first post.
The Dirac delta is symmetric so this is indeed irrelevant.
 
  • #10
BiGyElLoWhAt
Gold Member
1,560
113
So I guess my question now is this: how does f(x) (the argument taken by the functional) change. I see what you're saying about mapping from my f space to my F space.

If f(x) were to equal cos(x), are we looking at changes in f(x) such as epsilon = sin(x)-cos(x) or something similar? but for all possible "small changes" to our function.
So this would also include epsilon = constant, so that we look at the change of cos(x) to cos(x) + .001

Perhaps this is my area of confusion. Is epsilon a number or a function itself?
 
  • #11
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,671
6,454
##\epsilon## is just a small number (in fact you send it to zero in the limit) just as usual. When considering the functional derivative, you are considering small (as predicated by ##\epsilon \to 0## changes to the function, just as you consider small deviations from the function argument when you do partial derivatives of a normal function. If f(x) = cos(x), the functional F[f] will be a number, just as it will be a number if f(x) = cos(x) - sin(x) or f(x) = cos(x) + 100000. However, these are not small perturbations of the function, just as evaluating f(x) at x = 0 and at x = 10^65 is not a small change of the argument x.
 
  • #12
BiGyElLoWhAt
Gold Member
1,560
113
Ok.
I appreciate you bearing with me, by the way.

I am, however, still missing something. I'm not sure what that is, though.
 
  • #13
BiGyElLoWhAt
Gold Member
1,560
113
So epsilon is a number, however, when we alter our function by a factor of epsilon, epsilon must be must be multiplied by a function for our functional to take it as an argument. Am i right so far?

I see where the delta function would be the go to function, as in if someone said pick the most obvious function to multiply epsilon by. Is there any sort of derivation for this? Or was it just the most useful function anyone could come up with?
 
  • #14
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,671
6,454
I would rather say that it is the most useful definition you can come up with. In a similar fashion you could ask why partial derivatives are defined in the way they are with respect to one coordinate at a time only and not like a more general directional derivative. The answer is that you can build the variation of the functional for any variation of the argument using the functional derivative, just as you can build the directional derivative using partial derivatives and the direction vector.
 
  • #15
BiGyElLoWhAt
Gold Member
1,560
113
Actually if i remember correct, in calc 3 we went through and very rigerously derived partial derivatives by seeing what happens to small changes w.r.t. each variable, and it the limit definition for partials came out of it. Im curious if there is something like this for functional derivatives.
 
  • #16
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,671
6,454
But this is my point exactly. You define partial derivatives using what happens when you change one of the coordinates and you define the functional derivative as what happens when you change one of the function values.

Again, it helps seeing the function values as the coordinates and their argument as simply a label denoting which coordinate is intended.

In a separable space, it might have been an idea to define thefunctional derivative using a countable basis, but still I think it is perfectly fine as it is.
 
  • #17
BiGyElLoWhAt
Gold Member
1,560
113
Let me offer up a worked example to perhaps show my confusion.
##J [f] = \int [f (y)]^p \phi (y)dy##
The derivative w.r.t f(x) is given as
## \lim_{\epsilon \to 0} \frac{1}{\epsilon}[\int [f (y) + \epsilon \delta (y-x)]^p \phi (y)dy-\int [f (y)]^p\phi (y)dy]##
And they skip straight to the answer ##p [f]^{p-1}\phi (y)##
If we actually expand this out and let epsilon get close to zero, f^p' cancel as expected, and all of the terms except f(y)delta^p phi (y) go to zero because they have a degree of epsilon^2 or higher and thus all go to zero when we cancel the epsilons and let epsilon go to zero. I guess the integral actually saves this one, because of the delta function having area 1, but what if j wasnt defined as an integral? Literally every worked example is an integrated functional. So the answer to my problem would be the case where p =0, and so get rid of the piecewise conditions, and its just g (x,y) everywhere. I still dont feel like i really understand exactly whats going on though.
 
  • #18
BiGyElLoWhAt
Gold Member
1,560
113
I hope that rendered properly. My app wont display latex for some reason, i dont know if i need a 3rd party for latex code or what. Just got it yesterday.
 
  • #19
BiGyElLoWhAt
Gold Member
1,560
113
Let me offer up a worked example to perhaps show my confusion.
##J [f] = \int [f (y)]^p \phi (y)dy##
The derivative w.r.t f(x) is given as
## \lim_{\epsilon \to 0} \frac{1}{\epsilon}[\int [f (y) + \epsilon \delta (y-x)]^p \phi (y)dy-\int [f (y)]^p\phi (y)dy]##
And they skip straight to the answer ##p [f]^{p-1}\phi (y)##
If we actually expand this out and let epsilon get close to zero, f^p' cancel as expected, and all of the terms except f(y)delta^p phi (y) go to zero because they have a degree of epsilon^2 or higher and thus all go to zero when we cancel the epsilons and let epsilon go to zero. I guess the integral actually saves this one, because of the delta function having area 1, but what if j wasnt defined as an integral? Literally every worked example is an integrated functional. So the answer to my problem would be the case where p =0, and so get rid of the piecewise conditions, and its just g (x,y) everywhere. I still dont feel like i really understand exactly whats going on though.
Actually, the integral is irrelevant, as I forgot we're looking at the derivative w.r.t. f(x), and the derivative doesn't include the integral (the integral is an operand).
So if we have, after canceling all the terms out in the numerator:

##\lim_{\epsilon\to 0} \frac{pf(x)^{p-1}\epsilon \delta(y-x)\phi(x)}{\epsilon}##
How does that delta function just go away? Unless we're only looking at how our functional J changes with small changes to the function f at the point our delta function is centered around? I hope that is a clear question.
 
  • #20
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,671
6,454
You had it right from the beginning. The integral is part of the functional and is used to perform the integration over the delta.
 
  • #21
BiGyElLoWhAt
Gold Member
1,560
113
Ok, so what would happen if there weren't an integral attached to J?
 
  • #22
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,671
6,454
You would get a delta function as the functional derivative. Note that what you differentiate must still be a functional. For example:
F[f] = f(0)
is a mapping from functions to real numbers (assuming f is a function from real numbers to real numbers) and so is a functional. Its functional derivative at ##z## would be
$$
\lim_{\epsilon \to 0}\frac{F[f+\epsilon\delta_z] - F[f]}{\epsilon} = \delta_z(0) = \delta(z).
$$
 
  • #23
BiGyElLoWhAt
Gold Member
1,560
113
Ok, so basically, minus the integral, the answer is what I posted a few back.
 
  • #24
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
16,671
6,454
What you showed in post #17 was correct and the "magic step" was simply using the ##\delta## to perform the integral.
 
  • #25
BiGyElLoWhAt
Gold Member
1,560
113
Ok cool. Thanks again. I still need to work on this obviously, but at least I have a half-a**ed idea of what I'm doing
 

Related Threads on Functional Derivatives

Replies
14
Views
11K
  • Last Post
Replies
9
Views
497
  • Last Post
Replies
5
Views
2K
  • Last Post
Replies
15
Views
6K
  • Last Post
Replies
10
Views
852
  • Last Post
Replies
4
Views
39K
  • Last Post
Replies
5
Views
13K
  • Last Post
Replies
0
Views
9K
  • Last Post
Replies
3
Views
1K
  • Last Post
Replies
1
Views
1K
Top