Confusion over the definition of a Green's function

Click For Summary
The discussion centers on the confusion regarding the definitions of Green's functions in the context of linear operators. One definition states that L*G(ξ,x) = δ(ξ-x), while another states LG(x,ξ) = δ(x-ξ). Participants clarify that the key to understanding these definitions lies in recognizing the roles of the variables and the operator L, which must act on the appropriate variable in each case. The conversation emphasizes the importance of properly defining the contracted variable in the inner product to avoid confusion. Ultimately, the exchange leads to a clearer understanding of how Green's functions relate to linear operators and their inverses.
TheFerruccio
Messages
216
Reaction score
0
This is how I learned about Green's functions:

For the 1-D problem with the linear operator L and the inner product,
(\cdot,\cdot),
Lu(x) = f(x) \rightarrow u=(f(x),G(\xi,x))

if the Green's function G is defined such that

L^*G(\xi,x) = \delta(\xi-x)

I understand how to arrive at this algebraically. However, most articles I read define the Green's function backwards (?) like this:

LG(x,\xi)=\delta(x-\xi)

How do I arrive at this definition? As in, how do I work through the algebra to show that the green's function can be defined like this? I am assuming that the swapping of the variables indicates an equivalence between the two definitions, but I do not immediately see it, and it has been confusing me for quite a bit. Does it have something to do with whether we're on the interval [a,b] in \xi vs x? Could someone walk me through the steps? No, this is not a homework or coursework question. I'm just confused with the definition.
 
Physics news on Phys.org
TheFerruccio said:
For the 1-D problem with the linear operator L and the inner product,
(\cdot,\cdot),
Lu(x) = f(x) \rightarrow u=(f(x),G(\xi,x))

if the Green's function G is defined such that

L^*G(\xi,x) = \delta(\xi-x)

I understand how to arrive at this algebraically. However, most articles I read define the Green's function backwards (?) like this:

LG(x,\xi)=\delta(x-\xi)

How do I arrive at this definition? As in, how do I work through the algebra to show that the green's function can be defined like this?
Part of the difficulty might be that you haven't indicated explicitly which variable ##L## acts on in each case.

So re-write your earlier equation as
$$u(\xi) ~=~ \Big(f(x),G(\xi,x)\Big)$$
and then apply your operator ##L## to it. But take care: the "x" is a dummy integration variable in the inner product, so your ##L## would need to act on the ##\xi## variable.

No, this is not a homework or coursework question. I'm just confused with the definition.
You'll still have to work through it to get understanding... :biggrin:
 
strangerep said:
Part of the difficulty might be that you haven't indicated explicitly which variable ##L## acts on in each case.

So re-write your earlier equation as
$$u(\xi) ~=~ \Big(f(x),G(\xi,x)\Big)$$
and then apply your operator ##L## to it. But take care: the "x" is a dummy integration variable in the inner product, so your ##L## would need to act on the ##\xi## variable.

You'll still have to work through it to get understanding... :biggrin:

How do you specify what variable an operator acts on? I thought an operator was implicit, and how it behaves is defined based on the function it's operating on, for example, if

L=\frac{\partial}{\partial x}

Then wouldn't the result of that operation be apparent depending on what u is? If u is independent of x, then the Lu would just be 0. That's why I didn't think that I would need to define what variables L is operating on. Also, in the definitions and examples I've worked through, I've never had to be explicit about what variables L was acting on. I had to be explicit about the functions L was acting on, but not the variables, since that depends on a further restriction of L. Wouldn't I lose generality if I explicitly define L like that?

Also, my first equation should be $$u(x) ~=~ \Big(f(x),G(\xi,x)\Big)$$ with x instead of \xi (Line 3 in my first post). So, it's all in functions of x, with the dummy variable of integration in the inner product being \xi
 
Last edited:
TheFerruccio said:
How do you specify what variable an operator acts on?
It depends on the details on the operator. See below.

I thought an operator was implicit,
That depends on the details of the operator. Integral operators are a bit trickier...
and how it behaves is defined based on the function it's operating on, [...]
Suppose L is just the operator of differentiation. Then you can of course write ##f' = L f## in abstract notation. You could also use concrete notation, e.g., $$f'(x) ~=~ \frac{d}{dx} \; f(x)$$ which contains essentially the same information as $$f'(z) ~=~ \frac{d}{dz} \; f(z) ~,$$ (functionally speaking).

Also, my first equation should be $$u(x) ~=~ \Big(f(x),G(\xi,x)\Big)$$ with x instead of \xi (Line 3 in my first post). So, it's all in functions of x, with the dummy variable of integration in the inner product being \xi
That wouldn't make sense, since in that case you could pull ##f(x)## outside the integral, i.e.,
$$\int d\xi \, f(x) G(\xi,x) ~=~ f(x) \int d\xi\, G(\xi,x) ~.$$But what's actually needed is to "contract" ##G## with ##f##.

To explain what I mean by "contract", here's another way to think about Green's functions is as a continuous-index generalization of ordinary matrices. Consider a column vector with components ##a_i##, and a 2x2 matrix with components ##M_{ij}##. The action of M on ##a## produces a new vector ##b## as follows:$$b_i~=~ \sum_j M_{ij} a_j ~.$$Now think of a function ##f## as a column vector with a continuous index ##x##. So I'll write ##f_x := f(x)##, etc. The action of the Green's function is $$u_x \equiv u(x) ~=~ \int d\xi \, G_{x\xi} f_\xi ~\equiv~ \int d\xi \, G(x,\xi) \, f(\xi) ~,$$where I hope you can see that the Greens function is analogous to the earlier matrix M, but here we're doing a "contraction" over the "index" ##\xi##, implemented as integration instead of discrete summation. A purist might even think of G as an integral operator (instead of just the kernel function of an integral operator as above), and write ##u = Gf## and then the notation exactly parallels the matrix/vector case -- provided one can keep track of the types of all the symbols.
 
strangerep said:
It depends on the details on the operator. See below.

That depends on the details of the operator. Integral operators are a bit trickier...
Suppose L is just the operator of differentiation. Then you can of course write ##f' = L f## in abstract notation. You could also use concrete notation, e.g., $$f'(x) ~=~ \frac{d}{dx} \; f(x)$$ which contains essentially the same information as $$f'(z) ~=~ \frac{d}{dz} \; f(z) ~,$$ (functionally speaking).

That wouldn't make sense, since in that case you could pull ##f(x)## outside the integral, i.e.,
$$\int d\xi \, f(x) G(\xi,x) ~=~ f(x) \int d\xi\, G(\xi,x) ~.$$But what's actually needed is to "contract" ##G## with ##f##.

To explain what I mean by "contract", here's another way to think about Green's functions is as a continuous-index generalization of ordinary matrices. Consider a column vector with components ##a_i##, and a 2x2 matrix with components ##M_{ij}##. The action of M on ##a## produces a new vector ##b## as follows:$$b_i~=~ \sum_j M_{ij} a_j ~.$$Now think of a function ##f## as a column vector with a continuous index ##x##. So I'll write ##f_x := f(x)##, etc. The action of the Green's function is $$u_x \equiv u(x) ~=~ \int d\xi \, G_{x\xi} f_\xi ~\equiv~ \int d\xi \, G(x,\xi) \, f(\xi) ~,$$where I hope you can see that the Greens function is analogous to the earlier matrix M, but here we're doing a "contraction" over the "index" ##\xi##, implemented as integration instead of discrete summation. A purist might even think of G as an integral operator (instead of just the kernel function of an integral operator as above), and write ##u = Gf## and then the notation exactly parallels the matrix/vector case -- provided one can keep track of the types of all the symbols.

Oh, that's just a brilliant explanation. Yes, I've dealt with tensors in indicial notation a ton when talking about solid mechanics. What you just explained to me thoroughly unified so many concepts in my head. Thanks! I see where I was erring in my thought process now. I need to make sure to define my contracted variable when specifying the inner product. Thanks!
 
TheFerruccio said:
[...] thoroughly unified so many concepts in my head. [...]
I felt the same way when I first saw this. It means that we can think of
$$L u = f$$ as being solved by $$u = L^{-1} f ~,$$and ##G## is just a concrete implementation of ##L^{-1}##.

Happy gardening.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 33 ·
2
Replies
33
Views
6K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 0 ·
Replies
0
Views
1K