Chain rule for functions of operators?

Click For Summary
The discussion centers on the differentiation of operator-valued functions in quantum mechanics, specifically the expression for the derivative of f(A(x)). It questions whether the chain rule applies as it does for scalar functions, particularly when the operators A and dA/dx may not commute. The conversation explores the implications of time-dependent Hamiltonians and the need for a corresponding operator Q(t) that satisfies certain conditions. Additionally, the complexities of defining directional derivatives in this context are highlighted, especially when considering the spectral decomposition of operators. The overall consensus is that deriving simple formulas in these cases is challenging and requires careful consideration of operator properties.
pellman
Messages
683
Reaction score
6
This is strictly a math question but I figured that since it is something which would show up in QM, the quantum folks might be already familiar with it.

Suppose we have an operator valued function A(x) of a real parameter x and another function f, both of which have well defined derivatives.

consider \frac{d}{dx}f(A(x))

Does this equal

\frac{df}{dA}\frac{dA}{dx}

or

\frac{dA}{dx}\frac{df}{dA}

or something else? Of course, if A and dA/dx commute, then either expression is good. But it is not clear to me that A and dA/dx would necessarily commute.
 
Physics news on Phys.org
How do you define df/dA? (I don't think it's something we even want to define).
 
Fredrik said:
How do you define df/dA? (I don't think it's something we even want to define).

If f(u) is R --> R and has a Taylor series representation

f(u)=\Sigma \frac{1}{n!}f_n u^n

where the f_n are just coefficients. Then

f'(u)=\Sigma \frac{1}{n!}f_{n+1} u^n


We can similarly put

f(A)=\Sigma \frac{1}{n!}f_n A^n

f'(A)=\Sigma \frac{1}{n!}f_{n+1} A^n

For some f this may not work, it may not converge, blah, blah, blah. Let's just assume f is a function for which this works. The actual function I am interested in is f(A)=e^A, so f(A) = df/dA anyway.
 
Maybe I shouldn't be so general. My problem is this:

Suppose we have time-dependent Hamiltonian H(t). Then we can no longer write

|\Psi(t)\rangle = e^{-iHt}|\Psi(0)\rangle

because dH/dt != 0 . What we need is an operator Q(t) such that dQ/dt=H .

Then we would have

i\frac{d}{dt}|\Psi(t)\rangle =i\frac{d}{dt}e^{-iQ(t)}|\Psi(0)\rangle

=He^{-iQ(t)}|\Psi(0)\rangle

or would it be

=e^{-iQ(t)}H|\Psi(0)\rangle?

If H does not commute with Q, then the latter means we are not dealing with a solution to the Schrodinger equation. So what is

\frac{d}{dt}e^{-iQ(t)}=?
 
(I wrote this before I saw your last post).

I think that's a directional derivative in the direction of A

\lim_{t\rightarrow 0}\frac{f(A+t\frac{A}{\|A\|})-f(A)}{t}

Both the df/dA notation and the f'(A) notation seem very inadequate for directional derivatives. You could use something like D_X f(A) for the directional derivative in direction X, at A. Your df/dA would then be D_A f(A). However, when we take the derivate of exponentials, don't we always do it with respect to a parameter? For example, when we prove that A is self-adjoint if U=exp(itA) is unitary:

U^\dagger U=I

U^\dagger=U^{-1}

e^{-itA^\dagger}=e^{-itA}

Now apply \frac{d}{dt}\bigg|_0 to both sides, and we're done.

Added after I read your post #4: If Q(t) commutes with Q(s) for all t and s, then Q'(t) commutes with Q(t) and therefore with exp(iQ(t)), so the two options are equivalent. I need to think about the possibility that Q(t) doesn't commute with Q(s).
 
Last edited:
If your A is either self-adjoint or unitary in a (rigged) Hilbert space, then you can easily define a function f(A) by the means of the spectral decomposition of A. Then you can compute a derivative, but, of course, under tight conditions of convergence.
 
Fredrik said:
I need to think about the possibility that Q(t) doesn't commute with Q(s).
I don't think there are any simple formulas in this case. Note e.g. that d/dt Q(t)2=Q'(t)Q(t)+Q(t)Q'(t). So if we try to apply d/dt to each term of the exponential, things are already weird in the second order term.
 
I see what you mean. Darn. I was hoping this would have a simple answer.
 
bigubau said:
If your A is either self-adjoint or unitary in a (rigged) Hilbert space, then you can easily define a function f(A) by the means of the spectral decomposition of A. Then you can compute a derivative, but, of course, under tight conditions of convergence.

Umm,... how does this work when one is dealing is a continuous family of operators
such as A(t) ?

E.g., for a given time, we have an operator A_0 = A(t=0), (assumed to self-adjoint, say),
then we can spectral-decompose in terms of its eigenvalues and eigenstates:

<br /> f(A_0) ~=~ \int da_0 f(a_0) |a_0\rangle \langle a_0| ~~.<br />

But each A(t) will have a different set of eigenvalues and eigenstates in general,

<br /> f(A_t) ~=~ \int da_t f(a_t) |a_t\rangle \langle a_t| ~~.<br />

so how does one take the t derivative of the LHS without first computing
the time-dependent eigenvalues and eigenstates explicitly?

(Or did I misunderstand you?)
 

Similar threads

  • · Replies 13 ·
Replies
13
Views
1K
  • · Replies 8 ·
Replies
8
Views
18K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 12 ·
Replies
12
Views
5K