Preservation of Poisson Bracket Structure upon quantization?

VoxCaelum
Messages
14
Reaction score
0
When (canonically) quantizing a classical system we promote the Poisson brackets to (anti-)commutators. Now I was wondering how much of Poisson bracket structure is preserved. For example for a classical (continuous) system we have
$$ \lbrace \phi(z), f(\Pi(y)) \rbrace = \frac{\delta f(\Pi(y))}{\delta \Pi(z)}, $$
where the derivative is the functional derivative and the bracket denotes the classical poisson bracket for fields
$$ \lbrace F, G \rbrace := \int \text{d} x \left[\frac{\delta F}{\delta \phi(x)} \frac{\delta G}{\delta \Pi(x)} - \frac{\delta G}{\delta \phi(x)} \frac{\delta F}{\delta \Pi(x)}\right]. $$
Does this mean that when we quantize using the rule
$$ \lbrace \phi, \Pi \rbrace \rightarrow -i \left[ \phi, \Pi \right]_{\pm}, $$
where + stands for commutator (bosons) and - stands for anti-commutator (fermions), we automatically obtain
$$ \left[ \phi(x), f(\Pi(y)) \right]_{\pm} = i\frac{\delta f(\Pi(y))}{\delta \Pi(z)},$$
and similar formulae for other structures of the Classical Poisson bracket, (say Hamilton equations of motion)? I was wondering this because I want to compute
$$ \left[ H_{12}, \frac{1}{E-H_{22}}\right], $$
where $$H_{12}$$ and $$H_{22}$$ are functions of several different (second-quantized) creation and annihilation operators. I was able to check the preservation of this particular rule for the first quantized situation
$$[x,p]= i,$$
here it is easy to check that this leads to
$$[x,f(p)] = i \partial_{p} f(p),$$
either by using a test function and the regular coordinate space representation of the operators x and p, or simply by plugging in a monomial function for f(p) and using the rules for commutation. I am not sure how to prove a generalized version of this though.
Any help would be appreciated.
 
Physics news on Phys.org
It absolutely does mean that! The Poisson Bracket and the commutator are both examples of an algebraic structure called a derivation, which basically means they act like derivatives. They're obviously linear, which gets you most of the way there, and additionally they obey the product rule: [a, bc] = b[a,c] + [a, b]c. With those relations in hand, you can use induction to show that [x,p]=i implies that [x, f(x, p)] = i \frac{\partial f}{\partial p}, and [p, f(x,p)] = -i \frac{\partial f}{\partial x}. This fact also allows you to derive Hamilton's equations of motion, as well as the Heisenberg operator equation, and pretty much any other quantum mechanics formula that has an i in it, by looking at the analogous PB equation in classical mechanics.
 
Thanks! So now imagine the case where I second quantize my system, and I'm talking about fermions, that is I have the anti-commutator
$$ \left[a^{\dagger}_{i},a_{j}\right]_{-} = \delta_{ij}, $$
for the field creation operators in momentum space. And say I have some other operators say b and c, that obey the same rules, then I can say
$$ \left[a^{\dagger}_{i}, f(a^{\dagger}_{j},a_{j},b_{k},c_{l})\right]_{-} = \frac{\partial f}{\partial a_{i}}. $$
I somehow have an intuition that this should be true, but I am unsure how to prove (or disprove) this. Any hints or references in the right direction would be appreciated :)
 
VoxCaelum said:
So now imagine the case where I second quantize my system, and I'm talking about fermions, that is I have the anti-commutator
$$ \left[a^{\dagger}_{i},a_{j}\right]_{-} = \delta_{ij}, $$
for the field creation operators in momentum space. And say I have some other operators say b and c, that obey the same rules, then I can say
$$ \left[a^{\dagger}_{i}, f(a^{\dagger}_{j},a_{j},b_{k},c_{l})\right]_{-} = \frac{\partial f}{\partial a_{i}}. $$
I somehow have an intuition that this should be true, but I am unsure how to prove (or disprove) this. Any hints or references in the right direction would be appreciated :)
What Chopin told you applies to commutators.

One must be more careful with anticommutators. To see this, try the following example manually:
$$
\left[a^\dagger_i \,,\, a_i a_j \right]_-
$$:biggrin:
 
VoxCaelum said:
Thanks! So now imagine the case where I second quantize my system, and I'm talking about fermions, that is I have the anti-commutator
$$ \left[a^{\dagger}_{i},a_{j}\right]_{-} = \delta_{ij}, $$
for the field creation operators in momentum space. And say I have some other operators say b and c, that obey the same rules, then I can say
$$ \left[a^{\dagger}_{i}, f(a^{\dagger}_{j},a_{j},b_{k},c_{l})\right]_{-} = \frac{\partial f}{\partial a_{i}}. $$
I somehow have an intuition that this should be true, but I am unsure how to prove (or disprove) this. Any hints or references in the right direction would be appreciated :)

There may be a fancier way to do it, but the one I know is by induction. Say we have two operators p and q, with [q, p] = i (note that strangerep is right, this only works for commutators. I expect there's got to be some kind of analogous thing you can do with anticommutators, but I'm not sure exactly what that might be.)

Focus first on a function P_n(p, q) that contains a single term, which is a product of n of these operators. We want to show that [q, P_n(p, q)] = i \frac{\partial}{\partial p} P_n(p, q). The base case is terms of length 1:
[q, q] = 0 = i \frac{\partial}{\partial p} q\\<br /> [q, p] = i = i \frac{\partial}{\partial p} p
So [q, P_1(p, q)] = i \frac{\partial}{\partial p} P_1(p, q), where P_1(p, q) is all terms of length 1. Now for the induction step, we handle terms of length n+1, so we have (letting x denote either p or q):

[q, xP_n(p, q)] = x[q, P_n(p, q)] + [q, x]P_n(p, q) = i x (\frac{\partial}{\partial p} P_n(p, q)) + i (\frac{\partial}{\partial p} x)P_n(p, q) = i \frac{\partial}{\partial p}(xP_n(p, q))

Therefore [q, P_{n+1}(p, q)] = i \frac{\partial}{\partial p} P_{n+1}(p, q) for all terms P_{n+1}(p, q) of length n+1, and the induction is complete. Since the commutator is also linear, what is true for one polynomial term is true for any sum of polynomial terms. That means the theorem is true for any polynomial, as well as any other non-polynomial function which is equal to its Taylor expansion. The same process can be used to show [p, P(p, q)] = -i\frac{\partial}{\partial q} P(p, q).

Note that since the PB is also a derivation (as defined above), this same proof works for it, so showing that the commutator of p and q is equal to their PB is sufficient to show that any functions of p and q are as well. You can also use the same proof to show the Heisenberg operator equation--if \frac{\partial}{\partial q}H = -\dot{p} and \frac{\partial}{\partial p}H = \dot{q}, then that means that [p, H] = i \dot{p}, and [q, H] = i \dot{q}, and the same induction trick then proves that [H, F(p, q)] = i\dot{F}(p, q) for any function F of p and q.
 
Last edited:
Chopin said:
I expect there's got to be some kind of analogous thing you can do with anticommutators, but I'm not sure exactly what that might be.
For any given fermionic mode ##a_i## we have:
$$a_i^2 ~=~ 0 ~=~ (a^\dagger_i)^2 ~.$$
Hence the possible functions of a set of fermionic modes is rather more restricted. Evaluation of the anticommutator is then (mostly) an exercise in counting the minus signs. :biggrin:
 
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. Towards the end of the first lecture for the Qiskit Global Summer School 2025, Foundations of Quantum Mechanics, Olivia Lanes (Global Lead, Content and Education IBM) stated... Source: https://www.physicsforums.com/insights/quantum-entanglement-is-a-kinematic-fact-not-a-dynamical-effect/ by @RUTA
If we release an electron around a positively charged sphere, the initial state of electron is a linear combination of Hydrogen-like states. According to quantum mechanics, evolution of time would not change this initial state because the potential is time independent. However, classically we expect the electron to collide with the sphere. So, it seems that the quantum and classics predict different behaviours!
Back
Top