Understand Operator Dispersion in Sakurai's "Modern Quantum Physics

  • Context: Graduate 
  • Thread starter Thread starter HubertP
  • Start date Start date
  • Tags Tags
    Dispersion Operator
Click For Summary
SUMMARY

The discussion focuses on understanding operator dispersion as defined in Sakurai's "Modern Quantum Physics". The operator is defined as \(\Delta A \equiv A - \left\langle A \right\rangle\), where \(A\) is an observable. The dispersion is calculated as \(\sigma^2_A = \left\langle \left( \Delta A \right)^2 \right\rangle\), representing the average squared deviation of measurements from the mean. A common point of confusion arises from subtracting a scalar expected value from an operator, which is clarified as a standard notation practice in quantum mechanics.

PREREQUISITES
  • Understanding of quantum mechanics principles, particularly observables and expectation values.
  • Familiarity with the notation and concepts in Sakurai's "Modern Quantum Physics".
  • Knowledge of eigenstates and eigenvalues in quantum systems.
  • Basic proficiency in linear algebra, particularly with matrices and operators.
NEXT STEPS
  • Study the concept of expectation values in quantum mechanics, focusing on their mathematical representation.
  • Explore the derivation of the uncertainty principle and its relation to operator dispersion.
  • Learn about the role of eigenstates and eigenvalues in quantum measurements.
  • Investigate common notational practices in quantum physics literature to enhance comprehension.
USEFUL FOR

Students and researchers in quantum mechanics, particularly those studying operator theory and dispersion in quantum systems. This discussion is beneficial for anyone seeking clarity on the mathematical foundations of quantum observables.

HubertP
Messages
9
Reaction score
0
I'm trying to get my head around quantum mechanics with the help of Sakurai "Modern Quantum Physics". It's been good so far, but I came across a formula I don't really understand. When discussing uncertainty relation (in 1.4) the author begins with defining an "operator":

\Delta A \equiv A - \left\langle A \right\rangle

Where A is an observable. He then defines the dispersion of A to be the expectation value of the of the square of this operator: \left\langle \Delta A \right\rangle ^ 2.

I'm pretty sure I understand the concept of observable dispersion correctly, but correct me if I'm wrong: it's the average squared deviation of the measurements from mean (variation). Of course this is all computed for given state (ket). The results of the same measurement (with the same properly prepared state) performed multiple times will have certain variation, which is the same as dispersion we're talking about. Is this ok? I think it is, because I was even able to arrive at proper formula (which agrees with author's result) by summing the squared deviations from 'mean' (operator A expectation value) over all eigenkets of this operator, weighted with probabilities of each outcome.

However, what I don't understand is the author's derivation, in particular the definition of this new 'delta operator' - let me write it again:

\Delta A \equiv A - \left\langle A \right\rangle

How can one subtract the expected value which is a number (scalar) from an operator, which is represented by some matrix? This doesn't seem kosher. Is this a common practice? Will I see more examples of such 'flawed' notation? This seems really confusing...
 
Physics news on Phys.org
It's just short for ##A-\left\langle A \right\rangle\cdot\rm{id}##.

HubertP said:
He then defines the dispersion of A to be the expectation value of the of the square of this operator: \left\langle \Delta A \right\rangle ^ 2.

I think you made a mistake there, since ##\left\langle \Delta A \right\rangle=0##. I guess it's more like ##\sigma^2_A=\left\langle\left( \Delta A\right)^2 \right\rangle##.
 
This makes sense now!

To make sure it's consistent I tried expanding this \Delta A in eigenbasis of A, using A = \sum\limits_n a_n \left|a_n\right\rangle \left\langle a_n\right|, and applying it to a ket vector - I got correct result (sum of squared deviations of eigenvalues from mean weighted by probabilities)!. This is exciting. Thanks!

Of course I made a typo writing \left\langle \Delta A \right\rangle ^2 instead of \left\langle \left( \Delta A \right)^2 \right\rangle. Thanks for pointing this out.
 

Similar threads

  • · Replies 21 ·
Replies
21
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 59 ·
2
Replies
59
Views
5K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 10 ·
Replies
10
Views
2K