I am reading a book about quantum mechanics and the author is trying to show how the uncertainty principle can be obtained from the Schwarz inequality.(adsbygoogle = window.adsbygoogle || []).push({});

Without going into the whole thing there is one point I get stuck following what he is doing.

We have an operator called DA.

We start with DA = A - <A> (1)

From this we get:

<DA^2> = <A^2 + <A>^2 - 2A<A>> (2)

I can just about live with that. But then says: taking the expectation value of the last term in this equation, you get this result:

<DA^2> = <A^2> - <A>^2 (3)

I just can't see how he goes from (2) to (3). I have checked and found the same thing written by another author. That author's version of (2) is slightly different, but only in the way he puts various bracket round things and then he too jumps to the same conclusion giving no explanation of how it was done.

Can anyone help? Thanks Pete

**Physics Forums | Science Articles, Homework Help, Discussion**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Slightly confusing equation

**Physics Forums | Science Articles, Homework Help, Discussion**