Shankar Questions About Quantum Mechanics (Schwarz Inequality)

Click For Summary
SUMMARY

The discussion centers on the complexities of quantum mechanics as presented in Shankar's book, specifically regarding the completeness relation in Hilbert spaces and the Schwarz inequality. Participants clarify the multiplication of kets and inner products, emphasizing that the inner product results in a scalar that scales the ket. The conversation also touches on the challenges of understanding axioms and proofs related to linearity and sesquilinearity in quantum mechanics, particularly in the context of self-adjoint operators and their spectra.

PREREQUISITES
  • Understanding of Hilbert spaces and their properties
  • Familiarity with Dirac notation, including kets and bras
  • Knowledge of self-adjoint operators in quantum mechanics
  • Basic concepts of linear algebra and inner products
NEXT STEPS
  • Study the completeness relation in Hilbert spaces
  • Learn about self-adjoint operators and their spectra in quantum mechanics
  • Explore the concept of sesquilinearity in scalar products
  • Review the Schwarz inequality and its applications in quantum mechanics
USEFUL FOR

Students of quantum mechanics, particularly those using Shankar's textbook, as well as educators and anyone seeking to deepen their understanding of linear algebra in the context of quantum theory.

Dr_Pill
Messages
41
Reaction score
0
Hello there,

Im studying QM with Shankar's book.

I'm wrestling myself trough the linear algebra now and I have some questiosn.

Let me start with this one:

aajHi.jpg


I have absolutely no idea where this is coming from or what does it mean.

I don't know how to multiply a ket with an inner product...

Thx in advance
 
Physics news on Phys.org


This is the completeness relation for a bunch of vectors, usually a complete orthonormalized set of Hilbert-space basis vectors, |i\rangle. These usually occur as eigenvectors of an self-adjoint operator.

Usually you need a slight extension of this concept, namely the case for unbounded essentially self-adjoint operators which can have a continuous spectrum (like the position "eigenkets", which are no Hilbert-space vectors but belong to a larger space, i.e., the dual space of the domain of the position and momentum operator). In this case your sum goes over into an integral
|V \rangle=\int_{\mathbb{R}^3} \mathrm{d}^3 \vec{x} \; |\vec{x} \rangle \langle \vec{x}|V \rangle.
Sometimes also the case occurs, where an operator has both a discrete and a continuous spectrum, e.g., the Hamiltonian for the motion of a particle in the Coulomb potential of another heavy charged particle.
 


vanhees71 said:
This is the completeness relation for a bunch of vectors, usually a complete orthonormalized set of Hilbert-space basis vectors, |i\rangle. These usually occur as eigenvectors of an self-adjoint operator.

To expand on vanhees's statement a bit, when we say "completeness," we are talking about a basis (usually an eigenbasis of a self-adjoint operator) for a particular vector space V. If we have a complete basis for a particular space (in this case, |i\rangle collectively spans the vector space V), then we can express any arbitrary vector v (or function f if we're talking about a function space) as a sum of its components multiplied by the basis vectors. For example, if we're working in \Re^3, we can choose as a basis the standard basis vectors: e_1=(1,0,0), e_2=(0,1,0), and e_3=(0,0,1). In Dirac notation, these would typically be |1\rangle, |2\rangle, |3\rangle.

Now given an arbitrary vector in \Re^3, we can write it as a sum of |1\rangle, |2\rangle, |3\rangle with the proper coefficients in front of each basis vector. For example, the vector u=(2,1,0) can be expanded in the form |u\rangle = \sum_{i=1}^{3} | i \rangle \langle i | u \rangle as | u \rangle = 2 | 1 \rangle + 1 | 2 \rangle + 0 | 3 \rangle.

The term \langle i | u \rangle in the sum is the inner product of the basis vector with the arbitrary vector u. This inner product picks out the component of u in the direction of that particular i basis vector. Now this inner product is just a scalar value. The additional | i \rangle is there because we're attempting to expand a vector as a sum of basis vectors multiplied by their respective components in u for each basis vector. So when you multiply an inner product by a ket, you're just scaling a ket. That's all.

I hope this helps!
 


vanhees71 said:
This is the completeness relation for a bunch of vectors, usually a complete orthonormalized set of Hilbert-space basis vectors, |i\rangle. These usually occur as eigenvectors of an self-adjoint operator.

Usually you need a slight extension of this concept, namely the case for unbounded essentially self-adjoint operators which can have a continuous spectrum (like the position "eigenkets", which are no Hilbert-space vectors but belong to a larger space, i.e., the dual space of the domain of the position and momentum operator). In this case your sum goes over into an integral
|V \rangle=\int_{\mathbb{R}^3} \mathrm{d}^3 \vec{x} \; |\vec{x} \rangle \langle \vec{x}|V \rangle.
Sometimes also the case occurs, where an operator has both a discrete and a continuous spectrum, e.g., the Hamiltonian for the motion of a particle in the Coulomb potential of another heavy charged particle.

Thanks for the explanation, but it's actually a bit too difficult.
I never saw Hilbert spaces.

I was just wondering what happens if you multipy |i> with <i|V>

Do you get <i| i | V > ?

That is not clear to me.

@ jmcelve, I'll read your post later ( christmas dinner) anyway still thanks for the help.
 


merry christmas, hohoho

<i|V> is just a number (Since you are taking an inner product). So then to multiply by <i| just means that you have <i| times by a number
 


it is just like writing a vector in terms of it's basis.since the basis are in general infinite in number,it is called hillbert space.
 


Thx for the help guys.

However now I'm stuck with this proof:

SenOx.jpg



Aplpy axion 1(i), they say, but there is no such axion labelled that way, so I don't even know what axion to apply.

If you can explain 1.3.18 a bit more, because I completely do not understand it, it would be great.

Sorry for the questions ( it's no homework, it's self-study)

Thx in advance.
 


Dr_Pill said:
Aplpy axion 1(i), they say, but there is no such axion labelled that way, so I don't even know what axion to apply.
While not labeled as such, I assume they mean the axioms bullet-listed on page 8 (in my copy). The first one is skew-symmetry, which is what is invoked here.
 


But why are there 3 terms in the first step and 4 terms in the second step @1.3.18

pOsyZ.jpg


Why is this not legit?
 
  • #10


Dr_Pill said:
But why are there 3 terms in the first step and 4 terms in the second step @1.3.18
What 3 terms?
 
  • #11


Doc Al said:
What 3 terms?

next to <Z|Z> ?
 
  • #12


Dr_Pill said:
next to <Z|Z> ?
I see 4 terms, not 3.
 
  • #13


I really don't understand 1.3.18 Doc Al.

Proof in Griffiths is totally different and that, I understand, but this one in Shankar, not a single clue :(.

I don't even understand how you get the <Z|Z> out of 1.3.17, sigh.
 
  • #14
Proof Schwarz-Inequality help

Hi there,

I'm reading Shankar, but I'm really stuck on this one:

http://i.imgur.com/SenOx.jpg

The whole equations @ 1.3.18 are complete gibbrish to me.

Maybe somebody can explain this?
 
  • #15


In the first step, 1.3.17 is used to replace Z, and the second step uses linearity (more precise: sesquilinearity) of the scalar product. With the help of 1.3.19, you can see that this has to be real and at least 0.
 
  • #16


mfb said:
In the first step, 1.3.17 is used to replace Z, and the second step uses linearity (more precise: sesquilinearity) of the scalar product. With the help of 1.3.19, you can see that this has to be real and at least 0.

And why is this wrong:

pOsyZ.jpg


What is <Z| ?
 
  • #17


Dr_Pill said:
I really don't understand 1.3.18 Doc Al.

Proof in Griffiths is totally different and that, I understand, but this one in Shankar, not a single clue :(.

I don't even understand how you get the <Z|Z> out of 1.3.17, sigh.
The first line just expresses <Z|Z> in terms of the what Z is defined as in 1.3.17. Then just multiply it out. You get four terms (not three--I don't know where you got three from). Do you understand where each of the four terms come from?
 
  • #18


The second step I got (just worked it out), I know get u have to complex conjugate it.

the first step I don't get, how u get <Z|
 
  • #19


Now I understand where the 4 vectors come from. It was the formula of antilinearity :)

But I don't know how u get <Z|Z>

I think <Z| = http://i.imgur.com/pOsyZ.jpg ( second line)

I know I'm doing something impossible here, but not what.
Working with brakets first time ever, is very very confusing.In other words: I think you have got to reverse the left Z in <Z|Z> , and tthus also the equation in 1.3.17 , because its a bra, while the right Z is a ket.
 
  • #20


Moderator's note: I merged the two threads. (Once is enough!)​
 
  • #21
@Dr_Pill: Your <Z| is correct apart from a wrong sign for both parts, but you cannot write <Z|Z> by copying every single character.

Consider a=b+c (real numbers, if you like). Then
a*a != b+cb+c = b+c+cb (wrong)
a*a = (b+c)(b+c)=b^2+bc+cb+c^2 (right)
 
  • #22
@ Doc Al, sorry.

@ mgb ok i get it

as for 1.3.19 (almost there)

Is this correct:
l2Zfd.jpg


I have a feeling its not, how to get rid of the term with |W| ^ 4 in the denominator?
 
  • #23
$$\frac{<W|W>}{|W|^4}=\frac{1}{|W|^2}$$
This way, you get rid of two terms in your first line and you don't have to "forget" the 2 afterwards ;).

<W|V><V|W>= <V|W>* <V|W> = |<V|W>|2

This can be used to get <V|V> <W|W> ≥ |<V|W>|2.
Take the square root, and you are done.
 
  • #24
Indeed! Thx Now It's obvious!

<W|W> would be 1 if they were normalized Schrödinger equations, right, guess I was a bit confused.
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
8K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 22 ·
Replies
22
Views
20K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 5 ·
Replies
5
Views
9K
  • · Replies 16 ·
Replies
16
Views
5K
  • · Replies 3 ·
Replies
3
Views
2K