What is the Operator <m|n> in the Wave Equation for a Particle in a Box?

  • Thread starter Thread starter Markel
  • Start date Start date
  • Tags Tags
    Definition
Markel
Messages
83
Reaction score
0
Given the wave equation for a particle in a box I am supposed to calculate <m|n>. But I'm not sure what this notation means. Can someone give me a definition of this operator?
 
Physics news on Phys.org
If you look at the solutions for the wave equation of a particle in a box you'll get a series of solutions (in 1 dimension) proportional to sine functions:
\psi_n(x) \propto sin(n k x)
where k is the wave number (radians per unit distance) of the fundamental mode and n is an integer representing how many half-wavelengths of the particle's wave function are in that box.

This series of solutions forms a basis for general solutions. Thus for brevity we express a basis vector as a "ket" |n> i.e. |1>, |2>, |3> ... represents the successively higher frequency wave functions.

Your task (if you should choose to accept it) is to calculate the inner product between basis vectors <m|n>. It will involve integrating one wave function times the complex conjugate of the other.

Now the answers will depend on the normalization you use when defining your basis. So the question presumes a convention and I suggest you go back and review both the solutions for the particle in a box and the definition of the inner product on solutions.
 
I'm given the wave function as:
\Psi= \sqrt{2/d}Sin(\pi*nx/d)

I didn't realize this was a basis of my solution. So it is an infinite dimensional space?

Also, I'm not sure what inner product to use.

But essential these bra-ket notation is just inner product?

Also could you recommend a good book for an introduction to quantum mechanics. Our professor doesn't seem to lecture from any book. It's frustrating.
 
Last edited:
The inner product in a Hilbert space (where these wave-functions exist) is:

\langle\psi_1\mid\psi_2\rangle=\int_a^b \psi_1^*\psi_2 dx

Where the * means complex conjugation (not simple multiplication).
 
You're working with the set of square-integrable functions from \mathbb R into \mathbb C. This is a complex infinite-dimensional vector space. You can define a semi-inner product on it by

\langle f,g\rangle =\int_{-\infty}^{\infty}f(x)^*g(x)dx

It's not technically an inner product, because suppose e.g. that g is the function that takes the value 1 at 0 and 0 everywhere else. Then <g,g>=0, even though g isn't the 0 vector (i.e. the function that takes every real number to 0). It's called a semi-inner product or a pseudo-inner product because it satisfies all the other requirements of on inner product.

Edit: Matterwave's answer reminded me that in the case of a particle in a box, it's better to think of the wavefunctions as being defined only on the interior of the box, and always =0 at the edge of the box. You still define the semi-inner product in essentially the same way. You just take the integration limits to be the edges of the box instead of the infinities.
 
Last edited:
ok, thank you.


and how exactly does it work with an operator in between?

<n|H|n> for example. The operator acts on the ket? and then I take the inner product?

Also, what does this mean? What am I calculating in this situation.

Thanks for your help everyone.
 
When u and v are functions, <u|A|v> is defined as <u|Av>. The physical interpretation of <u|H|u> is the expected value of the Hamiltonian in state u, i.e. the average value you will get if you measure the energy of a bunch of particles-in-boxes that each have been prepared in the state u. When u is the nth eigenstate of H, i.e when H|n>=En|n>, we have <u|H|u>=<n|H|n>=En<n|n>=En.

This post on bra-ket notation may be useful. I'm not a big fan of mixing bra-ket notation with ordinary (semi-)inner products the way your teacher appears to be doing, but it seems a lot of people do it.

Textbook suggestions...Griffiths is fairly standard, and looks good to me, although I have only read a few pages of it. Shankar and Zettili are often mentioned as well. You can search the book forum for detailed comments. I think Isham would be a great supplement to whatever book you pick. (You can read it as the same time). His book is more focused on what the theory says, and not so much about how to calculate stuff with it.
 
Last edited:
Markel said:
ok, thank you.


and how exactly does it work with an operator in between?

<n|H|n> for example. The operator acts on the ket? and then I take the inner product?

Also, what does this mean? What am I calculating in this situation.

Thanks for your help everyone.

One aspect of this notation is its ambiguity expresses a mathematical equivalence.

To begin with, the kets |n> reside in the Hilbert space in question (call it X) so they are vectors of that space X. Now you define the dual space as the set of linear operators mapping vectors to numbers. A given linear functional (dual operator) is defined uniquely once you know to what number it maps each basis element. Thus it has the same dimension as the original space. We call the space of linear functionals the dual space X* of X. These are represented by bras in the bra ket notation.

Next we see that defining an inner product on the space (and dual inner product on the dual space) is equivalent to defining an adjoint operation mapping vectors to dual vectors and vis versa.

You can start with an inner product and define the adjoint of |x> as <x| such that <x| maps |y> to the inner product: <x| of |y> = InnerProd( |x> , |y>). Or you can start with the adjoint and define the inner product of two vectors as the adjoint of one acting on the other. InnerProd(|x>,|y>) =<x| of |y> The two become equivalent and we understand <x|y> to mean either.


Now you can see this manifest simply by considering matrices and row and column vectors. Column vectors are "vectors" or "kets" and row vectors are "dual vectors" or "bras". A row vector times a column vector gives you a number (1x1 matrix). And you can see the symmetry in transposing vectors and dual vectors. Indeed when you expand coefficients of a Hilbert space basis you generally use column vectors of coefficients.

You then find the adjoint of a "ket" is the "bra" corresponding to the conjugate transpose of the column matrix. That's a row matrix of coefficients for the dual basis.

Finally you can then consider something like <a|B|c> as a product of three matrices, a row times a square times a column matrix.

The matrix multiplication is associative so it doesn't matter which you multiply first.
(<a|B) |c> = <a|( B|c> ). This carries through to more general operator algebras.

(Technically the vectors and dual vectors are defined as left and right ideals of the operator algebra so they are defined withing the algebra itself which importantly is an associative algebra. Picture a column vector replaced with a square matrix whose first column is the column vector and whose others are zero. It acts just like the original vector under left multiplication by operators. Do the same with row vectors as first row of an otherwise zero square matrix. This means that such multiple products are associative by construction. Working with ideals let's us show this associativity (and possibly prove other properties) easily but once done with that we just keep things separate.)

Well that's the 50cent introduction to the linear algebra of quantum mechanics. I hope you find it helpful.
 
Uhm, if you choose to act on the left instead of the right with the operator, don't you have to take the adjoint of the operator first? I do see that in matrix algebra, associativity holds, but...I've always remembered taking the adjoint if I want to operate on the left instead of the right. Now I'm confused! @_@ (Never thought about this before lol!)

Dang, I used to be good at this, now it's like I've unlearned QM...>.>
 
  • #10
Matterwave said:
Uhm, if you choose to act on the left instead of the right with the operator, don't you have to take the adjoint of the operator first? I do see that in matrix algebra, associativity holds, but...I've always remembered taking the adjoint if I want to operate on the left instead of the right. Now I'm confused! @_@ (Never thought about this before lol!)

Dang, I used to be good at this, now it's like I've unlearned QM...>.>

Let's use the notation (f,g) for the inner product, and define f=|u>, g=|v>. We have

\langle u|v\rangle=(f,g)=(g,f)^* =\langle v|u\rangle^*

The expression \langle u|A is defined by \Big(\langle u|A\Big)|v\rangle =\langle u|\Big(A|v\rangle\Big). If we combine this with the previous result, we can show that the bra corresponding to the ket A|v\rangle is \langle v|A^*.

\langle u|\Big(A|v\rangle\Big)=(f,Ag)=(A^*f,g)=(g,A^*f)^*=\Big(\langle v|\Big(A^*|u\rangle\Big)\Big)^* =\Big(\Big(\langle v|A^*\Big)|u\rangle\Big)^*
 
  • #11
Yea I was walking around thinking about this and I realized that what I was thinking of is if we first convert the dual vector into a vector and act on it, and then convert it back to a dual vector to take the inner product.
 
  • #12
Hey, just wanted to say thanks for the replies. I found this helpful
 
  • #13
You need to insert a complete set of position eigenkets:

<br /> \langle m | n \rangle = \int_{-\infty}^{\infty}{dx&#039; \, \langle m | x&#039; \rangle \, \langle x&#039; | n \rangle}<br />

and identify \langle x&#039; | n \rangle \equiv \psi_{n}(x&#039;) as a wave function in coordinate representation and use the rules of brackets:

<br /> \langle m | x&#039; \rangle = \langle x&#039; | m \rangle^{\ast}<br />
 
Back
Top