Markel said:
ok, thank you.
and how exactly does it work with an operator in between?
<n|H|n> for example. The operator acts on the ket? and then I take the inner product?
Also, what does this mean? What am I calculating in this situation.
Thanks for your help everyone.
One aspect of this notation is its ambiguity expresses a mathematical equivalence.
To begin with, the kets |n> reside in the Hilbert space in question (call it X) so they are
vectors of that space X. Now you define the dual space as the set of linear operators mapping vectors to numbers. A given linear functional (dual operator) is defined uniquely once you know to what number it maps each basis element. Thus it has the same dimension as the original space. We call the space of linear functionals the
dual space X* of X. These are represented by bras in the bra ket notation.
Next we see that defining an inner product on the space (and dual inner product on the dual space) is equivalent to defining an adjoint operation mapping vectors to dual vectors and vis versa.
You can start with an inner product and define the adjoint of |x> as <x| such that <x| maps |y> to the inner product: <x| of |y> = InnerProd( |x> , |y>). Or you can start with the adjoint and define the inner product of two vectors as the adjoint of one acting on the other. InnerProd(|x>,|y>) =<x| of |y> The two become equivalent and we understand <x|y> to mean either.
Now you can see this manifest simply by considering matrices and row and column vectors. Column vectors are "vectors" or "kets" and row vectors are "dual vectors" or "bras". A row vector times a column vector gives you a number (1x1 matrix). And you can see the symmetry in transposing vectors and dual vectors. Indeed when you expand coefficients of a Hilbert space basis you generally use column vectors of coefficients.
You then find the adjoint of a "ket" is the "bra" corresponding to the conjugate transpose of the column matrix. That's a row matrix of coefficients for the dual basis.
Finally you can then consider something like <a|B|c> as a product of three matrices, a row times a square times a column matrix.
The matrix multiplication is associative so it doesn't matter which you multiply first.
(<a|B) |c> = <a|( B|c> ). This carries through to more general operator algebras.
(Technically the vectors and dual vectors are defined as left and right ideals of the operator algebra so they are defined withing the algebra itself which importantly is an
associative algebra. Picture a column vector replaced with a square matrix whose first column is the column vector and whose others are zero. It acts just like the original vector under left multiplication by operators. Do the same with row vectors as first row of an otherwise zero square matrix. This means that such multiple products are associative by construction. Working with ideals let's us show this associativity (and possibly prove other properties) easily but once done with that we just keep things separate.)
Well that's the 50cent introduction to the linear algebra of quantum mechanics. I hope you find it helpful.