QM: Formalism and bases

Niles

1. The problem statement, all variables and given/known data
Hi all.

Lets say that we are looking at an operator Q, and two eigenstates given by |1> and |2>. The eigenstates |1> and |2> are not eigenstates of Q, but of some other operator H.

My question: Is <1|Q|1>, <1|Q|2>, <2|Q|1> and <2|Q|2> independent of the basis used as long as Q, |1> and |2> are given in the same basis?

Sincerely,
Niles.

Last edited:

Redbelly98

Staff Emeritus
Homework Helper
Yes.
Those 4 terms express the elements of Q in terms of the basis {|1>,|2>}. Even when Q, |1>, and |2> are expressed using a different basis.

Niles

Thanks for replying. I tried doing a self-made example to give myself a full understanding of what is happening here. But I cannot get it to work.

I am looking at a matrix (i.e. the operator Q) and a vector (i.e. |1>) in the basis E spanned by (1,0) and (0,1). They are given by:

$$Q = \left( {\begin{array}{*{20}c} 2 & 0 \\ 3 & 1 \\ \end{array}} \right) \quad \text{and}\quad \left| 1 \right\rangle = \left( {\begin{array}{*{20}c} 1 \\ 2 \\ \end{array}} \right).$$

These have just been chosen randomly. Now we compute the above:

$$\left\langle 1 \right|Q\left| 1 \right\rangle_{\text{in E}} = \left( {\begin{array}{*{20}c} 1 & 2 \\ \end{array}} \right)\left( {\begin{array}{*{20}c} 2 & 0 \\ 3 & 1 \\ \end{array}} \right)\left( {\begin{array}{*{20}c} 1 \\ 2 \\ \end{array}} \right) = 2.$$

Now, according to the above, this result is supposed to be independent of whatever basis we are using. Now I wish to use the eigenbasis of Q, where the transition matrix from the basis E to the eigenbasis of Q is (this is the inverse of the eigenfunction-matrix):

$$S = \left( {\begin{array}{*{20}c} {1/3} & 0 \\ 1 & 1 \\ \end{array}} \right)^{ - 1}.$$

This gives me:

$$\left| 1 \right\rangle _{in\,\,Q} = \left( {\begin{array}{*{20}c} {1/3} & 0 \\ 1 & 1 \\ \end{array}} \right)^{ - 1} \left( {\begin{array}{*{20}c} 1 \\ 2 \\ \end{array}} \right) = \left( {\begin{array}{*{20}c} 3 \\ { - 1} \\ \end{array}} \right),$$

and thus:

$$\left\langle 1 \right|Q\left| 1 \right\rangle _{\text{in eigenbasis of Q}} = \left( {\begin{array}{*{20}c} 3 & { - 1} \\ \end{array}} \right)\left( {\begin{array}{*{20}c} 2 & 0 \\ 0 & 1 \\ \end{array}} \right) \left( {\begin{array}{*{20}c} 3 \\ { - 1} \\ \end{array}} \right) = 10.$$

But this should equal 2, because it is supposed to be independent of the basis. This is just standard linear algebra and change of bases, so the error cannot be in the above example. Have I misunderstood something?

Last edited:

Redbelly98

Staff Emeritus
Homework Helper
I think Q needs to be symmetric, in order to have it's eigenvectors be orthogonal.

Niles

I have just tried with a symmetric matrix, and it does not match either.

ice109

i can't make it sit right with me why this is so but if you carry out the multiplication in the correct order it works.

first of all your arithmetic is wrong <1|Q|1> = 12 , not 2. then if you perform a similarity transformation on Q, diagonalize it, represent it in it's eigenbasis we get

<1|$VDV^{-1}$|1> but (<1|V)$^{T}$ != $V^{-1}$|1>. which really is weird to me because you're right, considering you're just representing |1> in another basis by writing down $V^{-1}$|1> it seems like left term should just be the transpose.

Last edited:

Hurkyl

Staff Emeritus
Gold Member
The problem is that you transformed the coordinate representation of $\langle 1 |$ incorrectly. If the 'change of basis' transformation for coordinate vectors representing a ket is given by

$$v \mapsto T v$$

then the corresponding change of basis trasnformation for a bra is

$$w \mapsto w T^{-1}$$

Only in the case of a unitary transformation ($T^{-1} = T^*$) can you compute the new coordinates on a bra by simply taking the conjugate transpose of the new coordinates for the ket.

To put it differently, the coordinate representation of $| \psi \rangle^*$ is not the conjugate transpose of the coordinate representation of $\langle \psi |$, except in the special case where you are expressing everything in terms of an orthonormal basis.

ice109

The problem is that you transformed the coordinate representation of $\langle 1 |$ incorrectly. If the 'change of basis' transformation for coordinate vectors representing a ket is given by

$$v \mapsto T v$$

then the corresponding change of basis trasnformation for a bra is

$$w \mapsto w T^{-1}$$

Only in the case of a unitary transformation ($T^{-1} = T^*$) can you compute the new coordinates on a bra by simply taking the conjugate transpose of the new coordinates for the ket.

To put it differently, the coordinate representation of $| \psi \rangle^*$ is not the conjugate transpose of the coordinate representation of $\langle \psi |$, except in the special case where you are expressing everything in terms of an orthonormal basis.
yea that's what i discovered. to be a little clearer

if V is an eigenbasis that spans the space and x is the arbitrary vector that we wish to represent in this eigenbasis then we have to solve

Vx=b , b being the representation of x in the eigenbasis, hence $x=V^{-1}b$ since V spans the space and hence is invertible. but then the bra $x^T=(V^{-1}b)^T=b^T(V^{-1})^T$.

i think this is where the fact that the transpose is actually a mapping to the dual space, not just juggling the symbolic representation, of the kets becomes relevant. it's my professor in linear algebra had no idea why i would want to think about the transposition of a mapping to the dual space instead of just "flipping" the vector.

Niles

Ahh, yes. I forgot to normalize the eigenvectors of Q and the ket |1>, since we are always working in an orthonormal bases.

Thanks to all three of you for helping me.

ice109

Ahh, yes. I forgot to normalize the eigenvectors of Q and the ket |1>, since we are always working in an orthonormal bases.

Thanks to all three of you for helping me.
no no no, they have to be orthogonal and normal. i say orthogonal first because you can always normalize an eigenbasis but not always is the eigenbasis orthogonal. in fact the eigenbasis here is not orthogonal. it will never work for this matrix that you can just flip the eigenvectors. this matrix is not unitary/hermitian/symmetric etc.

Niles

Vx=b , b being the representation of x in the eigenbasis, hence $x=V^{-1}b$ since V spans the space and hence is invertible. but then the bra $x^T=(V^{-1}b)^T=b^T(V^{-1})^T$.
I am a little unsure of how you do this. We have to find the transformed ket and the transformed bra (i.e. the ket and the bra in the new basis). The transformed ket is just straightforward and easy, so that is OK.

Now the transformed bra is a little tricky, as you write here. We have (I am leaving out complex conjugation):

$$(x^{T}) = (V^{-1}b)^T = b^T(V^{-1})^T.$$

I cannot see why I cannot just transpose x in the first place regardsless of the basis we are using?

ice109

I am a little unsure of how you do this. We have to find the transformed ket and the transformed bra (i.e. the ket and the bra in the new basis). The transformed ket is just straightforward and easy, so that is OK.

Now the transformed bra is a little tricky, as you write here. We have (I am leaving out complex conjugation):

$$(x^{T}) = (V^{-1}b)^T = b^T(V^{-1})^T.$$

I cannot see why I cannot just transpose x in the first place regardsless of the basis we are using?
i need to hurkyl to confirm this but transposition isn't simply turning the vector 90 degrees. it's a map from vector space 1 to vector space 2 called the dual space of that vector space 1.

Hurkyl

Staff Emeritus
Gold Member
Transpose sort of has a double meaning. (Actually, more, but I won't get into that....)

The big idea is when you have an inner product on vectors, for any vector v, you can define a linear function

$$f_v(w) := \langle v, w \rangle$$

This makes $f_v$ a linear functional. In bra-ket notation, v is a ket, and $f_v$ is a bra, and you just write $f_v$ as $\langle v \mid$.

The other idea is the simple one ice109 described as "rotating the vector 90 degrees". This is certainly a transpose operation too. But, a priori, there's no reason it should be connected with the one I described above.

But they are connected; the conjugate transpose (second meaning) is exactly the same as transposing (first meaning) with respect to the standard inner product on n-tuples.

When you are representing everything with respect to an orthonormal basis, the inner product for your vector space becomes the standard inner product on n-tuples. And so, in this special case, you can compute the coordinate vector of a bra by applying the conjugate transpose to the coordinate vector of the corresponding ket.

ice109

Transpose sort of has a double meaning. (Actually, more, but I won't get into that....)

The big idea is when you have an inner product on vectors, for any vector v, you can define a linear function

$$f_v(w) := \langle v, w \rangle$$

This makes $f_v$ a linear functional. In bra-ket notation, v is a ket, and $f_v$ is a bra, and you just write $f_v$ as $\langle v \mid$.

The other idea is the simple one ice109 described as "rotating the vector 90 degrees". This is certainly a transpose operation too. But, a priori, there's no reason it should be connected with the one I described above.

But they are connected; the conjugate transpose (second meaning) is exactly the same as transposing (first meaning) with respect to the standard inner product on n-tuples.

When you are representing everything with respect to an orthonormal basis, the inner product for your vector space becomes the standard inner product on n-tuples. And so, in this special case, you can compute the coordinate vector of a bra by applying the conjugate transpose to the coordinate vector of the corresponding ket.
so how does this breakdown for our problem? because in taking the product $x^T Q x$ we are evaluating the inner product of x with itself over a nonstandard inner product, non standard metric, namely Q?

i realize i've jumbled it a bit now that i reread the last paragraph but if you could straighten out the that first statement it would probably instructive for me.

if you don't represent everything with respect to an orthonormal basis then what is the inner product?

Hurkyl

Staff Emeritus
Gold Member
Um... let's see.

In the old coordinates, the inner product is given by

$$\langle u, v\rangle = u^* v$$

If T is the change-of-basis matrix, then in the new coordinates, we must have

$$\langle \mu, \nu \rangle = \langle T^{-1} \mu, T^{-1} \nu \rangle = \mu^* \left(T^*)^{-1} T^{-1} \nu = \mu^* (T T^*)^{-1} \nu$$

and the ket-to-bra transformation has coordinate representation

$$\nu \mapsto \nu^* (T T^*)^{-1}$$

(Hopefully it's obvious, but I'm using u,v to denote coordinate vectors relative to the old basis, and $\mu, \nu$ to denote coordinate vectors relative to the new basis)

Last edited:

Niles

... and the ket-to-bra transformation has coordinate representation

$$\nu \mapsto \nu^* (T^* T)^{-1}$$
I must admit I am a little confused now: You are multiplying a vector from left to a matrix?

But I think I got my original question answered: The four terms <1|Q|1>, <1|Q|2>, <2|Q|1> and <2|Q|2> give the same result no matter what basis we are in, so it does not necessarily have to be orthogonal. It also does not matter if Q is Hermitian or not. The only thing I have to be cautious about is that if Q is not Hermitian, then the corresponding transformed bra is not the transposed of the transformed ket, but it has to be found using the above described method; correct?

Last edited:

Niles

Ok, of course it is a bra, and not a ket.

Generally we have that:

$$(x^{T}) = (V^{}b)^T = b^T(V^{})^T.$$

I am using the matrix from the above example, and to find the corresponding bra, I use the above expression. But this gives me the wrong answer still. Then I tried to use:

$$w \mapsto w T^{-1},$$

and this worked - but it isn't supposed to. Because you told me that this was only when the matrix A was Hermitian, which it is not in this case.

Last edited:

ice109

Ok, of course it is a bra, and not a ket.

Generally we have that:

$$(x^{T}) = (V^{}b)^T = b^T(V^{})^T.$$

I am using the matrix from the above example, and to find the corresponding bra, I use the above expression. But this gives me the wrong answer still. Then I tried to use:

$$w \mapsto w T^{-1},$$

and this worked - but it isn't supposed to. Because you told me that this was only when the matrix A was Hermitian, which it is not in this case.
you've misunderstood somewhere, the first case works if A is hermitian. the second case is the general case.

Niles

Great, then it is not strange that is works. But how does one show the general case, i.e. how does one show the following?

$$w \mapsto w T^{-1}$$

Hurkyl

Staff Emeritus
Gold Member
Great, then it is not strange that is works. But how does one show the general case, i.e. how does one show the following?

$$w \mapsto w T^{-1}$$
The easiest is from the fact wv is a scalar, and thus invariant under coordinate changes.

The Physics Forums Way

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving