Hermitian conjugate of outer product

Click For Summary

Homework Help Overview

The discussion revolves around the concept of the Hermitian conjugate of outer products in quantum mechanics, particularly as described in Sakurai's Modern Physics. Participants explore the nature of outer products as operators and the implications of taking their adjoint, questioning the relationship between kets, bras, and operators.

Discussion Character

  • Conceptual clarification, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants discuss the definition of the adjoint and the complexities involved in transforming kets and bras. There is exploration of the dual correspondence principle and its implications for the relationships between kets and bras when considering operators.

Discussion Status

The conversation is ongoing, with participants providing insights into the nature of operators in quantum mechanics and the relationships between different vector types. Some guidance has been offered regarding the interpretation of kets and bras as operators, but no consensus has been reached on the specific implications of these relationships.

Contextual Notes

Participants express confusion regarding the terminology and the independence of kets and bras from operators, indicating a need for further clarification on these concepts. The discussion also touches on the challenges posed by infinite-dimensional spaces in quantum mechanics.

loginorsinup
Messages
54
Reaction score
2

Homework Statement


In Sakurai's Modern Physics, the author says, "... consider an outer product acting on a ket: (1.2.32). Because of the associative axiom, we can regard this equally well as as (1.2.33), where \left<\alpha|\gamma\right> is just a number. Thus the outer product acting on a ket is just another ket; in other words, \left|\beta\right>\left<|\alpha\right| can be regarded as an operator. Because (1.2.32) and (1.2.33) are equal, we may as well omit the dots and let \left|\beta\right>\left<\alpha|\gamma\right> standing for the operator \left|\beta\right>\left<\alpha\right| acting on \left|\gamma\right> or, equivalently, the number \left<\alpha|\gamma\right> multiplying \left|\beta\right>. (On the other hand, if (1.2.33) is written as \left(\left<\alpha|\gamma\right>\right)\cdot\left|\beta\right>, we cannot afford to omit the dot and brackets because the resulting expression would look illegal.) Notice that the operator \left|\beta\right>\left<\alpha\right| rotates \left|\gamma\right> into the direction \left|\beta\right> . It is easy to see that if (1.2.34) then (1.2.35), which is left as an exercise.

Homework Equations


\left(\left|\beta\right>\left<\alpha\right|\right)\cdot\left|\gamma\right>\qquad (1.2.32)
\left|\beta\right>\cdot\left<\alpha|\gamma\right>\qquad (1.2.33)
X=\left|\beta\right>\left<\alpha\right|\qquad (1.2.34)
X^\dagger=\left|\alpha\right>\left<\beta\right|\qquad (1.2.35)

The Attempt at a Solution


I know that the definition of an adjoint involves taking the complex conjugate of the tranpose of a complex-vallued quantity. I can't just turn all the bras into kets and all the kets into bras, because then I end up with an inner product \left<\alpha|\beta\right>, which isn't right since the outer product is an operator (a matrix). What am I missing? Thanks!

Edit: I have noticed that this may be relevant. The dual correspondence principle.

c_{\alpha}\left|\alpha\right>+c_{\beta}\left|\beta\right>\stackrel{\text{DC}}{\leftrightarrow}c_{\alpha}^{*}\left<\alpha\right|+c_{\beta}^{*}\left<\beta\right|

If I start with all the \beta coefficients being zero, I should just get that the \alpha ket has a corresponding bra. Can I then "multiply" both sides of the equation with \left<\beta\right|? I guess I can't because then I end up with the same issue. I would get an inner product.
 
Last edited:
Physics news on Phys.org
When you take the hermitian conjugate of two operators, you also change the order of the operators, i.e., ##(AB)^\dagger = B^\dagger A^\dagger##. If you look at it from a matrix representation point of view, this follows from the transpose of a product being the product of the transposes in reverse order.
 
  • Like
Likes   Reactions: loginorsinup
Orodruin said:
When you take the hermitian conjugate of two operators, you also change the order of the operators, i.e., ##(AB)^\dagger = B^\dagger A^\dagger##. If you look at it from a matrix representation point of view, this follows from the transpose of a product being the product of the transposes in reverse order.
Hello, thanks for your response.

I know that the outer product is an operator, so we could call that A. But what would we call B in this case? Could I just introduce the identity matrix or something? Even if I did do that though, I would end up with \left(A\mathbb{1}\right)^{\dagger}=\mathbb{1}^{\dagger}A^{\dagger}=A^{\dagger}, but A is just the outer product. So, I don't know where that leads me. A and B are operators, but the inner product is just one operator. Can a bra or a ket be considered an operator by itself? Sorry, I'm new to this all. Am I missing something again?

So, given that I have one operator, the outer product, and I know the dual correspondence principle

c_{\alpha}\left|\alpha\right> + c_{\alpha}\left|\beta\right> \stackrel{\text{DC}}{\leftrightarrow} c_{\alpha}^* \left<\alpha\right| + c_{\beta}^* \left<\beta\right|
Where can I go from there? I know that I have \left|\alpha\right>\left<\beta\right| which looks like the left side of the dual correspondence principle with a bra introduced. My confusion, then, comes from... what happens if I have the left side, with c_{\beta}=c_{\beta}^*=0 and c_{\alpha}=c_{\alpha}^*=1 placed in front of a bra \left<\beta\right|? What should the right side look like then?
 
Well, a ket vector is an operator (it takes complex numbers to ket vectors when multiplying from the left) and bra vectors are also operators (it takes ket vectors to complex numbers when multiplying from the left). Thus the operator ##|\beta\rangle \langle \alpha|## takes ket vectors to ket vectors and is a composition of the two operators ##|\beta\rangle## and ##\langle \alpha |##. Think of bra vectors as row matrices and ket vectors as column matrices.

Another way of going about business here is to start from ##|\beta\rangle \langle \alpha|\psi\rangle## for an arbitrary ##|\psi\rangle## and express its corresponding bra vector, remembering that ##\langle \alpha|\psi\rangle## is just a complex number.
 
  • Like
Likes   Reactions: loginorsinup
That's interesting. So basically everything in quantum mechanics involves operators since even states are operators? I'm a little confused about the terminology. I thought column vectors (kets) and row vectors (bras) were independent entities -- separate from matrices (operators). But, I guess you could say a column vector is simply a 1\times n matrix and a row vector is simply a n\times 1 matrix. You also noted this, so it seems like this is the case.

Since \left<\alpha|\psi\right> is just a complex-valued constant, I could say that \left|\beta\right>\left<\alpha|\psi\right>=c_{\beta}\left|\beta\right>. It's corresponding ket would be c_{\beta}^*\left<\beta\right|=\left(\left<\alpha|\psi\right>\right)^*\left<\beta\right|=\left(\left<\psi|\alpha\right>\right)\left<\beta\right|

I have recently read that it is best to work with operators with some kind of trial function or bra or ket, which is what you suggested with that massive hint that I very much appreciate.

So since the dual correspondence principle was assumed to be true, and I used it to find this relation that the dual of \left|\beta\right>\left<\alpha\right| is \left|\alpha\right>\left<\beta\right|, I'm done.

Thank you very much for your help. (I looked at your profile. I noticed you're grading exams. So, extra thanks for taking time out of your schedule to lend me a hand. :) )
 
loginorsinup said:
That's interesting. So basically everything in quantum mechanics involves operators since even states are operators? I'm a little confused about the terminology. I thought column vectors (kets) and row vectors (bras) were independent entities -- separate from matrices (operators). But, I guess you could say a column vector is simply a n1\times n matrix and a row vector is simply a n×1n\times 1 matrix. You also noted this, so it seems like this is the case.


Well, I suggest starting to think about them like that. It is not necessarily always the case that they can be represented like that (in the case of a finite state space it is). If the state space is infinite you get an ##1\times\infty## etc. It may even be that the state space is not separable.

loginorsinup said:
I looked at your profile. I noticed you're grading exams.
Grading exams is generally not a very uplifting task ...
 
  • Like
Likes   Reactions: loginorsinup

Similar threads

  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
Replies
4
Views
5K
  • · Replies 7 ·
Replies
7
Views
5K
Replies
5
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
4
Views
4K
Replies
4
Views
10K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 8 ·
Replies
8
Views
2K