Hermitian conjugate of outer product

In summary, Sakurai's Modern Physics discusses the outer product acting on a ket, which can be equivalently written as the number \left<\alpha|\gamma\right> multiplying \left|\beta\right> . The author provides an example of the operator \left|\beta\right>\left<\alpha\right| acting on \left|\gamma\right> or the number \left<\alpha|\gamma\right> multiplying \left|\beta\right>. The dual correspondence principle states that if I have one operator, the outer product, and I know the principle, then I can express a bra vector in terms of its corresponding ket vector.
  • #1
loginorsinup
54
2

Homework Statement


In Sakurai's Modern Physics, the author says, "... consider an outer product acting on a ket: (1.2.32). Because of the associative axiom, we can regard this equally well as as (1.2.33), where [itex]\left<\alpha|\gamma\right>[/itex] is just a number. Thus the outer product acting on a ket is just another ket; in other words, [itex]\left|\beta\right>\left<|\alpha\right|[/itex] can be regarded as an operator. Because (1.2.32) and (1.2.33) are equal, we may as well omit the dots and let [itex]\left|\beta\right>\left<\alpha|\gamma\right>[/itex] standing for the operator [itex]\left|\beta\right>\left<\alpha\right|[/itex] acting on [itex]\left|\gamma\right>[/itex] or, equivalently, the number [itex]\left<\alpha|\gamma\right>[/itex] multiplying [itex]\left|\beta\right>[/itex]. (On the other hand, if (1.2.33) is written as [itex]\left(\left<\alpha|\gamma\right>\right)\cdot\left|\beta\right>[/itex], we cannot afford to omit the dot and brackets because the resulting expression would look illegal.) Notice that the operator [itex]\left|\beta\right>\left<\alpha\right|[/itex] rotates [itex]\left|\gamma\right>[/itex] into the direction [itex]\left|\beta\right>[/itex] . It is easy to see that if (1.2.34) then (1.2.35), which is left as an exercise.

Homework Equations


[tex]\left(\left|\beta\right>\left<\alpha\right|\right)\cdot\left|\gamma\right>\qquad (1.2.32)[/tex]
[tex]\left|\beta\right>\cdot\left<\alpha|\gamma\right>\qquad (1.2.33)[/tex]
[tex]X=\left|\beta\right>\left<\alpha\right|\qquad (1.2.34)[/tex]
[tex]X^\dagger=\left|\alpha\right>\left<\beta\right|\qquad (1.2.35)[/tex]

The Attempt at a Solution


I know that the definition of an adjoint involves taking the complex conjugate of the tranpose of a complex-vallued quantity. I can't just turn all the bras into kets and all the kets into bras, because then I end up with an inner product [itex]\left<\alpha|\beta\right>[/itex], which isn't right since the outer product is an operator (a matrix). What am I missing? Thanks!

Edit: I have noticed that this may be relevant. The dual correspondence principle.

[tex]c_{\alpha}\left|\alpha\right>+c_{\beta}\left|\beta\right>\stackrel{\text{DC}}{\leftrightarrow}c_{\alpha}^{*}\left<\alpha\right|+c_{\beta}^{*}\left<\beta\right|[/tex]

If I start with all the [itex]\beta[/itex] coefficients being zero, I should just get that the [itex]\alpha[/itex] ket has a corresponding bra. Can I then "multiply" both sides of the equation with [itex]\left<\beta\right|[/itex]? I guess I can't because then I end up with the same issue. I would get an inner product.
 
Last edited:
Physics news on Phys.org
  • #2
When you take the hermitian conjugate of two operators, you also change the order of the operators, i.e., ##(AB)^\dagger = B^\dagger A^\dagger##. If you look at it from a matrix representation point of view, this follows from the transpose of a product being the product of the transposes in reverse order.
 
  • Like
Likes loginorsinup
  • #3
Orodruin said:
When you take the hermitian conjugate of two operators, you also change the order of the operators, i.e., ##(AB)^\dagger = B^\dagger A^\dagger##. If you look at it from a matrix representation point of view, this follows from the transpose of a product being the product of the transposes in reverse order.
Hello, thanks for your response.

I know that the outer product is an operator, so we could call that [itex]A[/itex]. But what would we call [itex]B[/itex] in this case? Could I just introduce the identity matrix or something? Even if I did do that though, I would end up with [itex]\left(A\mathbb{1}\right)^{\dagger}=\mathbb{1}^{\dagger}A^{\dagger}=A^{\dagger}[/itex], but [itex]A[/itex] is just the outer product. So, I don't know where that leads me. [itex]A[/itex] and [itex]B[/itex] are operators, but the inner product is just one operator. Can a bra or a ket be considered an operator by itself? Sorry, I'm new to this all. Am I missing something again?

So, given that I have one operator, the outer product, and I know the dual correspondence principle

[tex]c_{\alpha}\left|\alpha\right> + c_{\alpha}\left|\beta\right> \stackrel{\text{DC}}{\leftrightarrow} c_{\alpha}^* \left<\alpha\right| + c_{\beta}^* \left<\beta\right|[/tex]
Where can I go from there? I know that I have [itex]\left|\alpha\right>\left<\beta\right|[/itex] which looks like the left side of the dual correspondence principle with a bra introduced. My confusion, then, comes from... what happens if I have the left side, with [itex]c_{\beta}=c_{\beta}^*=0[/itex] and [itex]c_{\alpha}=c_{\alpha}^*=1[/itex] placed in front of a bra [itex]\left<\beta\right|[/itex]? What should the right side look like then?
 
  • #4
Well, a ket vector is an operator (it takes complex numbers to ket vectors when multiplying from the left) and bra vectors are also operators (it takes ket vectors to complex numbers when multiplying from the left). Thus the operator ##|\beta\rangle \langle \alpha|## takes ket vectors to ket vectors and is a composition of the two operators ##|\beta\rangle## and ##\langle \alpha |##. Think of bra vectors as row matrices and ket vectors as column matrices.

Another way of going about business here is to start from ##|\beta\rangle \langle \alpha|\psi\rangle## for an arbitrary ##|\psi\rangle## and express its corresponding bra vector, remembering that ##\langle \alpha|\psi\rangle## is just a complex number.
 
  • Like
Likes loginorsinup
  • #5
That's interesting. So basically everything in quantum mechanics involves operators since even states are operators? I'm a little confused about the terminology. I thought column vectors (kets) and row vectors (bras) were independent entities -- separate from matrices (operators). But, I guess you could say a column vector is simply a [itex]1\times n[/itex] matrix and a row vector is simply a [itex]n\times 1[/itex] matrix. You also noted this, so it seems like this is the case.

Since [itex]\left<\alpha|\psi\right>[/itex] is just a complex-valued constant, I could say that [itex]\left|\beta\right>\left<\alpha|\psi\right>=c_{\beta}\left|\beta\right>[/itex]. It's corresponding ket would be [itex]c_{\beta}^*\left<\beta\right|=\left(\left<\alpha|\psi\right>\right)^*\left<\beta\right|=\left(\left<\psi|\alpha\right>\right)\left<\beta\right|[/itex]

I have recently read that it is best to work with operators with some kind of trial function or bra or ket, which is what you suggested with that massive hint that I very much appreciate.

So since the dual correspondence principle was assumed to be true, and I used it to find this relation that the dual of [itex]\left|\beta\right>\left<\alpha\right|[/itex] is [itex]\left|\alpha\right>\left<\beta\right|[/itex], I'm done.

Thank you very much for your help. (I looked at your profile. I noticed you're grading exams. So, extra thanks for taking time out of your schedule to lend me a hand. :) )
 
  • #6
loginorsinup said:
That's interesting. So basically everything in quantum mechanics involves operators since even states are operators? I'm a little confused about the terminology. I thought column vectors (kets) and row vectors (bras) were independent entities -- separate from matrices (operators). But, I guess you could say a column vector is simply a n1\times n matrix and a row vector is simply a n×1n\times 1 matrix. You also noted this, so it seems like this is the case.

Well, I suggest starting to think about them like that. It is not necessarily always the case that they can be represented like that (in the case of a finite state space it is). If the state space is infinite you get an ##1\times\infty## etc. It may even be that the state space is not separable.

loginorsinup said:
I looked at your profile. I noticed you're grading exams.
Grading exams is generally not a very uplifting task ...
 
  • Like
Likes loginorsinup

FAQ: Hermitian conjugate of outer product

What is the Hermitian conjugate of an outer product?

The Hermitian conjugate of an outer product is the conjugate transpose of the outer product. It is obtained by taking the complex conjugate of each element in the outer product and then transposing the resulting matrix.

What is the difference between an outer product and an inner product?

An outer product is a type of matrix multiplication where the result is a matrix, while an inner product is a type of scalar multiplication where the result is a single number. Additionally, the order of the elements in an outer product does not matter, while the order does matter in an inner product.

How is the Hermitian conjugate of an outer product related to the inner product?

The Hermitian conjugate of an outer product is closely related to the inner product. In fact, the inner product can be obtained from the Hermitian conjugate of the outer product by taking the trace of the resulting matrix.

Why is the Hermitian conjugate important in quantum mechanics?

The Hermitian conjugate is important in quantum mechanics because it is used to represent complex conjugation and transposition, both of which are essential operations in quantum mechanics. It is also used in the calculation of probabilities and expectation values in quantum systems.

What are some properties of the Hermitian conjugate of an outer product?

Some properties of the Hermitian conjugate of an outer product include: (1) it is linear, (2) it is involutory (i.e. applying it twice gives back the original matrix), (3) it is Hermitian (i.e. it is equal to its own Hermitian conjugate), (4) it is unitary (i.e. it preserves the inner product), and (5) it is self-adjoint (i.e. it is equal to its own adjoint).

Back
Top