Matrix operators Dirac notation

Click For Summary
SUMMARY

The discussion focuses on the representation of quantum operators in matrix form using Dirac notation. It highlights the expression for an operator X as X = ∑∑ |a''> PREREQUISITES

  • Understanding of Dirac notation and quantum mechanics
  • Familiarity with matrix multiplication and linear operators
  • Knowledge of orthonormal systems in Hilbert spaces
  • Basic concepts of eigenvectors and eigenvalues
NEXT STEPS
  • Study the properties of linear operators in quantum mechanics
  • Learn about matrix representations of quantum states and operators
  • Explore the implications of orthonormal bases in Hilbert spaces
  • Investigate the relationship between bra-ket notation and matrix algebra
USEFUL FOR

Quantum physicists, students of quantum mechanics, and anyone interested in the mathematical foundations of quantum theory.

Master J
Messages
219
Reaction score
0
I'm having trouble seeing how an operator can be written in matrix representation.

In Sakurai, for an operator X, we have:

X = [itex]\sum[/itex] [itex]\sum[/itex] |a''> <a''| X |a'> <a'|

since of course [itex]\sum[/itex] |a> <a| is equal to one.

Somehow, this all gets multiplied out and you get a square matrix with the elements of form
<a''| X |a'> ... where a'' correspond with rows and a' columns.

For the part at the middle of the expression for X at the top, which is just an inner product, and hence a number, we can move it through the rest of the expression. So then, we can just work out

[itex]\sum[/itex] [itex]\sum[/itex] |a''> <a'| ( an outer product) which will be a square matrix, and multiply each element of this matrix by the corresponding number from the inner product. But when working out this matrix, it turns out to be a diagonal matrix if we are talking about orthogonal eigenvectors (Sakurai has elements off the diagonal?).

What's more, how do we interpret multiplication of two state functions? Since the bras and kets are nothing more that column and row matrix representations of the state functions, and we are not talking about inner products here (which would be zero of course), just normal multiplication, I don't know how to interpret simply multiplying two state functions together.

Anyway, I can't get out what Sakurai has written out nicely. Anyone care to show me how to work out the expression at the very top? Thanks!
 
Physics news on Phys.org
You have it all there! Since the [itex]|a \rangle[/itex] are a complete orthonormal system, you get
[tex]X_{aa'}=\langle a |\hat{X} a' \rangle.[/tex]
Applied to an arbitrary ket you find
[tex]\hat{X} |\psi \rangle=\sum_{aa'} |a \rangle \langle a|\hat{X} a' \rangle \langle a'|\psi \rangle =\sum_{a a'} |a \rangle X_{a a'} \psi_{a'}.[/tex]
Thus you get in the representation wrt. this orthonormal system
[tex](\hat{X} \psi)_{a}=\sum_{a'} X_{aa'} \psi_{a'},[/tex]
which is the usual rule for the multiplication of a square matrix with a column vector in a complex vector space. The only difference is that your qm. Hilbert space usually is infinitely dimensional.
 
Consider the equality |u>=X|v> where |u> and |v> are arbitrary kets and X is an arbitrary linear operator. You know that this equality implies that ##\langle a'|u\rangle=\sum_{a''}\langle a'|X|a''\rangle\langle a''|v\rangle##. If you just compare this to the definition of matrix multiplication, ##(AB)_{ij}=\sum_k A_{ik}B_{kj}##, you will see that it can be interpreted as a component of a matrix equality. This is the reason why the correspondence between matrices and linear operators is given by the first equality in Vanhees71's post.

Actually, it's easier to understand this if we don't use bra-ket notation, so I suggest that you also read this old post. (Ignore the quote and the stuff below it).
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 21 ·
Replies
21
Views
3K
  • · Replies 8 ·
Replies
8
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K