Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Pauli matrices forming a basis for 2x2 operators

  1. Aug 29, 2011 #1
    Hi,

    We know that the Pauli matrices along with the identity form a basis of 2x2 matrices. Any 2x2 matrix can be expressed as a linear combination of these four matrices. I know of one proof where I take

    [itex] a_{0}\sigma_{0}+a_{1}\sigma_{1}+a_{2}\sigma_{2}+a_{3}\sigma_{3}=0[/itex]

    Here, [itex]\sigma_{0}[/itex] is the identity. We get four simultaneous equations in [itex] a_{i}[/itex] and it is fairly trivial to show that each [itex] a_{i}[/itex] must be zero. This implies that the four matrices are linearly independent and therefore form a basis for 2x2 matrices.

    But I recently discovered a different way to show this. This utilizes a matrix inner product defined by [itex]\sigma_{i}.\sigma_{j} = \frac{1}{2}Tr(\sigma_i \sigma_{j}^{\bot})[/itex]. Here [itex]A^{\bot}[/itex] is the transpose conjugate and the 1/2 is just to remove the factor of 2 that arises from the matrices being 2x2. The argument goes that this definition gives the inner product equal to 1 if i=j and 0 otherwise meaning that the four matrices we have are orthogonal. Since there are four of them, and the orthogonality is interpreted as linear independence, they form a basis.

    My question is, how can I connect these two proofs? The second one seems nice but I cannot see how it is the same, in general, as the first proof. The definition of orthogonality seems arbitrary (it obeys the axioms for inner product, yes, but surely there is more than that) and I cannot really see how showing that matrices that are orthogonal by this definition makes them linearly independent in the way the first proof shows. Thank you very much for your help.
     
  2. jcsd
  3. Aug 29, 2011 #2

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Let [itex]\{x_k\}[/itex] be an orthonormal set in a finite-dimensional vector space V. Suppose that [itex]\sum_k a_k x_k=0[/itex]. Then for all i, [tex]0=\langle x_i,0\rangle=\langle x_i,\sum_k a_k x_k\rangle=\sum_k a_k\langle x_i,x_k\rangle=\sum_k\delta_{ik}=a_i.[/tex] This means that the set is linearly independent.

    It's always easier to use the fact that we're working with an inner product than to use the definition of the specific inner product you've been given.
     
  4. Aug 30, 2011 #3
    That's a very nice proof. Thank you Fredrik!
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Pauli matrices forming a basis for 2x2 operators
  1. Pauli Matrices (Replies: 6)

  2. Pauli matrices? (Replies: 1)

Loading...