What is the General Solution for Finding Orthonormal Bases in C^2?

  • Context: Graduate 
  • Thread starter Thread starter TimH
  • Start date Start date
  • Tags Tags
    Basis
Click For Summary

Discussion Overview

The discussion revolves around finding orthonormal bases in the two-dimensional complex vector space C^2, particularly in the context of quantum mechanics and bra-ket notation. Participants explore various combinations of vectors, methods for constructing orthonormal bases, and the implications of these bases in relation to operators and Hilbert spaces.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant initially struggles to find an orthonormal basis for C^2, specifically questioning the validity of the vectors (1,0) and (0,i).
  • Another participant asserts that (1,0) and (0,1) indeed form an orthonormal basis, suggesting a misunderstanding of definitions by the original poster.
  • A later reply clarifies that the ket-bra notation represents a projection operator and discusses the implications of using column and row vector representations in bra-ket notation.
  • The original poster later realizes that their calculator's handling of complex vectors led to confusion, as they successfully verified the orthonormality by hand.
  • One participant suggests a general method for constructing orthogonal bases using any two complex numbers and provides a formula for normalization.
  • Another participant proposes using orthonormalization algorithms or inner-product-preserving transformations to create orthonormal bases.
  • A participant introduces a general solution for the two-state problem, providing a specific form of orthonormal vectors dependent on parameters θ and φ, while acknowledging their ongoing learning about Hilbert spaces.

Areas of Agreement / Disagreement

Participants express differing views on the initial validity of certain vector combinations as orthonormal bases. While some participants assert that specific pairs work, others highlight the need for clarification and exploration of additional methods. The discussion remains unresolved regarding the best approach to finding orthonormal bases without zero coefficients.

Contextual Notes

Some participants reference the completeness relation and the properties of complex matrices, indicating potential limitations in understanding how these concepts apply to the construction of orthonormal bases. There is also mention of the need for further exploration of definitions and methods in the context of Hilbert spaces.

TimH
Messages
56
Reaction score
0
I'm teaching myself quantum mechanics and am learning about bra-ket notation. There is a particular operator used, the ket-bra (e.g. |X><X|). To understand it, I'm trying to come up with an orthonormal basis for C^2 as a simple case (i.e., the 2-dimensional vector space over the field of complex numbers). That is, I want two vectors, each with two components, each component a complex number, that span C^2 and are orthonormal. I've tried some combinations like (1,0) (0,i) and such, but no luck. Right now my TI-89 is chugging away looking at a few thousand possible vector combinations, but there has to be a better way.

Can anybody suggest how I would find two such vectors? Thanks.
 
Physics news on Phys.org
Hi TimH,

I'm a bit confused by your question. It seems to me that you've already found such a basis. Why do you think (1,0) and (0,i) don't work?
 
(1,0) and (0,1) works too. (z,w)=z(1,0)+w(0,1). (You must have misunderstood some definition).

See this post for more about bra-ket notation.

If |X\rangle is a member of \mathbb C^2, then |X\rangle\langle X| is a linear operator on \mathbb C^2. To be more precise, it's the projection operator for the one-dimensional subspace spanned by |X\rangle. If you write the vectors as |V\rangle=\begin{pmatrix}V_1\\ V_2\end{pmatrix}, then you can write |X\rangle\langle X|=\begin{pmatrix}X_1\\ X_2\end{pmatrix}\begin{pmatrix}X_1 &amp; X_2\end{pmatrix}.

But I'm not a big fan of the "kets are column vectors, bras are row vectors" approach to bra-ket notation. It will give you the right intuition about bras and kets, but it doesn't explain why the notation still works when the vector space is infinite-dimensional. (See the post I linked to instead).
 
Last edited:
Okay, I figured out what happened. Thank you for your posts. I first tried (1,0) and (0,i) on my TI-89 calculator, using the "completeness relation," i.e. that the ket-bra |x><x| of the two vectors, when added together, should give the identity matrix. It didn't work on the calculator which is why I didn't think this pair worked. Then I did it by hand and it worked. Then I poked around on the calculator and discovered that when you take the transpose of a complex matrix or vector, it gives you the adjoint, which screwed up my formula. A feature, I guess...

So thank you for persisting and getting me to do it by hand...It would still be nice to find a set of orthonormal vectors in C^2 which don't have any zero-coefficients, i.e. where each component is a full-blown complex number with real and imaginary parts. Are there any well-known examples? I couldn't find anything online.

This is part of my effort to understand the machinery of Hilbert space, even if the space itself isn't visualizable. Thanks.
 
Last edited:
Just pick any two complex numbers a & b.

You can construct an orthogonal basis that consists of two vectors, (a,b) and (b*,-a*).

You can then normalize it by rescaling those vectors.
 
Thanks Hamster. I figured there was some general form. I'll play around with that.
 
If you're having trouble creating an orthonormal basis, then why not:
  • Use an orthonormalization algorithm? (e.g. Graham-Schmidt)
  • Use an inner-product-preserving transformation to alter a known orthonormal basis?
  • Write down -- and solve -- a system of equations that expresses exactly what you want?
 
TimH said:
Thanks Hamster. I figured there was some general form. I'll play around with that.

Well, I should have thought of this before, but the general solution to the two-state problem is:

\left|\alpha\right\rangle=\begin{pmatrix}cos\theta \\ sin\theta e^{i\phi}\end{pmatrix},\left|\beta\right\rangle=\begin{pmatrix}-sin\theta \\ cos\theta e^{i\phi}\end{pmatrix}

So \left|\alpha\right\rangle and \left|\beta\right\rangle are orthonomal for all choices of \theta and \phi.

As I understand it, this works because in the 2-D case, all orthogonal bases can be obtained by rotation of some initial set of diagonal eigenvectors (the ones Fredrik gave in post #3) in the (complex) 2-D Hilbert space. (This explanation may not be strictly correct ... I am still learning the ins and outs of Hilbert spaces).
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K