
#1
Apr2005, 03:33 PM

P: 460

Is it correct to assume that there is no such thing as nonorthogonal basis? The orthogonal eigenbasis is the "easiest" to work with, but generally to be a basis a set of vectors has to be lin. indep and span the space, and being "lin. indep." means orthogonal.
Is it correct? Thanks. 



#2
Apr2005, 03:36 PM

Emeritus
Sci Advisor
PF Gold
P: 16,101

The word "orthogonal" is meaningless until you define an inner product on your vector space. There's no reason basis vectors should be orthogonal.




#3
Apr2005, 03:50 PM

P: 166

To expand on what Hyrkyl's said:
Take the space [tex]C^2[/tex]. The most usual basis in this space are the vectors [tex]0\rangle =\left(\begin{array}{cc}1\\0\end{array}\right)[/tex] [tex]1\rangle =\left(\begin{array}{cc}0\\1\end{array}\right)[/tex] So we can then get to any point in [tex]C^2[/tex] with the linear combination [tex]anywhere\rangle = \alpha 0\rangle +\beta 1\rangle[/tex]. Now, as a counterexample to your claim that a basis set has to be orthogonal, let me define the vectors [tex]g\rangle , h\rangle[/tex] as: [tex]g\rangle =\left(\begin{array}{cc}1\\1\end{array}\right)[/tex] [tex]h\rangle =\left(\begin{array}{cc}1\\0\end{array}\right)[/tex] I can still define any point on [tex]C^2[/tex] by using these vectors, but they are not orthogonal. In essense, you can consider it 'wasting' information to use a nonorthogonal basis set: [tex]\left(\begin{array}{cc}x\\y\end{array}\right) = a \left(\begin{array}{cc}1\\1\end{array}\right) + b \left(\begin{array}{cc}1\\0\end{array}\right)[/tex] Which gives [tex]x=a+b[/tex] [tex]y=a[/tex] So, as g and h aren't orthogonal the expression for x also contains information about the position y. So, basis vectors don't have to be orthogonal, but they are usually chosen to be. This is important when you start working out solution of linear equations, or doing stuff like quantum mechanics. 



#4
Apr2005, 03:59 PM

Emeritus
Sci Advisor
PF Gold
P: 16,101

orthogonal basis
And for the sake of being explicit, James did define an inner product on his vector space  it comes as a "standard" part of the definition of C^{2}.




#5
Apr2005, 05:22 PM

P: 460

So, I cannot make a jump from linearly indep. to orthogonal, but if vectors are orthogonal, they must be lin. indep. Right?




#6
Apr2005, 05:29 PM

Math
Emeritus
Sci Advisor
Thanks
PF Gold
P: 38,885

Yes, if in an "inner product space", a set of vectors are all othogonal then it must be linearly independent:
Suppose {v_{1},v_{2},. . . , v_{n}} are orthogonal vectors, C_{1}v_{1}+ C_{2}v_{2}+ . . .+ C_{n}v_{n}= 0. For each i between 1 and n, take the inner product on each side with v_{i}. You obviously get 0 on the right what do you get on the left? Your original statement "there is no such thing as a nonorthogonal basis" is "sort of" right because "orthogonal" depends on your choice of basis. Given any basis there exist an inner product such that the basis is orthogonal with that inner product. You get like this: Given basis {v_{1},v_{2},. . . , v_{n}}, define the inner product <u, v>, of vectors u and v like this: write u and v in terms of the basis: u= A_{1} v_{1}+ A_{2}v_{2}+ . . .+ A_{n}v_{n}, v=B_{1} v_{1}+ B_{2}v_{2}+ . . .+ B _{n}v_{n}, and define <u, v>= A_{1} B_{1}+ A_{2}B_{2}+ . . .+ A_{n}B_{n}. With that inner product, the basis is orthonormal. 



#7
Apr2005, 05:58 PM

Emeritus
Sci Advisor
PF Gold
P: 16,101





#8
Apr2005, 06:37 PM

Sci Advisor
HW Helper
P: 11,863

Exclude the null vector and assume all vectors are finite norm.
Daniel. 



#9
Apr2005, 09:26 PM

Sci Advisor
HW Helper
P: 9,422

dextercioby is giving a counterexample to halls' statement that a set of orthogonal vectors is independent, since the zero vector could be in the set.
so neither property, orthogonal or independent, implies the other. but for non zero vectors, orthogonal deos imply independent. i.e. two non zero vectors are independent if they are not parallel. but just saying they are not parallel does not mean they are perpendicular. on the other hand for non zero vectors, being perpendicular does imply they are not parallel. 



#10
Apr2105, 08:43 AM

P: 460

Basically I wrote lin. indep. equation: Let B = {P1, P2, P3} be orthogonal basis for R(3) all vectors are nonzero, which means that P1 (dot) P2 = 0 P2 (dot) P3 = 0 P1 (dot) P3 = 0; Then equation looks like this: aP1 + bP2 + cP3 = 0 Next, I took dotproduct of both sides of the equation: aP1 (dot) P1 + bP1 (dot) P2 ... = P1 (dot) 0 So it turns out that aP1^2 = 0 and so on... a = 0, since P1 is nonzero. and so on for other cases. I see mathwonk's point, but if the basis is eigenbasis then zero vector is excluded. Thanks to all of you! I have a better idea now. 



#11
Apr2105, 12:44 PM

Sci Advisor
HW Helper
P: 9,422

Although you are problem solving at the moment, hence more flexibility is allowed, when writing it up, instead of "orthogonal basis" you should perhaps say "orthogonal eigenbasis".
I notice that physicists are fond of introducing hypotheses after the fact to get themselves out of trouble, but mathematicians require them to be stated "up front". (i could never solve physics problems partly for this reason, and was somewhat miffed as a college student, to note that the "solution" seemed always to include an additional hypothesis that the solver stated was 'obviously true" but which he had not mentioned in stating the problem. This also occurred in trying to read relativity books later on. The writer would state that he was going to deduce some property from some other different one. I was unable to do so, then read the solution which began as follows: " since we know space is homogeneous" ..... (which had not been assumed at all). And I seem to recall I was reading the best physicists, such as Pauli, Einstein, Planck...) Indeed as bombadillo mentioned in his analysis of mathematicians thinking, I allow myself this license when brainstorming, but not when writing up proofs. since a proof is an attempt to communicate with others, it should leave no essential point in doubt.) 



#12
Apr2105, 02:26 PM

P: 460

You're right, I sort of redefined the problem. But the problem does not say that the basis is eignebasis, all it says is that the set is orthogonal.
So in orthogonal set, zero may be included? It's something you mentioned earlier. Doesn't that make it linearly depenent? 



#13
Apr2105, 06:07 PM

P: 166

Mathwonk: I have all sorts of fun along the lines you're mentioning. I am a Physicist (final year of my degree in the UK) and as such make assumptions along the lines you mention. However, I'm always careful to prove any such assumptions to myself as although I take them as true, I find it helps understanding on a deeper level than the problem at hand if you fully understand the framework supporting it.
With that in mind, I'm finding it quite interesting taking a 4th year module in Quantum Computing and Quantum Information Theory that is taught by the Mathematics department (we can take this 4th year Maths module in our 3rd year Physics) as everything is definined very formally, which is different to Physics where there is a certain element of what seems to be hand waving but is actually saving time by telling you certain things are true. If you want to go and prove these things then that's fine! A case in point is a post on this subforum on orthogonal basis sets. I gave a counterexample to someone's claim that a Physicist would quite happily take, but Hyrkyl added that (in this instance) a certain fundamental property of what I was talking about (i.e. the space [itex]C^2[/itex] having an inner product) to 'formalise' things. Bloody mathematicians :) Edit: Please ignore certain gramatical inconsitancies in the post above but I'm slightly less than sober right now... 



#14
Apr2105, 09:24 PM

Sci Advisor
HW Helper
P: 9,422

{(0,0), (1,0)} is an example of a dependent, but mutually orthogonal set.




#15
Apr2105, 10:12 PM

P: 460





#16
Apr2205, 06:31 AM

P: 333





#17
Apr2205, 09:22 AM

Math
Emeritus
Sci Advisor
Thanks
PF Gold
P: 38,885





#18
Apr2205, 09:44 AM

Sci Advisor
HW Helper
P: 9,398

No you wouldn't, that would be automatically true. But that may be semantics. 


Register to reply 
Related Discussions  
matrix connecting Sz diagonal basis to Sx diag basis  Quantum Physics  0  
operator in nonorthogonal basis  Quantum Physics  5  
Inner products and orthogonal basis  Linear & Abstract Algebra  4  
Dot product of basis vectors in orthogonal coordinate systems  Calculus & Beyond Homework  5  
orthogonal projection, orthonormal basis, coordinate vector of the polynomial?  Linear & Abstract Algebra  1 