orthogonal basis


by EvLer
Tags: basis, orthogonal
EvLer
EvLer is offline
#1
Apr20-05, 03:33 PM
P: 460
Is it correct to assume that there is no such thing as non-orthogonal basis? The orthogonal eigenbasis is the "easiest" to work with, but generally to be a basis a set of vectors has to be lin. indep and span the space, and being "lin. indep." means orthogonal.
Is it correct?
Thanks.
Phys.Org News Partner Science news on Phys.org
NASA's space station Robonaut finally getting legs
Free the seed: OSSI nurtures growing plants without patent barriers
Going nuts? Turkey looks to pistachios to heat new eco-city
Hurkyl
Hurkyl is offline
#2
Apr20-05, 03:36 PM
Emeritus
Sci Advisor
PF Gold
Hurkyl's Avatar
P: 16,101
The word "orthogonal" is meaningless until you define an inner product on your vector space. There's no reason basis vectors should be orthogonal.
James Jackson
James Jackson is offline
#3
Apr20-05, 03:50 PM
P: 166
To expand on what Hyrkyl's said:

Take the space [tex]C^2[/tex]. The most usual basis in this space are the vectors

[tex]|0\rangle =\left(\begin{array}{cc}1\\0\end{array}\right)[/tex]
[tex]|1\rangle =\left(\begin{array}{cc}0\\1\end{array}\right)[/tex]

So we can then get to any point in [tex]C^2[/tex] with the linear combination

[tex]|anywhere\rangle = \alpha |0\rangle +\beta |1\rangle[/tex].

Now, as a counterexample to your claim that a basis set has to be orthogonal, let me define the vectors [tex]|g\rangle , |h\rangle[/tex] as:

[tex]|g\rangle =\left(\begin{array}{cc}1\\1\end{array}\right)[/tex]
[tex]|h\rangle =\left(\begin{array}{cc}1\\0\end{array}\right)[/tex]

I can still define any point on [tex]C^2[/tex] by using these vectors, but they are not orthogonal. In essense, you can consider it 'wasting' information to use a non-orthogonal basis set:

[tex]\left(\begin{array}{cc}x\\y\end{array}\right) = a \left(\begin{array}{cc}1\\1\end{array}\right) + b \left(\begin{array}{cc}1\\0\end{array}\right)[/tex]

Which gives

[tex]x=a+b[/tex]
[tex]y=a[/tex]

So, as g and h aren't orthogonal the expression for x also contains information about the position y.

So, basis vectors don't have to be orthogonal, but they are usually chosen to be. This is important when you start working out solution of linear equations, or doing stuff like quantum mechanics.

Hurkyl
Hurkyl is offline
#4
Apr20-05, 03:59 PM
Emeritus
Sci Advisor
PF Gold
Hurkyl's Avatar
P: 16,101

orthogonal basis


And for the sake of being explicit, James did define an inner product on his vector space -- it comes as a "standard" part of the definition of C2.
EvLer
EvLer is offline
#5
Apr20-05, 05:22 PM
P: 460
So, I cannot make a jump from linearly indep. to orthogonal, but if vectors are orthogonal, they must be lin. indep. Right?
HallsofIvy
HallsofIvy is offline
#6
Apr20-05, 05:29 PM
Math
Emeritus
Sci Advisor
Thanks
PF Gold
P: 38,885
Yes, if in an "inner product space", a set of vectors are all othogonal then it must be linearly independent:

Suppose {v1,v2,. . . , vn} are orthogonal vectors, C1v1+ C2v2+ . . .+ Cnvn= 0. For each i between 1 and n, take the inner product on each side with vi. You obviously get 0 on the right- what do you get on the left?

Your original statement "there is no such thing as a non-orthogonal basis" is "sort of" right- because "orthogonal" depends on your choice of basis. Given any basis there exist an inner product such that the basis is orthogonal with that inner product. You get like this: Given basis {v1,v2,. . . , vn}, define the inner product <u, v>, of vectors u and v like this: write u and v in terms of the basis:
u= A1 v1+ A2v2+ . . .+ Anvn, v=B1 v1+ B2v2+ . . .+ B nvn, and define
<u, v>= A1 B1+ A2B2+ . . .+ AnBn. With that inner product, the basis is orthonormal.
Hurkyl
Hurkyl is offline
#7
Apr20-05, 05:58 PM
Emeritus
Sci Advisor
PF Gold
Hurkyl's Avatar
P: 16,101
So, I cannot make a jump from linearly indep. to orthogonal, but if vectors are orthogonal, they must be lin. indep. Right?
See if you can prove it. Suppose you're given a linear combination of orthogonal vectors that sums to zero -- can you prove the coefficients must be zero?
dextercioby
dextercioby is offline
#8
Apr20-05, 06:37 PM
Sci Advisor
HW Helper
P: 11,863
Exclude the null vector and assume all vectors are finite norm.

Daniel.
mathwonk
mathwonk is offline
#9
Apr20-05, 09:26 PM
Sci Advisor
HW Helper
mathwonk's Avatar
P: 9,422
dextercioby is giving a counterexample to halls' statement that a set of orthogonal vectors is independent, since the zero vector could be in the set.

so neither property, orthogonal or independent, implies the other.

but for non zero vectors, orthogonal deos imply independent.

i.e. two non zero vectors are independent if they are not parallel. but just saying they are not parallel does not mean they are perpendicular.

on the other hand for non zero vectors, being perpendicular does imply they are not parallel.
EvLer
EvLer is offline
#10
Apr21-05, 08:43 AM
P: 460
Quote Quote by Hurkyl
See if you can prove it. Suppose you're given a linear combination of orthogonal vectors that sums to zero -- can you prove the coefficients must be zero?
That was the exercise I was working on when I thought of my original question.
Basically I wrote lin. indep. equation:
Let B = {P1, P2, P3} be orthogonal basis for R(3) all vectors are non-zero, which means that
P1 (dot) P2 = 0
P2 (dot) P3 = 0
P1 (dot) P3 = 0;
Then equation looks like this:
aP1 + bP2 + cP3 = 0
Next, I took dot-product of both sides of the equation:
aP1 (dot) P1 + bP1 (dot) P2 ... = P1 (dot) 0
So it turns out that
a|P1|^2 = 0 and so on... a = 0, since P1 is non-zero.
and so on for other cases.
I see mathwonk's point, but if the basis is eigenbasis then zero vector is excluded.
Thanks to all of you! I have a better idea now.
mathwonk
mathwonk is offline
#11
Apr21-05, 12:44 PM
Sci Advisor
HW Helper
mathwonk's Avatar
P: 9,422
Although you are problem solving at the moment, hence more flexibility is allowed, when writing it up, instead of "orthogonal basis" you should perhaps say "orthogonal eigenbasis".

I notice that physicists are fond of introducing hypotheses after the fact to get themselves out of trouble, but mathematicians require them to be stated "up front".

(i could never solve physics problems partly for this reason, and was somewhat miffed as a college student, to note that the "solution" seemed always to include an additional hypothesis that the solver stated was 'obviously true" but which he had not mentioned in stating the problem.

This also occurred in trying to read relativity books later on. The writer would state that he was going to deduce some property from some other different one. I was unable to do so, then read the solution which began as follows: " since we know space is homogeneous" ..... (which had not been assumed at all). And I seem to recall I was reading the best physicists, such as Pauli, Einstein, Planck...)

Indeed as bombadillo mentioned in his analysis of mathematicians thinking, I allow myself this license when brainstorming, but not when writing up proofs. since a proof is an attempt to communicate with others, it should leave no essential point in doubt.)
EvLer
EvLer is offline
#12
Apr21-05, 02:26 PM
P: 460
You're right, I sort of re-defined the problem. But the problem does not say that the basis is eignebasis, all it says is that the set is orthogonal.
So in orthogonal set, zero may be included? It's something you mentioned earlier. Doesn't that make it linearly depenent?
James Jackson
James Jackson is offline
#13
Apr21-05, 06:07 PM
P: 166
Mathwonk: I have all sorts of fun along the lines you're mentioning. I am a Physicist (final year of my degree in the UK) and as such make assumptions along the lines you mention. However, I'm always careful to prove any such assumptions to myself as although I take them as true, I find it helps understanding on a deeper level than the problem at hand if you fully understand the framework supporting it.

With that in mind, I'm finding it quite interesting taking a 4th year module in Quantum Computing and Quantum Information Theory that is taught by the Mathematics department (we can take this 4th year Maths module in our 3rd year Physics) as everything is definined very formally, which is different to Physics where there is a certain element of what seems to be hand waving but is actually saving time by telling you certain things are true. If you want to go and prove these things then that's fine!

A case in point is a post on this sub-forum on orthogonal basis sets. I gave a counterexample to someone's claim that a Physicist would quite happily take, but Hyrkyl added that (in this instance) a certain fundamental property of what I was talking about (i.e. the space [itex]C^2[/itex] having an inner product) to 'formalise' things.

Bloody mathematicians :)

Edit: Please ignore certain gramatical inconsitancies in the post above but I'm slightly less than sober right now...
mathwonk
mathwonk is offline
#14
Apr21-05, 09:24 PM
Sci Advisor
HW Helper
mathwonk's Avatar
P: 9,422
{(0,0), (1,0)} is an example of a dependent, but mutually orthogonal set.
EvLer
EvLer is offline
#15
Apr21-05, 10:12 PM
P: 460
Quote Quote by mathwonk
{(0,0), (1,0)} is an example of a dependent, but mutually orthogonal set.
Ok, I am not trying to be annoying, but why does Penney (author of my textbook) define orthogonal set as {P1, ...., Pn}, Pi != 0 ?
Lonewolf
Lonewolf is offline
#16
Apr22-05, 06:31 AM
P: 333
Quote Quote by EvLer
Ok, I am not trying to be annoying, but why does Penney (author of my textbook) define orthogonal set as {P1, ...., Pn}, Pi != 0 ?
It's an odd restriction that I've never seen before. If the set was orthonormal, then you'd need the condition that Pi != 0.
HallsofIvy
HallsofIvy is offline
#17
Apr22-05, 09:22 AM
Math
Emeritus
Sci Advisor
Thanks
PF Gold
P: 38,885
Quote Quote by EvLer
Ok, I am not trying to be annoying, but why does Penney (author of my textbook) define orthogonal set as {P1, ...., Pn}, Pi != 0 ?
In order to make my post right, of course!
matt grime
matt grime is offline
#18
Apr22-05, 09:44 AM
Sci Advisor
HW Helper
P: 9,398
Quote Quote by Lonewolf
It's an odd restriction that I've never seen before. If the set was orthonormal, then you'd need the condition that Pi != 0.

No you wouldn't, that would be automatically true. But that may be semantics.


Register to reply

Related Discussions
matrix connecting Sz diagonal basis to Sx diag basis Quantum Physics 0
operator in non-orthogonal basis Quantum Physics 5
Inner products and orthogonal basis Linear & Abstract Algebra 4
Dot product of basis vectors in orthogonal coordinate systems Calculus & Beyond Homework 5
orthogonal projection, orthonormal basis, coordinate vector of the polynomial? Linear & Abstract Algebra 1