Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Linear independancy and orthogonality of vectors

  1. Apr 8, 2014 #1
    Hi, i'm reading up on linear algebra and I'm wondering if the remark after a theorem I'm reading here is complete. The theorem states:
    "If {V_1,V_2,...,V_k} is an orthogonal set of nonzero vectors then these vectors are linearly independent."
    Remark after that simply states that if a set of vectors are linearly independent they are not necessarily orthogonal.

    If the dimension you're working with is R^n I find that if you have a set of 2*n linearly independent vectors in that dimension then they are necessarily orthogonal. Am I thinking about this the wrong way?
     
  2. jcsd
  3. Apr 8, 2014 #2

    Mark44

    Staff: Mentor

    Do you have a question about this? What it says is that if you have an orthogonal set of vectors, then they are linearly independent. The converse doesn't have to be true. IOW, if you have a set of linearly independent vectors, they don't have to be orthogonal.
    In Rn, you can have at most n (not 2n) linearly independent vectors. If you have n + 1 vectors in Rn, one of them must be a linear combination of the others.
     
  4. Apr 8, 2014 #3
    Here's a simple counterexample to help your intuition. Imagine two vectors in 2D: one along the x axis, and one along the y=x line (so at a 45 degree angle). They are linearly independent, but not orthogonal.

    Does that help?
     
  5. Apr 8, 2014 #4
    Ah, I failed to note that the number of linearly independent vectors you could have was limited. Thanks Mark44, that clears things up.

    chogg, I understand the theorem - I was wondering under what circumstances the reverse was true.
     
  6. Apr 8, 2014 #5

    Mark44

    Staff: Mentor

    For a vector space of dimension n, any basis will have exactly n vectors in it. If the vectors happen to be orthogonal, or even othonormal (i.e., mutually perpendicular and of length 1), then great, but there is no requirement for the vectors in a basis to be orthogonal.

    For R2, A = {<1, 0>, <0, 1>} and B = {<1, 1>, <1, 2>} are both bases, but only in A are the vectors orthogonal.

    Edit: I might have misunderstood your question. I think you are asking for a situation where you have a set of orthogonal vectors that are linearly dependent.

    C = {<1, 0>, <0, 1>, <0, 0>}
    These vectors are mutually orthogonal, but are linearly dependent.
     
  7. Apr 8, 2014 #6

    Not exactly. My reasoning was that if I had f.e. C = {<1, 0>, <0, 1>, <-1, 0>,<0, -1>} and the premise was that they were linearly independent I could state that they were orthogonal. What I failed to notice was that these are of course linearly dependent vectors. This would however hold:

    In a set of 2*n vectors in R^n where none of the vectors in the set is the null vector nor can be written as a sum of the other vectors multiplied by a positive scalar, the vectors are orthogonal.
     
  8. Apr 8, 2014 #7

    Mark44

    Staff: Mentor

    I doubt that this is true for larger values of n, say 3 and up. In two dimensions, it's easy to determine whether two vectors are linearly dependent - one of them will be a scalar multiple of the other. In three or more dimensions it's harder, as you can have a set of vectors where no one of them is a multiple of any of the others, but is a linear combination of the others.

    I'm not sure why it's important to say something about a set of 2n vectors in Rn, though, whether they're orthogonal or not, but what do I know?
     
  9. Apr 8, 2014 #8

    AlephZero

    User Avatar
    Science Advisor
    Homework Helper

    It only "holds" in the sense that "no such set of vectors exists, therefore no proposition about the members of the set is false."

    Any set of ##k > n## non-zero vectors in ##\mathbb{R}^n## is linearly dependent.

    Proof: any set of ##k## linearly independent vectors is the basis for a subspace of ##\mathbb{R}^n## of dimension ##k##. Therefore ##k \le n##.
     
  10. Apr 8, 2014 #9
    You're right, it's useless for anything else than a thought experiment for me to learn linear algebra :approve:

    I didn't state that they were linearly independent, positive scalar is the key. This set is an example:
    C = {<1, 0>, <0, 1>, <-1, 0>,<0, -1>}
    4 vectors in R^2 where none are the null vector nor can be written as a sum of the others multiplied by positive scalars, therefore they are orthogonal (according to my broken proposition).

    Thanks for the insight guys.
     
  11. Jan 1, 2015 #10
    the examples (1,0) and (1,1) is true for Cartesian co-ordinates... in general how can we prove that?
    All linearly independent vectors need not be orthogonal.... is it true for abstract vectors also? I mean which need not be arrows in space.....
    I am just asking......
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook