Linear independancy and orthogonality of vectors

In summary: Not exactly. My reasoning was that if I had f.e. C = {<1, 0>, <0, 1>, <-1, 0>,<0, -1>} and the premise was that they were linearly independent I could state that they were orthogonal. What I failed to notice was that these are of course linearly dependent vectors. This would however hold:In a set of 2*n vectors in R^n where none of the vectors in the set...If you have a set of 2*n linearly independent vectors in Rn, then they are necessarily orthogonal.
  • #1
larusi
4
0
Hi, I'm reading up on linear algebra and I'm wondering if the remark after a theorem I'm reading here is complete. The theorem states:
"If {V_1,V_2,...,V_k} is an orthogonal set of nonzero vectors then these vectors are linearly independent."
Remark after that simply states that if a set of vectors are linearly independent they are not necessarily orthogonal.

If the dimension you're working with is R^n I find that if you have a set of 2*n linearly independent vectors in that dimension then they are necessarily orthogonal. Am I thinking about this the wrong way?
 
Physics news on Phys.org
  • #2
larusi said:
Hi, I'm reading up on linear algebra and I'm wondering if the remark after a theorem I'm reading here is complete. The theorem states:
"If {V_1,V_2,...,V_k} is an orthogonal set of nonzero vectors then these vectors are linearly independent."
Remark after that simply states that if a set of vectors are linearly independent they are not necessarily orthogonal.
Do you have a question about this? What it says is that if you have an orthogonal set of vectors, then they are linearly independent. The converse doesn't have to be true. IOW, if you have a set of linearly independent vectors, they don't have to be orthogonal.
larusi said:
If the dimension you're working with is R^n I find that if you have a set of 2*n linearly independent vectors in that dimension then they are necessarily orthogonal. Am I thinking about this the wrong way?
In Rn, you can have at most n (not 2n) linearly independent vectors. If you have n + 1 vectors in Rn, one of them must be a linear combination of the others.
 
  • #3
Here's a simple counterexample to help your intuition. Imagine two vectors in 2D: one along the x axis, and one along the y=x line (so at a 45 degree angle). They are linearly independent, but not orthogonal.

Does that help?
 
  • #4
Ah, I failed to note that the number of linearly independent vectors you could have was limited. Thanks Mark44, that clears things up.

chogg, I understand the theorem - I was wondering under what circumstances the reverse was true.
 
  • #5
For a vector space of dimension n, any basis will have exactly n vectors in it. If the vectors happen to be orthogonal, or even othonormal (i.e., mutually perpendicular and of length 1), then great, but there is no requirement for the vectors in a basis to be orthogonal.

For R2, A = {<1, 0>, <0, 1>} and B = {<1, 1>, <1, 2>} are both bases, but only in A are the vectors orthogonal.

Edit: I might have misunderstood your question. I think you are asking for a situation where you have a set of orthogonal vectors that are linearly dependent.

C = {<1, 0>, <0, 1>, <0, 0>}
These vectors are mutually orthogonal, but are linearly dependent.
 
  • #6
Mark44 said:
Edit: I might have misunderstood your question. I think you are asking for a situation where you have a set of orthogonal vectors that are linearly dependent.

C = {<1, 0>, <0, 1>, <0, 0>}
These vectors are mutually orthogonal, but are linearly dependent.


Not exactly. My reasoning was that if I had f.e. C = {<1, 0>, <0, 1>, <-1, 0>,<0, -1>} and the premise was that they were linearly independent I could state that they were orthogonal. What I failed to notice was that these are of course linearly dependent vectors. This would however hold:

In a set of 2*n vectors in R^n where none of the vectors in the set is the null vector nor can be written as a sum of the other vectors multiplied by a positive scalar, the vectors are orthogonal.
 
  • #7
larusi said:
Not exactly. My reasoning was that if I had f.e. C = {<1, 0>, <0, 1>, <-1, 0>,<0, -1>} and the premise was that they were linearly independent I could state that they were orthogonal. What I failed to notice was that these are of course linearly dependent vectors. This would however hold:

In a set of 2*n vectors in R^n where none of the vectors in the set is the null vector nor can be written as a sum of the other vectors multiplied by a positive scalar, the vectors are orthogonal.
I doubt that this is true for larger values of n, say 3 and up. In two dimensions, it's easy to determine whether two vectors are linearly dependent - one of them will be a scalar multiple of the other. In three or more dimensions it's harder, as you can have a set of vectors where no one of them is a multiple of any of the others, but is a linear combination of the others.

I'm not sure why it's important to say something about a set of 2n vectors in Rn, though, whether they're orthogonal or not, but what do I know?
 
  • #8
larusi said:
This would however hold:

In a set of 2*n vectors in R^n where none of the vectors in the set is the null vector nor can be written as a sum of the other vectors multiplied by a positive scalar, the vectors are orthogonal.

It only "holds" in the sense that "no such set of vectors exists, therefore no proposition about the members of the set is false."

Any set of ##k > n## non-zero vectors in ##\mathbb{R}^n## is linearly dependent.

Proof: any set of ##k## linearly independent vectors is the basis for a subspace of ##\mathbb{R}^n## of dimension ##k##. Therefore ##k \le n##.
 
  • #9
Mark44 said:
I doubt that this is true for larger values of n, say 3 and up. In two dimensions, it's easy to determine whether two vectors are linearly dependent - one of them will be a scalar multiple of the other. In three or more dimensions it's harder, as you can have a set of vectors where no one of them is a multiple of any of the others, but is a linear combination of the others.

I'm not sure why it's important to say something about a set of 2n vectors in Rn, though, whether they're orthogonal or not, but what do I know?

You're right, it's useless for anything else than a thought experiment for me to learn linear algebra :approve:

AlephZero said:
It only "holds" in the sense that "no such set of vectors exists, therefore no proposition about the members of the set is false."

Any set of ##k > n## non-zero vectors in ##\mathbb{R}^n## is linearly dependent.

Proof: any set of ##k## linearly independent vectors is the basis for a subspace of ##\mathbb{R}^n## of dimension ##k##. Therefore ##k \le n##.

I didn't state that they were linearly independent, positive scalar is the key. This set is an example:
C = {<1, 0>, <0, 1>, <-1, 0>,<0, -1>}
4 vectors in R^2 where none are the null vector nor can be written as a sum of the others multiplied by positive scalars, therefore they are orthogonal (according to my broken proposition).

Thanks for the insight guys.
 
  • #10
the examples (1,0) and (1,1) is true for Cartesian co-ordinates... in general how can we prove that?
All linearly independent vectors need not be orthogonal... is it true for abstract vectors also? I mean which need not be arrows in space...
I am just asking...
 

1. What is the difference between linear independence and orthogonality of vectors?

Linear independence refers to a set of vectors that cannot be written as a linear combination of each other. Orthogonality, on the other hand, refers to a set of vectors that are perpendicular to each other, meaning their dot product is equal to 0.

2. How do you determine if a set of vectors is linearly independent?

A set of vectors is linearly independent if the only solution to the equation c1v1 + c2v2 + ... + cnvn = 0 is when c1 = c2 = ... = cn = 0, where c1, c2, ..., cn are constants and v1, v2, ..., vn are the vectors in the set.

3. What is the significance of linear independence in vector spaces?

Linear independence is significant because it allows us to create a basis for a vector space. A basis is a set of linearly independent vectors that can be used to represent any vector in the vector space.

4. How do you determine if a set of vectors is orthogonal?

A set of vectors is orthogonal if their dot product is equal to 0. This can be checked by calculating the dot product of each pair of vectors in the set and ensuring that they are all equal to 0.

5. Can a set of orthogonal vectors also be linearly independent?

Yes, a set of orthogonal vectors is always linearly independent. This is because if the dot product of two vectors is 0, it means they are perpendicular to each other and cannot be written as a linear combination of each other.

Similar threads

  • Linear and Abstract Algebra
Replies
9
Views
194
Replies
3
Views
1K
Replies
24
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
873
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
13
Views
2K
  • Linear and Abstract Algebra
Replies
9
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
823
  • Linear and Abstract Algebra
Replies
5
Views
1K
Back
Top