Question about vector independence

In summary: & 0 \\ 0 & 1 & -1 \\ 0 & 0 & 1\end{pmatrix} then there are multiple solutions which means the vectors are not linearly dependent.
  • #1
MarcL
170
2
So I was reading my textbook and I confused myself about a theorem

Where if S={v,v2,...,vr} and in ℝn then if r>n, then it is linearly dependent

It doesn't make sense to me because if we look at 2 vectors in ℝ3 (lets say u and v)

we have u=(u1,u2,u3) and v=(v1,v2,v3)
So i do k1(u1,u2,u3)+k2(v1,v2,v3)=0
If i use a matrix:

u1 v1 0
u2 v2 0
u3 v3 0

Then it would seem to me as both vectors would be linearly dependent, no?
 
Physics news on Phys.org
  • #2
MarcL said:
So I was reading my textbook and I confused myself about a theorem

Where if S={v,v2,...,vr} and in ℝn then if r>n, then it is linearly dependent

It doesn't make sense to me because if we look at 2 vectors in ℝ3 (lets say u and v)

The theorem doesn't say that 2 vectors in R^3 cannot be dependent. It doesn't say that there is dependence "if and only if r > n".
 
  • #3
But if I solve for a matrix of a 2x3, I would end up with a free variable no? ( 3rd line, after I REF or RREF the matrix) so that would make it linearly dependent
 
  • #4
MarcL said:
But if I solve for a matrix of a 2x3, I would end up with a free variable no? ( 3rd line, after I REF or RREF the matrix) so that would make it linearly dependent

The theorem does not say that 2 vectors in [itex]\mathbb{R}^3[/itex] must be independent. There is no contradiction.
 
  • #5
Well I don't know what I am trying to say is that I see a contradiction, more like " what is wrong in my reasoning" type of thing. I'll give a concrete example I just did, maybe my question will be more clear

Is this set of vector linearly dependent

(8,-1,3),(4,0,1)

8 4 0 --> ( interchange row 2 with row 1, then row 3 with "new" row 2)
-1 0 0
3 1 0

-1 0 0
3 1 0
8 4 0
We end up with 2 rows of leading one, and one row that I can reduce to 0 0 0 ( the third one )easily, so how isn't that linearly dependent?
 
  • #6
MarcL said:
so how isn't that linearly dependent?

The two vectors (8,-1,3),(4,0,1) in your example are not linearly dependent.

To determine if two 3-dimensional vectors [itex](u1,u2,u3)[/itex] and [itex] (v1,v2,v3) [/itex] are dependent, you need to find non-zero solutions [itex] x [/itex] and [itex] y [/itex] to the vector equation:

[itex] x ( u1,u2,u3) + y (v1,v2,v3) = 0 [/itex]

Written in matrix form, this can be expressed as:

[itex] \begin{pmatrix} u1&v1 \\ u2 & v2\\ u3&v3\end{pmatrix} \begin{pmatrix} x\\y \end{pmatrix} = \begin{pmatrix} 0\\ 0 \\0\end{pmatrix} [/itex].

Apparently you analyze this equation by writing the "augmented matrix" for it ( where the constants on the right-hand side are put in the matrix, we are dealing with:

\begin{pmatrix} u1 & v1 & 0\\ u2&v2& 0\\u3&v3 & 0\end{pmatrix}

If we row-reduce the matrix so the bottom row is zeroes, what does that tell us?

If it reduces to [itex] \begin{pmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 0\end{pmatrix} [/itex] , this implies that [itex] x = 0, \ y = 0 [/itex] is the unique solution to the equations, so the two vectors are linearly independent.

If it reduces to something like [itex] \begin{pmatrix} 1 & -2 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0\end{pmatrix} [/itex] , this implies there are solutions like [itex] x = 2, y = 1 [/itex] and [itex] x = 4, y = 2 [/itex] etc., so the two vectors are dependent.
 
  • #7
Oh I didn't know you guys had a matrix format that I could use... sorry about that

So, trivial solutions are only determined whether or not a x or y ( in this case) variable would have a leading one or not? just seems kinda odd that one row could be 0 without affecting the depedence of the vector
 
  • #8
MarcL said:
just seems kinda odd that one row could be 0 without affecting the depedence of the vector

You might be confusing two different techiques for testing independence. Another (and more common technique) for testing the indepdence for a set of n-dimensional vectors is to write the vectors as rows in a matrix and try to row reduce the matrix toward the identity. If you get a row of zeros in that technique you have managed to express the zero vector as a linear combination of other rows, so the vectors are dependent.

If you row reduce [itex]\begin{pmatrix} u1 & u2 & u3 \\ v1 & v2 & v3 \end{pmatrix}[/itex] to something like [itex] \begin{pmatrix} 1 &-2 &0 \\ 0 & 0 & 0 \end{pmatrix} [/itex] then the vectors are dependent.

If you row reduce it something like [itex] \begin {pmatrix} 1 & 0 & -2 \\ 0 & 1 & 7 \end{pmatrix}[/itex] the vectors are independent.
 
  • #9
If you have two vectors to consider, it's very easy to determine whether they are linearly independent. If neither one is a constant multiple of the other, the two vectors are linearly independent.

It's a lot harder when you have three or more vectors. Even if no one vector is a multiple of any of the others, the set can still be linearly dependent. For example, consider S = {<1, 1, 0>, <1, -1, 0>, <2, 0, 0>}. No one vector is a multiple of any of the others, but the equation c1v1 + c2v2 + c3v3 = 0, has a solution where not all of the constants are zero.
 
  • #10
MarcL said:
if we look at 2 vectors in ℝ3 (lets say u and v)

we have u=(u1,u2,u3) and v=(v1,v2,v3)
So i do k1(u1,u2,u3)+k2(v1,v2,v3)=0
That is not always possible: If u=(1,0,0) and v=(0,1,0) it is not possible to find nonzero k1 and k2 that will give 0.
But it can certainly be true: If u=(1,0,0) and v = (-1,0,0) clearly k1=1 and k2=1 would give 0.
So as long as the number of vectors is less than or equal to the dimension of the space, it can go either way.

However, for R3, it is impossible to find 4 vectors where no set of them are linearly dependent. That is what the theorem says.
 
  • #11
The theorem says that for all positive integers ##n##, every subset of ##\mathbb R^n## with cardinality (=number of elements) greater than ##n## is linearly independent. So in particular, it says that every subset of ##\mathbb R^3## with at least four elements is linearly dependent. It says nothing about sets with only two elements, like your ##\{u,v\}##.

Some subsets of ##\mathbb R^3## with two elements are linearly dependent, and some are linearly independent. ##\{(1,0,0),(2,0,0)\}## is an example of the former, and ##\{(1,0,0),(0,1,0)\}## is an example of the latter.
 

1. What is vector independence?

Vector independence refers to the property of a set of vectors in a vector space where none of the vectors can be written as a linear combination of the others. In other words, no vector in the set can be formed by multiplying another vector by a scalar and adding it to a linear combination of the remaining vectors in the set.

2. How do you determine if a set of vectors is linearly independent?

A set of vectors is linearly independent if no vector in the set can be written as a linear combination of the other vectors. This can be determined by setting up a system of equations where the coefficients of the linear combination are unknown and solving for them. If there is a unique solution (all coefficients are equal to 0), then the vectors are linearly independent. If there are infinitely many solutions, then the vectors are linearly dependent.

3. What is the difference between linear independence and orthogonality?

Linear independence refers to the property of a set of vectors where none of the vectors can be written as a linear combination of the others. Orthogonality, on the other hand, refers to the property of vectors being perpendicular to each other. In other words, orthogonal vectors are independent but not all independent vectors are necessarily orthogonal.

4. Can a set of vectors be both dependent and independent?

No, a set of vectors cannot be both dependent and independent. If a set of vectors is dependent, it means that at least one vector in the set can be written as a linear combination of the others, making the set linearly dependent. On the other hand, if a set of vectors is independent, it means that no vector can be written as a linear combination of the others, making the set linearly independent.

5. How does vector independence relate to the rank of a matrix?

The rank of a matrix is equal to the maximum number of linearly independent rows or columns in the matrix. This means that the rank of a matrix is directly related to the number of linearly independent vectors in a set. If the rank of a matrix is equal to the number of its rows or columns, then the vectors in the corresponding set are linearly independent. Otherwise, they are linearly dependent.

Similar threads

  • Mechanical Engineering
Replies
2
Views
1K
  • Linear and Abstract Algebra
2
Replies
59
Views
8K
Replies
4
Views
3K
  • Calculus and Beyond Homework Help
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
6
Views
869
  • Linear and Abstract Algebra
Replies
7
Views
2K
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
1K
Back
Top