Determine if linearly independent?

In summary, the concept of linear independence in a set of vectors is determined by the number of solutions to a system of equations, where the vectors are arranged as columns in a matrix. If the matrix is square and the determinant is non-zero, the vectors are independent. If the matrix is not square, the vectors are independent if the last row of the row-reduced matrix is not all zeros. Otherwise, they are dependent.
  • #1
Tweet
5
0
Hi all,

I have been studying Linear Algebra for an upcoming exam, and one question has puzzled me slightly! How do you determine of a vector in R4 is linearly independent?

Given three vectors, each with 4 rows, I know you are meant to arrange them into a matrix, like this:

[tex]\[ \left( \begin{array}{ccc}
a & e & i \\
b & f & j \\
c & g & k\\
d & h & l\end{array} \right)\]
[/tex]

In this case you are unable to find the determinant as it is not a square matrix. Are you meant to use row reduction instead? And if so, how do you ascertain whether it is independent or dependent? I'd appreciate any help clearing this up!

Cheers.
 
Physics news on Phys.org
  • #2
Tweet said:
Hi all,

I have been studying Linear Algebra for an upcoming exam, and one question has puzzled me slightly! How do you determine of a vector in R4 is linearly independent?

Given three vectors, each with 4 rows
You mean with 4 components, don't you?

, I know you are meant to arrange them into a matrix, like this:

[tex]\[ \left( \begin{array}{ccc}
a & e & i \\
b & f & j \\
c & g & k\\
d & h & l\end{array} \right)\]
[/tex]

In this case you are unable to find the determinant as it is not a square matrix. Are you meant to use row reduction instead? And if so, how do you ascertain whether it is independent or dependent? I'd appreciate any help clearing this up!

Cheers.
Think about what "independent" means (that should be the first thing you learned).

A set of vectors (here three) are "independent" if and only the only solution to the equations [itex]xu yv+ zw 0[/itex] is x= y= z= 0. If the components are <a, b, c, d>, <e, f, g, h>, and <i, j, k, l> that is the same as the system of equations xa+ ye+ zi= 0, xb+ yg+ zf= 0, xc+ yg+ zk= 0, and xd+ yh+ zl= 0 which, in turn, is the same as the matrix equation
[tex]\begin{bmatrix}a & e & i \\ b & g & f \\ c & g & k \\ d & h & f\end{bmatrix}\begin{bmatrix}x \\ y \\ z\end{bmatrix}= \begin{bmatrix}0 \\ 0 \\ 0\end{bmatrix}[/tex]

Obviously x= y= z= 0 is a solution so the whole question is whether or not there are any other solutions. You are right that if the matrix were square, we could look at the determinant: if it were non-zero, there would be only one solution, x= y= z= 0 and so the vectors would be independent. But with a non-square matrix, we need to row-reduce as you say. Here, because there are three equations with four unknowns, the last row will reduce to 0 0 0. If the third row does not also reduce to 0 0 0, then we can solve to get x= y= z= 0. If the third row does reduce to 0 0 0, then we have only two equations to solve for three unknowns and that will not have a unique solution.
 
  • #3
A set of vectors (here three) are "independent" if and only the only solution to the equations [itex]xu yv+ zw 0[/itex] is x= y= z= 0. If the components are <a, b, c, d>, <e, f, g, h>, and <i, j, k, l> that is the same as the system of equations xa+ ye+ zi= 0, xb+ yg+ zf= 0, xc+ yg+ zk= 0, and xd+ yh+ zl= 0 which, in turn, is the same as the matrix equation
[tex]\begin{bmatrix}a & e & i \\ b & g & f \\ c & g & k \\ d & h & f\end{bmatrix}\begin{bmatrix}x \\ y \\ z\end{bmatrix}= \begin{bmatrix}0 \\ 0 \\ 0\end{bmatrix}[/tex]

Minor correction: the right hand side should be

[tex]\left[\begin{array}{c}
0 \\
0 \\
0 \\
0\end{array}\right][/tex]
 
  • #4
Brilliant! That makes perfect sense to me. Thanks very much for clearing that up. So if the last row does reduce to 0 0 0 after row reducing, then it's safe to say that the vectors are dependent?
 
  • #5
In this particular, case, since there were only three vectors and four components, the last row will be 0 0 0 even if the vectors are independent. In order that they be dependent, the last two rows must be 0 0 0.

In order for n vectors to be independent, the first n rows of the matrix got by row-reducing formed by taking the vectors as columns must not be all 0s, no matter how many components (rows of the matrix) there are. Of course, if there are more vectors than components, there are not enough rows in the matrix for that to be true: if m> n, then a set of m vectors in a space of dimension n cannot be independent.
 
  • #6
Thanks a million guys, that's really helped me (finally) understand this stuff.
 

1. What does it mean for vectors to be linearly independent?

Linear independence refers to a set of vectors in which none of the vectors can be written as a linear combination of the others. In other words, no vector in the set can be expressed as a sum of multiples of the other vectors.

2. How do you determine if vectors are linearly independent?

One way to determine if vectors are linearly independent is to set up a system of equations using the coefficients of the vectors and solve for them. If the only solution is that all coefficients are equal to zero, then the vectors are linearly independent.

3. What is the difference between linearly independent and linearly dependent vectors?

Linearly dependent vectors can be written as a linear combination of the other vectors in the set, while linearly independent vectors cannot. In other words, linearly dependent vectors are not unique and can be expressed using other vectors, while linearly independent vectors are unique and cannot be expressed using other vectors.

4. Can a set of only two vectors be linearly independent?

Yes, a set of two vectors can be linearly independent if they are not scalar multiples of each other. This means that they are not on the same line and one cannot be obtained by scaling the other.

5. How is linear independence related to the dimension of a vector space?

The dimension of a vector space is the number of linearly independent vectors needed to span the space. This means that the dimension is equal to the number of vectors in a linearly independent set that spans the space.

Similar threads

  • Linear and Abstract Algebra
Replies
8
Views
853
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
849
  • Linear and Abstract Algebra
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
5
Views
937
  • Linear and Abstract Algebra
Replies
5
Views
2K
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
15
Views
4K
Back
Top