Determine if linearly independent?

Click For Summary
To determine if three vectors in R4 are linearly independent, one must arrange them into a matrix and use row reduction since the matrix is not square. A set of vectors is independent if the only solution to the corresponding system of equations is the trivial solution (all coefficients equal to zero). If the row-reduced form results in a row of zeros, it indicates dependency, especially if there are more vectors than rows. For n vectors to be independent, the first n rows of the row-reduced matrix must not all be zeros. Understanding these concepts is crucial for mastering linear independence in linear algebra.
Tweet
Messages
5
Reaction score
0
Hi all,

I have been studying Linear Algebra for an upcoming exam, and one question has puzzled me slightly! How do you determine of a vector in R4 is linearly independent?

Given three vectors, each with 4 rows, I know you are meant to arrange them into a matrix, like this:

\[ \left( \begin{array}{ccc}<br /> a &amp; e &amp; i \\<br /> b &amp; f &amp; j \\<br /> c &amp; g &amp; k\\<br /> d &amp; h &amp; l\end{array} \right)\] <br />

In this case you are unable to find the determinant as it is not a square matrix. Are you meant to use row reduction instead? And if so, how do you ascertain whether it is independent or dependent? I'd appreciate any help clearing this up!

Cheers.
 
Physics news on Phys.org
Tweet said:
Hi all,

I have been studying Linear Algebra for an upcoming exam, and one question has puzzled me slightly! How do you determine of a vector in R4 is linearly independent?

Given three vectors, each with 4 rows
You mean with 4 components, don't you?

, I know you are meant to arrange them into a matrix, like this:

\[ \left( \begin{array}{ccc}<br /> a &amp; e &amp; i \\<br /> b &amp; f &amp; j \\<br /> c &amp; g &amp; k\\<br /> d &amp; h &amp; l\end{array} \right)\] <br />

In this case you are unable to find the determinant as it is not a square matrix. Are you meant to use row reduction instead? And if so, how do you ascertain whether it is independent or dependent? I'd appreciate any help clearing this up!

Cheers.
Think about what "independent" means (that should be the first thing you learned).

A set of vectors (here three) are "independent" if and only the only solution to the equations xu yv+ zw 0 is x= y= z= 0. If the components are <a, b, c, d>, <e, f, g, h>, and <i, j, k, l> that is the same as the system of equations xa+ ye+ zi= 0, xb+ yg+ zf= 0, xc+ yg+ zk= 0, and xd+ yh+ zl= 0 which, in turn, is the same as the matrix equation
\begin{bmatrix}a &amp; e &amp; i \\ b &amp; g &amp; f \\ c &amp; g &amp; k \\ d &amp; h &amp; f\end{bmatrix}\begin{bmatrix}x \\ y \\ z\end{bmatrix}= \begin{bmatrix}0 \\ 0 \\ 0\end{bmatrix}

Obviously x= y= z= 0 is a solution so the whole question is whether or not there are any other solutions. You are right that if the matrix were square, we could look at the determinant: if it were non-zero, there would be only one solution, x= y= z= 0 and so the vectors would be independent. But with a non-square matrix, we need to row-reduce as you say. Here, because there are three equations with four unknowns, the last row will reduce to 0 0 0. If the third row does not also reduce to 0 0 0, then we can solve to get x= y= z= 0. If the third row does reduce to 0 0 0, then we have only two equations to solve for three unknowns and that will not have a unique solution.
 
A set of vectors (here three) are "independent" if and only the only solution to the equations xu yv+ zw 0 is x= y= z= 0. If the components are <a, b, c, d>, <e, f, g, h>, and <i, j, k, l> that is the same as the system of equations xa+ ye+ zi= 0, xb+ yg+ zf= 0, xc+ yg+ zk= 0, and xd+ yh+ zl= 0 which, in turn, is the same as the matrix equation
\begin{bmatrix}a &amp; e &amp; i \\ b &amp; g &amp; f \\ c &amp; g &amp; k \\ d &amp; h &amp; f\end{bmatrix}\begin{bmatrix}x \\ y \\ z\end{bmatrix}= \begin{bmatrix}0 \\ 0 \\ 0\end{bmatrix}

Minor correction: the right hand side should be

\left[\begin{array}{c}<br /> 0 \\<br /> 0 \\<br /> 0 \\<br /> 0\end{array}\right]
 
Brilliant! That makes perfect sense to me. Thanks very much for clearing that up. So if the last row does reduce to 0 0 0 after row reducing, then it's safe to say that the vectors are dependent?
 
In this particular, case, since there were only three vectors and four components, the last row will be 0 0 0 even if the vectors are independent. In order that they be dependent, the last two rows must be 0 0 0.

In order for n vectors to be independent, the first n rows of the matrix got by row-reducing formed by taking the vectors as columns must not be all 0s, no matter how many components (rows of the matrix) there are. Of course, if there are more vectors than components, there are not enough rows in the matrix for that to be true: if m> n, then a set of m vectors in a space of dimension n cannot be independent.
 
Thanks a million guys, that's really helped me (finally) understand this stuff.
 
I am studying the mathematical formalism behind non-commutative geometry approach to quantum gravity. I was reading about Hopf algebras and their Drinfeld twist with a specific example of the Moyal-Weyl twist defined as F=exp(-iλ/2θ^(μν)∂_μ⊗∂_ν) where λ is a constant parametar and θ antisymmetric constant tensor. {∂_μ} is the basis of the tangent vector space over the underlying spacetime Now, from my understanding the enveloping algebra which appears in the definition of the Hopf algebra...

Similar threads

  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
2
Views
1K