Determine Linear Independence of {[1,2,-1,6], [3,8,9,10],[2,-1,2,-2]}

In summary: So, how do we find the rank of A?The rank of A is the number of linearly independent rows (or columns) in the matrix A. In this case, the row echelon form of A has 3 linearly independent rows: (1,2,-1,6), (0,1,6,-4), (0,0,1,-1). Therefore, the rank of A is 3.
  • #1
roam
1,271
12
Determine whether the set {[1,2,-1,6], [3,8,9,10],[2,-1,2,-2]} is linearly independent.

3. The Attempt at a Solution

I construct

[tex]A = \left[\begin{array}{ccccc} 1 & 2 & -1 & 6 \\ 3 & 8 & 9 & 10 \\ 2 & -1 & 2 & -2 \end{array}\right][/tex]

The row echelon form is

[tex]A = \left[\begin{array}{ccccc} 1 & 2 & -1 & 6 \\ 0 & 1 & 6 & -4 \\ 0 & 0 & 1 & -1 \end{array}\right][/tex]

Now there is a theorem saying that if the "rank" of V is smaller than the number of vectors in the set under consideration (i.e., number of rows of V) then the vectors are linearly dependent; otherwise they're independent.

I can't understand this step, how do we determine the "rank" of V?


Furthermore, I have another question;
There is a property that states: "if the set contains more vectors than the dimension of its member vectors, the vectors are linearly dependent." They are thus NOT linearly independent.
So, what if the set contains fewer vectors than the dimension of its member vectors?? Here in my problem I have 3 vectors which are of the 4th dimensions, what does that tell us?

Thanks!

 
Physics news on Phys.org
  • #2
So, what if the set contains fewer vectors than the dimension of its member vectors?? Here in my problem I have 3 vectors which are of the 4th dimensions, what does that tell us?

It tells you you need to solve the problem another way (if the number of vectors you have is less than the dimension of the vector space they're in, you can't immediately decide if they're linearly independent or not).

Now there is a theorem saying that if the "rank" of V is smaller than the number of vectors in the set under consideration (i.e., number of rows of V) then the vectors are linearly dependent; otherwise they're independent.

You're in a bind here, since the rank is the dimension of the span of the row (or column, they will always be equal) vectors of the matrix. Fortunately row operations (or column operations if you're using the column definition) preserve rank, so what you have is useful. The problem boils down to finding if the vectors

(1,2,-1,6), (0,1,6,-4), (0,0,1,-1) are linearly independent. Use the basic definition of linear independence: If a*(1,2,-1,6) + b(0,1,6,-4) + c(0,0,1,-1) = (0,0,0,0)

Look at the first coordinate, then the second coordinate, then the third coordinate
 
  • #3
Office_Shredder said:
It tells you you need to solve the problem another way (if the number of vectors you have is less than the dimension of the vector space they're in, you can't immediately decide if they're linearly independent or not).

I understand, thanks.


(1,2,-1,6), (0,1,6,-4), (0,0,1,-1) are linearly independent. Use the basic definition of linear independence: If a*(1,2,-1,6) + b(0,1,6,-4) + c(0,0,1,-1) = (0,0,0,0)

Look at the first coordinate, then the second coordinate, then the third coordinate

I just started reading this topic yesterday from the book and I’m a little confused atm.
I don’t see any good explanation in my book, it only says: “by inspection the rank of A is…” it doesn’t elaborate on how to "inspect". I appreciate that if you could please demonstrate how to find the rank of A.

Well, if the vectors are linearly dependent then the condition a(1,2,-1,6) + b(0,1,6,-4) + c(0,0,1,-1) = (0,0,0,0) must hold (they're not zero) & the equation can be rewritten as

(1,2,-1,6) = -(b/a) (0,1,6,-4) – (c/a) (0,0,1,-1)

(0,1,6,-4) = -(a/b)(1,2,-1,6) - (c/b) (0,0,1,-1)

(0,0,1,-1) = -(a/c) (1,2,-1,6) - (b/c) (0,1,6,-4)

The first form being possible if [tex]a \neq 0[/tex] and the second if [tex]b \neq 0[/tex] and so on.
 
  • #4
a(1,2,-1,6) + b(0,1,6,-4) + c(0,0,1,-1) = (0,0,0,0)

Ok, so the first coordinate gives us the equation:
a = 0

That's pretty easy. So we look at the second coordinate
2a + b = 0

But a=0 (from the first coordinate). So we get b=0

Then we see c=0 also. So if a(1,2,-1,6) + b(0,1,6,-4) + c(0,0,1,-1) = (0,0,0,0), then a=b=c=0
 
  • #5
Thank you, I get it now & I see that it's linearly independent. But I want to know how to find the rank of A. The book says: "by inspection, the rank of A is 3". The book doesn't explain how they got it. I know that if the rank is 3 (which is equal to the number of the vectors in the given set) makes the set linearly independent.
 

Related to Determine Linear Independence of {[1,2,-1,6], [3,8,9,10],[2,-1,2,-2]}

1. What is linear independence?

Linear independence refers to the property of a set of vectors in a vector space, where none of the vectors can be written as a linear combination of the others. In other words, the vectors are not redundant and each one adds unique information to the set.

2. How do you determine linear independence?

One way to determine linear independence is by creating a matrix with the given vectors as its columns. Then, performing row operations to reduce the matrix to echelon form. If there are no rows of all zeros, the vectors are linearly independent. Otherwise, if there is a row of all zeros, the vectors are linearly dependent.

3. What is the process for determining linear independence?

The process for determining linear independence involves creating a matrix with the given vectors as its columns, reducing the matrix to echelon form using row operations, and then analyzing the resulting matrix for any rows of all zeros. If there are no rows of all zeros, the vectors are linearly independent. If there is a row of all zeros, the vectors are linearly dependent.

4. Can you determine linear independence with only two vectors?

No, linear independence can only be determined with a set of three or more vectors. With just two vectors, it is not possible to have redundancy or a linear combination of the vectors.

5. What is the significance of determining linear independence?

Determining linear independence is important in many areas of mathematics and science, particularly in linear algebra and vector calculus. It allows for the identification of unique solutions to systems of equations and plays a crucial role in understanding vector spaces and their properties.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
7
Views
2K
  • Calculus and Beyond Homework Help
Replies
0
Views
516
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
10
Views
1K
  • Calculus and Beyond Homework Help
Replies
21
Views
2K
  • Calculus and Beyond Homework Help
Replies
11
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
Back
Top