Solving Linear Dependency: u,v,w Equation

  • Thread starter Pengwuino
  • Start date
  • Tags
    Linear
In summary, the matrix is not linear dependant because the only column vector satisfying AX=0 is the zero vector.
  • #1
Pengwuino
Gold Member
5,124
20
Im a little confused here and I have a feeling I forgot how to do matrix operations. The problem is to determine whether or not the following equation is dependant or independent.

[tex]u = (1, - 1,2),v = (3,0,1),w = (1, - 2,2)[/tex]

I thought that I setup the matrix like this and try to look for the echelon form:

[tex]\begin{array}{*{20}c}
1 & 3 & 1 \\
{ - 1} & 0 & { - 2} \\
2 & 1 & 2 \\
\end{array}[/tex]

I got it down to:

[tex]\begin{array}{*{20}c}
1 & 0 & 1 \\
0 & { - 5} & 0 \\
0 & 3 & { - 1} \\
\end{array}[/tex]

and I got a little confused, I am not sure what to do… or maybe I screwed up earlier?
 
Physics news on Phys.org
  • #2
If you really want to get it into echelon form, just continue! To get a 0 below that -5, multiply the second row by 3/5 and add to the third row.
Although, with that 0 in the second row, third column, it's already obvious, isn't it, that you will get a non-zero third row. That's enough to show these three vectors are not dependent.
 
  • #3
Ok I'm still a little confused. What does the matrix have to become in order for me to be able to say its linear independant or linear dependant?

I also ran the matrix through mathematica and it was able to reduce it to an identity matrix... did i maybe do the problem wrong?
 
  • #4
The goal of the approach you are using is to find the dimension of the space your vectors span.

Can you use that number to tell if your vectors are linearly independent or not?

How does this number relate to the rank of the matrix you created?

What about the rank of the matrix produced by fully row-reducing it?

Can you tell what the rank of the fully row-reduced matrix is?
 
  • #5
What do rank and span mean? I looked in the book and its farther into the book then the problem is.
 
  • #6
If we call your matrix A, your vectors are linearly independant if and only if the only column vector X satisfying AX=0 is the zero vector. This is just the definition of linear independance as AX is just a linear combination of your vectors (the columns of A).

If you can reduce A to the identity matrix, what does this say about solutions to the homogeneous system AX=0?

If you are doubting your result, you might try "plotting" your vectors with a few pencils/straws/sticks/whatever. Do you know what linear independance will mean geometrically here?
 

1. What is linear dependency?

Linear dependency is a mathematical concept that refers to when one vector in a set of vectors can be expressed as a linear combination of the other vectors. In simpler terms, it means that one vector in the set is a multiple of another vector, making it redundant.

2. Why is solving linear dependency important?

Solving linear dependency is important because it allows us to eliminate redundant information and simplify our calculations. It also helps us determine if a set of vectors is linearly independent, which is a crucial concept in linear algebra.

3. How do you know if a set of vectors is linearly dependent?

A set of vectors is linearly dependent if one vector in the set can be written as a linear combination of the other vectors. This can be determined by setting up a system of equations and solving for the coefficients of the linear combination. If at least one of the coefficients is not zero, then the vectors are linearly dependent.

4. What are some methods for solving linear dependency?

One method for solving linear dependency is by using Gaussian elimination, which involves reducing a system of equations to its row echelon form and looking for rows of zeros. Another method is by using determinants, where if the determinant of the matrix formed by the vectors is equal to zero, the vectors are linearly dependent.

5. Can linearly dependent vectors be used to form a basis for a vector space?

No, linearly dependent vectors cannot form a basis for a vector space. A basis requires a set of linearly independent vectors that can span the entire vector space. If there are linearly dependent vectors in the set, they can be eliminated without changing the span of the vector space.

Similar threads

  • Calculus and Beyond Homework Help
Replies
6
Views
522
  • Calculus and Beyond Homework Help
Replies
14
Views
581
  • Calculus and Beyond Homework Help
Replies
2
Views
513
  • Calculus and Beyond Homework Help
Replies
2
Views
972
  • Calculus and Beyond Homework Help
Replies
12
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
599
  • Calculus and Beyond Homework Help
Replies
10
Views
2K
  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
269
  • Calculus and Beyond Homework Help
Replies
21
Views
822
Back
Top