Linear Algebra - Linear Independence/Dependence

In summary: In this case, there is only one solution and that's when x = 1. So the vectors (1 1 -3 1), (1 3 4 2), and (1 0; 0 1; 0 0; 0 0) are all solutions to your equation and are therefore linearly dependent.
  • #1
Pete_01
51
0

Homework Statement


I want to know if the matrix (1, 1)
(1, 3)
(-3,4)
(1, 2)
is linearly independent or dependent.


Homework Equations


I reduced it down in rref to (1 0; 0 1; 0 0; 0 0) and I'm guessing it's linearly independent because there is only 1 term per line when it's set to 0? Is this correct?


The Attempt at a Solution


in part 2
 
Physics news on Phys.org
  • #2
I assume you mean are the 2 vectors you used to construct that matrix linearly independent?

They are linearly dependent. When you reduce your matrix, you are left with a series of equations. The first and second line tell you that, given the coefficients as (a,b,c,d), that 1a = 0 and 1b = 0. In other words, a = b = 0. Now, by definition if the coefficients in the formula [tex]a\mathord{\buildrel{\lower3pt\hbox{$\scriptscriptstyle\rightharpoonup$}}
\over v} _1 + b\mathord{\buildrel{\lower3pt\hbox{$\scriptscriptstyle\rightharpoonup$}}
\over v} _2 = 0[/tex] are only capable of being 0, then your two vectors are linearly independant.
 
  • #3
Yes there was two vectors used: (1 1 -3 1) and (1 3 4 2), so they would be linearly independent then correct? Because of the formula av1 + bv2 =0 right? Sorry, I'm still a bit confused.
 
  • #4
Yes, you got it. What you're basically looking for in linear dependence problems is if any vector can be written in terms of the other two vectors. So say you want to see if V1 can be written as a linear combination of V2. Well what you're looking for is whether or not V1 = a V2 where a is some number other than 0.

Now if you have something more complex like 4 vectors and you want to know if V1 can be constructed as a linear combination of V2, V3, and V4, what you get it is V1 = a V2 + b V3 + c V4. If you subtract V1 , you get 0 = -V1 + a V2 + b V3 + c V4. Since you have 0 on the left hand side, you can arbitrarily multiply the entire equation by whatever number you wish so in the end, without loss of generality, can say that 0 = a' V1 + b' V2 + c' V3 + d' V4 and since the coefficients' label is arbitrary, you get what you see in general: 0 = a V1 + b V2 + c V3 + d V4.

Now you can use the power of linear algebra and say the coefficient matrix, we'll call X = (a,b,c,d), is a solution to that equation because you can form your vectors into that matrix and say Ax = 0 and look for solutions of x.
 

1. What is linear independence/dependence in linear algebra?

Linear independence/dependence refers to the relationship between a set of vectors in a vector space. A set of vectors is considered linearly independent if none of the vectors can be written as a linear combination of the others. In other words, the only way to obtain the zero vector from the set is by multiplying each vector by a scalar of zero. On the other hand, a set of vectors is considered linearly dependent if at least one vector can be written as a linear combination of the others.

2. How do you determine if a set of vectors is linearly independent?

To determine if a set of vectors is linearly independent, you can use the determinant method or the rank method. The determinant method involves creating a matrix with the vectors as columns and finding the determinant. If the determinant is equal to zero, the vectors are linearly dependent. The rank method involves creating a matrix with the vectors as rows and finding the rank. If the rank is less than the number of vectors, the vectors are linearly dependent.

3. What is the importance of linear independence/dependence in linear algebra?

Linear independence/dependence is important in linear algebra because it allows us to understand the relationships between vectors in a vector space. It helps us determine if a set of vectors can span the entire space or if they are redundant. It also allows us to solve systems of linear equations and find unique solutions.

4. Can a set of vectors be both linearly independent and linearly dependent?

No, a set of vectors cannot be both linearly independent and linearly dependent. This is because the definitions are mutually exclusive. If a set of vectors is linearly independent, it means that none of the vectors can be written as a linear combination of the others. If a set of vectors is linearly dependent, it means that at least one vector can be written as a linear combination of the others. Therefore, a set of vectors cannot satisfy both conditions.

5. How does linear independence/dependence relate to the dimension of a vector space?

The dimension of a vector space is equal to the number of linearly independent vectors that can span the entire space. This means that the more linearly independent vectors a vector space has, the higher its dimension will be. On the other hand, if a set of vectors is linearly dependent, it means that some of the vectors are redundant and do not add any new information to the vector space. Therefore, the dimension of the vector space will be lower than the number of vectors in the set.

Similar threads

  • Calculus and Beyond Homework Help
Replies
1
Views
246
  • Calculus and Beyond Homework Help
Replies
10
Views
987
  • Calculus and Beyond Homework Help
Replies
8
Views
778
  • Calculus and Beyond Homework Help
Replies
2
Views
966
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
508
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
8
Views
603
  • Calculus and Beyond Homework Help
Replies
25
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
Back
Top