Linearly Independent Sets: Subsets of A Are Also Independent

  • Thread starter _Bd_
  • Start date
  • Tags
    Linear
In summary: Therefore, these vectors do not satisfy the definition of linear independence.In summary, a set of vectors is linearly independent if the only solution to the equation c1v1 + c2v2 + ... + cnvn = 0 is c1 = c2 = ... = cn = 0. This means that no vector in the set can be written as a linear combination of the other vectors, making any subset of the set also linearly independent. This can be proven by contradiction using the definition of linear independence.
  • #1
_Bd_
109
0

Homework Statement



Show that if A = {v_1, v_2, v_3. . .v_n} is a linearly independent set then any subset of A is also linearly independent

Homework Equations



proof.

The Attempt at a Solution



so if A is a set of vectors
[v_1 ]
[v_2 ]
[v_3 ]
[. . . ]
[v_n ]
for it to be independent then the determinant must NOT be zero and so far I know that det are just for square matrices.
therefore A must be a square matrix with a det not = to zero.
therefore it must be possible to reduce it to the identity matrix
therefore since the Identity matrix has a leading 1 on different columns/rows you can NOT write any of them as a linear combination of any other ones, therefore any subset would be independent also ??

Im not sure about this since its mostly text and I don't know if I should have more math
. . .
the back of the book just says Proof.
 
Last edited:
Physics news on Phys.org
  • #2
_Bd_ said:
you can NOT write any of them as a linear combination of any other ones, therefore any subset would be independent also ??

Im not sure about this since its mostly text and I don't know if I should have more math
. . .
the back of the book just says Proof.

Focus on this last bit. If a subset were linearly dependent, you could write an equation for one of the vectors as a linear combination of the others. Is that consistent with A being linearly independent? What assumption did we make that must be false?
 
  • #3
fzero said:
Focus on this last bit. If a subset were linearly dependent, you could write an equation for one of the vectors as a linear combination of the others. Is that consistent with A being linearly independent? What assumption did we make that must be false?

thats kind of what i pointed. . . or I am not sure where you're getting at
say it reduces to

[1 0 0 0 ]
[0 1 0 0 ]
[0 0 1 0 ]
[0 0 0 1 ]

so for example vector 1 (1 0 0 0) can NOT be written as any combination of the other 3 vectors. . .because none of them have any coefficients in that column. . . so even if you were to only take a subset
[ 1 0 0 0 ]
[ 0 0 0 1 ]
[ 0 0 1 0 ]
then neither of those can be written as combinations of the others (therefore they are indepenent?)
or that's what I am thinking but I don't know if there is any more math and less text involved
?
 
  • #4
My question then, is "what definition of 'independent' are you using?"

The definition that appears in most texts has nothing to do with matrices!
 
  • #5
You should be able to show this pretty straight forward by using the defintion of linear independence that says: if {v0, v1 ,…, vn } are linear independent iff a0v0 + a1v1 + ... + anvn = 0 implies a0, a1 ,…, an = 0. Try a proof by contradiction.
 
Last edited:
  • #6
HallsofIvy said:
My question then, is "what definition of 'independent' are you using?"

The definition that appears in most texts has nothing to do with matrices!

The chapter is on vector spaces . . .



JonF said:
You should be able to show this pretty straight forward by using the defintion of linear independence that says: if {v0, v1 ,…, vn } are linear independent iff a0v0 + a1v1 + ... + anvn = 0 implies a0, a1 ,…, an = 0. Try a proof by contradiction.

ok so . . .then c1v1 + c2v2. . .cnvn = 0 then the set is independent if c1, c2. . .cn are zero.

if c1,c2. . .cn are NOT zero then the set contains at least 1 c_iv_i which is NOT equal to zero and therefore does not satisfy the independence theorem . . .
is that cool?
 
  • #7
Make sure you note that in the definition I gave you it’s for any an's have to = 0, not just for one particular group

I'd formal it up a bit, but that's the general idea. Start with something like let {v0, v1 ,…, vn } be a set of linear independent vectors. Suppose {vp0, vp1 ,…, vpk } is a subset of vectors and it is linear dependant. If it is linearly dependant… thus we reach a contradiction. I'll let you fill in the rest.
 
  • #8
_Bd_ said:
ok so . . .then c1v1 + c2v2. . .cnvn = 0 then the set is independent if c1, c2. . .cn are zero.

Make sure you understand the subtlety of this definition, which is that the only solution to the equation c1v1 + c2v2 + ... + cnvn = 0 is c1 = c2 = ... = cn = 0.

I can always set any linear combination of vectors to 0, whether they are linearly dependent or linearly independent, and come up with the solution c1 = c2 = ... = cn = 0. The key difference is that for linearly independent vectors, this is the only one solution. It will be one of many solutions for a set of linearly dependent vectors.

For example, the vectors v1 = <2, 1, 1> and v2 = <4, 2, 2> are obviously linearly dependent.

The equation c1v1 + c2v2 = 0 has a solution c1 = c2 = 0, even though these vectors are linearly dependent, but there are many other solutions, such as c1 = 2 and c2 = -1.
 

What is a linearly independent set?

A linearly independent set is a collection of vectors in a vector space that cannot be expressed as a linear combination of other vectors in the same space.

What does it mean for subsets of a set to also be independent?

If a set is linearly independent, then any subset of that set will also be linearly independent. This means that the vectors in the subset cannot be expressed as a linear combination of other vectors in that subset.

How do you determine if a set is linearly independent?

To determine if a set is linearly independent, you can use the definition of linear independence or perform a linear dependence test, such as Gaussian elimination or computing the determinant of a matrix formed by the vectors in the set.

What is the significance of linearly independent sets?

Linearly independent sets are important in linear algebra and other areas of mathematics because they allow us to represent and solve systems of linear equations. They also help us to understand the structure and properties of vector spaces.

Can a set be both linearly independent and dependent?

No, a set cannot be both linearly independent and dependent. A set is either linearly independent or dependent, there is no in-between. If a set is linearly dependent, it means that at least one vector in the set can be expressed as a linear combination of the other vectors in the same set.

Similar threads

  • Calculus and Beyond Homework Help
Replies
14
Views
596
  • Calculus and Beyond Homework Help
Replies
1
Views
282
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
34
Views
2K
  • Calculus and Beyond Homework Help
Replies
0
Views
449
  • Calculus and Beyond Homework Help
Replies
4
Views
945
  • Calculus and Beyond Homework Help
Replies
3
Views
895
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
Back
Top