# Linear independence proof

1. Jan 29, 2017

### Mr Davis 97

1. The problem statement, all variables and given/known data
Prove that a set S of vectors is linearly independent if and only if each finite subset of S is linearly independent.

2. Relevant equations

3. The attempt at a solution

I think that that it would be easier to prove the logically equivalent statement: Prove that a set S of vectors is linearly dependent if and only if there exists a finite subset of S that is linearly dependent.

First assume that a set S is linearly dependent. Then there exists a linear dependence relation among its vectors $a_1s_1 + \cdots + a_ns_n = 0$, such that not all $a_1,..., a_n$ are zero. Since S is a subset of itself, there must exist a finite subset of S that is linearly dependent.

Second, assume that S has a finite subset S' that is linearly dependent. Since S' is a subset of S, this means that there exists a linear dependence relation among the vectors in S. Thus, S is also linearly dependent.

Does this prove the original statement? It seemed a little too easy.

2. Jan 29, 2017

### Ray Vickson

I think the result is wrong: consider $S = \{ (1,0,0), (0,1,0), (0,0,1), (1,1,1)\}.$ Any three of the vectors in $S$ are linearly independent, but $S$ itself is not.

3. Jan 29, 2017

### Mr Davis 97

But S is a subset of itself

4. Jan 29, 2017

### Ray Vickson

Of course I understand that; but if the question meant "including itself" then it is trivial and pointless.

5. Jan 29, 2017

### Mr Davis 97

The problem is from Friedberg, a widely used textbook. So I don't think the statement could be wrong. Maybe it is just a trivial exercise...

6. Jan 30, 2017

### StoneTemplePython

I think the idea is to use the trivial case as your base case. That is S has no linearly dependent vectors if S has no linearly dependent vectors.

Now you want to prove the iff part via induction / recursive cases. The idea is: for a vector with n entries (and a spanning set with n vectors), basically you can split any set into 2 sets, one with k entries and one with k - n entries, for k = 0, 1, 2, ... n-1, n. (You could actually truncate k at n//2 + 1, but that's a low level optimization that doesn't really do much so feel free to ignore this comment.)

Each of those subsets must have no linearly dependent vectors. Why? assume the opposite and do a union of the 2 sets, this leads to a contradiction with your base case. Recursively call this logic over and over until you've covered the entire powerset. Also for special handing, if you are evaluating an empty set, this evaluates True... this is a leaf in your recursive tree. In effect you start on row n of Pascal's triangle and work your way to the 0th row.

I was about to start talking about memoization... which is a sign its getting late and that computer code may be seeping into my math.

7. Jan 30, 2017

### Stephen Tashi

How do your text materials define "a set S of vectors is linearly independent"? - and how does this definition apply to a case where S is an infinite set of vectors ?