# Existence of basis

1. Oct 8, 2012

Hi,

I've been trying to prove that every vector space has a basis.
So starting from the axioms of vector space I defined linear independence and span and then defined basis to be linear independent set that spans the space. I was trying to figure out a direct way to prove the existence of basis, but the best I managed is proof by construction and I'm not sure whether it's valid or not.

Step 1: Pick an arbitrary vector $v$ from $V^i$, then $S^{i+1} := S^i \cup \{v\}$
Step 2: $V^{i+1} := V^{i} - span(S^{i+1})$

Starting with $V^1$ as the vector space in question and $S^1$ as empty set, there are two possibilities:
A. After a finite number of steps $V^i = \{\}$, then it can be shown that $S^i$ is a basis.
B. There is no finite number of steps after which $V^i = \{\}$. In this case I have absolutely no idea what happens.

In hindsight I can define finite dimensional vector space to be vector space for which this construct terminates. Then I have proven that every finite dimensional vector space has a basis.

Now the questions:
1) Is this acceptable definition of finite dimensional vector space?
2) Is this valid proof of existence of basis for finite dimensional vector space?
3) Does case B still yield a basis?
4) Is this a good proof/definition?

Last edited: Oct 8, 2012
2. Oct 8, 2012

### Erland

Do you assume that your vector space is finite dimensional?

Because if you don't, then you need the Axiom of Choice, or some of its equivalents, to prove that a basis exists.

3. Oct 8, 2012

No I don't. The dimensionality of space is undefined yet. I was hoping that this construction would allow me to define whether the space is finite or infinite dimensional.

I was thinking about axiom of choice for step 1 of the construct, where you have to pick an arbitrary vector from set, but I thought it is built-in in set theory. Is there anything more to it than just stating that you can pick an arbitrary vector?

EDIT: I made a mistake in the OP. It should be fine now.

Last edited: Oct 8, 2012
4. Oct 8, 2012

### Erland

I see now that you got it wrong in step 2. With your procedure, your proposed "basis" will be subset of Span(v), where v is the vector you choose in the first iteration.

A more natural approach, which could be what you actually meant, would be to put $S_0 = \varnothing$ and then for each $i\ge 0$:

1. Choose $v_{i+1} \in V\setminus Span(S_i)$.
2. Put $S_{i+1}=S_i\cup\{v_{i+1}\}$.

In the finite dimensional case, this procedure will stop after a finite number of steps and then give a basis for V.

It is true that the Axiom of Choice is used in this procedure, but in the finite dimensional case it is not necessary, since we can use a finite set which spans V instead of V in 1 above.
In the infinite dimensional case, it is more complicated...

EDIT: I see you fixed the error before I posted this. Your procedure is OK in the finite dimensional case.

Last edited: Oct 8, 2012
5. Oct 8, 2012

### Erland

For the infinite dimensional case, we must know what is meant with a linear combination of an infinite set S of vectors. This is defined as a linear combination of any finite subset of S. Span(S) is then the set of all such linear combinations of finite subsets of S. We say that S is linearly independent if for every such finite linear combination which gives the result 0, all the (finitely many) coefficients in this combination are 0.
It is easy to show that these definitions are consistent with the definitions in the finite dimensional case, and a basis is also here defined as a linearly independent set which spans V (it is sometimes called a Hamel basis) , and the dimension of V as the cardinality of such a basis (which could be shown to be independent of the choice of basis, but this is harder to prove than in the finite dimensional case)

It is obvious your procedure fails if the dimension is uncountable, and even if it is countably infinite, it may fail, since it might only give a basis for a subspace if V of countably infinite dimension.

But your procedure could be generalized to cover the infinite dimensional case, using some equivalent of the Axiom of Choice, for example Zorn's Lemma.

6. Oct 8, 2012

### Fredrik

Staff Emeritus
The first thing you do is to choose a vector from $V^i$, but you haven't defined $V^i$ yet. So at that point, I already don't understand what you're doing.

A basis is by definition a maximal linearly independent subset. This means that B is a basis of V if and only if B is linearly independent, and for all $x\in V-B$, $B\cup\{x\}$ is linearly dependent. Because of this, the existence of a basis in the finite dimensional case is actually trivial (if you understand the definitions perfectly).

A vector space V is said to be infinite dimensional if for all positive integers n, there's a linearly independent subset of V with cardinality n.

A vector space V that isn't infinite dimensional is said to be finite dimensional.

The dimension of a non-trivial (i.e. $\neq\{0\}$) finite-dimensional vector space V is defined as the largest integer n such that V has a linearly independent subset with cardinality n. This integer is denoted by dim V.

Theorem: Every non-trivial finite-dimensional vector space has a basis.

Proof: Let n be an arbitrary positive integer. Let V be an arbitrary vector space such that dim V=n. Let B be an arbitrary linearly independent subset of V with cardinality n. Clearly, for all $x\in V-B$, $B\cup\{x\}$ must be linearly dependent, because otherwise we would have dim V≥n+1>n.

The standard proof for the arbitrary case uses Zorn's lemma (which is equivalent to the axiom of choice). You will have to study some definitions to understand it. (In particular, the definition of "partially ordered set").

Theorem: Every non-trivial vector space has a basis.

Proof: Let V be an arbitrary non-trivial vector space. Let S be the set of all linearly independent subsets of V, partially ordered by inclusion. Let T be an arbitrary totally ordered subset of S. Clearly, $\bigcup T$ is an upper bound of T. Since every totally ordered subset has an upper bound, Zorn's lemma tells us that S has a maximal element.

7. Oct 8, 2012

$V^i$ and $S^i$ are sets. I should have mention that before.

I see that with your definition of basis, but in the definition I used it wasn't obvious that the basis is maximal (linearly independent set that spans the space). It isn't even obvious that every basis of the vector space has the same cardinality. I can prove that the definitions are equivalent, but first I wanted to prove that basis actually exists in the first place.

Thank you for replies Frederik and Erland. They've been very helpful. I think I got it now.

8. Oct 8, 2012

### Fredrik

Staff Emeritus
That part was obvious, but you should have said something about what set $V^i$ is. It looks like it would be sufficient to say what $V^1$ is.

9. Oct 8, 2012

### Erland

It is, since he also gives a recursive definition of V^(i+1) for i>=1.

10. Oct 8, 2012

I specified $V^1$ afterwards. Sorry if the definition was unclear. I don't have much experience with writing mathematical texts.