Proving Existence of Basis in Vector Spaces

In summary, your procedure for finding a basis for a vector space is valid, but more complex for the infinite dimensional case.
  • #1
Dead Boss
150
1
Hi,

I've been trying to prove that every vector space has a basis.
So starting from the axioms of vector space I defined linear independence and span and then defined basis to be linear independent set that spans the space. I was trying to figure out a direct way to prove the existence of basis, but the best I managed is proof by construction and I'm not sure whether it's valid or not.

Step 1: Pick an arbitrary vector [itex]v[/itex] from [itex]V^i[/itex], then [itex]S^{i+1} := S^i \cup \{v\}[/itex]
Step 2: [itex]V^{i+1} := V^{i} - span(S^{i+1})[/itex]

Starting with [itex]V^1[/itex] as the vector space in question and [itex]S^1[/itex] as empty set, there are two possibilities:
A. After a finite number of steps [itex]V^i = \{\}[/itex], then it can be shown that [itex]S^i[/itex] is a basis.
B. There is no finite number of steps after which [itex]V^i = \{\}[/itex]. In this case I have absolutely no idea what happens. :smile:

In hindsight I can define finite dimensional vector space to be vector space for which this construct terminates. Then I have proven that every finite dimensional vector space has a basis.

Now the questions:
1) Is this acceptable definition of finite dimensional vector space?
2) Is this valid proof of existence of basis for finite dimensional vector space?
3) Does case B still yield a basis?
4) Is this a good proof/definition?
 
Last edited:
Physics news on Phys.org
  • #2
Do you assume that your vector space is finite dimensional?

Because if you don't, then you need the Axiom of Choice, or some of its equivalents, to prove that a basis exists.
 
  • #3
No I don't. The dimensionality of space is undefined yet. I was hoping that this construction would allow me to define whether the space is finite or infinite dimensional.

I was thinking about axiom of choice for step 1 of the construct, where you have to pick an arbitrary vector from set, but I thought it is built-in in set theory. Is there anything more to it than just stating that you can pick an arbitrary vector?

EDIT: I made a mistake in the OP. It should be fine now.
 
Last edited:
  • #4
I see now that you got it wrong in step 2. With your procedure, your proposed "basis" will be subset of Span(v), where v is the vector you choose in the first iteration.

A more natural approach, which could be what you actually meant, would be to put [itex]S_0 = \varnothing[/itex] and then for each [itex]i\ge 0[/itex]:

1. Choose [itex]v_{i+1} \in V\setminus Span(S_i)[/itex].
2. Put [itex]S_{i+1}=S_i\cup\{v_{i+1}\}[/itex].

In the finite dimensional case, this procedure will stop after a finite number of steps and then give a basis for V.

It is true that the Axiom of Choice is used in this procedure, but in the finite dimensional case it is not necessary, since we can use a finite set which spans V instead of V in 1 above.
In the infinite dimensional case, it is more complicated...

EDIT: I see you fixed the error before I posted this. Your procedure is OK in the finite dimensional case.
 
Last edited:
  • #5
For the infinite dimensional case, we must know what is meant with a linear combination of an infinite set S of vectors. This is defined as a linear combination of any finite subset of S. Span(S) is then the set of all such linear combinations of finite subsets of S. We say that S is linearly independent if for every such finite linear combination which gives the result 0, all the (finitely many) coefficients in this combination are 0.
It is easy to show that these definitions are consistent with the definitions in the finite dimensional case, and a basis is also here defined as a linearly independent set which spans V (it is sometimes called a Hamel basis) , and the dimension of V as the cardinality of such a basis (which could be shown to be independent of the choice of basis, but this is harder to prove than in the finite dimensional case)

It is obvious your procedure fails if the dimension is uncountable, and even if it is countably infinite, it may fail, since it might only give a basis for a subspace if V of countably infinite dimension.

But your procedure could be generalized to cover the infinite dimensional case, using some equivalent of the Axiom of Choice, for example Zorn's Lemma.
 
  • #6
The first thing you do is to choose a vector from ##V^i##, but you haven't defined ##V^i## yet. So at that point, I already don't understand what you're doing.

A basis is by definition a maximal linearly independent subset. This means that B is a basis of V if and only if B is linearly independent, and for all ##x\in V-B##, ##B\cup\{x\}## is linearly dependent. Because of this, the existence of a basis in the finite dimensional case is actually trivial (if you understand the definitions perfectly).

A vector space V is said to be infinite dimensional if for all positive integers n, there's a linearly independent subset of V with cardinality n.

A vector space V that isn't infinite dimensional is said to be finite dimensional.

The dimension of a non-trivial (i.e. ##\neq\{0\}##) finite-dimensional vector space V is defined as the largest integer n such that V has a linearly independent subset with cardinality n. This integer is denoted by dim V.

Theorem: Every non-trivial finite-dimensional vector space has a basis.

Proof: Let n be an arbitrary positive integer. Let V be an arbitrary vector space such that dim V=n. Let B be an arbitrary linearly independent subset of V with cardinality n. Clearly, for all ##x\in V-B##, ##B\cup\{x\}## must be linearly dependent, because otherwise we would have dim V≥n+1>n.The standard proof for the arbitrary case uses Zorn's lemma (which is equivalent to the axiom of choice). You will have to study some definitions to understand it. (In particular, the definition of "partially ordered set").

Theorem: Every non-trivial vector space has a basis.

Proof: Let V be an arbitrary non-trivial vector space. Let S be the set of all linearly independent subsets of V, partially ordered by inclusion. Let T be an arbitrary totally ordered subset of S. Clearly, ##\bigcup T## is an upper bound of T. Since every totally ordered subset has an upper bound, Zorn's lemma tells us that S has a maximal element.
 
  • #7
Fredrik said:
The first thing you do is to choose a vector from ##V^i##, but you haven't defined ##V^i## yet. So at that point, I already don't understand what you're doing.
##V^i## and ##S^i## are sets. I should have mention that before.

Fredrik said:
A basis is by definition a maximal linearly independent subset. This means that B is a basis of V if and only if B is linearly independent, and for all ##x\in V-B##, ##B\cup\{x\}## is linearly dependent. Because of this, the existence of a basis in the finite dimensional case is actually trivial (if you understand the definitions perfectly).
I see that with your definition of basis, but in the definition I used it wasn't obvious that the basis is maximal (linearly independent set that spans the space). It isn't even obvious that every basis of the vector space has the same cardinality. I can prove that the definitions are equivalent, but first I wanted to prove that basis actually exists in the first place.

Thank you for replies Frederik and Erland. They've been very helpful. I think I got it now.
 
  • #8
Dead Boss said:
##V^i## and ##S^i## are sets. I should have mention that before.
That part was obvious, but you should have said something about what set ##V^i## is. It looks like it would be sufficient to say what ##V^1## is.
 
  • #9
Fredrik said:
That part was obvious, but you should have said something about what set ##V^i## is. It looks like it would be sufficient to say what ##V^1## is.
It is, since he also gives a recursive definition of V^(i+1) for i>=1.
 
  • #10
I specified [itex]V^1[/itex] afterwards. Sorry if the definition was unclear. I don't have much experience with writing mathematical texts.
 

1. What is a vector space?

A vector space is a mathematical structure that consists of a set of vectors and operations, such as addition and scalar multiplication, that satisfy a set of axioms. These axioms include closure under addition and scalar multiplication, associativity, commutativity, and the existence of a zero vector and additive inverse.

2. Why is it important to prove the existence of a basis in a vector space?

A basis is a set of linearly independent vectors that can be used to represent any vector in a vector space. Proving the existence of a basis is important because it allows us to express any vector in a vector space as a unique linear combination of the basis vectors. This is useful in various applications, such as solving systems of linear equations and finding eigenvalues and eigenvectors.

3. How do you prove the existence of a basis in a vector space?

To prove the existence of a basis in a vector space, we need to show that the set of vectors we are considering satisfies the definition of a basis. This means that the vectors must be linearly independent and span the entire vector space. We can do this by using various methods, such as Gaussian elimination, or by using the properties of vector spaces and linear transformations.

4. Can a vector space have more than one basis?

Yes, a vector space can have multiple bases. This is because there are often multiple sets of vectors that can be used to represent any vector in the vector space. However, all bases in a given vector space will have the same number of vectors, known as the dimension of the vector space.

5. How is the existence of a basis related to the dimension of a vector space?

The existence of a basis is directly related to the dimension of a vector space. The dimension of a vector space is the number of vectors in any basis for that vector space. So, if a basis exists and has n vectors, then the dimension of the vector space is n. Additionally, if a vector space is finite-dimensional, then any basis for that vector space will have the same number of vectors, which is the dimension of the vector space.

Similar threads

  • Linear and Abstract Algebra
Replies
9
Views
559
  • Linear and Abstract Algebra
Replies
9
Views
183
  • Linear and Abstract Algebra
Replies
6
Views
860
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
288
  • Linear and Abstract Algebra
2
Replies
38
Views
5K
  • Linear and Abstract Algebra
Replies
8
Views
868
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
902
  • Linear and Abstract Algebra
Replies
1
Views
1K
Back
Top