I am still confused how to prove that a set is a basis other than

  • Context: Undergrad 
  • Thread starter Thread starter elabed haidar
  • Start date Start date
  • Tags Tags
    Basis Confused Set
Click For Summary
SUMMARY

The discussion centers on proving that a set of vectors forms a basis for a vector space. A basis must be linearly independent and span the space, which can be demonstrated by expressing standard basis vectors as linear combinations of the given set. The process involves forming a matrix from the vectors, performing row operations, and checking for linear independence through row echelon form. If the matrix is square and reduces to the identity matrix, the original set spans the entire space.

PREREQUISITES
  • Understanding of linear independence and spanning sets
  • Familiarity with matrix operations, specifically row reduction
  • Knowledge of vector spaces and basis definitions
  • Ability to perform matrix inversion and understand its implications
NEXT STEPS
  • Learn about row echelon form and reduced row echelon form in matrix theory
  • Study the concept of linear combinations and their applications in vector spaces
  • Explore the relationship between the column space and row space of matrices
  • Investigate the kernel of a matrix and its significance in determining basis status
USEFUL FOR

Students and educators in linear algebra, mathematicians, and anyone seeking to deepen their understanding of vector spaces and basis proofs.

elabed haidar
Messages
133
Reaction score
1
i am still confused how to prove that a set is a basis other than proving it linearly independent and system of generator that have to do with matrices? please help
 
Physics news on Phys.org


It is merely a matter of parsing the definition of basis. They must be linearly independent and must span the space to which they are presumed to be a basis.

That second part, if you want to be rigorous is a matter of expressing a given arbitrary vector in the space as a linear combination of the basis vectors. It is sufficient to show that each of the standard basis vectors of the space can be expressed in terms of the vectors in the set and that will involve inverting the matrix you form by using the set of vectors to form columns. (using their coefficients in the standard basis).

For example:
Given the set of vectors in 2-space:
\vec{u}= \hat{i}, \vec{v} = \hat{i}+2\hat{j}
Their columns of coefficients are:
\vec{u} = \left(\begin{array}{c}1\\0\end{array}\right), \vec{v} = \left(\begin{array}{c}1\\2\end{array}\right).

Then the matrix:
M = \left(\begin{array}{cc}1 & 1\\ 0 & 2\end{array}\right)
Gives, via right multiplication the transformation from coefficients in the basis {u,v} to coefficients in the basis {i,j}:
a\vec{u} + b\vec{v}=x\hat{i} + y\hat{j}
where
\left(\begin{array}{c}x\\y\end{array}\right)= \left(\begin{array}{cc}1 & 1\\ 0 & 2\end{array}\right)\left(\begin{array}{c}a\\b\end{array}\right)

Since the matrix is in invertible its inverse will give you back the coefficients (a,b) in terms of the (x,y) coefficients of a vector in the {i,j} basis:
\left(\begin{array}{c}a\\b\end{array}\right)=\left(\begin{array}{cc}1 & 1\\ 0 & 2\end{array}\right)^{-1}\left(\begin{array}{c}x\\y\end{array}\right)

Now if you're working with a subspace (i.e. two vectors as a basis of a subspace in 3 or more dimensions) you won't have square matrices but you can still carry out row operations to see if a specific vector is a linear combination of the proposed basis set and also if they are linearly independent.
 
Last edited:


Hi elabed haidar :smile:

I'm quite confused in what you want hear from us here. The standard way of proving that something is a basis is to prove that it is linear independent and that it spans the vector space.

Of course, sometimes there are shortcuts. Specifically, if you already know the dimension of your vector space and if it happens to be finite, then it becomes a tad easier. For example, if the dimension of the space is n, and if you have exactly n vectors, then it simply suffices to show that the vectors are linear independt OR that the vectors span the space. Thus it suffices to show only one of these.

But other than that, I see no other way of proving something a basis...
 


thank you both but what I am asking is the last two sentences jambaugh just said with subspaces I am still confused on how to prove a system to be a basis
 


elabed haidar said:
thank you both but what I am asking is the last two sentences jambaugh just said with subspaces I am still confused on how to prove a system to be a basis

Again, you know how to determine linear independence... to review, use the vector's components in some basis as rows in a matrix (since we'll use row operations... if you want you can use columns and column operations it's just the transpose case).
What we're doing here is forming a matrix whose row space is by definition the span of our set of vectors.

Then row reduce until you are in row echelon form.

Row operations on a matrix of row vectors replaces rows with linear combinations of rows and thus each row remains within the span of the space. The row space of the matrix is unchanged.

If any row becomes all zeros then you know your set was not linearly independent...
=>you essentially subtracted a linear combination of other rows from that row so a linear combination of other vectors equaled that vector.

Note if you have as many vectors as the dimension of the space, and if they are all linearly independent then you will a.) have a square matrix of row vectors, and b.) will get the identity matrix when you reduce to reduced row echelon form. This means the span of your original set of vectors is the span of the standard basis, i.e. is the whole space.

Now to answer the last of your question will depend on how you are specifying a subspace for which you wish to check a set of vectors for basis status.

If you wish to compare one set's span to another, you simply form the two matrices (of row vectors) and their spans will be equal if they have the same reduced row echelon form modulo any extra rows of all zeros.

If you wish to compare a set's span with the row space of a matrix, well then that is just the above case with the first step done for you. The matrix is already the matrix of row vectors for another set and the process is the same as above.

If you wish to compare a set's span with the column space of a matrix then you just take the matrix's transpose, the column space of M is the row space of transpose(M).

If you wish to compare a set's span with a set of homogenous linear constraints, i.e. system of homogenous linear equations (linear combinations of coordinates set equal to zero) then you are comparing the set's span to the null space (or kernel) of the matrix formed from the coefficients of the homogeneous linear equations.

You must then get a basis for the kernel (see: http://en.wikipedia.org/wiki/Kernel_(matrix)" ) and follow the first procedure.

That's about every case I can think of at the moment without getting into more abstract spaces such as function spaces. Make sure you understand these procedures in terms of the fundamental definition... a basis is a linearly independent spanning set. All of these procedures directly apply this definition at their core.

This should all be outlined in your textbook, and is available on Wikipedia. Just google it.

Regards,
James Baugh
 
Last edited by a moderator:


thank you sir very much
 

Similar threads

  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 12 ·
Replies
12
Views
5K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K