Is β a Basis for F^n if Det(B) ≠ 0?

Click For Summary
SUMMARY

The discussion centers on the proof that a set of vectors β = {u1, u2, ..., un} in F^n forms a basis if and only if the determinant of the matrix B, constructed from these vectors as columns, is non-zero (det(B) ≠ 0). It is established that linear independence of the vectors in β implies that the linear transformation represented by B is one-to-one, which is equivalent to B being invertible. The proof relies on the fact that the only solution to the equation Bc = 0 is the trivial solution, confirming that the kernel of B is {0} and thus det(B) must be non-zero.

PREREQUISITES
  • Understanding of linear independence in vector spaces
  • Knowledge of determinants and their properties in linear algebra
  • Familiarity with the concept of invertible matrices
  • Basic understanding of linear transformations and their kernels
NEXT STEPS
  • Study the properties of determinants in linear algebra
  • Learn about the relationship between linear transformations and their matrices
  • Explore the concept of the kernel of a linear transformation
  • Investigate the implications of a matrix being invertible in various contexts
USEFUL FOR

Students and professionals in mathematics, particularly those studying linear algebra, as well as educators looking to clarify concepts related to vector spaces and matrix theory.

rubixcircle
Messages
1
Reaction score
0
Let β={u1, u2, ... , un} be a subset of F^n containing n distinct vectors and let B be an nxn matrix in F having uj as column j.

Prove that β is a basis for Fn if and only if det(B)≠0.

For one direction of the proof I discussed this with a peer:

Since β consists of n vectors, β is a basis if and only if these vectors are linearly independent, which is equivalent to the map L_B being one-to-one. Since the matrix B is square, this is in turn equivalent to B being invertible, hence having a nonzero determinant.

However I do not understand the transition from the vectors being linearly independent to being one to one. Why is this true? Also, how do I prove the reverse direction?
 
Physics news on Phys.org
rubixcircle said:
Let β={u1, u2, ... , un} be a subset of F^n containing n distinct vectors and let B be an nxn matrix in F having uj as column j.

Prove that β is a basis for Fn if and only if det(B)≠0.

For one direction of the proof I discussed this with a peer:

Since β consists of n vectors, β is a basis if and only if these vectors are linearly independent, which is equivalent to the map L_B being one-to-one. Since the matrix B is square, this is in turn equivalent to B being invertible, hence having a nonzero determinant.

However I do not understand the transition from the vectors being linearly independent to being one to one. Why is this true? Also, how do I prove the reverse direction?

To answer your first question, β is assumed to be a basis for Fn, and the matrix B = [(u1) (u2) (u3) ... (un)] has as its columns the vectors in β.

Let c be a row vector of scalars, {c1, c2, ..., cn}, and consider the equation Bc = 0.

Since β is a basis (by assumption) of an n-dimensional space, its n vectors must be linearly independent, so the only solution to the equation Bc = 0 is c [itex]\equiv[/itex] 0. (I.e., c1 = 0, c2 = 0, ..., cn = 0.)

From this, we see that ker(B) = {0}, and since B represents a map from Fn to Fn, B must be invertible, so |B| [itex]\neq[/itex] 0.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 2 ·
Replies
2
Views
8K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
15
Views
1K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
Replies
7
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K