Generalizing Linear Independence: Beyond R^n and into Matrix Spaces

  • Context: Graduate 
  • Thread starter Thread starter schaefera
  • Start date Start date
  • Tags Tags
    Linear
Click For Summary
SUMMARY

This discussion focuses on generalizing the concept of linear independence from R^n to matrix spaces, specifically R^(mxn). To determine if n vectors in R^(mxn) are linearly independent, one constructs a matrix of size (mxn)x(mxn) and performs a reduced-row echelon reduction to assess its rank. The rank indicates the number of linearly independent vectors in the set. The discussion also touches on the implications of linear independence in various vector spaces, including continuous functions and polynomials.

PREREQUISITES
  • Understanding of linear independence in R^n
  • Familiarity with matrix operations, specifically determinants and row echelon forms
  • Knowledge of vector spaces and their dimensions
  • Basic concepts of polynomial functions and continuous functions
NEXT STEPS
  • Explore the concept of rank in matrix theory
  • Learn about reduced-row echelon form and its applications
  • Investigate linear independence in function spaces, particularly polynomial spaces
  • Study the implications of linear independence in infinite-dimensional vector spaces
USEFUL FOR

Mathematicians, educators, and students in linear algebra, particularly those interested in advanced topics related to vector spaces and matrix theory.

schaefera
Messages
208
Reaction score
0
To text whether n vectors in R^n are linearly independent, you put those vectors in a matrix and take its determinant.

How can this be generalized beyond vectors in R^n-- say to the space of matrices in R^(mxn)?
 
Physics news on Phys.org
Hey schaefera.

In this new space do you have mxn vectors in mxn space? If so you do exactly the same thing except your matrix is (mxn)x(mxn).

If you want to check whether any set of vectors are linearly dependent (below the dimension of the space), simply put the vectors in a matrix and do a reduced-row echelon reduction on the matrix and see what it's rank is. The rank will give you the number of linearly independent vectors for that set that you entered in.
 
How about for the space of continuous functions? Polynomials? Does the method ever break down?
 
The question of linear independence of a finite amount of vectors can be thought of as asking about solutions to the equation: c_1\mathbf{v}_1+ c_2\mathbf{v}_2+...+c_n\mathbf{v}_n= \mathbf{0} where \mathbf{v}_i \in V are the vectors you are testing for linear independence and c_i \in F are scalars in your field F. The vectors \mathbf{v}_i will be linearly independent if and only if all the scalars, c_1,c_2,...,c_n are zero. Said another way: c_1\mathbf{v}_1+ c_2\mathbf{v}_2+...+c_n\mathbf{v}_n= \mathbf{0} \Rightarrow c_1,c_2,...,c_n=0 So to test a set of vectors for linear independence you set up the above equation and check to see if all the scalars c_i are zero. How exactly you go about checking that is more or less dependent on what vector space you're dealing with. As for the case of infinitely many vectors I'm not 100% sure, so I won't comment.
 
What Gamble93 gives is the usual definition of "linear independence". Requiring that a matrix having non-zero determinant is a specific property.
 
I realized after my post that I should have included that a non zero determinant implying linear independence is a specific case that applies to \mathbb{R}^n but I was late for class so it slipped my mind. Excuse my ignorance.
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 19 ·
Replies
19
Views
4K