Measures of Linear Independence?

  • #1
LarryS
Gold Member
328
23
My formal education in Linear Algebra was lacking, so I have been studying that subject lately, especially the subject of Linear Independence.

I'm looking for functions that would qualify as measures of linear independence.

Specifically, given a real-valued vector space V of finite dimension N, consider two subsets of V, A and B, both of which are linear independent and contain N vectors each. A is also orthogonal and B is definitely not orthogonal. What would qualify as a real-valued measure of linear independence, m, for which m(A) > m(B)? Suggestions?

Thanks in advance.
 

Answers and Replies

  • #2
fresh_42
Mentor
Insights Author
2022 Award
17,645
18,339
My formal education in Linear Algebra was lacking, so I have been studying that subject lately, especially the subject of Linear Independence.

I'm looking for functions that would qualify as measures of linear independence.

Specifically, given a real-valued vector space V of finite dimension N, consider two subsets of V, A and B, both of which are linear independent and contain N vectors each. A is also orthogonal and B is definitely not orthogonal. What would qualify as a real-valued measure of linear independence, m, for which m(A) > m(B)? Suggestions?

Thanks in advance.
Not sure why you're looking for one. Normally I would say rank or dimension of the subspace, but as you have two basis here, there is actually no difference and ergo no natural measure. It is just a matter of taste that we consider orthonormal basis as convenient and the other we don't. But mathematically, there is no difference. If you like, you could define an order, e.g. the length of the vectors, and then add the absolute values of their angles they pairwise form. I doubt that this would be of any interest, other than trying to map what might be called sense of beauty. Wait, there is one measure: the volume of the parallelepiped spanned by those vectors.
 
  • #3
WWGD
Science Advisor
Gold Member
6,291
8,174
The condition of linear independence is a yes/no condition. I can't see how to compare , other than how fresh suggested, by the size of the independent set. Moreover, if we have two linearly-independent sets there is an invertible matrix that takes one to the other. In this sense the two sets are in the same "orbit".
 
  • #4
Stephen Tashi
Science Advisor
7,781
1,540
. Moreover, if we have two linearly-independent sets there is an invertible matrix that takes one to the other. In this sense the two sets are in the same "orbit".

Perhaps we should look at functions of such invertible matrices. If the vectors in both bases each have unit length, how does the determinant of such a matrix behave?
 
  • #5
StoneTemplePython
Science Advisor
Gold Member
1,260
597
Wait, there is one measure: the volume of the parallelepiped spanned by those vectors.

Indeed

If the vectors in both bases each have unit length, how does the determinant of such a matrix behave?

Yes, and for convenience we may want to look at the squared determinant of ##A## vs ##B##. Direct Application of Hadamard's Inequality tells us that said determinant relationship is

##1 = Det\big(\mathbf A^2\big) \gt Det\big(\mathbf B^2\big)##

(the above relationship holds over any permutation of the vectors in A and any permutation in B.)
- - - -
This nicely lines up with the intuition that if you have an inner product, the 'stuff' that makes a vector linearly independent is the portion of it that is orthogonal to all the other vectors in your collection (i.e. looking at the diagonal entries of ##\mathbf R## in a ##\mathbf {QR}## factorization or other application of Gramm Schmidt).
 
  • #6
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
2022 Award
23,784
15,399
Linear independence doesn't depend on which inner product you define. A set is either linearly independent or not.

You could measure the relative orthogonality of your vectors. You could perhaps define something on the angles from the inner products.
 
  • #7
LarryS
Gold Member
328
23
Wait, there is one measure: the volume of the parallelepiped spanned by those vectors.

Sounds interesting. How about the absolute value of the determinant of the matrix having the vectors in the set as it's column vectors? I probably should have specified up front that all vectors are unit vectors since linear independence only cares about the relative direction of vectors.
 
  • #8
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
2022 Award
23,784
15,399
Sounds interesting. How about the absolute value of the determinant of the matrix having the vectors in the set as it's column vectors? I probably should have specified up front that all vectors are unit vectors since linear independence only cares about the relative direction of vectors.

The concept of unit vector only makes any sense given an inner product or norm. Given any set of linearly independent vectors, you can simply define them to be orthonormal by a suitable choice of inner product.

Given an inner product, you can talk about how close to orthonormality a basis is. But, linear independence itself cannot be quantified. It's a yes or no.
 
  • #9
LarryS
Gold Member
328
23
Given an inner product, you can talk about how close to orthonormality a basis is. But, linear independence itself cannot be quantified. It's a yes or no.

Maybe "Orthogonality Measure" would be a more accurate name.
 
  • #10
fresh_42
Mentor
Insights Author
2022 Award
17,645
18,339
Maybe "Orthogonality Measure" would be a more accurate name.
Not really. In this case all depends on what is orthogonal, and therefore on how to measure angels. You have probably the standard example in mind, where orthogonal is what we usually draw to be perpendicular. But this is only, if the underlying bilinear form is determined by the identity matrix and the use of the Euclidean norm. However,
The concept of unit vector only makes any sense given an inner product or norm. Given any set of linearly independent vectors, you can simply define them to be orthonormal by a suitable choice of inner product.
meant, you're not restricted to this one specific case and one can use other inner product norms.

To summarize:
  • There are no grades of linear independency, either a set of vectors is or is not.
  • There are many ways to define angles and lengths.
  • We often work with the norm that corresponds to the theorem of Pythagoras, the Euclidean norm. In this case one can use the oriented volume of the parallelepiped spanned by vectors, the determinant, as a measure for deviation from one, the determinant of the identity matrix. However, I do not see any purpose to introduce another name for the standard orthonormal basis or a measure for those basis which are different from it.
  • The general case is studied by the behavior and concept of bilinear forms or more generally multilinear forms.
  • The study of the behavior of determinants, as a concept of algebraic equations is covered by algebraic geometry, which is way deeper than the simple comparison of some arbitrary numbers.
 
Last edited:
  • #11
WWGD
Science Advisor
Gold Member
6,291
8,174
Not really. In this case all depends on what is orthogonal, and therefore on how to measure angels. You have probably the standard example in mind, where orthogonal i
  • There are many ways to define angels and ...
Hell's or Anaheim's?
 
  • #13
WWGD
Science Advisor
Gold Member
6,291
8,174
Thanks for correction, and neither. Juice Newton's.
aka "Of the morning"? I prefer to be one of the afternoon, but it is not possible at the moment.
 
  • #14
Stephen Tashi
Science Advisor
7,781
1,540
Another thing we could look at is the "condition number" of the matrix whose columns are the components of the basis vectors.
 

Suggested for: Measures of Linear Independence?

  • Last Post
Replies
2
Views
424
Replies
5
Views
773
Replies
5
Views
1K
Replies
1
Views
595
Replies
1
Views
793
  • Last Post
Replies
4
Views
815
Replies
10
Views
822
Replies
12
Views
815
Replies
3
Views
2K
Top