Measures of Linear Independence?

Click For Summary

Discussion Overview

The discussion revolves around the concept of linear independence in the context of linear algebra, specifically seeking measures that can differentiate between two linearly independent sets of vectors. The participants explore various mathematical functions and properties that could serve as real-valued measures of linear independence, considering both orthogonal and non-orthogonal bases.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants suggest that traditional measures like rank or dimension do not differentiate between two bases since both are linearly independent.
  • One participant proposes the volume of the parallelepiped spanned by the vectors as a potential measure of linear independence.
  • Another participant mentions the determinant of the matrix formed by the vectors as a measure, particularly when considering unit vectors.
  • There is a discussion about the relationship between the determinant and the orthogonality of the vectors, with references to Hadamard's Inequality.
  • Some participants argue that linear independence is a binary condition and cannot be quantified beyond being either independent or dependent.
  • Others propose that measures of relative orthogonality, such as angles derived from inner products, could provide insight into the structure of the bases.
  • A suggestion is made to consider the "condition number" of the matrix formed by the basis vectors as another potential measure.

Areas of Agreement / Disagreement

Participants express differing views on whether linear independence can be measured in a nuanced way. While some agree that linear independence is a yes/no condition, others explore various mathematical constructs that could serve as measures, indicating that the discussion remains unresolved with multiple competing perspectives.

Contextual Notes

Limitations include the dependence on the choice of inner product for defining concepts like unit vectors and orthogonality, as well as the unresolved nature of how to quantify linear independence beyond its binary classification.

LarryS
Gold Member
Messages
361
Reaction score
34
My formal education in Linear Algebra was lacking, so I have been studying that subject lately, especially the subject of Linear Independence.

I'm looking for functions that would qualify as measures of linear independence.

Specifically, given a real-valued vector space V of finite dimension N, consider two subsets of V, A and B, both of which are linear independent and contain N vectors each. A is also orthogonal and B is definitely not orthogonal. What would qualify as a real-valued measure of linear independence, m, for which m(A) > m(B)? Suggestions?

Thanks in advance.
 
Physics news on Phys.org
referframe said:
My formal education in Linear Algebra was lacking, so I have been studying that subject lately, especially the subject of Linear Independence.

I'm looking for functions that would qualify as measures of linear independence.

Specifically, given a real-valued vector space V of finite dimension N, consider two subsets of V, A and B, both of which are linear independent and contain N vectors each. A is also orthogonal and B is definitely not orthogonal. What would qualify as a real-valued measure of linear independence, m, for which m(A) > m(B)? Suggestions?

Thanks in advance.
Not sure why you're looking for one. Normally I would say rank or dimension of the subspace, but as you have two basis here, there is actually no difference and ergo no natural measure. It is just a matter of taste that we consider orthonormal basis as convenient and the other we don't. But mathematically, there is no difference. If you like, you could define an order, e.g. the length of the vectors, and then add the absolute values of their angles they pairwise form. I doubt that this would be of any interest, other than trying to map what might be called sense of beauty. Wait, there is one measure: the volume of the parallelepiped spanned by those vectors.
 
The condition of linear independence is a yes/no condition. I can't see how to compare , other than how fresh suggested, by the size of the independent set. Moreover, if we have two linearly-independent sets there is an invertible matrix that takes one to the other. In this sense the two sets are in the same "orbit".
 
WWGD said:
. Moreover, if we have two linearly-independent sets there is an invertible matrix that takes one to the other. In this sense the two sets are in the same "orbit".

Perhaps we should look at functions of such invertible matrices. If the vectors in both bases each have unit length, how does the determinant of such a matrix behave?
 
fresh_42 said:
Wait, there is one measure: the volume of the parallelepiped spanned by those vectors.

Indeed

Stephen Tashi said:
If the vectors in both bases each have unit length, how does the determinant of such a matrix behave?

Yes, and for convenience we may want to look at the squared determinant of ##A## vs ##B##. Direct Application of Hadamard's Inequality tells us that said determinant relationship is

##1 = Det\big(\mathbf A^2\big) \gt Det\big(\mathbf B^2\big)##

(the above relationship holds over any permutation of the vectors in A and any permutation in B.)
- - - -
This nicely lines up with the intuition that if you have an inner product, the 'stuff' that makes a vector linearly independent is the portion of it that is orthogonal to all the other vectors in your collection (i.e. looking at the diagonal entries of ##\mathbf R## in a ##\mathbf {QR}## factorization or other application of Gramm Schmidt).
 
Linear independence doesn't depend on which inner product you define. A set is either linearly independent or not.

You could measure the relative orthogonality of your vectors. You could perhaps define something on the angles from the inner products.
 
  • Like
Likes   Reactions: LarryS
fresh_42 said:
Wait, there is one measure: the volume of the parallelepiped spanned by those vectors.

Sounds interesting. How about the absolute value of the determinant of the matrix having the vectors in the set as it's column vectors? I probably should have specified up front that all vectors are unit vectors since linear independence only cares about the relative direction of vectors.
 
referframe said:
Sounds interesting. How about the absolute value of the determinant of the matrix having the vectors in the set as it's column vectors? I probably should have specified up front that all vectors are unit vectors since linear independence only cares about the relative direction of vectors.

The concept of unit vector only makes any sense given an inner product or norm. Given any set of linearly independent vectors, you can simply define them to be orthonormal by a suitable choice of inner product.

Given an inner product, you can talk about how close to orthonormality a basis is. But, linear independence itself cannot be quantified. It's a yes or no.
 
PeroK said:
Given an inner product, you can talk about how close to orthonormality a basis is. But, linear independence itself cannot be quantified. It's a yes or no.

Maybe "Orthogonality Measure" would be a more accurate name.
 
  • #10
referframe said:
Maybe "Orthogonality Measure" would be a more accurate name.
Not really. In this case all depends on what is orthogonal, and therefore on how to measure angels. You have probably the standard example in mind, where orthogonal is what we usually draw to be perpendicular. But this is only, if the underlying bilinear form is determined by the identity matrix and the use of the Euclidean norm. However,
PeroK said:
The concept of unit vector only makes any sense given an inner product or norm. Given any set of linearly independent vectors, you can simply define them to be orthonormal by a suitable choice of inner product.
meant, you're not restricted to this one specific case and one can use other inner product norms.

To summarize:
  • There are no grades of linear independency, either a set of vectors is or is not.
  • There are many ways to define angles and lengths.
  • We often work with the norm that corresponds to the theorem of Pythagoras, the Euclidean norm. In this case one can use the oriented volume of the parallelepiped spanned by vectors, the determinant, as a measure for deviation from one, the determinant of the identity matrix. However, I do not see any purpose to introduce another name for the standard orthonormal basis or a measure for those basis which are different from it.
  • The general case is studied by the behavior and concept of bilinear forms or more generally multilinear forms.
  • The study of the behavior of determinants, as a concept of algebraic equations is covered by algebraic geometry, which is way deeper than the simple comparison of some arbitrary numbers.
 
Last edited:
  • Like
Likes   Reactions: LarryS
  • #11
fresh_42 said:
Not really. In this case all depends on what is orthogonal, and therefore on how to measure angels. You have probably the standard example in mind, where orthogonal i
  • There are many ways to define angels and ...
Hell's or Anaheim's?
 
  • #12
WWGD said:
Hell's or Anaheim's?
Thanks for correction, and neither. Juice Newton's.
 
  • #13
fresh_42 said:
Thanks for correction, and neither. Juice Newton's.
aka "Of the morning"? I prefer to be one of the afternoon, but it is not possible at the moment.
 
  • Like
Likes   Reactions: fresh_42
  • #14
Another thing we could look at is the "condition number" of the matrix whose columns are the components of the basis vectors.
 
  • Like
Likes   Reactions: LarryS

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 24 ·
Replies
24
Views
2K