# Measures of Linear Independence?

• I
Gold Member
My formal education in Linear Algebra was lacking, so I have been studying that subject lately, especially the subject of Linear Independence.

I'm looking for functions that would qualify as measures of linear independence.

Specifically, given a real-valued vector space V of finite dimension N, consider two subsets of V, A and B, both of which are linear independent and contain N vectors each. A is also orthogonal and B is definitely not orthogonal. What would qualify as a real-valued measure of linear independence, m, for which m(A) > m(B)? Suggestions?

Mentor
2022 Award
My formal education in Linear Algebra was lacking, so I have been studying that subject lately, especially the subject of Linear Independence.

I'm looking for functions that would qualify as measures of linear independence.

Specifically, given a real-valued vector space V of finite dimension N, consider two subsets of V, A and B, both of which are linear independent and contain N vectors each. A is also orthogonal and B is definitely not orthogonal. What would qualify as a real-valued measure of linear independence, m, for which m(A) > m(B)? Suggestions?

Not sure why you're looking for one. Normally I would say rank or dimension of the subspace, but as you have two basis here, there is actually no difference and ergo no natural measure. It is just a matter of taste that we consider orthonormal basis as convenient and the other we don't. But mathematically, there is no difference. If you like, you could define an order, e.g. the length of the vectors, and then add the absolute values of their angles they pairwise form. I doubt that this would be of any interest, other than trying to map what might be called sense of beauty. Wait, there is one measure: the volume of the parallelepiped spanned by those vectors.

Gold Member
The condition of linear independence is a yes/no condition. I can't see how to compare , other than how fresh suggested, by the size of the independent set. Moreover, if we have two linearly-independent sets there is an invertible matrix that takes one to the other. In this sense the two sets are in the same "orbit".

. Moreover, if we have two linearly-independent sets there is an invertible matrix that takes one to the other. In this sense the two sets are in the same "orbit".

Perhaps we should look at functions of such invertible matrices. If the vectors in both bases each have unit length, how does the determinant of such a matrix behave?

Gold Member
Wait, there is one measure: the volume of the parallelepiped spanned by those vectors.

Indeed

If the vectors in both bases each have unit length, how does the determinant of such a matrix behave?

Yes, and for convenience we may want to look at the squared determinant of ##A## vs ##B##. Direct Application of Hadamard's Inequality tells us that said determinant relationship is

##1 = Det\big(\mathbf A^2\big) \gt Det\big(\mathbf B^2\big)##

(the above relationship holds over any permutation of the vectors in A and any permutation in B.)
- - - -
This nicely lines up with the intuition that if you have an inner product, the 'stuff' that makes a vector linearly independent is the portion of it that is orthogonal to all the other vectors in your collection (i.e. looking at the diagonal entries of ##\mathbf R## in a ##\mathbf {QR}## factorization or other application of Gramm Schmidt).

Homework Helper
Gold Member
2022 Award
Linear independence doesn't depend on which inner product you define. A set is either linearly independent or not.

You could measure the relative orthogonality of your vectors. You could perhaps define something on the angles from the inner products.

LarryS
Gold Member
Wait, there is one measure: the volume of the parallelepiped spanned by those vectors.

Sounds interesting. How about the absolute value of the determinant of the matrix having the vectors in the set as it's column vectors? I probably should have specified up front that all vectors are unit vectors since linear independence only cares about the relative direction of vectors.

Homework Helper
Gold Member
2022 Award
Sounds interesting. How about the absolute value of the determinant of the matrix having the vectors in the set as it's column vectors? I probably should have specified up front that all vectors are unit vectors since linear independence only cares about the relative direction of vectors.

The concept of unit vector only makes any sense given an inner product or norm. Given any set of linearly independent vectors, you can simply define them to be orthonormal by a suitable choice of inner product.

Given an inner product, you can talk about how close to orthonormality a basis is. But, linear independence itself cannot be quantified. It's a yes or no.

Gold Member
Given an inner product, you can talk about how close to orthonormality a basis is. But, linear independence itself cannot be quantified. It's a yes or no.

Maybe "Orthogonality Measure" would be a more accurate name.

Mentor
2022 Award
Maybe "Orthogonality Measure" would be a more accurate name.
Not really. In this case all depends on what is orthogonal, and therefore on how to measure angels. You have probably the standard example in mind, where orthogonal is what we usually draw to be perpendicular. But this is only, if the underlying bilinear form is determined by the identity matrix and the use of the Euclidean norm. However,
The concept of unit vector only makes any sense given an inner product or norm. Given any set of linearly independent vectors, you can simply define them to be orthonormal by a suitable choice of inner product.
meant, you're not restricted to this one specific case and one can use other inner product norms.

To summarize:
• There are no grades of linear independency, either a set of vectors is or is not.
• There are many ways to define angles and lengths.
• We often work with the norm that corresponds to the theorem of Pythagoras, the Euclidean norm. In this case one can use the oriented volume of the parallelepiped spanned by vectors, the determinant, as a measure for deviation from one, the determinant of the identity matrix. However, I do not see any purpose to introduce another name for the standard orthonormal basis or a measure for those basis which are different from it.
• The general case is studied by the behavior and concept of bilinear forms or more generally multilinear forms.
• The study of the behavior of determinants, as a concept of algebraic equations is covered by algebraic geometry, which is way deeper than the simple comparison of some arbitrary numbers.

Last edited:
LarryS
Gold Member
Not really. In this case all depends on what is orthogonal, and therefore on how to measure angels. You have probably the standard example in mind, where orthogonal i
• There are many ways to define angels and ...
Hell's or Anaheim's?

Mentor
2022 Award
Hell's or Anaheim's?
Thanks for correction, and neither. Juice Newton's.