Measures of Linear Independence?

In summary, the conversation discusses the concept of linear independence and the search for a measure of linear independence. The participants suggest using the length of vectors, the volume of the parallelepiped spanned by vectors, or the determinant of a matrix as possible measures, but ultimately conclude that linear independence is a yes/no condition and cannot be quantified. They also discuss the role of inner products and norms in defining these measures.
  • #1
LarryS
Gold Member
345
33
My formal education in Linear Algebra was lacking, so I have been studying that subject lately, especially the subject of Linear Independence.

I'm looking for functions that would qualify as measures of linear independence.

Specifically, given a real-valued vector space V of finite dimension N, consider two subsets of V, A and B, both of which are linear independent and contain N vectors each. A is also orthogonal and B is definitely not orthogonal. What would qualify as a real-valued measure of linear independence, m, for which m(A) > m(B)? Suggestions?

Thanks in advance.
 
Physics news on Phys.org
  • #2
referframe said:
My formal education in Linear Algebra was lacking, so I have been studying that subject lately, especially the subject of Linear Independence.

I'm looking for functions that would qualify as measures of linear independence.

Specifically, given a real-valued vector space V of finite dimension N, consider two subsets of V, A and B, both of which are linear independent and contain N vectors each. A is also orthogonal and B is definitely not orthogonal. What would qualify as a real-valued measure of linear independence, m, for which m(A) > m(B)? Suggestions?

Thanks in advance.
Not sure why you're looking for one. Normally I would say rank or dimension of the subspace, but as you have two basis here, there is actually no difference and ergo no natural measure. It is just a matter of taste that we consider orthonormal basis as convenient and the other we don't. But mathematically, there is no difference. If you like, you could define an order, e.g. the length of the vectors, and then add the absolute values of their angles they pairwise form. I doubt that this would be of any interest, other than trying to map what might be called sense of beauty. Wait, there is one measure: the volume of the parallelepiped spanned by those vectors.
 
  • #3
The condition of linear independence is a yes/no condition. I can't see how to compare , other than how fresh suggested, by the size of the independent set. Moreover, if we have two linearly-independent sets there is an invertible matrix that takes one to the other. In this sense the two sets are in the same "orbit".
 
  • #4
WWGD said:
. Moreover, if we have two linearly-independent sets there is an invertible matrix that takes one to the other. In this sense the two sets are in the same "orbit".

Perhaps we should look at functions of such invertible matrices. If the vectors in both bases each have unit length, how does the determinant of such a matrix behave?
 
  • #5
fresh_42 said:
Wait, there is one measure: the volume of the parallelepiped spanned by those vectors.

Indeed

Stephen Tashi said:
If the vectors in both bases each have unit length, how does the determinant of such a matrix behave?

Yes, and for convenience we may want to look at the squared determinant of ##A## vs ##B##. Direct Application of Hadamard's Inequality tells us that said determinant relationship is

##1 = Det\big(\mathbf A^2\big) \gt Det\big(\mathbf B^2\big)##

(the above relationship holds over any permutation of the vectors in A and any permutation in B.)
- - - -
This nicely lines up with the intuition that if you have an inner product, the 'stuff' that makes a vector linearly independent is the portion of it that is orthogonal to all the other vectors in your collection (i.e. looking at the diagonal entries of ##\mathbf R## in a ##\mathbf {QR}## factorization or other application of Gramm Schmidt).
 
  • #6
Linear independence doesn't depend on which inner product you define. A set is either linearly independent or not.

You could measure the relative orthogonality of your vectors. You could perhaps define something on the angles from the inner products.
 
  • Like
Likes LarryS
  • #7
fresh_42 said:
Wait, there is one measure: the volume of the parallelepiped spanned by those vectors.

Sounds interesting. How about the absolute value of the determinant of the matrix having the vectors in the set as it's column vectors? I probably should have specified up front that all vectors are unit vectors since linear independence only cares about the relative direction of vectors.
 
  • #8
referframe said:
Sounds interesting. How about the absolute value of the determinant of the matrix having the vectors in the set as it's column vectors? I probably should have specified up front that all vectors are unit vectors since linear independence only cares about the relative direction of vectors.

The concept of unit vector only makes any sense given an inner product or norm. Given any set of linearly independent vectors, you can simply define them to be orthonormal by a suitable choice of inner product.

Given an inner product, you can talk about how close to orthonormality a basis is. But, linear independence itself cannot be quantified. It's a yes or no.
 
  • #9
PeroK said:
Given an inner product, you can talk about how close to orthonormality a basis is. But, linear independence itself cannot be quantified. It's a yes or no.

Maybe "Orthogonality Measure" would be a more accurate name.
 
  • #10
referframe said:
Maybe "Orthogonality Measure" would be a more accurate name.
Not really. In this case all depends on what is orthogonal, and therefore on how to measure angels. You have probably the standard example in mind, where orthogonal is what we usually draw to be perpendicular. But this is only, if the underlying bilinear form is determined by the identity matrix and the use of the Euclidean norm. However,
PeroK said:
The concept of unit vector only makes any sense given an inner product or norm. Given any set of linearly independent vectors, you can simply define them to be orthonormal by a suitable choice of inner product.
meant, you're not restricted to this one specific case and one can use other inner product norms.

To summarize:
  • There are no grades of linear independency, either a set of vectors is or is not.
  • There are many ways to define angles and lengths.
  • We often work with the norm that corresponds to the theorem of Pythagoras, the Euclidean norm. In this case one can use the oriented volume of the parallelepiped spanned by vectors, the determinant, as a measure for deviation from one, the determinant of the identity matrix. However, I do not see any purpose to introduce another name for the standard orthonormal basis or a measure for those basis which are different from it.
  • The general case is studied by the behavior and concept of bilinear forms or more generally multilinear forms.
  • The study of the behavior of determinants, as a concept of algebraic equations is covered by algebraic geometry, which is way deeper than the simple comparison of some arbitrary numbers.
 
Last edited:
  • Like
Likes LarryS
  • #11
fresh_42 said:
Not really. In this case all depends on what is orthogonal, and therefore on how to measure angels. You have probably the standard example in mind, where orthogonal i
  • There are many ways to define angels and ...
Hell's or Anaheim's?
 
  • #12
WWGD said:
Hell's or Anaheim's?
Thanks for correction, and neither. Juice Newton's.
 
  • #13
fresh_42 said:
Thanks for correction, and neither. Juice Newton's.
aka "Of the morning"? I prefer to be one of the afternoon, but it is not possible at the moment.
 
  • Like
Likes fresh_42
  • #14
Another thing we could look at is the "condition number" of the matrix whose columns are the components of the basis vectors.
 
  • Like
Likes LarryS

1. What are Measures of Linear Independence?

Measures of Linear Independence are mathematical tools used to determine the degree to which a set of vectors in a vector space are independent. They are used to assess the linear relationships between different variables in a dataset.

2. Why are Measures of Linear Independence important?

Measures of Linear Independence are important because they allow us to determine the dimensionality of a dataset and identify any redundant or highly correlated variables. This helps in simplifying complex datasets and improving the accuracy of statistical models.

3. How are Measures of Linear Independence calculated?

There are different measures of linear independence, such as the determinant of a matrix, the rank of a matrix, and the condition number of a matrix. These measures are calculated using mathematical equations and algorithms that involve operations such as matrix multiplication and inversion.

4. What is the difference between linearly dependent and linearly independent variables?

Linearly dependent variables are those that can be expressed as a linear combination of other variables in the dataset. In contrast, linearly independent variables are those that cannot be expressed as a linear combination of other variables. This means that linearly independent variables provide unique information and are not redundant in the dataset.

5. How do Measures of Linear Independence relate to machine learning?

Measures of Linear Independence are widely used in machine learning to identify and eliminate irrelevant or highly correlated variables from a dataset. This helps in improving the performance of machine learning algorithms and avoiding overfitting. Additionally, linear independence is a fundamental assumption in many machine learning models, such as linear regression and neural networks.

Similar threads

  • Linear and Abstract Algebra
Replies
6
Views
849
  • Linear and Abstract Algebra
Replies
4
Views
853
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
164
  • Linear and Abstract Algebra
Replies
9
Views
545
  • Linear and Abstract Algebra
Replies
10
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
990
  • Linear and Abstract Algebra
Replies
3
Views
1K
Back
Top