Proof that det(M)=0 => Linear Dependence of Columns

Click For Summary
SUMMARY

The discussion centers on proving that for a general NXN matrix M, if det(M) = 0, then the columns of M are linearly dependent. Participants emphasize the importance of understanding determinants, particularly through row operations that do not alter the determinant's value. The proof involves demonstrating that a determinant of zero indicates non-trivial solutions to the equation Ax = 0, confirming linear dependence among the column vectors. The conversation also highlights the necessity of relying on foundational definitions of determinants rather than advanced theorems.

PREREQUISITES
  • Understanding of linear algebra concepts, particularly determinants.
  • Familiarity with row operations and their effects on determinants.
  • Knowledge of linear dependence and independence of vectors.
  • Basic proficiency in matrix equations and solutions.
NEXT STEPS
  • Study the properties of determinants in linear algebra.
  • Learn about row operations and their impact on matrix determinants.
  • Explore the concept of linear dependence and how it relates to matrix equations.
  • Review the Laplace expansion for calculating determinants.
USEFUL FOR

Students of linear algebra, educators teaching matrix theory, and anyone seeking to understand the relationship between determinants and linear dependence in matrices.

bananabandana
Messages
112
Reaction score
5

Homework Statement


Prove that for a general NXN matrix, M, det(M)=0 => Linear Dependence of Columns

Homework Equations

The Attempt at a Solution


It's not clear to me at all how to approach this. We've just started Linear algebra and this was stated without proof in lecture. I have no idea how to solve this. Can someone give me a hint about a starting point?

Thanks! :)
 
Physics news on Phys.org
bananabandana said:

Homework Statement


Prove that for a general NXN matrix, M, det(M)=0 => Linear Dependence of Columns

Homework Equations

The Attempt at a Solution


It's not clear to me at all how to approach this. We've just started Linear algebra and this was stated without proof in lecture. I have no idea how to solve this. Can someone give me a hint about a starting point?

Thanks! :)

Depends on what you know about determinants. Do you know there are row operations you can do that don't change the determinant or change it only by a sign? Can you show you can reduce any matrix to upper triangular form with those row operations? Then the determinant depends only on the diagonal elements of the upper triangular matrix. Relate the linear independence to the value of those diagonal elements in the upper triangular form.
 
  • Like
Likes bananabandana
No I didn't know that. I will look up the method and try to go from there. Thanks!
 
It also depends on what definition of determinant you have to work from.
My old textbook (D.T. Finkbeiner II, 1966) defines it by three axioms:
1. linearity wrt columns; i.e. if a column vector v of A can be expressed as a linear sum of two vectors, v = av1+ bv2, and A1, A2 are the matrices consisting of A except that v is replaced by v1, v2 respectively, then det(A) = a det(A1)+b det(A2).
2. If two adjacent columns are equal then det is 0
3. det(I) = 1.
It's not hard to deduce that swapping two adjacent columns switches the sign on det.
From there, you need to extend to switching non-adjacent columns, and so on.
 
Would this be a valid solution?

$$ |A|=0 $$ implies there are non-trivial solutions to the equation $$\mathbf{A}\mathbf{x}=0$$. Since, if |A|=0 we know that the equation either has infinite solutions or no solution, since ##\mathbf{x}=\vec{0}## is a solution, there must be infinite solutions.

Matrix ## A ## can be written as the set of column vectors:
$$ A = [\mathbf{a_{1}}, \mathbf{a_{2}}...,\mathbf{a_{n}}] $$ , where ##a_{i} ## is a member of ## R^{N}##.

This implies that :

$$ x_{1}\mathbf{a_{1}}+x_{2}\mathbf{a_{2}} + ... + x_{n}\mathbf{a_{n}} = 0 $$

For some set of ## x_{i} ## which are not all zero. Therefore the column vectors of a matrix are linearly dependent if det|A| =0.
Thanks!
 
bananabandana said:
|A|=0 implies there are non-trivial solutions to the equation
Ax=0​
How do you know that? Does it come directly from the definition of determinant that you have been taught, or from some theorem that you are allowed to quote?
 
Sorry I was rushed and did not post the proof properly. Hopefully it should be as follows:

1. Proof that ## |A|=0 <==> ## Non-trivial solutions to:
$$ \mathbf{A}\vec{x}=\vec{0} \ (*) $$
i) If ##|A|=0 ## implies (via Cranmer's rule/matrix inversion) that there are either no solutions or infinitely many solutions to (*).
Since ##\vec{x} = \vec{0} ## is a solution, there must be an infinite number of solutions. ## \therefore |A| = 0 \implies ## non-trivial solutions.

ii) If there is one non-trivial solution to (*), since ##\vec{x} = \vec{0} ## is a solution, there must be an infinite number of solutions, therefore we know ##|A| =0##

Hope that is better :)
 
bananabandana said:
Sorry I was rushed and did not post the proof properly. Hopefully it should be as follows:

1. Proof that ## |A|=0 <==> ## Non-trivial solutions to:
$$ \mathbf{A}\vec{x}=\vec{0} \ (*) $$
i) If ##|A|=0 ## implies (via Cranmer's rule/matrix inversion) that there are either no solutions or infinitely many solutions to (*).
Since ##\vec{x} = \vec{0} ## is a solution, there must be an infinite number of solutions. ## \therefore |A| = 0 \implies ## non-trivial solutions.

ii) If there is one non-trivial solution to (*), since ##\vec{x} = \vec{0} ## is a solution, there must be an infinite number of solutions, therefore we know ##|A| =0##

Hope that is better :)
That looks ok if you are allowed to quote Cramer's rule (not Cranmer's; I believe Thomas Cranmer's rule was to keep Henry happy). The danger here is that this may be regarded as a more advanced theorem than the one you are trying to prove. This is often a difficulty when asked to prove something which is generally taken as a well known fact. In my view you should attempt to rely only on facts which are evidently more 'primitive'.
What definition of determinant have you been given?
 
  • #10
bananabandana said:
  1. Has just been defined as an operation to retrieve a number from a square matrix - via the Laplace expansion. (https://en.wikipedia.org/wiki/Laplace_expansion)
Then I feel you should try to derive the result directly from that definition and not appeal to any standard theorems.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 32 ·
2
Replies
32
Views
3K
Replies
4
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
14
Views
6K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 69 ·
3
Replies
69
Views
9K
  • · Replies 23 ·
Replies
23
Views
2K