MHB Proving Singular Matrix and Non-Zero Solutions: A Tutorial

  • Thread starter Thread starter Poirot1
  • Start date Start date
  • Tags Tags
    Matrix Proofs
Poirot1
Messages
243
Reaction score
0
How would I prove that if A is singular, then Av=0 has a non-zero solution?.
 
Physics news on Phys.org
Poirot said:
How would I prove that if A is singular, then Av=0 has a non-zero solution?.

If A is singular then it isn't invertible, so by the invertible matrix theorem the columns of A are not linearly independent.
 
Jameson said:
If A is singular then it isn't invertible, so by the invertible matrix theorem the columns of A are not linearly independent.

How can I prove that the columns of an invertible matrix are linearly independent (from 'first principles')?

Thanks
 
I need to know what you've covered and what tools are available. The proof of the invertible matrix theorem is widely available all over Google so I suggest skimming through some of those proofs and then posting any followup ideas or questions.

Many of these proofs also work by proving a couple of statements and then using that to imply the other statements. Any true statement of the IMT implies all of the others so there are lots of ways to go between these ideas.

Here is an example of an answer to your question:

"Assume that for the matrix A, Row i = Row j. By interchanging these two rows, the determinant changes sign (by Property 2). However, since these two rows are the same, interchanging them obviously leaves the matrix and, therefore, the determinant unchanged. Since 0 is the only number which equals its own opposite, det A = 0"

This uses the property that switching two rows of a matrix will reverse the sign of the determinant.
 
I'm not quite sure how your answer pertains to my question. I see on wikipedia there is a list of equivalent statements which comprise the invertblie matrix theorem. I suppose what I want is to prove these in a non-circular manner, i.e. without invoking the invertible matrix theorem.
 
The definition of a singular matrix A, as far as I know, is a square matrix that does not have an inverse. This occurs iff when det(A) =0. That's my reasoning for starting with the determinant.

Anyway, that's all I have to offer since I don't know the way you want to approach it but I know that a handful of members here are very knowledgeable of linear algebra so hopefully one of them can comment further.
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top