# List of Non-Singular Equivalencies

I was compiling a list of non-singular equivalencies. This is all I have so far. I would appreciate if you can help me to add more to these.

Let A be a square n x n matrix. The following statements are equivalent. That is, for a given A, the statements are either all true or all false.

1. A is non-singular.
2. A is row equivalent to In.
3. Ax=0 has only the trivial solution.
4. Ax=b has a unique solution, for each vector b in Rn .
5. Ax=b has at least one solution, for each vector b in Rn .
6. det(A) ≠ 0
7. The column vectors of A form a linearly independent set in Rn.
8. A(transpose) is non-singular.
9. The column vectors of A span Rn.
10. The column vectors of A form a basis for Rn.

I am thinking in the range of topics like basis, dimension, rank, column space, row space, null space, etc.

any help is highly appreciated.

## Answers and Replies

Deveno
Science Advisor
some more:

rank(A) = n
ker(A) (nullspace of A) = {0}.
there exists an nxn matrix B such that AB = In
row space of A = Rn
there exists an nxn matrix B such that BA = In
if AB = 0, B = 0.

AlephZero
Science Advisor
Homework Helper
The eigenvalues of A are all nonzero.
The singular values of A are all nonzero.

Deveno
Science Advisor
the constant term of the characteristic polynomial for A is non-zero.

I am thinking in the range of topics like basis, dimension, rank, column space, row space, null space, etc.
What is your thinking with respect to the type of the matrix elements: real, complex, Galois, ...?

Characterizations in terms of eigenvalues depend on this. There are real 2x2 matrices, the rotations of the euclidean plane, that have no eigenvalues at all (iff they differ from the unit matrix). So, are all eigenvalues 0 ???

AlephZero
Science Advisor
Homework Helper
There are real 2x2 matrices, the rotations of the euclidean plane, that have no eigenvalues at all (iff they differ from the unit matrix). So, are all eigenvalues 0 ???

Sure, the eigenvalues of a matrix may be members of the algebraic extension of the field of matrix elements, (e.g. matrices with real elements may have complex eigenvalues) but that doesn't affect "there are zero eigenvalues iff the matrix is singular".

The rotation matrix
$$\begin{pmatrix} \cos\theta & \sin\theta \\ -\sin\theta & \cos\theta \end{pmatrix}$$ has complex non-zero eigenvalues, unless it is the unit matrix.

Or have I misunderstood the point you are making?

Sure, the eigenvalues of a matrix may be members of the algebraic extension of the field of matrix elements, (e.g. matrices with real elements may have complex eigenvalues) but that doesn't affect "there are zero eigenvalues iff the matrix is singular".
I had in mind the view held in most mathematics textbooks, where linear spaces are discussed as associated with some 'field K of scalars' and all matrices under considerations have elements from K. Then, when it comes to eigenvalues, these are defined to belong to K (since otherwise multiplication of vectors with eigenvalues would not be defined). Within such a mental framework our rotations in R^2 would have no eigenvalues and a widespread (careless ?) application of logic allows to make about 'any element of the (void) set of eigenvalues' any statement (also that it equals 0) without being formally wrong. This shows at least, that the direct test whether for your rotation matrix all eigenvalues are non-zero becomes a bit confusing. As you mention, the version of the criterion for singularity is not touched by this problem since 0 is always an element of K and can be multiplied with all vectors under consideration. Further, also with singular values there is no problem (at least for K=R and K=C, for other fields it could well be that the concept makes no sense).

AlephZero
Science Advisor
Homework Helper
I had in mind the view held in most mathematics textbooks, where linear spaces are discussed as associated with some 'field K of scalars' and all matrices under considerations have elements from K. Then, when it comes to eigenvalues, these are defined to belong to K (since otherwise multiplication of vectors with eigenvalues would not be defined).

OK, I admit it's about 40 years since I read that kind of math textbook so I'll take your word for it that's the "standard" definiton these days.

But I'm not going to stop finding and using complex eigenpairs of real matrices just because some math textbook tells me I shouldn't. (FWIW I have a math degree.)

Deveno
Science Advisor
I had in mind the view held in most mathematics textbooks, where linear spaces are discussed as associated with some 'field K of scalars' and all matrices under considerations have elements from K. Then, when it comes to eigenvalues, these are defined to belong to K (since otherwise multiplication of vectors with eigenvalues would not be defined). Within such a mental framework our rotations in R^2 would have no eigenvalues and a widespread (careless ?) application of logic allows to make about 'any element of the (void) set of eigenvalues' any statement (also that it equals 0) without being formally wrong. This shows at least, that the direct test whether for your rotation matrix all eigenvalues are non-zero becomes a bit confusing. As you mention, the version of the criterion for singularity is not touched by this problem since 0 is always an element of K and can be multiplied with all vectors under consideration. Further, also with singular values there is no problem (at least for K=R and K=C, for other fields it could well be that the concept makes no sense).

some authors emphasize the underlying field. some don't. some go so far as to explicitly state from the outset, that they are working in a subfield of C. truly first-rate authors note when the characteristic of a field affects the results.

if one enlarges the field, since we have a subfield, all of the vectors become vectors in our new, improved vector space, and the same rules apply. this is often necessary for real, or rational matrices, since the roots of the characteristic polynomial in a matrix in Fnxn, need not lie in F, unless F is algebraically closed. i've seen many texts that open their discussion of eigenvalues (and the Jordan normal form) with a brief statement to the effect: "assume for the purposes of this section, we are working over C".