Square matrix with no eigenvectors?

In summary, every square matrix of degree n does have n eigenvalues and corresponding n eigenvectors. These eigenvalues may not be distinct or non-zero, but they still represent the amount of expansion in the corresponding dimension. The rotation matrix is a special case where the eigenvalues are complex, but they still have corresponding eigenvectors. Therefore, the dimension of the eigenspace of any square matrix is always guaranteed to be at least 1.
  • #1
psholtz
136
0
Is there such a thing as a square matrix with no eigenvectors?

I'm thinking not ... since even if you have:

[tex]\left[\begin{array}{cc} 0 & 0 \\ 0 & 0 \end{array}\right][/tex]

you could just as well say that the eigenvalue(s) are 0 (w/ algebraic multiplicity 2) and the eigenvectors are:

[tex]u_1 = \left[\begin{array}{cc}1 & 0 \end{array}\right]^T, u_2 = \left[\begin{array}{cc}0 & 1\end{array}\right]^T[/tex]
 
Physics news on Phys.org
  • #2


Every square matrix of degree n does have n eigenvalues and corresponding n eigenvectors. These eigenvalues are not necessary to be distinct nor non-zero.

An eigenvalue represents the amount of expansion in the corresponding dimension. this expansion may be zero, smaller than 1, equal to 1, larger than 1, or even complex. But since there is an expansion, this amount of expansion must be represented by an eigenvalue.
 
  • #3


hkBattousai said:
Every square matrix of degree n does have n eigenvalues and corresponding n eigenvectors. These eigenvalues are not necessary to be distinct nor non-zero.
What do you mean by "n eigenvectors"? There necessarily exist an infinite number of eigenvectors since any multiple of an eigenvector is also an eigenvector. If you mean "n independent eigenvectors, that is not true. The 2 by 2 matrix
[tex]\begin{bmatrix}1 & 1 \\ 0 & 1\end{bmatrix}[/tex]
has eigenvalue 1 (a double eigenvalue) but all eigenvectors are multiples of
[tex]\begin{bmatrix}0 \\ 1\end{bmatrix}[/tex]

An eigenvalue represents the amount of expansion in the corresponding dimension. this expansion may be zero, smaller than 1, equal to 1, larger than 1, or even complex. But since there is an expansion, this amount of expansion must be represented by an eigenvalue.
 
  • #4


HallsofIvy said:
What do you mean by "n eigenvectors"? There necessarily exist an infinite number of eigenvectors since any multiple of an eigenvector is also an eigenvector. If you mean "n independent eigenvectors, that is not true. The 2 by 2 matrix
[tex]\begin{bmatrix}1 & 1 \\ 0 & 1\end{bmatrix}[/tex]
has eigenvalue 1 (a double eigenvalue) but all eigenvectors are multiples of
[tex]\begin{bmatrix}0 \\ 1\end{bmatrix}[/tex]

Sorry, but your example also has two eigenvalue-eigenvector pair :smile:

Please look at this image (it is not displaying for some reason):
[PLAIN]http://img227.imageshack.us/img227/7890/eigen.png

Code:
teta = 0	x=(1, 0);			b=(1, 0);
teta = 7.2	x=(0.992115, 0.125333);		b=(1.11745, 0.125333);
teta = 14.4	x=(0.968583, 0.24869);		b=(1.21727, 0.24869);
teta = 21.6	x=(0.929776, 0.368125);		b=(1.2979, 0.368125);
teta = 28.8	x=(0.876307, 0.481754);		b=(1.35806, 0.481754);
teta = 36	x=(0.809017, 0.587785);		b=(1.3968, 0.587785);
teta = 43.2	x=(0.728969, 0.684547);		b=(1.41352, 0.684547);
teta = 50.4	x=(0.637424, 0.770513);		b=(1.40794, 0.770513);
teta = 57.6	x=(0.535827, 0.844328);		b=(1.38015, 0.844328);
teta = 64.8	x=(0.425779, 0.904827);		b=(1.33061, 0.904827);
teta = 72	x=(0.309017, 0.951057);		b=(1.26007, 0.951057);
teta = 79.2	x=(0.187381, 0.982287);		b=(1.16967, 0.982287);
teta = 86.4	x=(0.0627905, 0.998027);	b=(1.06082, 0.998027);
teta = 93.6	x=(-0.0627905, 0.998027);	b=(0.935236, 0.998027);
teta = 100.8	x=(-0.187381, 0.982287);	b=(0.794906, 0.982287);
teta = 108	x=(-0.309017, 0.951057);	b=(0.64204, 0.951057);
teta = 115.2	x=(-0.425779, 0.904827);	b=(0.479048, 0.904827);
teta = 122.4	x=(-0.535827, 0.844328);	b=(0.308501, 0.844328);
teta = 129.6	x=(-0.637424, 0.770513);	b=(0.133089, 0.770513);
teta = 136.8	x=(-0.728969, 0.684547);	b=(-0.0444215, 0.684547);
teta = 144	x=(-0.809017, 0.587785);	b=(-0.221232, 0.587785);
teta = 151.2	x=(-0.876307, 0.481754);	b=(-0.394553, 0.481754);
teta = 158.4	x=(-0.929776, 0.368125);	b=(-0.561652, 0.368125);
teta = 165.6	x=(-0.968583, 0.24869);		b=(-0.719893, 0.24869);
teta = 172.8	x=(-0.992115, 0.125333);	b=(-0.866781, 0.125333);
teta = 180	x=(-1, -1.36639e-015);		b=(-1, -1.36639e-015);
teta = 187.2	x=(-0.992115, -0.125333);	b=(-1.11745, -0.125333);
teta = 194.4	x=(-0.968583, -0.24869);	b=(-1.21727, -0.24869);
teta = 201.6	x=(-0.929776, -0.368125);	b=(-1.2979, -0.368125);
teta = 208.8	x=(-0.876307, -0.481754);	b=(-1.35806, -0.481754);
teta = 216	x=(-0.809017, -0.587785);	b=(-1.3968, -0.587785);
teta = 223.2	x=(-0.728969, -0.684547);	b=(-1.41352, -0.684547);
teta = 230.4	x=(-0.637424, -0.770513);	b=(-1.40794, -0.770513);
teta = 237.6	x=(-0.535827, -0.844328);	b=(-1.38015, -0.844328);
teta = 244.8	x=(-0.425779, -0.904827);	b=(-1.33061, -0.904827);
teta = 252	x=(-0.309017, -0.951057);	b=(-1.26007, -0.951057);
teta = 259.2	x=(-0.187381, -0.982287);	b=(-1.16967, -0.982287);
teta = 266.4	x=(-0.0627905, -0.998027);	b=(-1.06082, -0.998027);
teta = 273.6	x=(0.0627905, -0.998027);	b=(-0.935236, -0.998027);
teta = 280.8	x=(0.187381, -0.982287);	b=(-0.794906, -0.982287);
teta = 288	x=(0.309017, -0.951057);	b=(-0.64204, -0.951057);
teta = 295.2	x=(0.425779, -0.904827);	b=(-0.479048, -0.904827);
teta = 302.4	x=(0.535827, -0.844328);	b=(-0.308501, -0.844328);
teta = 309.6	x=(0.637424, -0.770513);	b=(-0.133089, -0.770513);
teta = 316.8	x=(0.728969, -0.684547);	b=(0.0444215, -0.684547);
teta = 324	x=(0.809017, -0.587785);	b=(0.221232, -0.587785);
teta = 331.2	x=(0.876307, -0.481754);	b=(0.394553, -0.481754);
teta = 338.4	x=(0.929776, -0.368125);	b=(0.561652, -0.368125);
teta = 345.6	x=(0.968583, -0.24869);		b=(0.719893, -0.24869);
teta = 352.8	x=(0.992115, -0.125333);	b=(0.866781, -0.125333);

Eigenvalue 1 = 1.0
Eigenvector 1 = (1,0)

Eigenvalue 1 = 1.0
Eigenvector 1 = (-1,0)

The issue which is confusing you is the fact that, both eigenvectors are on the same direction. Both are on the x-axis direction. Like two eigenvalues can have the same value, eigenvectors also can have the same value. In other words, they do not need to be distinct.
 
Last edited by a moderator:
  • #5


hkBattousai, I think HallsOfIvy is correct..

In the example he gave, the matrix has only one distinct eigenvalue (which is 1, w/ algebraic multiplicity of 2), and there is only one eigenvector corresponding to this eigenvalue (so the geometric multiplicity of the eigenvalue is 1).

I suppose we can conclude that the example matrix is not diagonalizable.. indeed, it already is in its "canonical" form.

At any rate, the vectors [1,0] and [-1,0] are both in the same eigenspace.

Getting back to my original question, would it be correct to say that every square matrix has at least one eigenvector..

Or as HallsOfIvy pointed out, if a matrix has one eigenvector, then it has infinite eigenvectors, since any multiple of the eigenvector will also be an eigenvector.

So many the correct way to phrase my question is: is the dimension of the eigenspace of any square matrix always at least 1?
 
  • #6


psholtz said:
Is there such a thing as a square matrix with no eigenvectors?

I'm thinking not ... since even if you have:

[tex]\left[\begin{array}{cc} 0 & 0 \\ 0 & 0 \end{array}\right][/tex]

you could just as well say that the eigenvalue(s) are 0 (w/ algebraic multiplicity 2) and the eigenvectors are:

[tex]u_1 = \left[\begin{array}{cc}1 & 0 \end{array}\right]^T, u_2 = \left[\begin{array}{cc}0 & 1\end{array}\right]^T[/tex]

the matrix of a rotation of the Cartesian plane by 90 degrees has no eigen vectors. Every vector in the plane is moved to a vector orthogonal to it.
The roots of the characteristic polynomial of the matrix are +-i. These are not a scalars in the plane since it is a vector space over the real numbers.
 
Last edited:
  • #7


It's true that in general, the rotation matrix does not have real eigenvalues.. however, for the general 2-dimensional rotation matrix:

[tex]\left[\begin{array}{cc} \cos \phi & -\sin \phi \\ \sin \phi & \cos \phi\end{array}\right][/tex]

it will, in general have the two (complex) eigenvalues:

[tex]\lambda_1 = e^{i\phi}, \lambda_2 = e^{-i\phi}[/tex]

to which we can find the corresponding eigenvectors:

[tex]u_1 = \left[\begin{array}{cc} i & 1\end{array}\right]^T, u_2=\left[\begin{array}{cc} -i & 1 \end{array}\right]^T[/tex]

So far instance, if the rotation matrix is through 90 degrees, as in your example:

[tex]\left[\begin{array}{cc} 0 & -1 \\ 1 & 0\end{array}\right]
\left[\begin{array}{c} i \\ 1 \end{array}\right]
=\left[\begin{array}{c}-1 \\ i \end{array}\right]
= i \cdot \left[\begin{array}{c} i \\ 1 \end{array}\right]
[/tex]

where [tex]\lambda_1 = i[/tex] is the eigenvalue corresponding to this eigenvector.

Similarly for the other eigenvalue.

Similarly for any other rotation matrix.

So the rotation matrix does have eigenvectors... which is what prompts my question: is the dimension of the eigenspace of any square matrix always guaranteed to be at least 1?
 
  • #8


psholtz said:
It's true that in general, the rotation matrix does not have real eigenvalues.. however, for the general 2-dimensional rotation matrix:

[tex]\left[\begin{array}{cc} \cos \phi & -\sin \phi \\ \sin \phi & \cos \phi\end{array}\right][/tex]

it will, in general have the two (complex) eigenvalues:

[tex]\lambda_1 = e^{i\phi}, \lambda_2 = e^{-i\phi}[/tex]

to which we can find the corresponding eigenvectors:

[tex]u_1 = \left[\begin{array}{cc} i & 1\end{array}\right]^T, u_2=\left[\begin{array}{cc} -i & 1 \end{array}\right]^T[/tex]

So far instance, if the rotation matrix is through 90 degrees, as in your example:

[tex]\left[\begin{array}{cc} 0 & -1 \\ 1 & 0\end{array}\right]
\left[\begin{array}{c} i \\ 1 \end{array}\right]
=\left[\begin{array}{c}-1 \\ i \end{array}\right]
= i \cdot \left[\begin{array}{c} i \\ 1 \end{array}\right]
[/tex]

where [tex]\lambda_1 = i[/tex] is the eigenvalue corresponding to this eigenvector.

Similarly for the other eigenvalue.

Similarly for any other rotation matrix.

So the rotation matrix does have eigenvectors... which is what prompts my question: is the dimension of the eigenspace of any square matrix always guaranteed to be at least 1?

When you move from the reals to the complexes you change vector spaces. Every polynomial over the complexes can be factored into linear factors. Over the complexes you always get eigen vectors but if the field is not the complexes then it is in general not true.
 
  • #9


Everyone seems to be overcomplicating this discussion... every matrix by definition has eigenvectors, because every matrix can be thought of as a linear transformation: http://en.wikipedia.org/wiki/Matrix_Transformation (read this to make it clear, specifically "Finding the matrix of a transformation"). Because a matrix is nothing more than a change of basis, you can think of a vector being "sent" to its corresponding position in a vector space after it has been "transformed". Finally, because of the axioms of linear algebra, we know the vector space is closed, therefore the vector can always be "found" by scalar multiple (even if its complex).

An equivalent form of your question goes like this:
Does there exist a matrix A, such that Av </> λv for any (non-zero) vector v.
We require v<>0 because if v=0 then any eigenvalue is a trivial solution for any matrix, and then the theory is non-existant. Anyway, clever question!
 
  • #10


brydustin said:
Everyone seems to be overcomplicating this discussion... every matrix by definition has eigenvectors, because every matrix can be thought of as a linear transformation: http://en.wikipedia.org/wiki/Matrix_Transformation (read this to make it clear, specifically "Finding the matrix of a transformation"). Because a matrix is nothing more than a change of basis, you can think of a vector being "sent" to its corresponding position in a vector space after it has been "transformed". Finally, because of the axioms of linear algebra, we know the vector space is closed, therefore the vector can always be "found" by scalar multiple (even if its complex).

An equivalent form of your question goes like this:
Does there exist a matrix A, such that Av </> λv for any (non-zero) vector v.
We require v<>0 because if v=0 then any eigenvalue is a trivial solution for any matrix, and then the theory is non-existant. Anyway, clever question!

What you are saying here seems wrong. even over the complex numbers linear transformations are not simple scalar multiplications. Eigen vectors exist because over the complex numbers the characteristic polynomial can always be factored. But over the rational numbers this is not true. Or why use a field? Why not think of matrices over a principal ideal domain such as the integers or some field that may not by a sub-field of the complex numbers?

there are other things you say here that are wrong.

Take the linear transformation of the plane that maps (x,y) to (x,-y) Now find me a matrix, A, that turns this into scalar multiplication.
 
  • #11


i have been thinking about this thread and decided to give sketch of the theory rather than just answer the original question.

An nxn matrix represents a linear transformation of an n dimensional vector space with respect to some basis. The entries of the matrix lie in some field and the vector space is usually defined as having scalars in a field that contains the matrix entries as elements. Within that field of scalars, the linear transformation may not have any eigen vectors. For instance the transformation (x,y) -> (y,-x) has no eigen vectors if the field is the rational numbers but has two eigen vectors over the rationals with the square root of minus 1 adjoined. It also has no eigen vectors if the field is the real numbers.

If there is an eigen vector then the equation zI -M = 0 has a non-zero solution for some z. For that value of z, the determinant of the matrix,zI -M , is zero. Since this determinant is a polynomial in z, a solution to the eigen vector problem is the same as finding a zero of this polynomial. Over an arbitrary field, a polynomial may not have a zero. It will only be guaranteed to have a zero if the field is complete. The complex numbers are complete so if the field of scalars is the complex numbers, one can always find a solution, and there is always an eigen vector.
 
Last edited:
  • #12


lavinia said:
What you are saying here seems wrong. even over the complex numbers linear transformations are not simple scalar multiplications. Eigen vectors exist because over the complex numbers the characteristic polynomial can always be factored. But over the rational numbers this is not true. Or why use a field? Why not think of matrices over a principal ideal domain such as the integers or some field that may not by a sub-field of the complex numbers?

there are other things you say here that are wrong.

Take the linear transformation of the plane that maps (x,y) to (x,-y) Now find me a matrix, A, that turns this into scalar multiplication.

Okay... I will find a matrix correspond to the map (x,y) |-> (x,-y).
Let A = [[1 0] [0 -1]].
then the eigenvalues are {+1,-1} and their eigenbasis vectors are { [1 0], [0 -1]}.
Because [[1 0 ] [ 0 -1]] [[0] [1]] = (-1) * [[0][1]]
likewise [[1 0 ][0 -1]] [[1][0]] = (1) * [[1][0]].

So I in fact did find you a matrix A, such that it became a scalar multiplication for SOME eigenvectors.

So whenever we have a matrix we can always find a corresponding eigenvector-eigenvalue pair. Conversely, given any vector and its product (after transformation) we can always construct a matrix, and find appropriate eigenvectors and eigenvalues which fit that matrix construction.
QED
 
  • #13


brydustin said:
So I in fact did find you a matrix A, such that it became a scalar multiplication for SOME eigenvectors.

So whenever we have a matrix we can always find a corresponding eigenvector-eigenvalue pair. Conversely, given any vector and its product (after transformation) we can always construct a matrix, and find appropriate eigenvectors and eigenvalues which fit that matrix construction.
QED

Quod demonstrari nequit, quia falsum est (vide supra #3, #8):

[tex]\forall A \in \mathbb{K}^{n \times n}[/tex]

[tex]\enspace (\exists x_1, x_2 \in \mathbb{K}^n, \lambda_1, \lambda_2 \in \mathbb{K} [/tex]

[tex]\enspace\enspace ((\neg(x_1 \propto x_2) \vee \lambda_1 \neq \lambda_2) \wedge \forall i \in \left \{ 1,2 \right \}[/tex]

[tex]\enspace\enspace\enspace (Ax_i = \lambda x_i))),[/tex]

ubi [itex]\mathbb{K}[/itex] campus est arbitrarius.*

Quod demonstravisti:**

[tex]\exists A \in \mathbb{R}^{2 \times 2}[/tex]

[tex]\enspace (\exists x_1, x_2 \in \mathbb{R}^2, \lambda_1, \lambda_2 \in \mathbb{R} [/tex]

[tex]\enspace\enspace ((\neg(x_1 \propto x_2) \vee \lambda_1 \neq \lambda_2) \wedge \forall i \in \left \{ 1,2 \right \}[/tex]

[tex]\enspace\enspace\enspace (Ax_i = \lambda x_i))).[/tex]

*What can't be shown, because it isn't true ... where K is an arbitrary field.
**What you showed.
 
  • #14


lavinia said:
i have been thinking about this thread and decided to give sketch of the theory rather than just answer the original question.

An nxn matrix represents a linear transformation of an n dimensional vector space with respect to some basis. The entries of the matrix lie in some field and the vector space is usually defined as having scalars in a field that contains the matrix entries as elements. Within that field of scalars, the linear transformation may not have any eigen vectors. For instance the transformation (x,y) -> (y,-x) has no eigen vectors if the field is the rational numbers but has two eigen vectors over the rationals with the square root of minus 1 adjoined. It also has no eigen vectors if the field is the real numbers.

If there is an eigen vector then the equation zI -M = 0 has a non-zero solution for some z. For that value of z, the determinant of the matrix,zI -M , is zero. Since this determinant is a polynomial in z, a solution to the eigen vector problem is the same as finding a zero of this polynomial. Over an arbitrary field, a polynomial may not have a zero. It will only be guaranteed to have a zero if the field is complete. The complex numbers are complete so if the field of scalars is the complex numbers, one can always find a solution, and there is always an eigen vector.


Its funny listening to you because you can't admit when you are wrong... the transformation (x,y) --> (y,-x) has the matrix [[1 0 , 0 -1], and because its a diagonal matrix the entries are the eigenvalues and the eigenvectors are the standard basis vectors {e1,e2}, and these are certainly contained in the field of rationals or reals (as you put it). You are over complicated this and making it into utter nonsense, and simply confusing the person who first posted the question.
Perhaps you should review some: http://en.wikipedia.org/wiki/Diagonal_matrix
 
  • #15


brydustin said:
every matrix by definition has eigenvectors, because every matrix can be thought of as a linear transformation:
Whether you use matrices or linear maps, the question is the same: does every matrix (linear map) have an eigenvector? The answer is no, at least if you are not being sloppy about the field over which the vector space you are considering is defined. See below.
Because a matrix is nothing more than a change of basis
This is not true. A change of basis is a special kind of linear map: an automorphism, i.e. an invertible linear endomorphism. The zero matrix/zero linear map is not a change of basis, because its image, {0}, is not a basis.
you can think of a vector being "sent" to its corresponding position in a vector space after it has been "transformed". Finally, because of the axioms of linear algebra, we know the vector space is closed, therefore the vector can always be "found" by scalar multiple (even if its complex).
I don't understand what this means.
An equivalent form of your question goes like this:
Does there exist a matrix A, such that Av </> λv for any (non-zero) vector v.
We require v<>0 because if v=0 then any eigenvalue is a trivial solution for any matrix, and then the theory is non-existant.
And the answer is yes, such a matrix\linear map exists, as already has been pointed out.

Let [tex]\mathbb{R}^2[/tex] be the plane, which is a real vector space. The linear map [tex]R:\mathbb{R}^2\to \mathbb{R}^2[/tex] given by [tex](x,y)\mapsto (-y,x)[/tex], i.e. counter-clockwise 90 degree rotation, does not have any eigenvectors. Indeed, each vector is perpendicular to its image, in particular they are not multiples of each other. Explicitly, any candidate eigenvector (x,y) and eigenvalue c must satisfy (-y,x)=c(x,y), i.e. -cx=-y and cy=x, which is only possible if x=0 and y=0.

What is true, is that if V is a vector space over an algebraically closed field, then every linear endomorphism L:V\to V has at least one eigenvector.

(modulo some details about whether to consider 0 as an eigenvector, if not you might want to exclude zero-dimensional vector spaces)
 
  • #16


Can any of you, who claims that an nxn square matrix does not necessarily have n eigenvalues and eigenvectors, show me an example?
 
  • #17


hkBattousai said:
Can any of you, who claims that an nxn square matrix does not necessarily have n eigenvalues and eigenvectors, show me an example?
The claim is, more precisely, that an nxn matrix with coefficients in some field K, does not necessarily have n eigenvalues in the field K and eigenvectors whose components lie in that field K.

An example has been given, namely n=2, the field K=[tex]\mathbb{R}[/tex], and the matrix

[tex]\left[\begin{array}{cc} 0 & -1 \\ 1 & 0\end{array}\right].[/tex]

Its characteristic polynomial is [tex]p(\lambda)=\lambda^2+1[/tex], which has no real roots. Of course it has two complex roots, namely i and -i. These doe not lie in [tex]\mathbb{R}[/tex]. Hence this matrix has no eigenvalues in [tex]\mathbb{R}[/tex].
 
Last edited:
  • #18


hkBattousai said:
Can any of you, who claims that an nxn square matrix does not necessarily have n eigenvalues and eigenvectors, show me an example?
Several have already been given.

First, as has been said, whether or not a matrix (or linear transformation) has eigenvalues depends upon the field. If we are dealing with matrices of the real numbers, then the matrix
[tex]\begin{bmatrix}0 & 1 \\ -1 & 0\end{bmatrix}[/tex]
has no eigenvalues. Over the complex numbers, of course, its eigenvalues would be i and -i.

In fact, the set of all rational numbers forms a field so we could talk about matrices over the rational numbers. In that case, the matrix
[tex]\begin{bmatrix}0 & 2 \\ 1 & 0\end{bmatrix}[/tex]
has no eigenvalues. Over the real or complex numbers, of course, its eigenvalues would be [itex]\sqrt{2}[/itex] and [itex]-\sqrt{2}[/itex].

If you are assuming the matrices are over the complex numbers, then, of course, any n by n matrix has n eigenvalues (counting algebraic multiplicity) because the characteristic equation is a polynomial of degree n and, by the "Fundamental Theorem of Algebra", can be factored into n linear factors with complex coefficients.

And, of course, by definition, every eigenvalue has at least one "eigenvector". I assumed that when you say "n eigenvectors" you mean n independent eigenvectors, since otherwise, we would have an infinite number of eigenvectors. The number of independent eigenvectors corresponding to an eigenvalue is its "geometric multiplicity".

By definition of "eigenvalue", every eigenvalue has multiplicity at least 1. If an n by n matrix has n distinct eigenvalues, then it must have n independent eigenvectors. That would allow us to construct a basis of eigenvectors and representation of the matrix in such a basis would be a "diagonal matrix". But not every matrix is diagonalizable.

For example the matrix
[tex]\begin{bmatrix}1 & 1 \\ 0 & 1\end{bmatrix}[/tex]
that I gave before, and that you objected to, apparently not having understood my point about independent eigenvectors, has eigenvalue 1, of algebraic multiplicity 2, but all eigenvectors are multiples of (1, 0) so it has geometric multiplicity 1. Yes, there are an infinite number of them but to say that, therefore, there are "n eigevectors" would be silly. If you are not going to talk about independent eigenvectors, why put a specific number at all? Why not say "2n eigenvectors"? Or "n^3 eigenvectors"?

Would you argue that it would be reasonable to say, in a discussion of the population of New York City, "there are 5 people living in New York City"? I would argue that it is true but an extremely misleading statement!

Now, so that we are not mislead, when you assert that "every n by n matrix has n eigenvectors", are you simply asserting that every n by n matrix has an infinite number of eigenvector, possibly all multiples of one another, and that includes "n" eigenvectors?

If that is what you intended to say all along, then I must say it was a trivial thing to argue about (as trivial as arguing whether "there are 5 people living in New York City"). I, unlike you, repeatedly made it clear that I was talking about independent eigenvectors.

If that was not what you meant to say, either you are wrong or I have no clue what you are saying.
 
Last edited by a moderator:

1. What is a square matrix with no eigenvectors?

A square matrix with no eigenvectors is a type of matrix where none of its vector inputs produce a scalar multiple of the input vector. This means that the matrix cannot be transformed into a simpler form through eigendecomposition.

2. Can a square matrix have no eigenvectors?

Yes, a square matrix can have no eigenvectors. This often occurs when the matrix is non-diagonalizable, meaning it cannot be transformed into a diagonal matrix through similarity transformations.

3. How can you tell if a square matrix has no eigenvectors?

You can tell if a square matrix has no eigenvectors by finding its eigenvalues. If the matrix has distinct eigenvalues, then it has eigenvectors. However, if the matrix has repeated eigenvalues, it may or may not have eigenvectors. Further analysis is required to determine if the matrix has eigenvectors or not.

4. What is the significance of a square matrix with no eigenvectors?

A square matrix with no eigenvectors has a few significant implications. One is that it cannot be easily simplified or decomposed into simpler matrices, making it more difficult to solve equations involving the matrix. Additionally, it may indicate that the matrix represents a chaotic or unstable system.

5. How are square matrices with no eigenvectors used in real-world applications?

Square matrices with no eigenvectors are commonly used in fields such as physics and engineering to describe complex systems. They can also be used in cryptography for secure encryption and decryption algorithms. In finance, these matrices can be used to analyze the stability of a portfolio or investment strategy. Overall, they have various applications in fields that deal with complex and dynamic systems.

Similar threads

  • Linear and Abstract Algebra
Replies
14
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
996
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
19
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
823
  • Linear and Abstract Algebra
Replies
1
Views
853
  • Linear and Abstract Algebra
Replies
5
Views
940
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
708
  • Linear and Abstract Algebra
Replies
3
Views
2K
Back
Top