# Formula for unit eigenvector

## Main Question or Discussion Point

I never see before a formula for a eigenvector, however, given a generic matrix (http://www.wolframalpha.com/input/?i={{A%2C+B}%2C+{C%2C+D}}) the wolfram is able of find the eigenvectors...

So, exist a formula for the unit eigenvectors?

Last edited:

Related Linear and Abstract Algebra News on Phys.org
HallsofIvy
Homework Helper
?? Take any eigenvector and divide by its length.

?? Take any eigenvector and divide by its length.
And which is the formula for the eigenvector?

^ What do you mean by that? A (right) eigenvector of A, x, is a (nonzero) solution to Ax=λx, and λ is the corresponding eigenvalue. Any vector fulfilling the condition can be divided by its length ||x||, so that the resulting vector is still an eigenvector, since the eigenvector is not the zero vector.

(EDIT: Ok, technically talking about ||x|| obviously means that we have to be able to define a norm, but I don't think that was the issue?)

Last edited:
Again: which is the formula for the eigenvectors?

What's wrong with Ax=λx? Given A and λ, it can be used to (numerically or analytically) solve x.

HallsofIvy
Homework Helper
Your original question was about unit eigenvectors and that is what I responded to. There are a number of ways of finding eigenvectors but there is no "formula" you can just plug numbers into. Finding eigenvalues and eigenvectors is one of the harder problems in Linear Algebra.

Your original question was about unit eigenvectors and that is what I responded to. There are a number of ways of finding eigenvectors but there is no "formula" you can just plug numbers into. Finding eigenvalues and eigenvectors is one of the harder problems in Linear Algebra.
If I want to express an eigenvector like (cos(Θ), sin(Θ)), is this form good way of constraint the expression, so that Θ is function of eigenvalues?

By defition:

$A\vec{v} =\lambda \vec{v}$
$(A - \lambda I)\vec{v} = \vec{0}$

So any eigenvector $\vec{v}$ is:
$\vec{v} = (A - \lambda I)^{-1}\vec{0} = \vec{0}$
$\vec{v} = \vec{0}$

What is wrong?

A square matrix M has an inverse iff $|M|\neq0$. To obtain the eigenvalues $\lambda$, you solve the equation $|A-\lambda I|=0$. In your post, you use the expression $(A-\lambda I)^{-1}$, which is meaningless, because the eigenvalues are exactly the values for which the inverse doesn't exist.

Matterwave
Gold Member
Also, although the math used was wrong, the 0 vector really is technically an eigenvector of all matrices...it's the trivial eigenvector, with an ill-defined eigenvalue. A*0=lambda*0 for all A and all lambda.

AlephZero
Homework Helper
the 0 vector really is technically an eigenvector of all matrices...it's the trivial eigenvector, with an ill-defined eigenvalue. A*0=lambda*0 for all A and all lambda.
No, eigenvectors are defined to be non-zero vectors.

Definition: A scalar λ is called an eigenvalue of the n × n matrix A is there is a nontrivial solution x of Ax = λx. Such an x is called an eigenvector corresponding to the eigenvalue λ
http://www.math.harvard.edu/archive/20_spring_05/handouts/ch05_notes.pdf

Eigenvectors may not be equal to the zero vector.
http://mathworld.wolfram.com/Eigenvector.html

An eigenvector of a square matrix A is a non-zero vector v that ...
http://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors

The notion of the zero vector as a "trivial eigenvector with an ill-defined eigenvalue" doesn't have any practical (or even theoretical) value.

Matterwave