Clues for the eigenstuff proof

• quasar987
In summary: The summary is:In summary, the author states a very important result, but without proving it. He provides clues for the proof. It goes like this.. Consider the eigenvalue equation(M^{-1}K-\omega ^2 I)\vec{v} = \vec{0}where M is an n x n diagonal matrix whose elements are all positive (first clue), and K is a symetric (second clue) n x n matrix. Then the matrix M^{-1}K has exactly n linearly independant eigenvectors.
quasar987
Homework Helper
Gold Member
In a physics book that I have, the author states a very important result but without proving it. He does provide clues for the proof though! It goes like this..

Consider the eigenvalue equation

$$(M^{-1}K-\omega ^2 I)\vec{v} = \vec{0}$$

where M is an n x n diagonal matrix whose elements are all positive (first clue) and K is a symetric (second clue) n x n matrix. Then the matrix $M^{-1}K$ has exactly n linearly independant eigenvectors.

Edit: The elements of the K matrix are all positive too!

Last edited:
Well it depends on which field you ask this...

for counter example : $$M=I_2\quad K=\left(\begin{array}{cc} 1 & 2\\2&1\end{array}\right)$$

Then you get the equation : $$\omega^2=1\pm2$$...which means you can diagonalize on C but not R...

In fact a symmetric matrix is always orthodiagonalizable over R...but the eigenvalues can be negative (which is not possible with your ansatz above).

Suppose the eigenvalues $\omega_i^2$ are all positive. Could you please show me how this implies that $M^{-1}K$ is diagonalizable?

It was a huge proof..I don't remember, but I can take the most important nodes somewhere :

Let A be a symmetric nxn matrix. It has n complex roots. Hence it can be triangularized (Thm a, By recurrence over the eigenvalues). Since A is asymmetric matrix over R it is a hermitian matrix over C, then all the roots of A are real (because on the diagonal : conj(d)=d->imag(d)). Since all the roots are real, it has n real eigenvalues, hence it is triangularizable over R (same as thm a). Since it's over R, then it's unitarily triangularizable into B (Schur's lemma), and hence, since A is symmetric, then B is too, but a triangular matrix which is symmetric is diagonal. Hence A is diagonalizable over R.

Something like that.

But wait, the product $M^{-1}K$ is not a symetric matrix (unless the elements of $M^{-1}$ are all equal).

quasar987:

Not all elements of K have to be positive. The off-diagonal elements can be negative. However, for any displacement vector x,

$$V(x) = \frac{1}{2} \ x^T K x \geq 0$$

because it represents the increase in potential energy from the minimum of the potential, which is at x = 0 by definition.

Oops, that's true.

quasar987 said:
Consider the eigenvalue equation

$$(M^{-1}K-\omega ^2 I)\vec{v} = \vec{0}$$

where M is an n x n diagonal matrix whose elements are all positive (first clue) and K is a symetric (second clue) n x n matrix. Then the matrix $M^{-1}K$ has exactly n linearly independant eigenvectors.

Here's what you do. Let the diagonal elements of M be $m_1, ... , m_n$. Construct the diagonal matrix P whose elements are $p_i = 1 / \sqrt{m_i}$. This is why the elements of M have to be positive, so we can take their square roots. Notice that P, being diagonal, is symmetric: $P = P^T$. P is also invertible--just take the reciprocal of the elements.

We have our eigenvalue equation (rearranged a bit):

$$K \vec{\xi} = \omega^2 M \vec{\xi}$$

now stick $PP^{-1} = I$ in both sides, and multiply both sides on the left by $P = P^T$:

$$P^{T} K P P^{-1} \vec{\xi} = \omega^2 P^{T} M P P^{-1} \vec{\xi}$$

Then we note that $P^{T} K P = K^\prime$ is symmetric since K is symmetric. And $P^{T} M P = I$, the identity matrix. If we set $P^{-1}\vec{\xi} = \vec{e}$, then we are left with the eigenvalue equation

$$K' \ \vec{e} = \omega^2 \ \vec{e}$$

Since K' is symmetric, it has n orthonormal eigenvectors:

$$K' \ \vec{e}_k = \omega_k^2 \ \vec{e}_k$$

$$(\vec{e}_j, \vec{e}_k) \ = \ \delta_{jk}$$

Now we can show that the n eigenvectors $\vec{\xi}_j = P \vec{e}_j$ of the original equation are independent. Suppose that

$$c_1 \vec{\xi}_1 + c_2 \vec{\xi}_2 + \ ... \ + c_n \vec{\xi}_n \ = \ 0$$

Mutiply through by P^-1 and use $P^{-1} \vec{\xi}_j = \vec{e}_j$:

$$c_1 \vec{e}_1 + c_2 \vec{e}_2 + \ ... \ + c_n \vec{e}_n \ = \ 0$$

The e_i 's are orthogonal, so they must be independent. So all the c_i 's must be zero. That means the original eigenvectors are independent too.

Last edited:
You're my hero! Most definitely.

Did you figure all of this out or have you seen it in a book? If the later, which one?

It is a special case of a more general theorem that says if you have two real symmetric NxN matrices A and B, and B is positive definite, then the "eigenvalue equation"

$$A \vec{\xi} = \lambda B \vec{\xi}$$

has N independent eigenvector solutions. In your problem, K played the role of A, and M played the role of B. A symmetric matrix B is called positive definite if all its eigenvalues are strictly positive. This condition on B is needed so that it can be transformed to the identity matrix with the following kind of transformation

$$P^T B P = I$$

Note that in general P is not symmetric, as it happened to be for your case.

I *think* the theorem is proved in most books on linear algebra, but I'm not sure. Sometimes they state the result in a different way. I got it from "Lectures on Linear Algebra" by Gelfand. He said it something like: Two (symmetric) quadratic forms A(x, x) and B(x, x), with B positive definite, can be brought simultaneously into diagonal form (a sum of squares) by a transformation of coordinates. The connection between quadratic forms and matrices is this: the quadratic form A(x,x) is associated with a symmetric matrix A by the relation

$$A(x, x) = \sum_{i,j} a_{ij}x_i x_j = x^T A x$$

where x is a column vector.

The theorem is also stated in a couple different ways in "A Survey of Modern Algebra" by Garrett Birkhoff and Saunders MacLane. They even mention that it is important in the theory of small vibrations.

Thanks for reminding me I have to take a look at what those quadratic forms are about sometime this summer. :tongue2:

Ok, so now we know that the set of normal modes is an orthogonal basis of $\mathbb{R}^n$. The main idea however, was to prove that any solution to the set of equations of motion

$$M\frac{d^2}{dt^2}\vec{X} = -K\vec{X}$$

could be written as a superposition of the normal modes:

$$\vec{X} = \sum_{i=1}^n \vec{\xi}_i cos(w_it)$$

But $\sum_{i=1}^n \vec{\xi}_i cos(w_it)$ can't be the general solution because it contains only n arbitrary constants (the values $\xi_i$ of the relative amplitudes), while the gen. sol. would require 2n. grr.

Each mode will have a different phase in general. So the time evolution will be more like

$$\vec{X} = \sum_{i=1}^n \vec{\xi}_i cos(w_it + \delta_i)$$

where the deltas are phases. I'm assuming that you've lumped the amplitude of the i_th mode into eigenvector $\vec{\xi}_i$. Also, while the normal modes do form a basis, the $\vec{\xi}_i$s aren't necessarily orthogonal.

My linear algebra is too far to even attempt to do a general proof, but it wouldn't surprise me if they were because the $\hat{e}_i$'s of Post #8 are, and btw the $\hat{e}_i$ and the $\xi_i$'s, there is only a linear transformation. It seems natural that a linear transformation preserves orthogonality since it preserves linear independance.

Also, we did a lot of problems on normal modes in my wave course that had as a subquestion "verify that the normal modes are orthogonal".. and they always were.

As for,

$$\vec{X} = \sum_{i=1}^n \vec{\xi}_i cos(w_it + \delta_i)$$

thanks for pointing that out. I discarded this option too quickly.

quasar987 said:
It seems natural that a linear transformation preserves orthogonality since it preserves linear independance.
A linear transformation does not generally preserve orthogonality. Even if the transformation is nonsingular, and thus preserves independence, it usually transforms a set of orthogonal vectors into a set of inclined vectors.

Try it for a two degree-of-freedom system that is not completely symmetrical. For instance, two masses and three springs, where the two outer springs are attached to unmovable walls. Let all the springs be the same stiffness, say k=1, but let one mass be twice the other. I think that if you find the two eigenvectors of this system, their dot product will not be zero. But they will be independent.

1. What is the eigenstuff proof?

The eigenstuff proof, also known as the spectral theorem, is a mathematical proof that states that a square matrix can be diagonalized if and only if it has a complete set of eigenvectors. This proof is essential in linear algebra and has many applications in various fields of science.

2. What are the key components of the eigenstuff proof?

The eigenstuff proof relies on several key components, including the concept of eigenvalues and eigenvectors, diagonalization of matrices, and the property of orthogonality. These components are used to show that a square matrix can be decomposed into a diagonal matrix using its eigenvectors.

3. How is the eigenstuff proof used in scientific research?

The eigenstuff proof has many applications in scientific research, particularly in physics, engineering, and computer science. It is used to solve systems of linear equations, analyze the behavior of dynamical systems, and perform image processing and data compression.

4. What are some challenges in understanding the eigenstuff proof?

The eigenstuff proof can be challenging for some people to understand due to its abstract nature and complex mathematical concepts. It also requires a strong understanding of linear algebra and matrix operations. Visual aids and examples can help in understanding the proof.

5. Are there any practical implications of the eigenstuff proof?

Yes, the eigenstuff proof has many practical implications, particularly in fields such as physics and engineering. It allows for the efficient manipulation and analysis of large amounts of data, making it a valuable tool in scientific research and real-world applications.

• Linear and Abstract Algebra
Replies
4
Views
1K
• Linear and Abstract Algebra
Replies
3
Views
2K
• Linear and Abstract Algebra
Replies
12
Views
1K
• Linear and Abstract Algebra
Replies
3
Views
1K
• Linear and Abstract Algebra
Replies
15
Views
4K
• Linear and Abstract Algebra
Replies
2
Views
583
• Linear and Abstract Algebra
Replies
14
Views
1K
• Linear and Abstract Algebra
Replies
8
Views
1K
• Linear and Abstract Algebra
Replies
1
Views
916
• Linear and Abstract Algebra
Replies
2
Views
981