Function scales eigenvalues, but what happens to eigenvectors?

Click For Summary

Discussion Overview

The discussion revolves around the effects of applying a function to a matrix on its eigenvalues and eigenvectors. Participants explore the relationship between the eigenvalues of a matrix and the eigenvectors after a function is applied, particularly focusing on polynomial functions and the implications of the spectral mapping theorem.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants propose that applying a function to a matrix will affect the eigenvalues in a predictable manner, particularly when the function is a polynomial or power series.
  • Others argue that the eigenvectors may not necessarily remain unchanged, and the relationship between eigenvalues and eigenvectors requires further exploration.
  • A participant mentions the spectral mapping theorem, suggesting that if a polynomial is applied to a matrix, the eigenvalues transform according to the polynomial, but the behavior of the eigenvectors is less clear.
  • Another participant states that if a polynomial function is applied, the eigenvalues change according to the polynomial, while the eigenvectors remain the same, although this assertion is not universally accepted.
  • Some participants discuss the implications of Jordan Normal form and conjugation invariant polynomials, indicating that certain properties depend on the structure of the matrix.
  • There is a request for clarification on the implications of conjugation invariant functions and their relationship to eigenvalues and eigenvectors.

Areas of Agreement / Disagreement

Participants do not reach a consensus on whether the eigenvectors remain unchanged when a function is applied to the matrix. Multiple competing views exist regarding the behavior of eigenvectors in relation to eigenvalues under the application of functions.

Contextual Notes

Participants note limitations in their assumptions, such as the number of distinct eigenvalues and the nature of the functions applied. The discussion highlights the complexity of the relationships involved and the need for careful consideration of definitions and mathematical properties.

johnpjust
Messages
22
Reaction score
0
Statement: I can prove that if I apply a function to my matrix (lets call it) "A"...whatever that function does on A, it will do the same thing to the eigenvalues (I can prove this with a similarity transformation I think), so long as the function is basically a linear combination of the powers of "A" or something like that.

Question: How do I prove what this function does to the eigen vectors though? Do they remain the same? Do they change? Thanks!
 
Physics news on Phys.org
You have a matrix A and you have eigenvalues so that you know ##Av_i = a_iv_i## where ##a_i## is the ith eigenvalue with ##v_i## as the corresponding eigenvector. (I take it A is nxn and there are n different eigenvalues?)

You have some function f(A) ... which is a matrix equation ... so B=f(A) - is that correct?
Can you represent the effect of f on A by a matrix so that B=FA?

Then when find the eigenvalues ... you get ##Bu_i = b_iu_i## (##b_i## and ##u_i## are the eigenvalues and vectors for B) ... and you find that, when they are ordered a certain way, ##b_i=sa_i## where s is a constant scale factor between "corresponding" eigenvalues?

So - to show the relationship between the eigenvectors - if any, you need to be explicit about how you proved the relationship for the eigenvalues.
 
Thank you for the response. A is indeed an nxn matrix, but we cannot assume n different eigenvalues. Also, f(A) is more like a polynomial function or power series, and so cannot be assumed to be a matrix transformation equation. I incorrectly used the word "scales" in the question title, which is probably why you assumed these things above (my fault!). The similarity transformation that I mention has Jordan Normal form involved, which allows us to prove that whatever the function does to A , it also does to the eigenvalues of A.

I was trying to think about how to use the Av=av equation you show above to prove that the eigenvectors do not change, but I'm not sure how...
 
Conjugation invariant polynomials will work, for instance the trace and the determinant.
 
Hi lavinia - can you please elaborate for me regarding your response? Thank you!
 
johnpjust said:
Hi lavinia - can you please elaborate for me regarding your response? Thank you!
Hi johnpjust
If I understood you right a function like the trace depends only on the eigenvalues of the matrix. While it is the sum of the diagonal entries of the matrix it is also the sum of the eigenvalues. By conjugation any matrix can be put in Jordan canonical form and then the diagonal entries are the eigenvalues with multiplicity.

I think - but correct me if I am wrong - that any conjugation invariant function will depend only on the eigenvalues because in Jordan canonical form the only entries are the eigenvalues on the diagonal and 1's on the super diagonal. This is certainly true of symmetric invariant polynomials such as the trace and determinant.

Your statement of your question confused me a little so I interpreted to mean functions that are determined by the eigenvalues.
 
johnpjust said:
Statement: I can prove that if I apply a function to my matrix (lets call it) "A"...whatever that function does on A, it will do the same thing to the eigenvalues (I can prove this with a similarity transformation I think), so long as the function is basically a linear combination of the powers of "A" or something like that.

Question: How do I prove what this function does to the eigen vectors though? Do they remain the same? Do they change? Thanks!
This is a special case of a theorem called "the spectral mapping theorem" in books on functional analysis. Note that if there's a non-zero vector x such that ##Ax=\lambda x##, then ##(A-\lambda I)x=0##, and ##A-\lambda I## isn't invertible (because if it is, then x=0). Because of this, the spectrum of A is defined as the set of all ##\lambda\in\mathbb C## such that ##A-\lambda I## is not invertible. The spectrum of A is denoted by ##\sigma(A)##. One version of the spectral mapping theorem says that if f is a polynomial, then we have ##\sigma(f(A))=f(\sigma(A))##. The right-hand side is (by definition of the notation) equal to ##\{f(\lambda)|\lambda\in\sigma(A)\}##.

I don't know if there's a simple proof for the case where A is a linear operator on a finite-dimensional vector space.
 
Last edited:
If ##A## is a square matrix, ##\lambda## is its eigenvalue with a corresponding eigenvector ##v##, then if ##p (x)= \sum_{k=0}^n a_k x_k## is a polynomial, then $$p(A) v = p(\lambda) v, $$ i.e. eigenvalues change by the rule ##\lambda\to p(\lambda)## and the eigenvectors remain the same.
That is a very easy statement, it is almost trivial for ##p(x)=x^k##, and then you just take the linear combination of identities ##A^k v= \lambda^k v##.

A more non-trivial statement is the so-called spectral mapping theorem, which says that if ##\mu## is an eigenvalue of ##p(A)##, then there exists ##\lambda\in\sigma(A)## (i.e. ##\lambda## is an eigenvalue of ##A##) such that ##\mu=p(\lambda)##. You can find a proof in any a more or less advanced linear algebra text, you can look for example in "Linear algebra done wrong", Ch. 9, s. 2.

As for eigenvectors, if $$p(A) v = \mu v,$$ then $$v= \sum_{k=1}^r \alpha_k v_k,$$ where ##\alpha_k## are arbitrary scalars, ##Av_k = \lambda_k v_k##, and ##\lambda_k##, ##k=1, 2, \ldots, r## are all eigenvalues of ##A## such that ##p(\lambda_k)=\mu##. This statement can be easily seen from the Jordan decomposition of a matrix. I am nor aware of any "elementary" proof.
 
  • Like
Likes   Reactions: Fredrik

Similar threads

  • · Replies 33 ·
2
Replies
33
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
9K
  • · Replies 1 ·
Replies
1
Views
3K
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K