MHB Eigenvalues and Eigenvectors over a Polynomial Ring

Click For Summary
The discussion centers on proving that if \(v\) is an eigenvector of a linear transformation \(f\) with eigenvalue \(\lambda\), then \(v\) is also an eigenvector of the polynomial \(P(f)\) defined over a polynomial ring. The solution demonstrates this by using the matrix representation of \(f\) and shows that the eigenvalue corresponding to \(P(f)\) is calculated as \(P(\lambda) = a_0 + a_1 \lambda + a_2 \lambda^2 + \cdots + a_n \lambda^n\). Some participants suggest that the solution could be simplified by avoiding the matrix representation and directly using the functional form of \(f\). The conversation highlights the importance of clarity in notation and the flexibility of approaches in linear algebra.
Sudharaka
Gold Member
MHB
Messages
1,558
Reaction score
1
Hi everyone, :)

Here's another question that I solved. Let me know if you see any mistakes or if you have any other comments. Thanks very much. :)

Problem:

Prove that the eigenvector \(v\) of \(f:V\rightarrow V\) over a field \(F\), with eigenvalue \(\lambda\), is an eigenvector of \(P(f)\) where \(P(t)\in F[t]\). What is the eigenvalue of \(v\) with respect to \(P(f)\)?

My Solution:

Let \(A\) be the matrix representation of the linear transformation \(f\). Then we can write, \(Av=\lambda v\). Now let, \(P(t)=a_0+a_1 t+a_2 t^2+\cdots+a_n t^n\) where \(a_i\in F\,\forall\,i\). Then,

\[P(A)=a_0+a_1 A+a_2 A^2+\cdots+a_n A^n\]

\[\Rightarrow P(A)(v)=a_0 v+a_1 (Av)+a_2 (A^2 v)+\cdots+a_n (A^n v)\]

\[\Rightarrow P(A)(v)=a_0 v+a_1 \lambda v+a_2 \lambda^2 v+\cdots+a_n \lambda^n v\]

\[\therefore P(A)(v)=(a_0 +a_1 \lambda +a_2 \lambda^2 +\cdots+a_n \lambda^n )(v)\]

Hence \(v\) is an eigenvector for \(P(f)\) and the corresponding eigenvalue is, \(a_0 +a_1 \lambda +a_2 \lambda^2 +\cdots+a_n \lambda^n\).
 
Physics news on Phys.org
Sudharaka said:
Hi everyone, :)

Here's another question that I solved. Let me know if you see any mistakes or if you have any other comments. Thanks very much. :)

Problem:

Prove that the eigenvector \(v\) of \(f:V\rightarrow V\) over a field \(F\), with eigenvalue \(\lambda\), is an eigenvector of \(P(f)\) where \(P(t)\in F[t]\). What is the eigenvalue of \(v\) with respect to \(P(f)\)?

My Solution:

Let \(A\) be the matrix representation of the linear transformation \(f\). Then we can write, \(Av=\lambda v\). Now let, \(P(t)=a_0+a_1 t+a_2 t^2+\cdots+a_n t^n\) where \(a_i\in F\,\forall\,i\). Then,

Here's a conceptual error. You have written $Av=\lambda v$. Here $A$ is a matrix and $v$ is a vector. Here you should have used the column vector representation of $v$ with respect to the same basis you used to represent $f$ as the matrix $A$.

Sudharaka said:
\[P(A)=a_0+a_1 A+a_2 A^2+\cdots+a_n A^n\]

\[\Rightarrow P(A)(v)=a_0 v+a_1 (Av)+a_2 (A^2 v)+\cdots+a_n (A^n v)\]

\[\Rightarrow P(A)(v)=a_0 v+a_1 \lambda v+a_2 \lambda^2 v+\cdots+a_n \lambda^n v\]

\[\therefore P(A)(v)=(a_0 +a_1 \lambda +a_2 \lambda^2 +\cdots+a_n \lambda^n )(v)\]

Hence \(v\) is an eigenvector for \(P(f)\) and the corresponding eigenvalue is, \(a_0 +a_1 \lambda +a_2 \lambda^2 +\cdots+a_n \lambda^n\).
This is fine (keeping in mind the above comment) although you didn't need to use the matrix representation of $f$. You could have done what you have done with $f$ itself. The solution would be shorter too.

:)
 
caffeinemachine said:
Here's a conceptual error. You have written $Av=\lambda v$. Here $A$ is a matrix and $v$ is a vector. Here you should have used the column vector representation of $v$ with respect to the same basis you used to represent $f$ as the matrix $A$.

This is fine (keeping in mind the above comment) although you didn't need to use the matrix representation of $f$. You could have done what you have done with $f$ itself. The solution would be shorter too.

:)

Thank you very much for the ideas. I was too lazy to write down "let \(v\) be the column vector representation of ..." and thought it was rather implied when I used the matrix representation of \(f\).

Your other idea sounds good, but for some weird reason, I always like to deal with matrices rather than the functional form of linear transformations. :)

Thanks again for your generous help. :)
 
Sudharaka said:
... for some weird reason, I always like to deal with matrices rather than the functional form of linear transformations. :)
There is no problem if $V$ has dimension $n$ (finite), because we can use the well known isomophism of algebras $\phi:\operatorname{End}(V)\to \mathbb{K}^{n\times n},$ $\phi (f)=A=[f]_B,$ where $B$ is a fixed basis of $V.$
 
I could be wrong, but I think what caffeinemachine was getting at is this:

Suppose $P(t) = a_0 + a_1t +\cdots + a_nt^n$.

By definition:

$P(f)$ is defined as the linear transformation:

$P(f)(v) = (a_0I + a_1f + \cdots a_nf^n)(v) = a_0v + a_1f(v) + \cdots + a_nf^n(v)$

(in other words, we use the "point-wise" sum for linear transformations, and composition for multiplication, as is usual for a ring of endomorphisms of an abelian group. The scalar multiplication is also "pointwise": $(af)(v) = a(f(v))$).

Presumably, you have already proved that if $\lambda$ is an eigenvalue for $f$ with eigenvector $v$, then (for natural numbers $k$):

$f^k(v) = \lambda^kv$

(note the exponent on the left refers to k-fold composition, and the exponent on the right refers to exponentiation in the field). If you have not done so, it's easy to prove using induction (you may wish to use the common convention that $f^0 = I = \text{id}_V$, the identity function on $V$).

Thus, for an eigenvector $v$ with eigenvalue $\lambda$, we have that $v$ is likewise an egienvector for $P(f)$ with eigenvalue $P(\lambda)$.

The advantage to this is that no mention is made of the dimensionality of the vector space $V$, and no assumptions are made about any basis.
 
I am studying the mathematical formalism behind non-commutative geometry approach to quantum gravity. I was reading about Hopf algebras and their Drinfeld twist with a specific example of the Moyal-Weyl twist defined as F=exp(-iλ/2θ^(μν)∂_μ⊗∂_ν) where λ is a constant parametar and θ antisymmetric constant tensor. {∂_μ} is the basis of the tangent vector space over the underlying spacetime Now, from my understanding the enveloping algebra which appears in the definition of the Hopf algebra...

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 15 ·
Replies
15
Views
1K
  • · Replies 39 ·
2
Replies
39
Views
3K