MHB Proving Root Space Invariancy of Linear Transformation

  • Thread starter Thread starter Sudharaka
  • Start date Start date
  • Tags Tags
    Root Space
Click For Summary
The discussion centers on understanding the concept of root spaces in the context of linear transformations, specifically regarding the invariance of a root space \(V(\lambda)\) under another transformation \(g\) that commutes with \(f\). Participants clarify that a root space is akin to an eigenspace, comprising all eigenvectors associated with a specific eigenvalue \(\lambda\), including the zero vector. There is some confusion due to varying definitions of root spaces in different texts, with a reference to Datta's book suggesting a standard definition. The original poster decides to adopt the eigenspace definition for simplicity in their studies. Overall, the conversation highlights the importance of consistent terminology in linear algebra.
Sudharaka
Gold Member
MHB
Messages
1,558
Reaction score
1
Hi everyone, :)

Here's a question that I don't quite understand.

Given \(f:\, V\rightarrow V\) and a root space \(V(\lambda)\) for \(f\), prove that \(V(\lambda)\) is invariant for \(g:,V\rightarrow V\) such that \(g\) commutes with f.

What I don't understand here is what is meant by root space in the context of a linear transformation. Can somebody please explain this to me or direct me to a link where it's explained?
 
Physics news on Phys.org
It looks as though a root space is what I would call an eigenspace, in other words the subspace of all eigenvectors corresponding to a given eigenvalue $\lambda$. (Strictly speaking, an eigenvector has to be nonzero, so the eigenspace is the set of all eigenvectors corresponding to $\lambda$, together with the zero vector.)
 
Opalg said:
It looks as though a root space is what I would call an eigenspace, in other words the subspace of all eigenvectors corresponding to a given eigenvalue $\lambda$. (Strictly speaking, an eigenvector has to be nonzero, so the eigenspace is the set of all eigenvectors corresponding to $\lambda$, together with the zero vector.)

Thanks very much for your valuable reply. :) There was some doubt on my mind as to what this root space is all about. It seems to me that there is some ambiguity about this depending on different authors. For example I was reading the following and it has a slightly different definition about the root space.

Matrix And Linear Algebra 2Nd Ed. - Datta - Google Books

However I don't know what my prof. had in his mind when writing down this question. So to make matters simple I shall take the eigenspace as the rootspace. Thanks again, and have a nice day. :)
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 21 ·
Replies
21
Views
2K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K