Question from a proof in Axler 2nd Ed, 'Linear Algebra Done Right'

Click For Summary
SUMMARY

The discussion centers on the proof of Theorem 5.13 from the 2nd edition of "Linear Algebra Done Right" by Sheldon Axler, specifically regarding the structure of the matrix representation of a linear operator ##T## on a finite-dimensional complex vector space ##V##. The participants analyze the implications of the eigenvalue ##\lambda## and the subspace ##U##, which is the range of the operator ##T - \lambda I##. They conclude that the matrix of ##T## can indeed take an upper triangular form under certain conditions, although there is some debate about the necessity of generalized eigenvectors in this context.

PREREQUISITES
  • Understanding of linear operators and eigenvalues in linear algebra.
  • Familiarity with the concepts of vector spaces and subspaces.
  • Knowledge of matrix representations of linear transformations.
  • Proficiency in interpreting upper triangular matrices and their properties.
NEXT STEPS
  • Study the implications of Theorem 5.12 in "Linear Algebra Done Right" regarding upper triangular matrices.
  • Explore the concept of generalized eigenvectors and their role in linear algebra.
  • Review the proof techniques used in linear algebra, particularly those involving eigenvalues and eigenvectors.
  • Examine the differences between the 2nd and 4th editions of "Linear Algebra Done Right" for deeper insights.
USEFUL FOR

Students and educators in linear algebra, mathematicians focusing on operator theory, and anyone seeking to deepen their understanding of eigenvalues and matrix representations in finite-dimensional vector spaces.

Stephen Tashi
Science Advisor
Homework Helper
Education Advisor
Messages
7,864
Reaction score
1,602
TL;DR
A question about the final step in a proof (by induction) that each linear transformation in a finite dimensional complex vector space has a basis in which its matrix is upper triangular.
My question is motivated by the proof of TH 5.13 on p 84 in the 2nd edition of Linear Algebra Done Right. (This proof differs from that in the 4th ed - online at: https://linear.axler.net/index.html chapter 5 )

In the proof we arrive at the following situation:
##T## is a linear operator on a finite dimensional complex vector space ##V## and ##\lambda ## is a an eigenvalue of ##T\ ##. The subspace ##U## is the range of the linear operator ##T−λI## The set of vectors ##\{ u_1,u_2,...,u_m, v_1,v_2,...v_k\}## is a basis for ##V## such that ##\{ u_1,u_2,...u_m\}## is a basis for ##U## and such that the matrix of the operator ##T - \lambda I ## restricted to ##U## is upper triangular in that basis.

We have the identity ## Tv_j = (T - \lambda I) v_j + \lambda v_j##. Since ##(T-\lambda I) v_j \in U##, this exhibits ##Tv_j## as the sum of two vectors where the first can be expressed as a linear combination of the vectors ##u_i## and the second is ##\lambda v_j##.

My question: (for example in the case Dim ##U = m = 2,\ ##, Dim ## V = 5## ) Does this show the matrix of ##T## has the form
##\begin{pmatrix} a_{1,1}&a_{1,2}& a_{1,3} & a_{1,4} & a_{1,5} \\ 0 & a_{2,2}, & a_{2,3} & a_{2,4} & a_{2,5} \\ 0 & 0 & \lambda & 0 & 0 \\ 0 & 0 &0 & \lambda &0\\ 0 & 0 & 0 & 0 & \lambda \end{pmatrix} ## ?

This is not how Axler ends the proof. He makes the less detailed observation that ##v_j \in ## Span ## \{u_1,u_2,...v_j\} ## for ##j = 1, k\ ##. That property characterizes an upper triangular matrix by a previously proved theorem, TH 5.12.
 
Last edited:
Physics news on Phys.org
pardon me, I did not try read axler, but this theorem seems off hand to have a trivial proof. just take v1 as an eigenvector for T, then mod out V by the space spanned by v1, and take v2 to be a vector representing an eigenvector for the induced map on V/<v1>. Then take v3 to be a vector representing an eigenvector for the induced map on V/<v1,v2>, ...... I.e. at each stage, vk is an eigenvector mod the previous vectors, so T(vk)-ck.vk is a linear combination of v1,...,vk-1. Is this nonsense?

but I see your questions is otherwise. I think the answer to it is yes.
 
Last edited:
Here's my discomfort with the zeroes: If all those zeroes appear, why do we ever need generalized eigenvectors? This is just an intuitive discomfort. I haven't figured out whether a defective matrix couldn't have them.
 
I see why my intuition is wrong. For example, ##Tv_1 = \begin{pmatrix} a_{1,3} \\ a_{2,3} \\ \lambda,\\0,\\0 \end{pmatrix} ## , which isn't equal to ##\lambda v_1##. So ##\lambda v_1## isn't necessarily an eigenvector.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
5K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
3
Views
2K