Undergrad Question from a proof in Axler 2nd Ed, 'Linear Algebra Done Right'

Click For Summary
The discussion centers on a proof from Axler's "Linear Algebra Done Right," specifically regarding the structure of the matrix representation of a linear operator T in relation to its eigenvalues and eigenvectors. The question raised concerns whether the matrix form proposed, which includes multiple eigenvalues on the diagonal and zeros below, accurately reflects the situation described in the proof. While the proof suggests that the vectors v_j are linear combinations of the basis vectors u_i, there is uncertainty about the necessity of generalized eigenvectors if the matrix appears to be upper triangular. The participant expresses discomfort with the implications of having zeros in the matrix and questions the nature of eigenvectors in this context. Ultimately, they conclude that the proposed matrix structure is plausible, but the role of generalized eigenvectors remains a point of confusion.
Stephen Tashi
Science Advisor
Homework Helper
Education Advisor
Messages
7,864
Reaction score
1,602
TL;DR
A question about the final step in a proof (by induction) that each linear transformation in a finite dimensional complex vector space has a basis in which its matrix is upper triangular.
My question is motivated by the proof of TH 5.13 on p 84 in the 2nd edition of Linear Algebra Done Right. (This proof differs from that in the 4th ed - online at: https://linear.axler.net/index.html chapter 5 )

In the proof we arrive at the following situation:
##T## is a linear operator on a finite dimensional complex vector space ##V## and ##\lambda ## is a an eigenvalue of ##T\ ##. The subspace ##U## is the range of the linear operator ##T−λI## The set of vectors ##\{ u_1,u_2,...,u_m, v_1,v_2,...v_k\}## is a basis for ##V## such that ##\{ u_1,u_2,...u_m\}## is a basis for ##U## and such that the matrix of the operator ##T - \lambda I ## restricted to ##U## is upper triangular in that basis.

We have the identity ## Tv_j = (T - \lambda I) v_j + \lambda v_j##. Since ##(T-\lambda I) v_j \in U##, this exhibits ##Tv_j## as the sum of two vectors where the first can be expressed as a linear combination of the vectors ##u_i## and the second is ##\lambda v_j##.

My question: (for example in the case Dim ##U = m = 2,\ ##, Dim ## V = 5## ) Does this show the matrix of ##T## has the form
##\begin{pmatrix} a_{1,1}&a_{1,2}& a_{1,3} & a_{1,4} & a_{1,5} \\ 0 & a_{2,2}, & a_{2,3} & a_{2,4} & a_{2,5} \\ 0 & 0 & \lambda & 0 & 0 \\ 0 & 0 &0 & \lambda &0\\ 0 & 0 & 0 & 0 & \lambda \end{pmatrix} ## ?

This is not how Axler ends the proof. He makes the less detailed observation that ##v_j \in ## Span ## \{u_1,u_2,...v_j\} ## for ##j = 1, k\ ##. That property characterizes an upper triangular matrix by a previously proved theorem, TH 5.12.
 
Last edited:
Physics news on Phys.org
pardon me, I did not try read axler, but this theorem seems off hand to have a trivial proof. just take v1 as an eigenvector for T, then mod out V by the space spanned by v1, and take v2 to be a vector representing an eigenvector for the induced map on V/<v1>. Then take v3 to be a vector representing an eigenvector for the induced map on V/<v1,v2>, ...... I.e. at each stage, vk is an eigenvector mod the previous vectors, so T(vk)-ck.vk is a linear combination of v1,...,vk-1. Is this nonsense?

but I see your questions is otherwise. I think the answer to it is yes.
 
Last edited:
Here's my discomfort with the zeroes: If all those zeroes appear, why do we ever need generalized eigenvectors? This is just an intuitive discomfort. I haven't figured out whether a defective matrix couldn't have them.
 
I see why my intuition is wrong. For example, ##Tv_1 = \begin{pmatrix} a_{1,3} \\ a_{2,3} \\ \lambda,\\0,\\0 \end{pmatrix} ## , which isn't equal to ##\lambda v_1##. So ##\lambda v_1## isn't necessarily an eigenvector.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K