Triangular Form for Linear Transformations: Finding Basis for Null Spaces

  • Context: Graduate 
  • Thread starter Thread starter Ioiô
  • Start date Start date
  • Tags Tags
    Form Matrix
Click For Summary

Discussion Overview

The discussion revolves around finding a matrix B in triangular form for a linear transformation T, given a specific minimal polynomial m(x) = (x+1)^2 (x-2). Participants explore the process of determining a suitable basis for the null spaces associated with T, particularly focusing on the implications of the minimal polynomial on the structure of the transformation's matrix representation.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants propose finding a basis {w1, w2} for null(T+1)^2 such that (T+1)w1 = 0 and (T+1)w2 is in the subspace generated by w1, questioning if this is necessary due to the degree of the polynomial.
  • Others argue that the union of bases for null(T+1)^2 and null(T-2) will form a basis for the space V, leading to a block diagonal form of T.
  • A participant provides a specific transformation matrix A and discusses finding vectors in the null spaces, noting that different choices for w1 can lead to different bases and potentially different matrix forms.
  • There is a challenge regarding whether choosing an eigenvector for T corresponding to -1 guarantees that the matrix for the eigenvalue 2 will also be in block form.
  • Some participants clarify that the direct sum decomposition of V into null spaces explains the structure of the resulting matrix blocks, but there is uncertainty about the implications of different choices of basis vectors on the matrix form.

Areas of Agreement / Disagreement

Participants express a mix of agreement and disagreement regarding the implications of the minimal polynomial on the structure of the transformation's matrix. While some concepts are accepted, there remains uncertainty about the effects of different basis choices and the guarantees of upper triangular forms.

Contextual Notes

There are limitations regarding the assumptions made about the minimal polynomial and the conditions under which the transformation can be represented in triangular form. The discussion also highlights the dependency on the choice of basis vectors and the unresolved nature of how these choices affect the resulting matrix structure.

Ioiô
Messages
3
Reaction score
0
I am confused on how to find a matrix B in triangular form for some linear transformation T over a basis [tex]\{v_1,v_2, v_3\}[/tex].

Suppose we are given a minimal polynomial [tex]m(x) = (x+1)^2 (x-2)[/tex].

Do I want to find a basis [tex]\{w_1,w_2\}[/tex] for [tex]null(T+1)^2[/tex] such that [tex](T+1) w_1 = 0[/tex] and [tex](T+1) w_2 \in S(w_1)[/tex]? Is this because [tex](x+1)^2[/tex] has degree two? This is the part I'm not sure about.

For [tex]w_3[/tex], should I just let it be a basis for [tex]null(T-2)[/tex]?

I tried this for a specific transformation T and got the correct matrix B. (I checked the work by computing the matrix S that relates the old basis (v's) to the new basis (w's) and used the relation [tex]B = S^{-1} A S[/tex] where A is the matrix of T.)

Thanks for the help!
 
Physics news on Phys.org
Ioiô said:
I am confused on how to find a matrix B in triangular form for some linear transformation T over a basis [tex]\{v_1,v_2, v_3\}[/tex].

Suppose we are given a minimal polynomial [tex]m(x) = (x+1)^2 (x-2)[/tex].

Do I want to find a basis [tex]\{w_1,w_2\}[/tex] for [tex]null(T+1)^2[/tex] such that [tex](T+1) w_1 = 0[/tex] and [tex](T+1) w_2 \in S(w_1)[/tex]?
What is S?

The general idea is as follows. If you take a basis for null(T+1)^2 and a basis for null(T-2), then their union will be a basis for your space V with respect to which T is block diagonal. In this case the first block will be 2x2 and the second one will be 1x1 (with just a '2' in it). So to make sure that T becomes upper triangular, we're going to have to see to it that the 2x2 block is upper triangular. One way to do this is to let w_1 be an eigenvector for T corresponding to -1 (i.e. pick a w_1 in null(T+1)) and then extend {w_1} to a basis {w_1, w_2} for null(T+1)^2. This choice of w_1 ensures that we get the 0s that we want on the first column.
 
Last edited:
In [tex](T+1) w_2 \in S(w_1)[/tex] S means subspace. (I know it is confusing when S is also used for the matrix relating the two basis.)

The space V is a direct sum of V1 = null(T+1)^2 and V2 = null(T-2). Is this the reason for the 2x2 matrix block, the diagonals are all -1 and the 1x1 matrix block, the diagonal is 2?

To make the example concrete, suppose the matrix corresponding to the transformation T is
[tex]A = \begin{bmatrix} -1 & 0 & 2 \\ 3 & 2 & 1 \\ 0 & 0 & -1 \end{bmatrix}[/tex].

A vector in null(A + I) is x = v1 - v2 (just by solving the equation (A + I)x=0 for x). Likewise, a vector in null(A + I)^2 is y = v2-v3.

This gives me a new basis w1 = v1 - v2, w2 = v2 - v3, and w3 = v3. I checked this and it works. However, another vector in null(A + I)^2 is y'= v3-v2, so I'd get a different set of w1, w2, and w3 and I didn't get the matrix in block form as before. Shouldn't I get still get a matrix in diagonal form (but with different number in the upper triangular block)?

I don't see how choosing a vector w1 such that T(w1) = -w1 will also guarantee the matrix for the eigenvalue -2 will also be in block form.
 
Ioiô said:
The space V is a direct sum of V1 = null(T+1)^2 and V2 = null(T-2). Is this the reason for the 2x2 matrix block, the diagonals are all -1 and the 1x1 matrix block, the diagonal is 2?
No. This is the reason that we can write T as a direct sum of a 2x2 block and a 1x1 block with respect to this decomposition of V. Of course the 1x1 block will have to be a '2', because Tx=2x for vector in V2.

I don't really follow what you're doing in the rest of the post. If the minimal polynomial of A is m(x)=(x+1)^2(x-1) (I haven't checked), then A won't be diagonalizable. But it will be upper triangular if you choose a good basis for null(A+1)^2. Think about why picking an eigenvector is a good idea.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 20 ·
Replies
20
Views
3K
  • · Replies 12 ·
Replies
12
Views
5K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K