Jordan Normal Form Issues

In summary, the matrix A associated with the basis {v, (A- 3I)v, ..., (A- 3I)ⁿ⁻¹} is a matrix consisting of λ on the diagonal and 1's on the superdiagonal. However, taking the direct sum of cyclic subspaces cannot be represented as a matrix composed of matrices on the diagonal.
  • #1
sponsoredwalk
533
5
I just can't figure out how you arrive at having a diagonal matrix consisting of Jordan blocks.

Going by Lang, a vector is (A - λI)-cyclic with period n if (A - λI)ⁿv = 0, for some n ∈ℕ.
It can be proven that v, (A - λI)v, ..., (A - λI)ⁿ⁻¹ are linearly independent, & so
{v, (A - λI)v, ..., (A - λI)ⁿ⁻¹} forms a basis, called the Jordan basis, for what is now known
as a cyclic vector space.

Furthermore, for each (A - λI)ⁿv we have that (A - λI)ⁿv = (A - λI)ⁿ⁺¹v + λ(A - λI)ⁿv.

Now for the life of me I just don't see how the matrix associated to this basis is a matrix
consisting of λ on the diagonal & 1's on the superdiagonal.

But assuming that works, I don't see how taking the direct sum of cyclic subspaces can
be represented as a matrix consisting of matrices on the diagonal.

Basically I'm just asking to see explicitly how you form the matrix w.r.t. the Jordan basis
& to see how you form a matrix representation of a direct sum of subspaces, appreciate
any & all help. :cool:
 
Physics news on Phys.org
  • #2
Let's take the simple example where the matrix, A, is 3 by 3 and has the single eigenvalue 3. Then the Characteristic equation is [itex](x- 3)^3= 0[/itex]. Since every matrix satisfies its own characteristice equiation, it must be true that for every vector, v, [itex](A- 3)^3= 0[/itex]. It might be the case that (A- 3I)v= 0 for every vector v. In that there are three independent vectors such that Av= 3v and we can use those three vectors as a basis for the vector space. Written in that basis, A would be diagonal:
[tex]\begin{bmatrix}3 & 0 & 0 \\ 0 & 3 & 1\\ 0 & 0 & 3\end{bmatrix}[/tex].

Or, it might be that (A- 3I)v= 0 only for multiples of a single vector or linear combinations of two vectors. In the first case, it must still be true that [itex](A- 3I)^3u= 0[/itex] for all vectors so we must have, for some vector, u, [itex](A- 3I)v= w\ne 0[/itex] but that [itex](A- 3I)^3v= (A- 3I)^2w= 0[/itex]. Of course that is the same as saying that [itex](A- 3I)^2w[/itex] is a multiple of v, the eigenvector, and so, letting x be (A- 3I)w, (A- 3I)x= v. Those two vectors, x such that (A- 3I)x= v, and w such that (A- I3)w= x, are called "generalized eigenvectors".

Of course, to find the matrix representation of a linear transformation in a given basis, we apply the matrix to each basis vector in turn, writing the result as a linear combination of the basis vectors, so that the coefficients are the columns of the matrix.

(If you are not aware of that [very important!] fact, suppose A is a linear transformation from a three dimensional vector space to itself. Further, suppose [itex]\{v_1, v_2, v_3\}[/itex] is a basis for that vector space. Then [itex]Av_1[/itex] is a vector in the space and so can be written as a linear combination of the basis vectors, say, [itex]Av_1= av_1+ bv_2+ cv_3[/itex]. In that basis, [itex]v_1[/itex] itself is written as the column
[tex]\begin{bmatrix}1 \\ 0 \\ 0\end{bmatrix}[/tex]
since [itex]v_1= (1)v_1+ (0)v_2+ (0)v_3[/tex]
Since [itex]Av_1= av_1+ bv_2+ cv_3[/itex], we must have
[tex]Av_1= A\begin{bmatrix}1 \\ 0 \\ 0\end{bmatrix}= \begin{bmatrix}a \\ b \\ c\end{bmatrix}[/tex]
so obviously, the first column of a must be
[tex]\begin{bmatrix}a \\ b \\ c\end{bmatrix}[/tex].)

Here, our basis vectors are v, x, and w such that (A- 3I)v= 0, (A- 3I)x= v, and (A- 3I)w= u. From (A- 3I)v= 0, which is the same as Av= 3v+ 0x+ 0w, we see that the first column of the matrix is
[tex]\begin{bmatrix}3 \\ 0 \\ 0\end{bmatrix}[/tex]

From (A- 3I)x= v, which is the same as Ax= v+ 3x+ 0w, we see that the second column of the matrix is
[tex]\begin{bmatrix}1 \\ 3 \\ 0\end{bmatrix}[/tex]

Finally, from (A- 3I)w= x, which is the same as Aw= 0v+ x+ 3w, we see that the third column of the matrix is
[tex]\begin{bmatrix}0 \\ 1 \\ 3\end{bmatrix}[/tex]

so that, in this ordered basis, the linear transformation is represented by the matrix
[tex]\begin{bmatrix}3 & 1 & 0 \\ 0 & 3 & 1 \\ 0 & 0 & 3\end{bmatrix}[/tex]
 
Last edited by a moderator:
  • #3
Thanks HallsofIvy, that was an interesting read. I enjoyed the construction of the matrix from the generalized eigenvectors - I don't remember seeing it that way before.
 
  • #4
Great stuff, thanks Halls. Figured out the importance of direct sums and invariant subspaces
today & am alright now.
 
  • #5


I understand your confusion and frustration with the Jordan Normal Form. It is a complex concept that can be difficult to grasp at first. However, with some further explanation and examples, I hope to clarify how a diagonal matrix consisting of Jordan blocks is formed.

First, let's define what a Jordan block is. A Jordan block is a square matrix with a constant value (λ) on the diagonal and 1's on the superdiagonal. For example, a 3x3 Jordan block with λ = 2 would look like this:

[2 1 0]
[0 2 1]
[0 0 2]

Now, let's go back to the definition of a cyclic vector space. A vector v is (A - λI)-cyclic with period n if (A - λI)ⁿv = 0, for some n ∈ℕ. This means that when we apply the matrix (A - λI) to the vector v, it will eventually become the zero vector after n applications. This is why the elements in the Jordan basis are linearly independent - they are all "cyclic" in the sense that they eventually become the zero vector when multiplied by (A - λI).

Now, how do we form a matrix representation of a direct sum of cyclic subspaces? Let's say we have two cyclic subspaces with basis vectors v₁ and v₂. The direct sum of these subspaces is a larger space that contains both v₁ and v₂, and any vector in this space can be written as a linear combination of v₁ and v₂. In matrix form, this can be represented as:

[A₁ 0]
[0 A₂]

Where A₁ is the matrix representation of the cyclic subspace with basis vector v₁ and A₂ is the matrix representation of the cyclic subspace with basis vector v₂. This is essentially taking the two Jordan blocks and placing them on the diagonal of a larger matrix.

Finally, let's see how this relates to the Jordan Normal Form. The Jordan Normal Form is a diagonal matrix that consists of Jordan blocks on the diagonal. This means that for each eigenvalue λ of the original matrix A, there will be a corresponding Jordan block on the diagonal of the Jordan Normal Form. The size of the Jordan block is determined by the period of the cyclic subspace associated with that eigenvalue.

To summarize, the Jordan Normal Form is formed by finding the
 

1. What is Jordan Normal Form?

Jordan Normal Form is a way to represent a square matrix in a special form that simplifies certain calculations and reveals important properties of the matrix.

2. How do I find the Jordan Normal Form of a matrix?

To find the Jordan Normal Form of a matrix, you need to first find its eigenvalues and corresponding eigenvectors. Then, use these eigenvectors to construct a transformation matrix that diagonalizes the original matrix. The resulting diagonal matrix is the Jordan Normal Form.

3. What does the Jordan Normal Form tell us about a matrix?

The Jordan Normal Form reveals important properties of a matrix, such as its eigenvalues and eigenvectors. It also tells us about the matrix's rank, nullity, and the number of linearly independent eigenvectors it has.

4. Can any matrix be transformed into Jordan Normal Form?

Yes, any square matrix can be transformed into Jordan Normal Form. However, the transformation matrix may not be unique, and some matrices may have a more complicated Jordan Normal Form than others.

5. What are some applications of Jordan Normal Form?

Jordan Normal Form is commonly used in linear algebra and differential equations to simplify calculations and solve systems of equations. It also has applications in physics, engineering, and computer science, such as in the analysis of systems with repeated eigenvalues.

Similar threads

  • Linear and Abstract Algebra
Replies
3
Views
965
  • Linear and Abstract Algebra
Replies
2
Views
494
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
886
  • Linear and Abstract Algebra
Replies
7
Views
1K
Replies
4
Views
2K
Replies
14
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
2K
Replies
27
Views
1K
Back
Top