How to find the matrix of the derivative endomorphism?

  • Context: Undergrad 
  • Thread starter Thread starter Cathr
  • Start date Start date
  • Tags Tags
    Derivative Matrix
Click For Summary
SUMMARY

The discussion focuses on finding the matrix representation of the derivative endomorphism for the polynomial space ##R3 [X]## using the basis ##B=(1, X, X^2, X^3)##. The matrix for the first derivative, denoted as ##A##, is established as A = \begin{bmatrix} 0 & 1 & 0 & 0 \\ 0 & 0 & 2 & 0 \\ 0 & 0 & 0 & 3 \\ 0 & 0 & 0 & 0 \end{bmatrix}, which correctly leads to the second derivative matrix upon multiplication. However, confusion arises when calculating the derivative of the polynomial ##P=X^3+X^2+X+1##, as the order of the basis elements affects the transformation matrix. The correct differentiation is achieved by reordering the polynomial to ##P=1+X+X^2+X^3##, resulting in the accurate derivative ##P'=1+2X+3X^2##.

PREREQUISITES
  • Understanding of polynomial spaces, specifically ##R3 [X]##.
  • Familiarity with linear transformations and matrix representations.
  • Knowledge of differentiation in the context of linear algebra.
  • Basic understanding of Jordan form and similarity transformations in linear algebra.
NEXT STEPS
  • Explore the concept of Jordan form in linear algebra.
  • Learn about the relationship between linear transformations and their matrix representations.
  • Study the properties of derivative operators in polynomial spaces.
  • Investigate the implications of the characteristic polynomial in determining matrix forms.
USEFUL FOR

Mathematicians, students of linear algebra, and anyone interested in understanding the matrix representation of derivative operators in polynomial spaces.

Cathr
Messages
67
Reaction score
3
We have ##B=(1, X, X^2, X^3)## as a base of ##R3 [X]## and we have the endomorphisms ##d/dX## and ##d^2/dX^2## so that:

##d/dX (P) = P'## and ##d^2/dX^2 (P) = P''##.

Calculating the matrix in class, the teacher found the following matrix, call it ##A##:
\begin{bmatrix}
0 & 1 & 0 & 0 \\

0 & 0 & 2 & 0 \\

0 & 0 & 0 & 3 \\
0 & 0 & 0 & 0
\end{bmatrix}

This is the matrix for the first derivative, and it has the right property that when it's multiplied with itself, it gives the matrix of the second derivative:
\begin{bmatrix}
0 & 0 & 2 & 0 \\

0 & 0 & 0 & 6 \\

0 & 0 & 0 & 0 \\
0 & 0 & 0 & 0
\end{bmatrix}

However, when I try to calculate the derivative of a polynomial, say ##P=X^3+ X^2 + X + 1##, I don't find the right answer.
I found that the transpose ##C## of the matrix ##A## works, so that if we multiply ##P## with the following matrix:\begin{bmatrix}
0 & 0 & 0 & 0 \\
1 & 0 & 0 & 0 \\
0 & 2 & 0 & 0 \\
0 & 0 & 3 & 0
\end{bmatrix} *

I find ##P= 3X^2 + 2X +1##, which is the derivative.

However, this doesn't work for all vectors and, multiplied by itself, it doesn't give the matrix of the second derivative, so it's still wrong.

May you please give me a hint on finding the solution?
 
Physics news on Phys.org
You write the basis in the order ##B=\{1,X,X^2,X^3\}##, but then you write the polynomial in the order ##P=X^3+X^2+X+1##, which indicates that you take the basis elements in the opposite order. I think this is the cause of the confusion, since the transformation matrix depends on the order of the basis elements. If we write ##P=1+X+X^2+X^3## and differentiate by multiplying the transformation matrix by the coordinate vector, you obtain the correct result ##P'=1+2X+3X^2##.
 
Last edited:
  • Like
Likes   Reactions: WWGD
an even nicer basis is obtained by tweaking this one a little to {1, X, X^2/2!, X^3/3!}. Then the non zero entries are all = 1, and the matrix for D is in "Jordan form". The operator (D-c) acting on the similar space {e^ct, te^ct, t^2/2! e^ct, t^3.3! e^ct} has a similar matrix, but also with the constant c on the diagonal. The amazing theorem is that this is, up to "similarity", the most general possible matrix! I.e., given any linear transformation at all on a finite dimensional space, as long as the roots of its characteristic polynomial all lie in the scalar field, then in some basis it has a matrix made up of copies of blocks like this.

So in some sense every linear transformation looks like copies of the derivative operator (D-c) acting on the solution space of the differential equation (D-c)^n = 0. This is discussed in my free linear algebra notes:

http://alpha.math.uga.edu/%7Eroy/laprimexp.pdf
 
Last edited:

Similar threads

  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 33 ·
2
Replies
33
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 11 ·
Replies
11
Views
6K
  • · Replies 24 ·
Replies
24
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 34 ·
2
Replies
34
Views
3K