Finding the Matrix of a Linear Transformation

In summary, a linear transformation given by \(f(X)=X^t\) with respect to a basis of matrix units is diagonalizable. The matrix of the linear transformation is given by \(\begin{pmatrix}1&0&0&0\\0&0&1&0\\0&1&0&0\\0&0&0&1\end{pmatrix}\). The eigenvalues are \(\lambda_1=-1\) and \(\lambda_2=1\) with corresponding eigenvectors \((0,\,1,\,-1,\,0)\), \((1,\,0,\,0,\,0)\), \((0,\,0,\,0,\,1
  • #1
Sudharaka
Gold Member
MHB
1,568
1
Hi everyone, :)

Here's another question I encountered recently. I am writing the question and my full solution. Many thanks if you can go through it and find a mistake, or confirm whether it's correct, or can contribute with any other useful comments. :)

Problem:

Find the matrix of a linear transformation \(f:M_{2}(\mathbb{R})\rightarrow M_{2}(\mathbb{R})\) given by \(f(X)=X^t\), with respect to a basis of matrix units \(E_{ij}\). Is \(f\) diagonalizable?

My Solution:

So first we shall represent the basis of matrix units by column vectors as follows.

\[\begin{pmatrix}1&0\\0&0\end{pmatrix}\rightarrow \begin{pmatrix}1\\0\\0\\0\end{pmatrix}\]

\[\begin{pmatrix}0&1\\0&0\end{pmatrix}\rightarrow \begin{pmatrix}0\\1\\0\\0\end{pmatrix}\]

\[\begin{pmatrix}0&0\\1&0\end{pmatrix}\rightarrow \begin{pmatrix}0\\0\\1\\0\end{pmatrix}\]

\[\begin{pmatrix}0&0\\0&1\end{pmatrix}\rightarrow \begin{pmatrix}0\\0\\0\\1\end{pmatrix}\]

Now we have,

\[f\begin{pmatrix}1\\0\\0\\0\end{pmatrix}= \begin{pmatrix}1\\0\\0\\0\end{pmatrix}\]

\[f\begin{pmatrix}0\\1\\0\\0\end{pmatrix}= \begin{pmatrix}0\\0\\1\\0\end{pmatrix}\]

\[f\begin{pmatrix}0\\0\\1\\0\end{pmatrix}= \begin{pmatrix}0\\1\\0\\0\end{pmatrix}\]

\[f\begin{pmatrix}0\\0\\0\\1\end{pmatrix}= \begin{pmatrix}0\\0\\0\\1\end{pmatrix}\]

Hence the matrix of linear transformation is,

\[M=\begin{pmatrix}1&0&0&0\\0&0&1&0\\0&1&0&0\\0&0&0&1\end{pmatrix}\]

I have basically used the method outlined in the Wikipedia article.

Now to find whether \(M\) is diagonalizable we shall check whether it has two four linearly independent eigenvectors. Unfortunately this has only one linearly independent eigenvector (dimension of eigenspace is one). Therefore \(M\) is not diagonalizable. :)
 
Physics news on Phys.org
  • #2
Sudharaka said:
Here's another question I encountered recently. I am writing the question and my full solution. Many thanks if you can go through it and find a mistake, or confirm whether it's correct, or can contribute with any other useful comments.

Looks pretty good. :)
\[M=\begin{pmatrix}1&0&0&0\\0&0&1&0\\0&1&0&0\\0&0&0&1\end{pmatrix}\]

I have basically used the method outlined in the Wikipedia article.

Now to find whether \(M\) is diagonalizable we shall check whether it has two four linearly independent eigenvectors. Unfortunately this has only one linearly independent eigenvector (dimension of eigenspace is one). Therefore \(M\) is not diagonalizable.

Hmm, which single linearly independent eigenvector is that?
What are the eigenvalues for that matter?
 
  • #3
I like Serena said:
Looks pretty good. :)

Hmm, which single linearly independent eigenvector is that?
What are the eigenvalues for that matter?

Thanks for the confirmation. Sorry I got the wrong answer for the last part of the question since I just scribbled it down 'cause I was in a hurry to get the answer. There are two eigenvalues, \(\lambda_1=-1\) and \(\lambda_2=1\) and four linearly independent eigenvectors.

\[v_1=(0,\,1,\,-1,\,0)\]

\[v_2=(1,\,0,\,0,\,0)\]

\[v_3=(0,\,0,\,0,\,1)\]

\[v_4=(0,\,1,\,1,\,0)\]

So \(M\) in fact is diagonalizable. :)
 
  • #4
Sudharaka said:
Thanks for the confirmation. Sorry I got the wrong answer for the last part of the question since I just scribbled it down 'cause I was in a hurry to get the answer. There are two eigenvalues, \(\lambda_1=-1\) and \(\lambda_2=1\) and four linearly independent eigenvectors.

\[v_1=(0,\,1,\,-1,\,0)\]

\[v_2=(1,\,0,\,0,\,0)\]

\[v_3=(0,\,0,\,0,\,1)\]

\[v_4=(0,\,1,\,1,\,0)\]

So \(M\) in fact is diagonalizable.

Yep. That looks much better. :cool:
 
  • #5
I like Serena said:
Yep. That looks much better. :cool:

Thanks for helping me out. I truly appreciate it. :)
 
  • #6
As I see it, the trick is not to be too hasty in calculating the characteristic polynomial.

Expanding by minors along the first column we get:

$\det(xI - M) = (x - 1)\begin{vmatrix}x&-1&0\\-1&x&0\\0&0&x-1 \end{vmatrix}$

$= (x - 1)[(x^2(x - 1) + 0 + 0 - 0 - 0 - (-1)(-1)(x - 1)] = (x - 1)(x^2(x - 1) - (x - 1)$

$= (x - 1)(x^2 - 1)(x - 1) = (x - 1)^3(x + 1)$

If you stop and think about it, all the transpose map does is switch the 1,2 and the 2,1 entry. Therefore, we clearly have:

$E_{11}, E_{22}$ as linearly independent eigenvectors belonging to 1.

In fact, the transpose map preserves all SYMMETRIC matrices, and thus preserves the third (usual) basis vector of the subspace of symmetric matrices:

$E_{12} + E_{21}$.

So even if you had concluded you had just the eigenvalue 1, you still should have come up with an eigenbasis with 3 elements.

The transpose also sends any skew-symmetric matrix to its negative, so the subspace of skew-symmetric matrices is an invariant subspace of the transpose map. For 2x2 matrices, this subspace has just one (usual) basis vector:

$E_{12} - E_{21}$ (the diagonal entries must be 0, and the lower off-diagonal entry is completely determined by the upper one).

Explicitly, these are 4 linearly independent elements of $M_2(\Bbb R)$, the first 3 clearly belong the eigenvalue 1, and the last belongs to the eigenvalue -1 (I am stating it like this to show that you need not pass to a "vectorized" representation of $M_2(\Bbb R)$).

Succintly, one can express the above as:

$M_2(\Bbb R) = \text{Sym}_2(\Bbb R) \oplus \text{Skew}_2(\Bbb R)$, and I leave it to you to consider how you might generalize this result to nxn matrices.

If we turn our reasoning "upside down" (provided we have an underlying field with characteristic not equal to 2, otherwise symmetric = skew-symmetric), we can conclude that every matrix has a UNIQUE representation as the sum of a symmetric and skew-symmetric one, namely:

$M = \frac{1}{2}(M + M^T) + \frac{1}{2}(M - M^T)$

which relies on an algebraic trick you may well encounter in other settings (such as even and odd functions).
 

Related to Finding the Matrix of a Linear Transformation

1. What is a linear transformation?

A linear transformation is a mathematical operation that maps one vector space to another, preserving the basic structure of the original space. It can be represented by a matrix and is often used to describe changes in geometric shapes and patterns.

2. Why is it important to find the matrix of a linear transformation?

Finding the matrix of a linear transformation allows us to easily perform calculations and understand the effects of the transformation on vectors. It also helps us to generalize the transformation and apply it to different vectors and scenarios.

3. How do you find the matrix of a linear transformation?

To find the matrix of a linear transformation, we first need to identify the basis vectors of the original vector space. Then, we apply the transformation to each basis vector and record the resulting vectors as columns of a matrix. This matrix represents the transformation and can be used to perform calculations.

4. Can the matrix of a linear transformation be different for different coordinate systems?

No, the matrix of a linear transformation is independent of the coordinate system used. This is because it represents the transformation itself, rather than its specific effects on certain vectors. However, the representation of the vectors and the resulting calculations may differ depending on the coordinate system.

5. How is the matrix of a linear transformation related to eigenvectors and eigenvalues?

The matrix of a linear transformation can be used to find its eigenvectors and eigenvalues. Eigenvectors are special vectors that remain in the same direction after the transformation, while eigenvalues represent the scaling factor of the transformation on the eigenvectors. Both are important in understanding the behavior of the transformation and its effects on vectors.

Similar threads

  • Linear and Abstract Algebra
Replies
9
Views
2K
  • Linear and Abstract Algebra
Replies
10
Views
995
  • Linear and Abstract Algebra
Replies
34
Views
2K
  • Linear and Abstract Algebra
Replies
8
Views
899
Replies
2
Views
724
  • Linear and Abstract Algebra
2
Replies
52
Views
2K
Replies
31
Views
2K
  • Linear and Abstract Algebra
Replies
15
Views
1K
  • Linear and Abstract Algebra
Replies
14
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
907
Back
Top