Finding the Kernel of a Matrix Map

  • Context: Undergrad 
  • Thread starter Thread starter WWGD
  • Start date Start date
  • Tags Tags
    Kernel Map Matrix
Click For Summary
SUMMARY

The discussion focuses on finding the kernel of the matrix map defined by the equation \( f(A) = A^n - I_d \), where \( A \) is a \( k \times k \) matrix and \( n \) is a positive integer. Participants confirm that the kernel's solutions include the identity matrix \( I \) for odd \( n \) and both \( I \) and \( -I \) for even \( n \). The conversation highlights the use of the Fundamental Theorem of Linear Algebra and explores methods for calculating powers of matrices efficiently, such as squaring to reduce computational complexity to \( O(\log(n)) \). Additionally, it clarifies that not all solutions are rotations, emphasizing the distinction between real and complex matrices.

PREREQUISITES
  • Understanding of linear algebra concepts, particularly eigenvalues and eigenvectors.
  • Familiarity with matrix operations, including matrix multiplication and diagonalization.
  • Knowledge of the Fundamental Theorem of Linear Algebra.
  • Basic understanding of complex numbers and roots of unity.
NEXT STEPS
  • Research the Fundamental Theorem of Linear Algebra in detail.
  • Learn about eigenvalues and eigenvectors in the context of matrix transformations.
  • Study efficient algorithms for matrix exponentiation, such as exponentiation by squaring.
  • Explore the properties of rotation matrices and their applications in linear transformations.
USEFUL FOR

Mathematicians, computer scientists, and engineers working with linear algebra, particularly those involved in matrix theory, eigenvalue problems, and computational mathematics.

WWGD
Science Advisor
Homework Helper
Messages
7,779
Reaction score
13,021
Hi All,
I am trying to see if there is a "nice" ( relatively straightforward) way of finding the
solution/kernel of the map : ##f(A)=A^n -Id ## , where A is an ## k \times k ## matrix and ##n## is a positive
integer. Question: what is the kernel of this map? Cranking out matrix coefficients gets messy
and very slow as n grows. I can think of a solution using the Fundamental Theorem of Linear
Algebra relating Ortho spaces of rows, columns and row- and column- spaces. Is there some other
"reasonable" way of finding all solutions (clearly Id,-Id are solutions for n even and Id is a solution for n odd
)?
 
Physics news on Phys.org
some more details would be of interest. Since you mention orthogonality, I'm assuming your scalars are in ##\mathbb R## or ##\mathbb C##.

What's the motivation here-- are you trying to calculate a (finite) von neumann series?

To confirm: is ##Id## the identity matrix ##I##? If so, the kernel is really a question of whether or not ##\mathbf A## has an eigenvalue with magnitude 1 (and is an nth root of unity), and what the geometric multiplicities of associated eigenvectors are...

another thought: if you want to go the direct multiplication route there is the approach of first finding ##\mathbf A^2## then squaring that to get ##\mathbf A^4## then squaring that to get ##\mathbf A^8## then squaring that to get ##\mathbf A^{16}##, and so on until you get 'close' to ##\mathbf A^n## then cleverly finishing off the residual... as I recall this cuts down multiplications to something like ##O\big(log(n)\big)##,in a manner similar to bisection.
 
  • Like
Likes   Reactions: WWGD
StoneTemplePython said:
some more details would be of interest. Since you mention orthogonality, I'm assuming your scalars are in ##\mathbb R## or ##\mathbb C##.

What's the motivation here-- are you trying to calculate a (finite) von neumann series?

To confirm: is ##Id## the identity matrix ##I##? If so, the kernel is really a question of whether or not ##\mathbf A## has an eigenvalue with magnitude 1 (and is an nth root of unity), and what the geometric multiplicities of associated eigenvectors are...

another thought: if you want to go the direct multiplication route there is the approach of first finding ##\mathbf A^2## then squaring that to get ##\mathbf A^4## then squaring that to get ##\mathbf A^8## then squaring that to get ##\mathbf A^{16}##, and so on until you get 'close' to ##\mathbf A^n## then cleverly finishing off the residual... as I recall this cuts down multiplications to something like ##O\big(log(n)\big)##,in a manner similar to bisection.
Yes, thanks, good points, yes, we would need a notion of orthogonality to make appeal to Fundamental Theorem, but I think it applies only to Real-valued matrices. Still, rotation by ## 2\pi/n ## is a solution , despite having no Real eigenvalues.
 
In the class of complex matrices all the solutions of the equation ##A^n=I## are given by ##A=SDS^{-1}##, where ##S## is an invertible matrix, and ##D## is a diagonal matrix whose diagonal entries are ##n##th roots of unity, ##d_{j}^n=1##.

In the class of real matrices all the solutions are given by ##A=SRS^{-1}##, where ##S## is a real invertible matrix, and ##R## is a "rotation" matrix, i.e. ##R## is a block diagonal matrix whose diagonal blocks can be either rotations through the angle ##2\pi j/n##, ##j=0, 1, 2, \ldots, n-1## (these are ##2\times 2## blocks) or just real roots of unity (##1\times 1## blocks). The real roots of unity are ##\pm 1## if ##n## is even, and ##1## if ##n## is odd.

A rotation through the angle ##\alpha## matrix is given by $$\left( \begin{array}{cc} \cos\alpha & -\sin\alpha \\ \sin\alpha & \cos\alpha \end{array}\right).$$
 
Hawkeye18 said:
In the class of complex matrices all the solutions of the equation ##A^n=I## are given by ##A=SDS^{-1}##, where ##S## is an invertible matrix, and ##D## is a diagonal matrix whose diagonal entries are ##n##th roots of unity, ##d_{j}^n=1##.

In the class of real matrices all the solutions are given by ##A=SRS^{-1}##, where ##S## is a real invertible matrix, and ##R## is a "rotation" matrix, i.e. ##R## is a block diagonal matrix whose diagonal blocks can be either rotations through the angle ##2\pi j/n##, ##j=0, 1, 2, \ldots, n-1## (these are ##2\times 2## blocks) or just real roots of unity (##1\times 1## blocks). The real roots of unity are ##\pm 1## if ##n## is even, and ##1## if ##n## is odd.

A rotation through the angle ##\alpha## matrix is given by $$\left( \begin{array}{cc} \cos\alpha & -\sin\alpha \\ \sin\alpha & \cos\alpha \end{array}\right).$$
Thanks, Hawkeye, are all permutation matrices then also rotation matrices? The antidiagonal matrix ## A_{ij} ; a_{ii}=0 ; a_{ij}=-1 ; i \neq j ## satisfies ##A^2=Id ## but it cannot be a rotation matrix since otherwise ## sin \alpha =-sin \alpha =-1 ##. Did I make a mistake?EDIT: Or is this matrix conjugate to a rotation? EDIT2: I don't think it can be conjugate to a rotation matrix ; its determinant would still need to equal 1.
 
Last edited:
WWGD said:
Thanks, Hawkeye, are all permutation matrices then also rotation matrices? The antidiagonal matrix ## A_{ij} ; a_{ii}=0 ; a_{ij}=-1 ; i \neq j ## satisfies ##A^2=Id ## but it cannot be a rotation matrix since otherwise ## sin \alpha =-sin \alpha =-1 ##. Did I make a mistake?EDIT: Or is this matrix conjugate to a rotation? EDIT2: I don't think it can be conjugate to a rotation matrix ; its determinant would still need to equal 1.

I believe you are talking here about a ##2\times 2## matrix. Then your antidiagonal matrix is not a rotation matrix, but it is a matrix of form ##SDS^{-1}##, where $$D=\left(\begin{array}{cc} 1 & 0 \\ 0 & -1 \end{array} \right).$$
 
Hawkeye18 said:
I believe you are talking here about a ##2\times 2## matrix. Then your antidiagonal matrix is not a rotation matrix, but it is a matrix of form ##SDS^{-1}##, where $$D=\left(\begin{array}{cc} 1 & 0 \\ 0 & -1 \end{array} \right).$$
Right, but I thought all solutions to ##A^n =Id ## consisted of rotations ( up to conjugation) , and the matrix I mentioned, antidiagonals =-1; 0 otherwise, is a solution (specifically, to ## A^2=Id ##), but not a rotation? But, I agree, if A is a rotation then its eigenvalues must be n-th roots of unity .
 
Last edited:
Let me elaborate. In the class of real matrices, all the solutions are given by ##A=SRS^{-1}##, where ##R## is a "rotation" (note the quotes, that is not always a real rotation), and ##S## is invertible. "Rotation" here mean that ##R## is the block diagonal matrices, where each block is either a ##2\times 2## rotation matrix (through an allowed angle), or just ##1\times 1## block, with ##n##th root of unity in it. I probably did not make it clear, but you do not have to have both types of blocks: your matrix can have only ##2\times2## blocks, or only ##1\times1## blocks, or any combination of these 2 (that adds up to the dimension).

The matrix $$R=D=\left(\begin{array}{cc} 1 & 0\\ 0& -1\end{array}\right)$$ is of that type, and it is not a rotation (that why I used quotes). But your matrix can be represented as ##SDS^{-1}##, so your example does not contradict my statement.
 
  • Like
Likes   Reactions: WWGD
I think you two may be talking past each other.

WWGD said:
Right, but I thought all solutions to ##A^n =Id ## consisted of rotations

This is wrong. For example: ##A^2 = I ## is an involution. Such a matrix does not need to be normal. (i.e. it is diagonalizable, but need not be unitarily diagonalizable.)

Permutation matrices are a special kind of orthogonal matrix (which is a special kind of unitary matrix). Rotation matrices are also orthogonal matrices. All orthogonal matrices are normal.

Another way to think about it is: Pick your favorite orthogonal matrix ##P##, where ##P^n = I##. Now do a non-unitary similarity transform, i.e. consider ##S## where ##S^{-1} \neq S^*##.

Now take a look at ##\big(S^{-1}PS\big)##. The spectrum of this is the same as for ##P## because the matrices are similar. But the ##P## is normal while the new matrix is not.

yet we still have

##P^n = I##

so

##\big(S^{-1}PS\big)^n =\big(S^{-1}P^n S\big) = \big(S^{-1}IS\big) = \big(S^{-1}S\big)= I ##
 
  • Like
Likes   Reactions: WWGD
  • #10
StoneTemplePython said:
I think you two may be talking past each other.
This is wrong. For example: ##A^2 = I ## is an involution. Such a matrix does not need to be normal. (i.e. it is diagonalizable, but need not be unitarily diagonalizable.)

Permutation matrices are a special kind of orthogonal matrix (which is a special kind of unitary matrix). Rotation matrices are also orthogonal matrices. All orthogonal matrices are normal.

Another way to think about it is: Pick your favorite orthogonal matrix ##P##, where ##P^n = I##. Now do a non-unitary similarity transform, i.e. consider ##S## where ##S^{-1} \neq S^*##.

Now take a look at ##\big(S^{-1}PS\big)##. The spectrum of this is the same as for ##P## because the matrices are similar. But the ##P## is normal while the new matrix is not.

yet we still have

##P^n = I##

so

##\big(S^{-1}PS\big)^n =\big(S^{-1}P^n S\big) = \big(S^{-1}IS\big) = \big(S^{-1}S\big)= I ##
Thanks, I am aware not all solutions are rotations , by a straightforward counting argument, but I thought someone here had stated they are rotations, up to similarity.
 
  • #11
WWGD said:
Thanks, I am aware not all solutions are rotations , by a straightforward counting argument, but I thought someone here had stated they are rotations, up to similarity.

I used "rotation" in quotes for the lack of better term, it is not always a real rotation. Your example is up to similarity what I call "rotation" see post # 8 above for the details.
 
  • #12
Hawkeye18 said:
I used "rotation" in quotes for the lack of better term, it is not always a real rotation. Your example is up to similarity what I call "rotation" see post # 8 above for the details.
Got it Hawk, thanks.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 15 ·
Replies
15
Views
5K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
Replies
1
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K