Undergrad Finding the Kernel of a Matrix Map

Click For Summary
The discussion centers on finding the kernel of the matrix map defined by f(A) = A^n - Id, where A is a k x k matrix and n is a positive integer. Participants explore various approaches, including the Fundamental Theorem of Linear Algebra and the implications of eigenvalues, particularly focusing on whether A has an eigenvalue with magnitude 1. They discuss the efficiency of calculating powers of A using a squaring method to reduce computational complexity. The conversation also clarifies that not all solutions to A^n = Id are rotation matrices, highlighting that some matrices can be represented in a form involving diagonalization and similarity transformations. The topic concludes with an acknowledgment of the complexity and nuances involved in the classification of solutions.
WWGD
Science Advisor
Homework Helper
Messages
7,772
Reaction score
12,994
Hi All,
I am trying to see if there is a "nice" ( relatively straightforward) way of finding the
solution/kernel of the map : ##f(A)=A^n -Id ## , where A is an ## k \times k ## matrix and ##n## is a positive
integer. Question: what is the kernel of this map? Cranking out matrix coefficients gets messy
and very slow as n grows. I can think of a solution using the Fundamental Theorem of Linear
Algebra relating Ortho spaces of rows, columns and row- and column- spaces. Is there some other
"reasonable" way of finding all solutions (clearly Id,-Id are solutions for n even and Id is a solution for n odd
)?
 
Physics news on Phys.org
some more details would be of interest. Since you mention orthogonality, I'm assuming your scalars are in ##\mathbb R## or ##\mathbb C##.

What's the motivation here-- are you trying to calculate a (finite) von neumann series?

To confirm: is ##Id## the identity matrix ##I##? If so, the kernel is really a question of whether or not ##\mathbf A## has an eigenvalue with magnitude 1 (and is an nth root of unity), and what the geometric multiplicities of associated eigenvectors are...

another thought: if you want to go the direct multiplication route there is the approach of first finding ##\mathbf A^2## then squaring that to get ##\mathbf A^4## then squaring that to get ##\mathbf A^8## then squaring that to get ##\mathbf A^{16}##, and so on until you get 'close' to ##\mathbf A^n## then cleverly finishing off the residual... as I recall this cuts down multiplications to something like ##O\big(log(n)\big)##,in a manner similar to bisection.
 
  • Like
Likes WWGD
StoneTemplePython said:
some more details would be of interest. Since you mention orthogonality, I'm assuming your scalars are in ##\mathbb R## or ##\mathbb C##.

What's the motivation here-- are you trying to calculate a (finite) von neumann series?

To confirm: is ##Id## the identity matrix ##I##? If so, the kernel is really a question of whether or not ##\mathbf A## has an eigenvalue with magnitude 1 (and is an nth root of unity), and what the geometric multiplicities of associated eigenvectors are...

another thought: if you want to go the direct multiplication route there is the approach of first finding ##\mathbf A^2## then squaring that to get ##\mathbf A^4## then squaring that to get ##\mathbf A^8## then squaring that to get ##\mathbf A^{16}##, and so on until you get 'close' to ##\mathbf A^n## then cleverly finishing off the residual... as I recall this cuts down multiplications to something like ##O\big(log(n)\big)##,in a manner similar to bisection.
Yes, thanks, good points, yes, we would need a notion of orthogonality to make appeal to Fundamental Theorem, but I think it applies only to Real-valued matrices. Still, rotation by ## 2\pi/n ## is a solution , despite having no Real eigenvalues.
 
In the class of complex matrices all the solutions of the equation ##A^n=I## are given by ##A=SDS^{-1}##, where ##S## is an invertible matrix, and ##D## is a diagonal matrix whose diagonal entries are ##n##th roots of unity, ##d_{j}^n=1##.

In the class of real matrices all the solutions are given by ##A=SRS^{-1}##, where ##S## is a real invertible matrix, and ##R## is a "rotation" matrix, i.e. ##R## is a block diagonal matrix whose diagonal blocks can be either rotations through the angle ##2\pi j/n##, ##j=0, 1, 2, \ldots, n-1## (these are ##2\times 2## blocks) or just real roots of unity (##1\times 1## blocks). The real roots of unity are ##\pm 1## if ##n## is even, and ##1## if ##n## is odd.

A rotation through the angle ##\alpha## matrix is given by $$\left( \begin{array}{cc} \cos\alpha & -\sin\alpha \\ \sin\alpha & \cos\alpha \end{array}\right).$$
 
Hawkeye18 said:
In the class of complex matrices all the solutions of the equation ##A^n=I## are given by ##A=SDS^{-1}##, where ##S## is an invertible matrix, and ##D## is a diagonal matrix whose diagonal entries are ##n##th roots of unity, ##d_{j}^n=1##.

In the class of real matrices all the solutions are given by ##A=SRS^{-1}##, where ##S## is a real invertible matrix, and ##R## is a "rotation" matrix, i.e. ##R## is a block diagonal matrix whose diagonal blocks can be either rotations through the angle ##2\pi j/n##, ##j=0, 1, 2, \ldots, n-1## (these are ##2\times 2## blocks) or just real roots of unity (##1\times 1## blocks). The real roots of unity are ##\pm 1## if ##n## is even, and ##1## if ##n## is odd.

A rotation through the angle ##\alpha## matrix is given by $$\left( \begin{array}{cc} \cos\alpha & -\sin\alpha \\ \sin\alpha & \cos\alpha \end{array}\right).$$
Thanks, Hawkeye, are all permutation matrices then also rotation matrices? The antidiagonal matrix ## A_{ij} ; a_{ii}=0 ; a_{ij}=-1 ; i \neq j ## satisfies ##A^2=Id ## but it cannot be a rotation matrix since otherwise ## sin \alpha =-sin \alpha =-1 ##. Did I make a mistake?EDIT: Or is this matrix conjugate to a rotation? EDIT2: I don't think it can be conjugate to a rotation matrix ; its determinant would still need to equal 1.
 
Last edited:
WWGD said:
Thanks, Hawkeye, are all permutation matrices then also rotation matrices? The antidiagonal matrix ## A_{ij} ; a_{ii}=0 ; a_{ij}=-1 ; i \neq j ## satisfies ##A^2=Id ## but it cannot be a rotation matrix since otherwise ## sin \alpha =-sin \alpha =-1 ##. Did I make a mistake?EDIT: Or is this matrix conjugate to a rotation? EDIT2: I don't think it can be conjugate to a rotation matrix ; its determinant would still need to equal 1.

I believe you are talking here about a ##2\times 2## matrix. Then your antidiagonal matrix is not a rotation matrix, but it is a matrix of form ##SDS^{-1}##, where $$D=\left(\begin{array}{cc} 1 & 0 \\ 0 & -1 \end{array} \right).$$
 
Hawkeye18 said:
I believe you are talking here about a ##2\times 2## matrix. Then your antidiagonal matrix is not a rotation matrix, but it is a matrix of form ##SDS^{-1}##, where $$D=\left(\begin{array}{cc} 1 & 0 \\ 0 & -1 \end{array} \right).$$
Right, but I thought all solutions to ##A^n =Id ## consisted of rotations ( up to conjugation) , and the matrix I mentioned, antidiagonals =-1; 0 otherwise, is a solution (specifically, to ## A^2=Id ##), but not a rotation? But, I agree, if A is a rotation then its eigenvalues must be n-th roots of unity .
 
Last edited:
Let me elaborate. In the class of real matrices, all the solutions are given by ##A=SRS^{-1}##, where ##R## is a "rotation" (note the quotes, that is not always a real rotation), and ##S## is invertible. "Rotation" here mean that ##R## is the block diagonal matrices, where each block is either a ##2\times 2## rotation matrix (through an allowed angle), or just ##1\times 1## block, with ##n##th root of unity in it. I probably did not make it clear, but you do not have to have both types of blocks: your matrix can have only ##2\times2## blocks, or only ##1\times1## blocks, or any combination of these 2 (that adds up to the dimension).

The matrix $$R=D=\left(\begin{array}{cc} 1 & 0\\ 0& -1\end{array}\right)$$ is of that type, and it is not a rotation (that why I used quotes). But your matrix can be represented as ##SDS^{-1}##, so your example does not contradict my statement.
 
  • Like
Likes WWGD
I think you two may be talking past each other.

WWGD said:
Right, but I thought all solutions to ##A^n =Id ## consisted of rotations

This is wrong. For example: ##A^2 = I ## is an involution. Such a matrix does not need to be normal. (i.e. it is diagonalizable, but need not be unitarily diagonalizable.)

Permutation matrices are a special kind of orthogonal matrix (which is a special kind of unitary matrix). Rotation matrices are also orthogonal matrices. All orthogonal matrices are normal.

Another way to think about it is: Pick your favorite orthogonal matrix ##P##, where ##P^n = I##. Now do a non-unitary similarity transform, i.e. consider ##S## where ##S^{-1} \neq S^*##.

Now take a look at ##\big(S^{-1}PS\big)##. The spectrum of this is the same as for ##P## because the matrices are similar. But the ##P## is normal while the new matrix is not.

yet we still have

##P^n = I##

so

##\big(S^{-1}PS\big)^n =\big(S^{-1}P^n S\big) = \big(S^{-1}IS\big) = \big(S^{-1}S\big)= I ##
 
  • Like
Likes WWGD
  • #10
StoneTemplePython said:
I think you two may be talking past each other.
This is wrong. For example: ##A^2 = I ## is an involution. Such a matrix does not need to be normal. (i.e. it is diagonalizable, but need not be unitarily diagonalizable.)

Permutation matrices are a special kind of orthogonal matrix (which is a special kind of unitary matrix). Rotation matrices are also orthogonal matrices. All orthogonal matrices are normal.

Another way to think about it is: Pick your favorite orthogonal matrix ##P##, where ##P^n = I##. Now do a non-unitary similarity transform, i.e. consider ##S## where ##S^{-1} \neq S^*##.

Now take a look at ##\big(S^{-1}PS\big)##. The spectrum of this is the same as for ##P## because the matrices are similar. But the ##P## is normal while the new matrix is not.

yet we still have

##P^n = I##

so

##\big(S^{-1}PS\big)^n =\big(S^{-1}P^n S\big) = \big(S^{-1}IS\big) = \big(S^{-1}S\big)= I ##
Thanks, I am aware not all solutions are rotations , by a straightforward counting argument, but I thought someone here had stated they are rotations, up to similarity.
 
  • #11
WWGD said:
Thanks, I am aware not all solutions are rotations , by a straightforward counting argument, but I thought someone here had stated they are rotations, up to similarity.

I used "rotation" in quotes for the lack of better term, it is not always a real rotation. Your example is up to similarity what I call "rotation" see post # 8 above for the details.
 
  • #12
Hawkeye18 said:
I used "rotation" in quotes for the lack of better term, it is not always a real rotation. Your example is up to similarity what I call "rotation" see post # 8 above for the details.
Got it Hawk, thanks.
 

Similar threads

  • · Replies 15 ·
Replies
15
Views
5K
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
5K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 32 ·
2
Replies
32
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 17 ·
Replies
17
Views
6K
Replies
15
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K