Proving only 1 normalized unitary vector for normal matrix

  • Context: Undergrad 
  • Thread starter Thread starter swampwiz
  • Start date Start date
  • Tags Tags
    Matrix Normal Vector
Click For Summary

Discussion Overview

The discussion revolves around the properties of normal matrices, specifically focusing on the uniqueness of normalized unitary vectors associated with these matrices. Participants explore the implications of eigenvector solutions, diagonalization, and the conditions under which these properties hold, including considerations of uniqueness and the role of permutation matrices.

Discussion Character

  • Debate/contested
  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • Some participants assert that every normal matrix has a full eigenvector solution and that there is only one normalized modal matrix, assuming unique eigenvalues to avoid degeneracy.
  • One participant questions the assertion of a single unitary matrix for diagonalization, suggesting that eigenvectors can be rescaled and that permutation matrices may also play a role in the diagonalization process.
  • There is a discussion about the definition of unitary vectors, with some participants clarifying that the term may not be appropriate and that unitary matrices are formed from mutually orthonormal vectors.
  • Concerns are raised regarding the implications of eigenvalues having geometric multiplicity greater than one, which could lead to more linearly independent eigenvectors than the dimension allows.
  • Participants discuss the normalization of the modal matrix and the potential for different arrangements of eigenvectors, particularly in the context of real versus complex eigenvalues.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the uniqueness of the normalized unitary vectors or the conditions under which they exist. Multiple competing views remain regarding the implications of eigenvector scaling and the role of permutation matrices in the diagonalization of normal matrices.

Contextual Notes

Participants note that the discussion is limited by assumptions about eigenvalue uniqueness and the implications of geometric multiplicity. The definitions and properties of unitary matrices and vectors are also points of contention.

swampwiz
Messages
567
Reaction score
83
AIUI, every normal matrix has a full eigenvector solution, and there is only 1 *normalized* modal matrix as the solution (let's presume unique eigenvalues so as to avoid the degenerate case of shared eigenvalues), and the columns of the modal matrix, which are the (normalized) eigenvectors, are unitary vectors. (I am presuming that there is only 1 such solution, a proof of which I don't think I am familiar with,)

But I'd to prove this singleness from the opposite direction. I know that the sesquilinear quadratic product for the case of unitary matrices as the side matrix is such that the normality of the product is the same as the normality of the central matrix, and thus there must exist a unitary matrix that diagonalizes any particular normal matrix since the sesquilinear product of a diagonal matrix and a unitary matrix (namely the inverse of the original one, which is the transpose because it is unitary, and thus unitary itself) must be normal to follow the normality of a diagonal matrix (which is de facto normal). But I can't seem to prove that there only exists 1 unitary matrix that accomplishes this.

EDIT: Each column can be + , so there are 2n modal matrices, but if the columns are limited such that the element that corresponds to the pivot element is the same sign, then this situation goes away.

So I am hung up as to why there is only 1 solution (unless I am wrong about this!) for the unitary diagonalization, and as well the singleness of the eigenvector solution. Obviously, once it is proven that there is only 1 for each, then it is proven that this matrix is one in the same.
 
Physics news on Phys.org
There's a lot of jargon in your post and it's not quite right. I assume we're talking about n-dimensional vectors and in general the scalars are in ##\mathbb C##.
E.g.
swampwiz said:
(namely the inverse of the original one, which is the transpose because it is unitary, and thus unitary itself) must be normal to follow the normality of a diagonal matrix (which is de facto normal).

This is technically wrong -- you mean to say conjugate transpose. It's a technical point and matters in cases like dealing with the DFT which is symmetric but not Hermitian.

swampwiz said:
AIUI, every normal matrix has a full eigenvector solution, and there is only 1 *normalized* modal matrix as the solution (let's presume unique eigenvalues so as to avoid the degenerate case of shared eigenvalues), and the columns of the modal matrix, which are the (normalized) eigenvectors, are unitary vectors. (I am presuming that there is only 1 such solution, a proof of which I don't think I am familiar with,)

I'm not sure there's such a thing as 'unitary vectors' (and while I know matrices can be treated as a vector space, that is not what we're talking about here). In general, when you have n mutually orthonormal vectors, with at least one having a non-zero imaginary component, and you collect them in an ##n## x ##n## matrix, you call the matrix unitary.

swampwiz said:
So I am hung up as to why there is only 1 solution (unless I am wrong about this!) for the unitary diagonalization, and as well the singleness of the eigenvector solution. Obviously, once it is proven that there is only 1 for each, then it is proven that this matrix is one in the same.

I'm not really sure that there is 'one solution'. You've already identified that the eigenvectors can be rescaled. What about the use of permutation matrices?

I.e. suppose you have ##\mathbf A = \mathbf {UDU}^*##

where ##\mathbf A## is normal and has been unitarily diagonalized. Well we can also say

##\mathbf A = \mathbf {UIDIU}^* = \big(\mathbf U\mathbf P\big) \big(\mathbf P^* \mathbf D \mathbf P\big) \big(\mathbf P^* \mathbf U^*\big) = \big(\mathbf U\mathbf P\big) \big(\mathbf P^* \mathbf D \mathbf P\big) \big(\mathbf U\mathbf P\big)^*##

where ##\mathbf P## is any permutation matrix that you like. So you may infer that ordering doesn't matter.

Ok what does the set of eigenvectors look like? You've already said to assume all eigenvalues are unique, so there must be ##n## linearly independent eigenvectors, one for each eigenvalue, (why? And note that this creates a pidgeon hole problem -- a basis has exactly n linearly independent vectors, each with positive length -- if even one of your eigenvalues had extra eigenvectors --i.e. geometric multiplicity ##\gt 1##-- you'd have more linearly independent eigenvectors than is possible for forming a basis -- i.e. a given set of linearly independent vectors with positive length has cardinality of at most n).

So the nullspace of ##\big(\mathbf A - \mathbf \lambda_k \mathbf I\big)## has the zero vector in it and exactly one non-zero vector, for each eigenvalue. You may scale the non-zero vector as you choose. That is all.
 
StoneTemplePython said:
There's a lot of jargon in your post and it's not quite right. I assume we're talking about n-dimensional vectors and in general the scalars are in ##\mathbb C##.
E.g.

This is technically wrong -- you mean to say conjugate transpose. It's a technical point and matters in cases like dealing with the DFT which is symmetric but not Hermitian.

Yes, I meant conjugate transpose. What is DFT?

StoneTemplePython said:
I'm not sure there's such a thing as 'unitary vectors' (and while I know matrices can be treated as a vector space, that is not what we're talking about here). In general, when you have n mutually orthonormal vectors, with at least one having a non-zero imaginary component, and you collect them in an ##n## x ##n## matrix, you call the matrix unitary.

Yes, I meant the eigenvectors are such that the matrix is unitary.

StoneTemplePython said:
I'm not really sure that there is 'one solution'. You've already identified that the eigenvectors can be rescaled. What about the use of permutation matrices?

I.e. suppose you have ##\mathbf A = \mathbf {UDU}^*##

where ##\mathbf A## is normal and has been unitarily diagonalized. Well we can also say

##\mathbf A = \mathbf {UIDIU}^* = \big(\mathbf U\mathbf P\big) \big(\mathbf P^* \mathbf D \mathbf P\big) \big(\mathbf P^* \mathbf U^*\big) = \big(\mathbf U\mathbf P\big) \big(\mathbf P^* \mathbf D \mathbf P\big) \big(\mathbf U\mathbf P\big)^*##

where ##\mathbf P## is any permutation matrix that you like. So you may infer that ordering doesn't matter.

The modal matrix here is normalized, so there is no arbitrary scaling other than the signs. I didn't consider the fact that the modes could be in any order, but I presumed that the they would be in order as per the arithmetic value, which removes the permutation. Of course, this only works if the eigenvalues are real, since complex numbers can't be ordered as such - although they could by sorting by real component first, then imaginary component.

StoneTemplePython said:
Ok what does the set of eigenvectors look like? You've already said to assume all eigenvalues are unique, so there must be ##n## linearly independent eigenvectors, one for each eigenvalue, (why? And note that this creates a pidgeon hole problem -- a basis has exactly n linearly independent vectors, each with positive length -- if even one of your eigenvalues had extra eigenvectors --i.e. geometric multiplicity ##\gt 1##-- you'd have more linearly independent eigenvectors than is possible for forming a basis -- i.e. a given set of linearly independent vectors with positive length has cardinality of at most n).

So the nullspace of ##\big(\mathbf A - \mathbf \lambda_k \mathbf I\big)## has the zero vector in it and exactly one non-zero vector, for each eigenvalue. You may scale the non-zero vector as you choose. That is all.

Yes, right after I posted this, I thought that the fact that this all derives from the terms of the nullspace solution explains my question in some way. Thanks
 
swampwiz said:
Yes, I meant conjugate transpose. What is DFT?
Yes, I meant the eigenvectors are such that the matrix is unitary.
The modal matrix here is normalized, so there is no arbitrary scaling other than the signs. I didn't consider the fact that the modes could be in any order, but I presumed that the they would be in order as per the arithmetic value, which removes the permutation. Of course, this only works if the eigenvalues are real, since complex numbers can't be ordered as such - although they could by sorting by real component first, then imaginary component.
Yes, right after I posted this, I thought that the fact that this all derives from the terms of the nullspace solution explains my question in some way. Thanks
DFT = Discrete Fourier Transform matrix.

It may be easier to think about this stuff in the more narrow confines of of Hermitian matrices --- then you get real eigenvalues which certainly makes ordering seem more natural. You also have the fact that for any ##\mathbb C ^{n x n}## matrix the left eigenvectors and right eigenvectors are orthogonal if the eigenvalues are different -- I.e. for any matrix ##\mathbf A## in ##\mathbb C ^{n x n}##, you have left eigenpairs ## (\lambda_j, \mathbf v_j)## and right eigenpairs ## (\lambda_k, \mathbf x_k)##, when all eigs are unique and ##k \neq j##, then

##\big(\mathbf v_j^* \mathbf A\big) \mathbf x_k = \lambda_j \mathbf v_j^* \mathbf x_k = \lambda_k \mathbf v_j^* \mathbf x_k = \mathbf v_j^* \big(\mathbf A \mathbf x_k\big)##

but ##\lambda_j \neq \lambda_k## so ##\mathbf v_j^* \mathbf x_k =0##
- - - -
But a Hermitian matrix has the same left and right eigenvectors, hence all eigenvectors are mutually orthogonal in your matrix example with unique eigs.
 

Similar threads

  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 33 ·
2
Replies
33
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K