Do Gramian Matrices Have Only One Non-Zero Eigenvalue?

  • Context: Graduate 
  • Thread starter Thread starter I_am_learning
  • Start date Start date
  • Tags Tags
    Eigenvalues Matrix
Click For Summary
SUMMARY

The discussion confirms that Gramian matrices, defined as G = x * x^T for a column vector x, have exactly one non-zero eigenvalue. This eigenvalue is equal to the squared norm of the vector x, specifically λ = x^T * x. Participants highlighted that while calculating eigenvalues through characteristic polynomials can be complex, a simpler proof exists that demonstrates the relationship between eigenvectors and the vector x itself. The conclusion is that Gramian matrices consistently exhibit this eigenvalue property.

PREREQUISITES
  • Understanding of Gramian matrices and their definition
  • Knowledge of eigenvalues and eigenvectors
  • Familiarity with characteristic polynomials
  • Basic linear algebra concepts
NEXT STEPS
  • Study the properties of Gramian matrices in linear algebra
  • Learn about eigenvalue decomposition and its applications
  • Explore simpler methods for calculating eigenvalues of matrices
  • Investigate the implications of eigenvector multiplicity in linear transformations
USEFUL FOR

Mathematicians, students of linear algebra, and anyone interested in the properties of matrices and eigenvalues will benefit from this discussion.

I_am_learning
Messages
681
Reaction score
16
if x is a column vector, then a matrix G = x*xT is a Gramian Matrix.
When I tried calculating the matrix G and its eigenvalues for cases when x = [x1 x2]' and [x1 x2 x3]'
by actually working out the algebra, it turned out (if I didn't do any mistakes) that the eigen values are all zeros except one which is equal to (x12+x22 OR x12 + x22 + x32) depending upon the case.
Is this a standard result for a Gramian Matrix to have a single non-zero eigenvalue? If, yes, is there a simpler proof?

Thank you.
 
Physics news on Phys.org
Yes that is a standard and pretty simple fact. How did you compute the eigenvalues? Did you compute characteristic polynomials and then find their roots? If so, then yes, there is a much simpler proof.
 
Hawkeye18 said:
Yes that is a standard and pretty simple fact. How did you compute the eigenvalues? Did you compute characteristic polynomials and then find their roots? If so, then yes, there is a much simpler proof.
Yes, I solved the roots of characteristic equation, and it was a nasty business for even a 3x3 matrix. :D Would love to know the simpler method.
 
Let ##\lambda \ne 0## be an eigenvalue, and ##\mathbf v\ne\mathbf 0## be he corresponding eigenvector. That means ##G\mathbf v =\lambda\mathbf v##. But $$G \mathbf v = \mathbf x (\mathbf x^T \mathbf v)$$ and ##(\mathbf x^T \mathbf v)## is a scalar. Therefore $$(\mathbf x^T \mathbf v) \mathbf x = \lambda \mathbf v,$$ so the eigenvector ##\mathbf v## must be a non-zero multiple of ##\mathbf x##. Substituting ##a\mathbf x## (where ##a\ne0## is a scalar) in the above equation, we get that indeed ## a\mathbf x ## is an eigenvector corresponding to ##\lambda= \mathbf x^T\mathbf x##.
 
  • Like
Likes   Reactions: I_am_learning
Hawkeye18 said:
Let ##\lambda \ne 0## be an eigenvalue, and ##\mathbf v\ne\mathbf 0## be he corresponding eigenvector. That means ##G\mathbf v =\lambda\mathbf v##. But $$G \mathbf v = \mathbf x (\mathbf x^T \mathbf v)$$ and ##(\mathbf x^T \mathbf v)## is a scalar. Therefore $$(\mathbf x^T \mathbf v) \mathbf x = \lambda \mathbf v,$$ so the eigenvector ##\mathbf v## must be a non-zero multiple of ##\mathbf x##. Substituting ##a\mathbf x## (where ##a\ne0## is a scalar) in the above equation, we get that indeed ## a\mathbf x ## is an eigenvector corresponding to ##\lambda= \mathbf x^T\mathbf x##.
I cannot see why the bold part should be true?
But, doing the underlined part, i.e. substitution v=ax, I can see that it gives a solution, is that from this you infered that the bold part should hold?

Thank you for your help.
 
I_am_learning said:
I cannot see why the bold part should be true?
But, doing the underlined part, i.e. substitution v=ax, I can see that it gives a solution, is that from this you infered that the bold part should hold?

Thank you for your help.
No, the "bold" part is true independently of the "underlined" part, they both prove different parts of the statement.

For the "bold" part: we know that ##(\mathbf x^T\mathbf v)## is a number, let call it ##\beta##. Then the equation is rewritten as ##\beta\mathbf x = \lambda\mathbf v##, and solving it for ##\mathbf v ## gives us ##\mathbf v = (\beta/\lambda) \mathbf x##.

Now, the constant ##\beta## depends on the unknown ##\mathbf v##, and we do not know what ##\lambda## is, so we cannot say from here that ##\mathbf v = (\beta/\lambda) \mathbf x= a\mathbf x## is an eigenvector. But what we can say is that if ##\mathbf v## is an eigenvector corresponding to a non-zero eigenvalue ##\lambda##, then it must be a non-zero multiple of ##\mathbf x##.

Substituting then ##\mathbf v=a\mathbf x## we get that it is indeed an eigenvector and find ##\lambda##. So the "underlined" part give you that ##\mathbf v=a\mathbf x## is an eigenvector, and gives the corresponding eigenvalue. The "bold"" part shown that there are no other eigenvectors corresponding to a non-zero eigenvalue.
 
  • Like
Likes   Reactions: I_am_learning

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 10 ·
Replies
10
Views
14K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 7 ·
Replies
7
Views
7K
Replies
2
Views
3K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 0 ·
Replies
0
Views
1K
Replies
1
Views
16K