Eigenvalues and Eigenvectors of Non-Singular Matrices: A Proof and Explanation

  • Thread starter BiBByLin
  • Start date
  • Tags
    Matrix
In summary, for a non-singular matrix A, if A is symmetric and positive definite, then A*A is also symmetric positive definite. This can be proven by using the singular value decomposition and the fact that a symmetric matrix is positive definite if and only if x^T M x > 0 for all nonzero vectors x.
  • #1
BiBByLin
17
0
Suppose A is a non-singular matrix. AT is the transpose matrix of A. Therefore, the eigenvalue of A can be expressed as:
S-1AS=Λ
(S-1AS)TT
(AS)TS-T=STATS-T
So, AT and A share the same eigenvalue and eigenvector.
Here, x is the base eigenvector of A. Hence, span{x} is the eigenvector of A.
ATAx=ATΛx.
We consider the new eigenvector Λx as the eigenvector of A or AT.
Therefore:
ATAx=ATΛx=ΛΛx.
So, ATA is symmetric positive definite.

Is my proof right?

It is very urgent for my program, pls help me. Thanks a lot!
 
Mathematics news on Phys.org
  • #2
I can see a dozen different things wrong with it. First you talk about "the" eigenvalue when a matrix may have many eigenvalues. Second,you say "S-1AS= =Λ" without saying what S or Λ are!

You say "Here, x is the base eigenvector of A. Hence, span{x} is the eigenvector of A."
What do you mean the "the base eigenvector"? span{x} is a subspace spanned by x, not a vector at all so it cannot be "the eigenvector of A"- and A is unlikely to have only one eigenvector so it makes no sense to talk about "the eigenvector of A".

Some of these (the use of "the" in particular) are probably just English language problems but I suspect some basic misunderstanding as well.
 
Last edited by a moderator:
  • #3
Consider the SVD.
 
  • #4
Thanks HallsofIvy.
I just want to know whether ATA is symmetric positive definite if A is a N*N matrix, and A is a non-singular matrix.
 
  • #5
Do you know about the singular value decomposition?
 
  • #6
Not, I don't know.

I will study it. Can you give me some hint?
 
  • #7
Hmm. If you don't know the SVD, it may be easier to prove it in a more direct manner, so keep that in mind. The SVD is a decomposition that any matrix has, with the form M=UEV*, where U and V are unitary and E is diagonal with positive or zero entries on the diagonal in decreasing order. The SVD is unique up to phase factors in U and V, and can be made unique if a convention is chosen.

Already we see the SVD is very similar to the eigenvalue decomposition, but it is not quite the same - U and V are always unitary, and not necessairily inverses of each other, whereas in the eigenvalue decomposition (PDP-1), P and P-1 are inverses that might not be unitary. The entries of E are nonnegative real numbers, whereas D could have negative or complex entries on the diagonal. Also, every matrix has a SVD, whereas not all matrices have an eigenvalue decomposition.

The geometric intuition behind the SVD is that, a matrix maps the unit sphere to a hyperellipse. The vectors in U are the principal axes of the hyperellipse, and the vectors in V are the vectors on the unit sphere that get mapped to the principal axes. The values on the diagonal in E are the scaling factors for each ellipse axis.

To apply the SVD to your problem, try:
A*A = (UEV*)*(UEV*) = VEU*UEV* = VE2V*

Since E2 is diagonal and V-1 = V*, this is the eigenvalue decomposition of A*A. I will leave it to you to verify that, VE2V* is symmetric positive definite.
 
  • #8
It's hard to tell exactly what you are assuming and what you are deducing. It's also not clear if you are assuming that you are working with a vector space over the REAL field. If not, you need to use the conjugate (Hermitian) transpose below.

BiBByLin said:
Suppose A is a non-singular matrix. AT is the transpose matrix of A. Therefore, the eigenvalue of A can be expressed as:
S-1AS=Λ

This is certainly NOT true in general. It is possible if and only if A is diagonalizable, which is true if and only there exists a full linearly independent set of eigenvectors. If you're working over the real field, there's not even any guarantee that your matrix has ANY eigenvalues or eigenvectors, even if it's nonsingular. (Example: rotation matrix.)

Second, your manipulation doesn't show that A and [tex]\Lambda[/tex] have the same eigenvectors. You seem to be assuming that S is symmetric, but why should it be?

Fortunately, your question can be answered without reference to eigenvalues or eigenvectors. A symmetric matrix (in a real vector space - otherwise replace with "Hermitian") is positive definite if and only if

[tex]x^T M x > 0[/tex] for all nonzero vectors x

Why don't you use this test directly for [tex]M = A^T A[/tex]? The answer should pop right out at you.
 
Last edited:
  • #9
great! Thank you very much! Maze, and JBunniii
 

1. What is a matrix?

A matrix is a rectangular array of numbers or symbols that are arranged in rows and columns. It is a fundamental tool used in mathematics and science for solving equations and representing data.

2. How do I solve a problem on matrix?

To solve a problem on matrix, you first need to identify the type of problem it is (e.g. addition, multiplication, etc.). Then, you can use the appropriate operations and rules for that type of problem to manipulate the numbers in the matrix and arrive at a solution.

3. What are the basic operations on matrices?

The basic operations on matrices include addition, subtraction, and multiplication. Addition and subtraction are performed by adding or subtracting corresponding elements in the matrices, while multiplication involves multiplying elements in one matrix with elements in the other matrix and then summing the products.

4. How do I know if a matrix problem has a solution?

A matrix problem has a solution if the dimensions (number of rows and columns) of the matrices involved are compatible for the operation being performed. For example, to add two matrices, they must have the same number of rows and columns. If the dimensions are not compatible, the problem cannot be solved.

5. What are some real-world applications of matrix problems?

Matrix problems have many real-world applications, such as in computer graphics, economics, and engineering. They can be used to represent data and perform calculations in various fields, such as image processing, financial analysis, and circuit design.

Similar threads

Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
226
  • Linear and Abstract Algebra
Replies
4
Views
5K
  • Linear and Abstract Algebra
Replies
1
Views
984
  • Linear and Abstract Algebra
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
2K
  • Linear and Abstract Algebra
Replies
4
Views
4K
Replies
4
Views
3K
  • Engineering and Comp Sci Homework Help
Replies
5
Views
5K
Back
Top