Proving a matrix exponential determinant is a exponential trace

In summary: And this is the exact reason why we should always attribute our sources and not try to pass off someone else's work as our own. In summary, the proof for det(e^{A})=e^{tr(A)} can be shown by writing A in Schur Decomposition form and using Taylor series expansion to show that the determinant of the matrix exponential is equal to the exponential of the trace of A. This can also be shown using Jordan Canonical form.
  • #1
Demon117
165
1

Homework Statement


Prove that for any matrix [itex]A[/itex], the following relation is true:

[itex]det(e^{A})=e^{tr(A)}[/itex]


The Attempt at a Solution


PROOF: Let [itex]A[/itex] be in Jordan Canonical form, then

[itex]A=PDP^{-1}[/itex]

where [itex]D[/itex] is the diagonal matrix whose entries are the eigenvalues of [itex]A[/itex]. Then,

[itex]e^{A}=Pe^{D}P^{-1}[/itex]

By application of the determinant operator we have

[itex]det(e^{A})=det(Pe^{D}P^{-1})=det(P)det(e^{D})det(P^{-1})=det(e^{D})[/itex]

Since the diagonal matrix has eigenvalue elements along its main diagonal, it follows that the determinant of its exponent is given by

[itex]det(e^{D})=e^{\lambda_{1}} \cdot e^{\lambda_{2}} \cdot \cdot \cdot e^{\lambda_{n}}[/itex]

By simple algebra the product of the exponents is the exponent of the sum, so

[itex]det(e^{D})=e^{\lambda_{1}+\lambda_{2}+ \cdot \cdot \cdot +\lambda_{n}}[/itex]

This of course is simply the exponent of the trace of [itex]D[/itex]. Therefore,

[itex]det(e^{A})=e^{tr(D)}[/itex]

Now, this is where I get messed up. My question is simply this: Because [itex]A[/itex] is similar to [itex]D[/itex], does it follow that the trace of [itex]D[/itex] is the same as the trace of [itex]A[/itex]? I heard this somewhere but I cannot verify this statement anywhere in my notes or textbooks. If this is a true statement, then the proof is complete.
 
Physics news on Phys.org
  • #2
matumich26 said:
Now, this is where I get messed up. My question is simply this: Because [itex]A[/itex] is similar to [itex]D[/itex], does it follow that the trace of [itex]D[/itex] is the same as the trace of [itex]A[/itex]? I heard this somewhere but I cannot verify this statement anywhere in my notes or textbooks. If this is a true statement, then the proof is complete.

Well, this was pointless. I found that the similar matrices do indeed have same trace. But does that also follow for diagonal matrices?
 
  • #3
matumich26 said:
Well, this was pointless. I found that the similar matrices do indeed have same trace. But does that also follow for diagonal matrices?

If two matrices are similar, their traces are equal whether one is diagonal or not. But that's not your biggest worry. Your proof looks fine for diagonal matrices, but not all matrices have Jordan form that is diagonal. What about them?
 
  • #4
Dick said:
If two matrices are similar, their traces are equal whether one is diagonal or not. But that's not your biggest worry. Your proof looks fine for diagonal matrices, but not all matrices have Jordan form that is diagonal. What about them?

But every matrix can be put into Jordan normal form correct? If this is true (and the statement of the problem should actually be "every n x n matrix") then the proof would not be altogether different, just write it in terms of the Jordan normal form?
 
  • #5
matumich26 said:
But every matrix can be put into Jordan normal form correct? If this is true (and the statement of the problem should actually be "every n x n matrix") then the proof would not be altogether different, just write it in terms of the Jordan normal form?

No, it's not altogether different. Any matrix is similar to D+N, where D is your diagonal matrix of eigenvalues and N is strictly upper triangular. You want to show even with N nonzero trace of exp(D+N) is equal trace of exp(D).
 
Last edited:
  • #6
Hi matumich26! :smile:


Sorry for being a bit of a nitpicker... but...


matumich26 said:
PROOF: Let [itex]A[/itex] be in Jordan Canonical form, then

Huh? I thought A was just any matrix?


matumich26 said:
[itex]A=PDP^{-1}[/itex]

where [itex]D[/itex] is the diagonal matrix whose entries are the eigenvalues of [itex]A[/itex]. Then,

How can you make a Jordan Canonical form similar to a diagonal matrix?


matumich26 said:
[itex]e^{A}=Pe^{D}P^{-1}[/itex]

How did you get this?
It's not true.
 
  • #7
I like Serena said:
Hi matumich26! :smile:


Sorry for being a bit of a nitpicker... but...




Huh? I thought A was just any matrix?




How can you make a Jordan Canonical form similar to a diagonal matrix?




How did you get this?
It's not true.

I'll just ignore this because I realize the error in my assumptions, but thank you. I have a different proof that I'd like to submit for your critique.

PROOF: Let A be any matrix. By Schur Decomposition A can be written as

[itex]A=D+N[/itex]

where D is a diagonal matrix composed of the eigenvalues of A, and N is some upper triangular matrix whose entries may be nonzero. Then, by Taylor series expansion

[itex]e^{A}=\sum \frac{A^{k}}{k!}[/itex]

By properties of diagonal matrices and strictly upper triangular matrices we know that both [itex]ND[/itex] and [itex]DN[/itex] will also be strictly upper triangular matrices and so will their sum.

Thus the powers of [itex]A[/itex] are:

[itex]A=(D+N)=D+N_{1}[/itex]
[itex]A^{2}=(D+N)(D+N)=D^{2}+N_{2}[/itex]
[itex]\cdot[/itex]
[itex]\cdot[/itex]
[itex]\cdot[/itex]
[itex]A^{k}=D^{k}+N_{k}[/itex]
[itex]\cdot[/itex]
[itex]\cdot[/itex]
[itex]\cdot[/itex]

The [itex]N_{i}[/itex] matrices are all strictly upper triangular. The recursion formula is given by [itex]N_{n+1}=DN_{n}+N_{n}D+N_{1}N_{n}[/itex]. We can write the exponential of the matrix as follows:

[itex]e^{A}=e^{D}+\widetilde{N}[/itex]

where the matrix [itex]\widetilde{N}=\sum \frac{N_{k}}{k!}[/itex] is strictly upper triangular. The matrix exponential [itex]e^{D}=diag(e^{\lambda_{1}},e^{\lambda_{2}},...,e^{\lambda_{n}})[/itex] where [itex]D=diag(\lambda_{1},\lambda_{2},...,\lambda_{n})[/itex].

By all of these arguments it follows that [itex]e^{A}[/itex] will be strictly upper triangular. Since the determinant of an upper triangular matrix is the product of the diagonal elements, it follows that

[itex]det(e^{A})=\prod e^{\lambda_{n}}=e^{\sum \lambda_{n}}=e^{tr(A)}[/itex]

Thus, the relation holds.
 
  • #8
matumich26 said:
I'll just ignore this because I realize the error in my assumptions, but thank you. I have a different proof that I'd like to submit for your critique.

PROOF: Let A be any matrix. By Schur Decomposition A can be written as

[itex]A=D+N[/itex]

where D is a diagonal matrix composed of the eigenvalues of A, and N is some upper triangular matrix whose entries may be nonzero. Then, by Taylor series expansion

[itex]e^{A}=\sum \frac{A^{k}}{k!}[/itex]

By properties of diagonal matrices and strictly upper triangular matrices we know that both [itex]ND[/itex] and [itex]DN[/itex] will also be strictly upper triangular matrices and so will their sum.

Thus the powers of [itex]A[/itex] are:

[itex]A=(D+N)=D+N_{1}[/itex]
[itex]A^{2}=(D+N)(D+N)=D^{2}+N_{2}[/itex]
[itex]\cdot[/itex]
[itex]\cdot[/itex]
[itex]\cdot[/itex]
[itex]A^{k}=D^{k}+N_{k}[/itex]
[itex]\cdot[/itex]
[itex]\cdot[/itex]
[itex]\cdot[/itex]

The [itex]N_{i}[/itex] matrices are all strictly upper triangular. The recursion formula is given by [itex]N_{n+1}=DN_{n}+N_{n}D+N_{1}N_{n}[/itex]. We can write the exponential of the matrix as follows:

[itex]e^{A}=e^{D}+\widetilde{N}[/itex]

where the matrix [itex]\widetilde{N}=\sum \frac{N_{k}}{k!}[/itex] is strictly upper triangular. The matrix exponential [itex]e^{D}=diag(e^{\lambda_{1}},e^{\lambda_{2}},...,e^{\lambda_{n}})[/itex] where [itex]D=diag(\lambda_{1},\lambda_{2},...,\lambda_{n})[/itex].

By all of these arguments it follows that [itex]e^{A}[/itex] will be strictly upper triangular. Since the determinant of an upper triangular matrix is the product of the diagonal elements, it follows that

[itex]det(e^{A})=\prod e^{\lambda_{n}}=e^{\sum \lambda_{n}}=e^{tr(A)}[/itex]

Thus, the relation holds.

Of course it's right. It's a pretty literal rewriting of the proof on PlanetMath website. There's nothing wrong with looking something up, but you might at least want to digest it and then put it in you own words and notation. That's really too close to the exact form of what you looked up. Might make somebody think you really don't understand it and are just parroting.
 
Last edited:
  • #9
Dick said:
Of course it's right. It's a pretty literal rewriting of the proof on PlanetMath website. There's nothing wrong with looking something up, but you might at least want to digest it and then put it in you own words and notation. That's really too close to the exact form of what you looked up.

In the words of a wise friend, even if a solution looks good before you make use of it (i.e. quote it) make sure you have it checked by at least two or more good sources so that you don't look like a fool. Passing it off as my own was never my intention here, but now that I have seen the general idea I can at least use some of the ideas that I hadn't understood before. But thank you for helping me keep a level head about this sir.
 
  • #10
matumich26 said:
In the words of a wise friend, even if a solution looks good before you make use of it (i.e. quote it) make sure you have it checked by at least two or more good sources so that you don't look like a fool. Passing it off as my own was never my intention here, but now that I have seen the general idea I can at least use some of the ideas that I hadn't understood before. But thank you for helping me keep a level head about this sir.

No problem, everybody looks ideas up if you can't solve it after thinking about it for a while. That's just research. Just use them to make you own version. Like if you know the Jordan form use that. You don't have to use the Schur decomposition just because the model did. Just use the ideas, not the literal text of the other proof.
 
  • #11
Hmm, it was not my intention to burn you down or anything. :frown:
Sorry for that.

As for the trace being the same for similar matrices.
Perhaps you'd like to know that just like the determinant of any matrix is the product of the eigenvalues, the trace is the sum of the eigenvalues.
 
  • #12
I like Serena said:
Hi matumich26! :smile:


Sorry for being a bit of a nitpicker... but...




Huh? I thought A was just any matrix?




How can you make a Jordan Canonical form similar to a diagonal matrix?




How did you get this?
It's not true.

I was just wondering why (according to your last claim), if \begin{equation}A = PDP^{-1}\end{equation} where D is a diagonal matrix, the statement \begin{equation}e^A = Pe^DP^{-1}\end{equation} is not true? To me it seems to hold, since clearly \begin{equation}A^k = PD^kP^{-1}\end{equation} holds for any natural number k (by cancellation of all the 'inbetween' P and P^{-1}). Hence

\begin{equation}
e^A = \sum_{k=0}^{\infty} \frac{A^k}{k!} = \lim_{N \rightarrow \infty} \sum_{k=0}^{N} \frac{A^k}{k!} = P \left(\lim_{N \rightarrow \infty} \sum_{k=0}^{N} \frac{D^k}{k!}\right)P^{-1} = Pe^DP^{-1}.
\end{equation}
Is there anything wrong with this?
 
  • #13
Just a remark: There is no reason why D should be a diagonal matrix in this post (but of course it matters very much for the main discussion in this thread!) and that is the reason I chose to keep it this way. But as some of you might well point out, the above calculation goes through for any similar matrices.
 
  • #14
Demon117 said:
I'll just ignore this because I realize the error in my assumptions, but thank you. I have a different proof that I'd like to submit for your critique.

PROOF: Let A be any matrix. By Schur Decomposition A can be written as

[itex]A=D+N[/itex]

where D is a diagonal matrix composed of the eigenvalues of A, and N is some upper triangular matrix whose entries may be nonzero. Then, by Taylor series expansion

[itex]e^{A}=\sum \frac{A^{k}}{k!}[/itex]

By properties of diagonal matrices and strictly upper triangular matrices we know that both [itex]ND[/itex] and [itex]DN[/itex] will also be strictly upper triangular matrices and so will their sum.

Thus the powers of [itex]A[/itex] are:

[itex]A=(D+N)=D+N_{1}[/itex]
[itex]A^{2}=(D+N)(D+N)=D^{2}+N_{2}[/itex]
[itex]\cdot[/itex]
[itex]\cdot[/itex]
[itex]\cdot[/itex]
[itex]A^{k}=D^{k}+N_{k}[/itex]
[itex]\cdot[/itex]
[itex]\cdot[/itex]
[itex]\cdot[/itex]

The [itex]N_{i}[/itex] matrices are all strictly upper triangular. The recursion formula is given by [itex]N_{n+1}=DN_{n}+N_{n}D+N_{1}N_{n}[/itex]. We can write the exponential of the matrix as follows:

[itex]e^{A}=e^{D}+\widetilde{N}[/itex]

where the matrix [itex]\widetilde{N}=\sum \frac{N_{k}}{k!}[/itex] is strictly upper triangular. The matrix exponential [itex]e^{D}=diag(e^{\lambda_{1}},e^{\lambda_{2}},...,e^{\lambda_{n}})[/itex] where [itex]D=diag(\lambda_{1},\lambda_{2},...,\lambda_{n})[/itex].

By all of these arguments it follows that [itex]e^{A}[/itex] will be strictly upper triangular. Since the determinant of an upper triangular matrix is the product of the diagonal elements, it follows that

[itex]det(e^{A})=\prod e^{\lambda_{n}}=e^{\sum \lambda_{n}}=e^{tr(A)}[/itex]

Thus, the relation holds.

This is way too complicated. If J is the Jordan form and f is any analytic function, the matrix f(J) is upper-triangular with elements ##f(\lambda_i)## along the diagonal. The determinant of an upper-triangular matrix is the product of its diagonal elements.
 

1. What is a matrix exponential determinant and why is it important?

A matrix exponential determinant is the product of all the eigenvalues of a square matrix raised to the power of the matrix's size. It is important because it provides valuable information about the behavior of the matrix, such as whether it is invertible or singular.

2. How can I prove that the matrix exponential determinant is equal to the exponential of the trace?

To prove this, we can use the fact that the trace of a matrix is equal to the sum of its eigenvalues. By taking the logarithm of both sides and using the properties of logarithms, we can show that the logarithm of the matrix exponential determinant is equal to the trace of the matrix. Then, by exponentiating both sides, we can conclude that the matrix exponential determinant is equal to the exponential of the trace.

3. Can this proof be extended to non-square matrices?

No, this proof only applies to square matrices because the trace and determinant are only defined for square matrices. For non-square matrices, a different approach would be needed to prove that the exponential of the trace is equal to the matrix exponential determinant.

4. Is there a practical application of this theorem?

Yes, this theorem is commonly used in linear algebra and differential equations to solve systems of linear equations. It allows us to simplify calculations involving matrix determinants and traces, making it a useful tool in various fields of science and engineering.

5. Are there any other related theorems that involve the matrix exponential determinant?

Yes, there is a related theorem known as the Cayley-Hamilton theorem which states that a matrix satisfies its own characteristic polynomial. This theorem can also be used to prove the matrix exponential determinant is equal to the exponential of the trace.

Similar threads

  • Calculus and Beyond Homework Help
Replies
4
Views
962
  • Calculus and Beyond Homework Help
Replies
6
Views
2K
  • Calculus and Beyond Homework Help
Replies
5
Views
620
  • Calculus and Beyond Homework Help
Replies
4
Views
684
  • Calculus and Beyond Homework Help
Replies
3
Views
330
  • Calculus and Beyond Homework Help
Replies
25
Views
2K
  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
299
  • Calculus and Beyond Homework Help
Replies
11
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
4K
Back
Top