What is the most efficient method for computing exp(tA)?

Click For Summary

Discussion Overview

The discussion revolves around methods for computing the matrix exponential of a real matrix that lacks special structure, particularly focusing on the challenges posed by singular matrices. Participants explore various approaches, including eigenvalue decomposition and alternative methods for evaluating the matrix exponential efficiently.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant seeks an efficient method to compute the matrix exponential of a non-special real matrix, noting it is the most time-consuming step in their program.
  • Another participant suggests using eigenvalue decomposition if eigenvalues and eigenvectors can be found efficiently, proposing that the matrix can be expressed in terms of its eigenvalues and eigenvectors.
  • A participant points out that the matrix is not invertible, which complicates the use of eigenvalue decomposition.
  • Further clarification reveals that the matrix is constructed such that its diagonal elements equal the negative sum of the off-diagonal elements in each row, confirming its singularity.
  • Some participants argue that having a zero eigenvalue does not necessarily prevent the existence of a non-singular eigenvector matrix, while others contend that it complicates the decomposition process.
  • One participant provides an example of a singular matrix, demonstrating the calculation of its eigenvalues and eigenvectors, and discusses the implications of the singularity on the decomposition.
  • Another participant raises concerns about the reliability of eigenvalue decomposition when the matrix has linearly dependent rows or columns.
  • There is a discussion about the evaluation of the matrix exponential in MATLAB, with one participant suggesting the use of the expm function instead of exp for accurate results.
  • A later post introduces the idea of periodically re-evaluating exp(tA) for different values of t and discusses methods like Lagrange interpolation and Schur decomposition for potentially improving efficiency.

Areas of Agreement / Disagreement

Participants express differing views on the implications of having a zero eigenvalue and the reliability of eigenvalue decomposition for singular matrices. The discussion remains unresolved regarding the best method for computing the matrix exponential efficiently.

Contextual Notes

Limitations include the dependence on the specific structure of the matrix and the unresolved nature of the mathematical steps involved in the proposed methods.

brydustin
Messages
201
Reaction score
0
I need to compute the matrix exponential of a real matrix that has no special structure (is not symmetric, positive definite, nilpotent, normal, etc...). Currently, this is the most time consuming step in my program; I need to be able to transform this matrix, I suppose, into something that can have its exponential easily computed. Thanks in advance for any ideas.
 
Physics news on Phys.org
If you can find the eigenvalues and eigenvectors of the matrix efficiently, then you can write:
[tex] A \, U = U \, \Lambda[/tex]
where [itex]\Lambda[/itex] is a diagonal matrix containing the eigenvalues, and [itex]U[/itex] is a matrix whose columns are the corresponding eigenvectors. If these are linearly independent, then [itex]U^{-1}[/itex], and you can write:
[tex] A = U \, \Lambda \, U^{-1} \Rightarrow \exp(A) = U \, \exp(\Lambda) \, U^{-1}[/tex]
The matrix [itex]\exp(\Lambda)[/itex] is a diagonal matrix with the diagonal elements being the exponential of the corresponding eigenvalue.

This method works for an arbitrary function of a matrix.
 
Sorry, I forgot to mention that the matrix is not invertible; so this method will not work.
 
To be more specific... my matrix is such that the diagonal bits are equal to the negative of the sum of the off diagonal elements associated with that given row. That is why my matrix will never be invertible, its singular by construction. But can this construction be manipulated?
 
It doesn't matter if the matrix is singular. That only means that 0 is an eigenvalues. What you need is the matrix whose columns are the eigenvectors of your matrix to be non-singular. This is quite different.
 
So, does your matrix satisfy:
[tex] A_{i i} = -\sum_{k \neq i}{A_{i k}}[/tex]
 
Dickfore said:
So, does your matrix satisfy:
[tex] A_{i i} = -\sum_{k \neq i}{A_{i k}}[/tex]

Yes, that is correct.

In response to your post above this, I think that it does matter that an eigenvalue is zero. If an eigenvalue is zero, there is no unique associated eigenvector (any will do). This will ruin the basis, and therefore the inverse does not exist, so there is no motivation for the equation:
i.e. A V = V Λ → A = VΛV^-1 ⇔ V^-1 exists ⇔ 0 isn't an eigenvalue... right?
 
No, not any vector is an eigenvector for the zero eigenvalue. There is a vector [itex]\vert x \rangle[/itex], such that:
[tex] A \, \vert x \rangle \neq 0[/tex]
Otherwise, the matrix is identically equal to zero. But, then, this vector is not an eigenvector.

BTW, your matrix satisfies the following:
[tex] \sum_{k}{A_{i k} \cdot 1} = 0[/tex]
so the vector [itex]\vert x_0 \rangle = \frac{1}{\sqrt{N}} \, \left( 1, 1 , \ldots, 1 \right)^{\top}[/itex] is the normalized eigenvector corresponding to the zero eigenvalue.
 
Example of diagonalizing. Take the obviously singular 2x2 matrix:
[tex] A = \left( \begin{array}{cc}<br /> 1 & 2 \\<br /> <br /> 3 & 6<br /> \end{array}\right)[/tex]

The characteristic equation is:
[tex] \left\vert \begin{array}{cc}<br /> 1 - \lambda & 2 \\<br /> <br /> 3 & 6 - \lambda<br /> \end{array}\right\vert = (1 - \lambda)(6 - \lambda) - 6 = \lambda^2 - 7 \lambda = \lambda(\lambda - 7) = 0[/tex]
Thus, one eigenvalue is [itex]\lambda_1 = 0[/itex]. This is why the determinant of the matrix is zero, i.e. it is singular. But, the eigenvector is:
[tex] \vert \lambda_1 \rangle = \left(\begin{array}{c}<br /> A_1 \\<br /> <br /> B_1<br /> \end{array}\right), \ A_1 + 2 B_1 = 0 \Rightarrow A_1 = -2 B_1 \Rightarrow \vert \lambda_1 \rangle = \left( \begin{array}{c}<br /> 2 \\<br /> <br /> -1<br /> \end{array}\right) [/tex]

The eigenvector corresponding to [itex]\lambda_2 = 7[/itex] is:
[tex] \vert \lambda_2 \rangle = \left(\begin{array}{c}<br /> A_2 \\<br /> <br /> B_2<br /> \end{array}\right), -6 \ A_2 + 2 B_2 = 0 \Rightarrow B_2 = 3 A_2 \Rightarrow \vert \lambda_1 \rangle = \left( \begin{array}{c}<br /> 1 \\<br /> <br /> 3<br /> \end{array}\right) [/tex]
Thus, the matrix U is:
[tex] U = \left(\begin{array}{cc}<br /> 2 & 1 \\<br /> <br /> -1 & 3<br /> \end{array}\right)[/tex]

The determinant of this matrix is [itex]2 \cdot 3 - 1 \cdot (-1) = 7 \neq 0[/itex], i.e. this matrix is non-singular! Its inverse is:
[tex] U^{-1} = \frac{1}{7} \left(\begin{array}{cc}<br /> 3 & -1 \\<br /> <br /> 1 & 2<br /> \end{array}\right)[/tex]
Then, you may represent the original matrix as a product (check it!):
[tex] A = \frac{1}{7} \, \left(\begin{array}{cc}<br /> 2 & 1 \\ -1 & 3 \end{array}\right)<br /> <br /> \cdot \left(\begin{array}{cc}<br /> 0 & 0 \\ 0 & 7 \end{array}\right)<br /> <br /> \cdot \left(\begin{array}{cc}<br /> 3 & -1 \\ 1 & 2 \end{array}\right)[/tex]

Then, you can easily write the exponential of the matrix as:
[tex] A = \frac{1}{7} \, \left(\begin{array}{cc}<br /> 2 & 1 \\ -1 & 3 \end{array}\right)<br /> <br /> \cdot \left(\begin{array}{cc}<br /> e^0 & 0 \\ 0 & e^7 \end{array}\right)<br /> <br /> \cdot \left(\begin{array}{cc}<br /> 3 & -1 \\ 1 & 2 \end{array}\right)[/tex]
 
  • #10
Okay, I see what you mean. Although, if zero is an eigenvalue then the rank of A is (N-1) : where N is the number rows/columns. But one of the requirements for doing a eigenvalue decomposition is that the rank is full and that the rows/columns are linearly independent.

We take a simple example:

A = [-a,a;b,-b]. In this example, the top row is (b/a) *bottom row, so there is linear dependence. I'm sure that this is true for the other cases (more rows/columns). Because of this, I don't see how the decomposition could be reliable.

------------------

Now, to try your example, (this was a print out from the MATLAB environment):

First off, the expressions don't even commute

>> inv(U)*[1,0;0,exp(7)]*U

ans =

157.5190 -469.5571
-313.0380 940.1141

>> U*[1,0;0,exp(7)]*inv(U)

ans =

157.5190 313.0380
469.5571 940.1141>> U

U =

2 1
-1 3
>> A=[1,2;3,6]

A =

1 2
3 6
>> exp(A)

ans =

2.7183 7.3891
20.0855 403.4288

So unfortunately, it does not work.

[tex] exp(A) ≠\frac{1}{7} \, \left(\begin{array}{cc}<br /> 2 & 1 \\ -1 & 3 \end{array}\right)<br /> <br /> \cdot \left(\begin{array}{cc}<br /> e^0 & 0 \\ 0 & e^7 \end{array}\right)<br /> <br /> \cdot \left(\begin{array}{cc}<br /> 3 & -1 \\ 1 & 2 \end{array}\right)[/tex]
 
  • #13
brydustin said:
What's a dickfore?

If you don't know, you're doin' it wrong!
 
  • #14
Dickfore said:
If you don't know, you're doin' it wrong!

Ba dum ching!
 
  • #15
I'm wondering now if the situation is
exp(tA) then is it best to re-evaluate exp(tD) every iteration (for a given t)?
Assuming exp(tA) = Qexp(tD)Q^-1. where t from 0 to infinity.

We need to evaluate periodically until ∂exp(tA)/∂t → 0.
I've been reading through: http://www.cs.cornell.edu/cv/researchpdf/19ways+.pdf
For example, on page 17 there is an evaluation (lagrange interpolation) which leaves the matrix problem to evaluating a different scalar for every given time (very fast). Yet, its probably not the most stable method right? Also, later (top of pg 25) they discuss a method like the one we've been talking about,... except that they talk about the schur decomposition and discuss ways of finding the upper triangle in terms of the diagonal which is a function of t...

What is the best course of action for evaluation exp(tA) for various t? Symbolic evaluation?
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 2 ·
Replies
2
Views
4K
Replies
29
Views
6K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 13 ·
Replies
13
Views
4K