Numerical Solution to Random Linear Non-Homogeneous ODE

  • A
  • Thread starter Avatrin
  • Start date
  • #1
244
6

Summary:

Trying to learn applied optimal estimation
Hi

I am trying to learn optimal estimation by reading Gelbs Applied Optimal Estimation, and I am having hard time with finding [itex]\Gamma[/itex] defined as the following:
$$ \Gamma_k w_k = \int_{t_k}^{t_{k+1}} e^{F(t_{k+1} - \sigma)} G(\sigma) w(\sigma) d\sigma$$
Here F is a known matrix. So is G, and w is a random function.

Things are fine so far. Then Gelb deduces that to calculate this, I have to calculate the following:

$$ \Gamma_k Q_k \Gamma_k^T= \int_{t_k}^{t_{k+1}} e^{F(t_{k+1} - \sigma)} G(\sigma) Q_k G(\sigma)^T (e^{F(t_{k+1}-\sigma)})^T d\sigma$$

Here Q is a covariance matrix. In my case, that one is known. So, I can numerically approximate the integral, and then I am stuck. If Q=I, I can use Cholesky decomposition unless the integral does not produce a positive-definite Hermitian matrix, and that I the case for me. Which other options do I have to figure out what [itex]\Gamma[/itex] is equal to?

I am thinking of two issues I can encounter:
1) Q is not equal to the identity matrix
2) The right-side of the equation does not produce a positive-definite Hermitian matrix.

What should I do?
 
Last edited:

Answers and Replies

Related Threads on Numerical Solution to Random Linear Non-Homogeneous ODE

Replies
5
Views
4K
Replies
9
Views
1K
Replies
10
Views
2K
  • Last Post
Replies
8
Views
3K
Replies
11
Views
1K
Replies
3
Views
3K
  • Last Post
Replies
10
Views
1K
Replies
1
Views
2K
  • Last Post
Replies
4
Views
2K
Top