# Numerical Solution to Random Linear Non-Homogeneous ODE

• A

## Summary:

Trying to learn applied optimal estimation
Hi

I am trying to learn optimal estimation by reading Gelbs Applied Optimal Estimation, and I am having hard time with finding $\Gamma$ defined as the following:
$$\Gamma_k w_k = \int_{t_k}^{t_{k+1}} e^{F(t_{k+1} - \sigma)} G(\sigma) w(\sigma) d\sigma$$
Here F is a known matrix. So is G, and w is a random function.

Things are fine so far. Then Gelb deduces that to calculate this, I have to calculate the following:

$$\Gamma_k Q_k \Gamma_k^T= \int_{t_k}^{t_{k+1}} e^{F(t_{k+1} - \sigma)} G(\sigma) Q_k G(\sigma)^T (e^{F(t_{k+1}-\sigma)})^T d\sigma$$

Here Q is a covariance matrix. In my case, that one is known. So, I can numerically approximate the integral, and then I am stuck. If Q=I, I can use Cholesky decomposition unless the integral does not produce a positive-definite Hermitian matrix, and that I the case for me. Which other options do I have to figure out what $\Gamma$ is equal to?

I am thinking of two issues I can encounter:
1) Q is not equal to the identity matrix
2) The right-side of the equation does not produce a positive-definite Hermitian matrix.

What should I do?

Last edited: